So you want to calibrate your TV/monitor/beamer, ey? *uncomfortable silence*
Are you sure that you are up to this task?
Ok, if you say so...
Its actually easier than most will want to make you believe - as with all things it takes some practice to get better, but not that much at all, really. The only condition is, that you have to be thorough.
Lets start with something simple, you most certainly have come in contact with.
> What are TV Presets?
Presets (Standard, Cinema, Gaming, ...) are predefined groupings of settings in most cases visible to the user via the settings page, but sometimes even producing slightly different picture behavior which may be unchangeable (greyed out settings, or changed presets that are beyond the users control) for the user.
Presets are used for quick select purposes.
When calibrating a TV you usually restrain yourself, or at least start with the „cinema“ or „user“ preset. Cinema most of the time is the closest to the desired rec 709 standard („color“ Standard that all 16:9 HD material (including Blurays) is mastered (and then encoded) in), also in Cinema mode normally all available settings are user controllable. The user preset (if available) is basically a „everything that can be changed on your set is available, play with it“ preset and also usable.
Sub presets like Cinema 1, Cinema 2, ... are also used for quick select purposes.
> What is calibrating?
Setting the TV at hand as close as possible to the decided upon standard for (in most cases) HD content (rec 709, also called ITU 709).
> What calibrating is not -
Color correcting the original material. If f.e. something is shot/encoded with a sepia tone to it - calibrating wont get rid of it.
If you want to play „artist“, stick to switching the color temperature settings.
> Can calibration fix my TVs color reproduction problems?
Only to a certain extend. As in „a bit“. The saying that setting brightness and contrast just right“ is 80% of calibration in many cases isnt that far off. To a great deal it is dependent on the „quality of the panel“ itself, whereas by quality the manufacturer means „measured color accuracy and contrast (mostly black level)“.
Calibration on a device (as in „I bought a Colorimeter, or a Spectro“) level is 90% „setting white balance“ and the measly rest „setting the 100% saturation points of each primary/secondary color - if the device supports it“ (with color and hue mostly already being set accurately at the factory level) - point being, if after setting whitebalance „not everything falls into place“ your options are limited. [If you are not either limiting yourself to using a HTPC with a certain renderer which heavily limits your media player choice, or invest several hundred bucks in hardware boxes - for 3D LUT generation, which at that point basically is color correcting the input signal based on the persistent flaws of your display (*think* on the base of several hundred automated measurements „profiling“ each region of the color spectrum)]
Setting whitebalance right can even accentuate color reproduction problems (as in „i can live with everything having a slight green tint, but since I calibrated my set the sky looks purple“ (exaggeration)).
> What is calibrating - part 2
First its sticking with the warm color temperature profile (warm 2 in most cases). Because - believe it or not - its the encoding standard. Thats the production standard. *Heck* warm2 (D65) is the white balance standard for close to EVERY (still used) color space defined by men. On earth. Up till now. [edit: That is western (Europe, US, ...) ones. Please see randal_r posting below.]
Then its getting the warm color space >exactly< right. Warm is just a user friendly way of saying „close to the D65 standard of white“. As in „a black to white curve (through all the greys there are) with the white point at D65“.
Calibrating is also adjusting colors, but more on that later.
> Whats D65?
D65 is a certain color of white (red, green, blue proportions) that is defined as 100% white. Why? Simple. Because its the „agreed upon“ color of white, a white sheet of paper has in the midday sun in certain regions in europe. Sounds logical, right?
But wait it is. When calibrating white balance on scene, camera men historically used white sheets of paper to do it - which more often than not, were „illuminated by the sun“ and voila, after some guys agreed on what the EXACT color that was (probably by a highly scientific method of drawing straws and rolling some dice) - it became the standard.
Also, this doesnt mean, that your TV set now won‘t be able to reproduce more overcast or more bluish whites - its just that to do so - it has to be instructed by the video signal itself. Color of white is important as in „everyone open your watercolors set and grab the brush“.
Point being - nowadays in production (close to) everything is calibrated to D65. The cameras, the screener monitors, the monitors in the production facilities, the encoding „file format“, blurays, ... EVERYTHING.
> What about those other color temperature profiles (neutral, vivid, ...)?
warm 2 (D65) has a white point that should lie at around 6500K(elvin), all others are increasingly colder (we actually call it „hotter“) as in „more bluish“. Think 7000k to 10000k+
Point is, we dont use them. As calibrating is all about „getting close to the standard“ and not about „color correcting this one scene, where the influence of the film material itself is clearly visible, and...“. Because you would be doing it with a broad brush the size of a barn door.
> Why are we talking so much about whitebalance (= greyscale curve (from black to white))?
Because its easy. And it seems to be worthwhile. In encoding nowadays, especially with HD file formats, most of the picture detail information is encoded in greys (as in „black and white“). Because its saves discspace. [Also there is a historical contingency component to it.] Those „black and white“ pictures are then „colored over“ (as in - not at all colored over, because we are „painting with light“ (=additive color space) - but mixing metaphors is just so fun). Not „colored in“. „Colored over.“
So you are saying that if this „grey“ „black and white“ base has a red, or blue, or green or candycottonribbonpurple „tint“ - it will be noticeable in the final (colorful) picture? Ah, but yes, sire, this is exactly what I‘m saying.
> Why are we talking so little about calibrating „colors“ (on top of it)
Because its hard. Still worthwhile though.
If a TV has a CMS (color management system) as in „the ability to also calibrate the three primary colors and three secondary colors at 100% saturation“ - you are doing exactly that. Calibrating six colors out of several million (depending on your color space). What fun. As they are the „end points“ it might have an impact on how all your other colors are displayed and the impact even might be positive - but hey, you are calibrating six colors.
> But how does this whole „color evaluation“ work then (I bought a meter to calibrate white balance and six colors?11!?)
Calibrating on an evaluation level is „looking at a subset of points in the color space - each hopefully representative for a certain „sector“ in the color space and then giving a probability based (percentages) assessment on color performance.
Even if you are creating a 3D LUT (beyond the scope of „normal“ calibration) and doing close to a thousand different color measurements, you are basically looking at a small sample (averaging, making decisions based on probability) compared to the actually available colors.
Be happy and rejoice, though - because with current technology colors are mixed „together“ trough basically the three primary colors - there at least is a tendency to a linear progression of color reproduction and color errors - except when there isnt - which we then call „cheap“ (for the manufacturer to buy - regardless of if he produces himself, there is a market, you see) displays.
-- BREAK TO REGULATE YOUR BREATHING, GRAB AN APPLE AND REFOCUS --
Back to practical stuff.
Picture Setting Options
Contrast - The „brightness“ of white (the entire color spectrum). In most cases.
There might not be „a“ right setting for contrast - as „more contrast is better“, but not if
- one of the color tripplets (RGB) gets overdriven, and therefore the white gets a color tint as „one of the primaries has run out of „color“. Look for at tint in the pattern you are setting it with, while adjusting the slider
- the picture becomes too bright for the roomlight you are dealing with („my eyyyyeeees!“). Some TVs (dependent on the technology) have the option to reduce backlight (or a similar equivalent) independent of Contrast in which case - turn down the backlight (adjusting contrast in addition might give you a more granular scale (backlight slider „jumps“ between larger steps)). Adjusting the backlight, in theory, shouldnt impact your TVs greyscale or color performance at all.
Brightness - The „brightness of black“ (the low end of the color spectrum).
Sharpness - Image detail reproduction/invention/blurage
Those three can be set by eye only using a calibration disk (you will use individual patterns to calibrate each of those - refer to your calibration disk provider for all „how to“ questions, after RTFM.).
How about using the free one this forum has sourced and is one of the best out there anyways?
Called AVS HD 709 ( AVS HD 709 - Blu-ray & MP4 Calibration
709 refers to rec 709 which means we are setting the HD color space with HD material, which the TV/Beamer has correctly identified as HD material and applied the right color decoding process to display it. If that chain works as expected THEORETICALLY that means, that this configuration would also be useful for SD (rec 601 among others) material IF the TV identifies and decodes that as such.
If you are using HCFR, you also might want to download the free GCD disk for the saturation sweeps and color checker patterns: GCD - Gamut Calibration Disk
Also one additional word concerning the „correct chain“. Before you even consider touching the settings you can calibrate by eye, make sure the TV understands the color levels the source device outputs. On most newer TVs its an „auto setting“ which „just works“ - except when it doesnt and you have to set it manually.
Problem is as follows. Colors are defined in RGB tripplets (f.e. 255,255,255) where each „primary color portion“ is described with a value from 0-255 (= 8bit), except when it isnt. Like in the (also HD) video standard, where the spectrum gets cut (bottom end and top end) to 16-235.
16-235 is the actual video standard (limited RGB), although on some Blurays you‘ll find colors above the 235 level (depending on the encoding)
0-255 is the standard used by computers (full RGB).
Point being that the limited standard reduces the potential number of colors (less disc space), BUT upon the decoding process it either gets „displayed correctly“ or it gets „expanded to get displayed correctly“ dependent on which internal processing the TV/Beamer/Monitor uses.
Its best explained by looking at the black point 0,0,0 vs 16,16,16 . Both represent the same color (black) but the TV has to know according to which signal standard he is being „fed“ with. To display the same color.
If it doesnt and your TV is „capable of displaying a correct picture, by todays even average standards for TVs“ you will either get a washed out picture, or heavy black crush depending on the two potential false options of what the TV expects and gets delivered (expects full, gets limited // expects limited, gets full).
So actually this is the first thing to check while „calibrating“ your set.
Its done by looking at the same pattern you use for calibrating brightness.
Also while calibrating it might be a good idea to „force“ a chain [only if the devices specifically state that they support BOTH options AND give you a user controllable way of setting it]. Meaning setting the TV to full, or limited (not auto) manually - while knowing which signal your source device outputs. When using a PC most graphics cards have the option to output in BOTH, so double check this side as well.
Most Bluray Players nowadays also have the option to output BOTH, so doublecheck here as well. Why do BR players have the option for both? So you can use them on older PC monitors for example, which would always expect 0-255 (full RGB).
So which one to use then? First - the one your display expects/supports. If it supports both - the one that produces the least amount of color conversions. (think 0<>16) for the material you are using most on this input chain. Blurays (Video in general) for the most part are encoded in 16-235 (limited) (or 16-255 at most), while the PC on which you are surfing the internet "for pictures“ will mostly want to output 0-255 (full).
As it is not unheard of that you might want to use a PC also to watch movies - there might be at least one „color conversion“ happening in the chain (most often done by the graphics driver without your knowledge) - so thats happening (and hopefully will do its math correctly), meaning that the actually crucial point is NOT which of them to use, but that both your source and your display device agree upon which one they are using.
Later while calibrating with a meter, its also important that you set your Calibration software to „produce“ the correct Levels (0-255 or 16-235) according to the input chain youve set (/or the TV is „detecting“), if you use it (the PC/calibration software) as a signal generator as well.
(edit: This is only partly true - as there is one important exception.
has explained it later on in this thread, please read up on it here: The certainly not complete user guide to get to know and calibrate your TV
) If you use a Bluray Calibration Disk or a HD video file as a signal generator (= to display the colours you are measuring), the player (hardware) will decide (and possibly expand) on which color space to output (source is encoded in limited (16-235) according to rec 709 spec).
Calibrating with a meter
> I vont to calibrate my whitebalance/CMS and look how my TV is performing, what should I buy?
If you don‘t want to break the bank, a i1 Display Pro (i1d3) or Colormunki Display (about the same HW, takes slower measurements) from X-Rite, according to popular opinion and the article over at Dry Creek Photo ( http://www.drycreekphoto.com/Learn/C...nHardware.html
) - a independent site that did sample variation testing.
[I was compelled to mention that the sample sizes in the article are way below the number of statistical significance, nevertheless it is the most compelling "effort" to do a comparative assessment out there. Take it with a grain of salt, maybe dont take it as gospel, but imho - read it anyhow.] Also both of these colorimeters have the advantage that they dont degrade at all (at least it would not be expected) and work with HCFR (open source calibration software) just via plug and play (no driver installs needed).
Also - you perhaps could get them calibrated more „accurately“ (always according to „presets“ > averages of panel technologies/panels/manufacturers), but when they are, they (colorimeters in general, you could buy more expensive ones as well, but mostly the i1d3) remain the standard for calibration in the field, because of their reading speed and software compatibility.
> Calibrated more accurately?
Problem with Colorimeters is, that they need correction tables according to the „spectrum of light“ they are working with. Meaning - is it a CCFL backlight (or simply „light“ when dealing with projectors), is it al LED backlight, is it a Plasma, is it a OLED, ...
Now according to Dry Geek Photo the medium deviation on the i1d3s they tested was 0,4 dE (on one display), according to resellers of the i1d3 that want to upsell you on THEIR SPECIAL calibration (mostly different device presets, but not only) its up to a delta error (dE) of 4 - but remember, even they dont care to know the exact display you are working with (panel, manufacturer) which is why you want to save up and buy this great 10.000 USD spectroradiometer which will give you more accurate measurements for up to six months until it has to be recalibrated, because its filters usually will degrade over time.
DeltaE. Its a number. *bushrollsthrough* Basically DeltaE (dE) is a formula that condenses several kinds of color errors (also color of black, greys, white) down to one number - which also includes „psycho visual“ correction for „how much a color error is humanly perceivable“.
Lower is better.
Below 3 is said to be „not humanly perceptible“. Which is BS.
Notice that in the example above dE 0,4 or 4 is a „mean deviation“, that is a number that you ought to „add“ to your own dE measurements. If you want to question your sanity.
> I vont to use a meter, which software should I use?
HCFR. ( HCFR - Open source projector and display calibration software
) The end. If your meter is supported. Because, see, those meter vendors sometimes like to get those juicy paybacks from the software vendors which sell their solutions for 1000s of dollars, or at least dream about that - or - if this should turn out not to be the case - HCFR cant implement them because of other reasons.
Why? Because its faster. Faster will encourage you to do more measurements, rather then getting each step spoonfed for the 100th time while in the end marveling over a comparison report, that always should show „better numbers“, even if a monkey was pressing the buttons - except those rare cases when „no improvement could be reached“, and you congratulate yourself and others for buying such a grand TV.
[Reader discretion advised, the strong recommendation towards HCFR is a strongly held subjective opinion of this guy, who has written these texts. There may be differing opinions out there.]
If you want to calibrate a 3D LUT for your devices you will not use HCFR as a software package > but you will probably read zoyds Tutorials on the matter:
> How to calibrate?
Other tutorials. Or just look at the posting below this one...
Just as broad strokes.
- Position the meter. If it is not direct contact - wait until 00:00 midnight, turn off the lights and mask your windows with aluminum foil (remove roomlight). Its a standards thing. Measurement targets are calculated without roomlight influence.
- Set TV somewhat ideal for calibration (Presets, sometimes reset)
- Check if the limited RGB/full RGB chain is recognized correctly
- Do your contrast, brighness, sharpness calibration by eye - with testpatterns.
- Check Gamma (if its not anyway near the graph where you would expect it, or you can dial it in via a gamma slider - you will have a long day in front of you (10 pt greyscale calibration is your only hope))
- Calibrate Whitebalance (= Greyscale)
- Calibrate 100% saturated colors (also check Color / Hue sliders at this point)
- Calibrate Saturation Sweeps (check Color / Hue sliders at this point also)
- Do a Color Checker measurement (might want to check Color / Hue slider also, just sain‘)
If you are looking at this fun Color Triangle in your calibration software, the color slider moves ALL points closer to / farther from the center, while the Hue (=Tint) slider rotates ALL of them around the mid point (clockwise / counterclockwise). Also, what you see on the triangle doesnt include color luminance - so its just a guide, not a "picture proof". The proof is in the dE values (or in other numbers used to calculate it, which will show up in HCFR and ... GO BY dE, DONT OVERCOMPLICATE YOUR LIFE (unless you are ready to do so)).
A CMS only allows you to modify the 100% saturation points of both all primary and all secondary colors (red, green, blue, yello, cyan, magenta). After that you are hoping that your color readings fall into place.
Also in general almost all of the steps in return influence each other, so do at least two, maybe three measurement runs as you close in on the ideal configuration.
Also there are slightly different formulas to account for „perceptibility of color errors“ (those formulas that calculate the dEs). We mostly use dE2000 (redefined in the year 2000), sometimes also dEuv for looking at the whitebalance (greyscale).
> Gamma? What is Gamma?
The horror and cause of sleepless nights and endless anxiety. The cause why this whole industry is a scam and people should get fired.
There are two ways we perceive „depth“ as humans. Stereoscopically (left eye picture different from right eye picture, brain caluculates - *.error does not computeÜÜ* - the depth positioning of objects. And with the use of „shades“ of colors (farther away = darker). As we all hate stereoscopic 3D (*hate those glasses, hate even glass free*) we dont talk about it. (Ok, some people like it, ...
Gamma. Gamma is „depth perception“ caused by „color shading“ in pictures. (Actually this is wrong and the actual definition much more complex, but you know...). Psycho visual experiments showed - humans prefer gamma at - some scale, lets not talk about it - (power law) 2.2 . Great, manufacturers said and produced newer TVs/Beamers with Gamma at 2.2 - because they could.
ITU forgot to think about gamma at all when defining the rec 709 standard, so they are the ones that should get fired.
Ok, one step back. It was a problem where they had to have at least one person capable of „thinking hard“ (John Oliver segment). Which they hadn‘t at the time - so lets all pretend its excusable.
One more step back. Old CRT TVs and monitors had widely differing gamma all „about in the 2.4 range“. But with all kinds of widely differing curves from black>greys>white. Because this was the case with analog cameras as well, just in the inverse - both somewhat canceled each other out, or not, or who cares, lets party. ITU, ITU, IT...
But then all those newer TVs came with gamma set to the correct linear progression of 2.2.
And everyone was still partying. Actually until 2014. When someone noticed. The cinema standard had a defined gamma. All those years. Just sayin‘
So whats the problem here? The actual problem comes into place during post production (color correction) most of postproduction according to people who claim to know is still done/was done the entire time on production level CRTs which show a different gamma profile than those great new TVs/Beamers you can buy today.
Uh, and the scrambleing started...
Use a flat power law 2.4 some said, use a entirely different curve (close to 2.4) called bt1886, some said.
And suddenly all seemed to agree on bt1886 as a recommendation, except the TV manufacturers or the TV test sites who said *eff yo* and called it a day. Couldnt blame them.
*But, what does it mean, man?*
You can decide for yourself if you want adhere to power law 2.2, 2.4; or bt1886, or use a black compensation on the first two, or criticize bt1886 for a integrated black compensation that is jokingly exponential on displays which don‘t even need it - the result is -
That colour measurements (display performance) will be slightly different dependent on which of them you chose - BUT, gladly its just in the 0.*something* of dEs so *everything is cool man* except it isn‘t and the resulting pictures look noticeably different - with you guessing the creators intent (probably aiming for power law 2,4 or bt 1886 on Bluray movies).
Colors are mostly effected in the mid tone range.
FUN. Now, can please someone get fired?
> What about those other fun numbers that show up when I, ...
Mostly stick to deltaE numbers. It does the „importance weighing“ for you, and it is the standard at which we compare - so don‘t make it harder on you than you have to.
> Should I calibrate at 100% luminance or at 75% luminance, or at ... And what size patterns should I use - for measurements?
Answer to the first part is „it shouldnt matter“, except in those cases where it matters. Have fun. Do both!
We want 100% for our meters accuracy (newer ones should measure both just fine), but there are TVs where the color reproduction differs noticeably and it is to be expected that more of the real live image color spectrum will fall around the 75% mark.
Answer to the second part is „almost always“ 10% or 11% windows. On LCDs maybe full screen fields, but 10%, 11% windows are fine there too.. Problem is that some TVs will engage in undefeatable image processing especially when full screen patterns are shown. We dont want that.
> Which Tutorials should I read to learn to Calibrate?
Everything you find, really, ...
First starting points would be
and the HCFR thread (maybe start from the back)
HCFR - Open source projector and display calibration software
> How do we vary our backlight/contrast settings dependent on the room lighting?
First, we look at the brightness of 100% white to judge it.
There we are looking at the Color brightness number either measured in foot lambert (fL), or in candela in square meters (cd/m2).
Then we look at a table with data some folks somewhere, somehow have agreed on:
Bright Room: 50-60 fL
Dim Room: 40-50 fL
Dark Room: 30-40 fL
Theater Room Projector: 14-30 fL
(Values are circulating on these and other forums, so I sadly wasnt able to pin the original source.)
If you need a calculator thingy to convert from fL to cd/m2:
Also - we mostly use gamma 2.4 for dim environments. Thats mostly a black correction curve thing. If your TV is "good enough" you can also use it in daytime viewing. But its gamma, so go crazy and do what you want. There is no standard. Also the recommendation of the standards body came 10 years to late (*party!...*).