Originally Posted by Agent_C
Let's also consider that there isn't universal agreement that the D65 color standard actually looks like real life. Which is perhaps why so many lay people find a slightly cooler (D75?) picture more accurate'.
I don't buy the typical explanation for this either; that people are conditioned' to prefer a bluer picture because of the way TV's are displayed in showrooms. The explanation AFAIC is much
simpler; people compare what they see on their flat screen to what it looks like in the real world, and adjust their sets accordingly. It really is that simple.
A_C (Flame Shields Up!)
1) D65 is the best possible compromise. Dusk and dawn are warmer. Overcast skies are blue-er. Interiors are warmer if lit by incandescent, or greener if lit by fluorescents. 10am is a different temp than noon or 4pm.
2) D65 works because it is "in the middle" of what we experience every day. If you use the D65 standard, when you shoot something at dawn or dusk, it will look warmer, if you shoot interiors they will look warmer or greener depending on the lighting used, 10am will look "right" and 4pm will also look "right" - if what is shot is playedback on a display calibrated to that standard also. Nobody is shooting at 7500K or 5500K (though the printing and publishing market uses 5500K as their reference white because most everything they do is seen mostly indoors.
3) If the program or movie was "shot" using the D65 standard (or something somewhat close to it), you have no prayer of seeing it as the director/director of photography/cinematographer intended if you aren't also calibrated for D65. No adjustment on the TV will every un-do the mess caused by shooting and viewing at different settings.
4) Human vision, good as it is, has significant problems... put a "real" white next to a blue-ish white and your vision will assign the blue-ish white to be the "real" whilte and the ACTUAL white next to it will look pink-ish or yellow-ish or orange-ish. You cannot stop this and it is EASY to demonstrate with a meter - and it happens with a fair bit of regularity when you are calibrating TVs. There are lots of examples of optical illusions online that prove your eyes are easily fooled.
5) If you want the most accurate images you can get from your display... the AV-SCIENCE (aka AVS) Forums are a great place for that exploration. If you are only interested in "what you like"... there are plenty of other forums for you and most of the threads here really won't be much interested in "what you like" since the majority of AVS Forum members are here to find out what makes an accurate image and how to get that on their display at home. These aren't the "What You Like" Forums... they are the AVSCIENCE Forums and there are all sorts of reasons the SCIENCE of video imaging works.
That said... I've seen a fair number of cases where people say "I really don't like Movie mode and nobody is ever going to convince me it is better than Standard or Optimum. So when you calibrate my X020 (or X010), you can try Movie mode first, but I'm pretty sure I'm going to want to end up calibrating Standard mode." After calibration there's an epiphany of EPIC proportions. Mostly along the lines of "You made me a believer in Movie mode. This is AWESOME and I could NOT be happier with Movie mode. I was SO wrong about Standard and Optimum." In all of my experience with these panels, NOBODY has preferred Standard or Optimum to Movie mode after calibration. Not ONCE. Before calibration, I'd say my customers have been split about 50-50 between those "into" Movie mode and those who would swear that Standard or Optimum was the only way to go.
Nobody calibrates to (say) 7500K to display (say) 6500K content... it just makes no sense. If you are going to run a display at 7500K, you'd better figure out a source for 7500K movies for your disc player(s) and TV channels because nothing is going to be close to right if all the sources were calibrated to 6500K.