Originally Posted by SierraMikeBravo
While not wanting to fan the flames here, but to counter argue with you, you are aware that display manufacturers have intentionally ruined image fidelity in order to get you to buy THEIR product? So, given that, are you satisfied that a manufacturer has decided how you are going to watch something? If you are going to correct the display, might as well do it to the reference standards. Seems as though this is an argument that goes in circles. My suggestion, chalk it up as those who see the benefits of calibration and those who don't. Just like there are Democrats and Republicans, Aethiests and those that believe in God, etc. Just differences in philosophies and beliefs, and those are extremely difficult to change no matter what is said. Just my two cents.
Usually image fidelity (looks identical to the original) is being sacrificed for more image quality (looks better than the competing displays). Image quality is determined by things like naturalness and colorfulness and visual information, contrast and sharpness. Image fidelity is determined by the inability to tell the difference between the original and the reproduction.
As I say I view consumer calibration as setting up a display correctly to the format. That is not overrating it.
Claiming it is highly desirable for the image reproduction be as accurate as possible. The idea being that otherwise they are loosing out due to lack of faithfulness to artistic intent. Strikes me as less true.
Take a DCI Theatrical presentation that is within spec if it has a contrast as low as 1,200:1 sequential, 100:1 intra-frame, luminance uniformity as low as 70%, a center white level of anywhere between 11 and 17ftL, gamma +/-5%, color accuracy +/-4 delta E.
Take a grade two monitor using EBU standards. It is within spec if it has a contrast ration above 500:1 full frame 1% patch, 100:1 intra-frame, gamma within +/-0.10 for 10-90% of input signal, white level of 58ftL, grey scale accuracy +/-4 Δu*v* , color accuracy 7 ΔE*
The standards are not very high or very tight, the reproductions would not be indistinguishable from the original or displays from each other, including color reproduction.
Consumer displays ideally want to have very high contrast, etc... and consumer displays are not going to look like DCI cinema presentations or video monitors anyway. They are after all consumer products playing consumer sources not monitors or reference displays being use during the mastering of content.
I can understand why color accuracy is desirable on a grade one monitor used by the colorist or director or for a cinema preview screening room.
But as a consumer although obviously I do not want a massively inaccurate image, looks odd unnatural, I do not see the requirement for the highest image fidelity accuracy the display can acheive as needed for enjoyment.
As a consumer image quality is the ultimate basis on which I pick a display and set that display up, image fidelity might be the start point but it is not the end.