Originally Posted by TweakerInWA
...Point being, there will be slight variations between the same models calibrated by the same tech, most of which are set regardless of enviornment..So to say most calibrated settings done on television (a) in an igloo on Everest wouldn't help uncalibrated television (b) in Costa Rica would be incorrect IMO..While (b) might not look "EXACTLY" like (a) it would probably still look better than it did before...
Actually, the settings tend to vary more between individual samples than you might think. Some settings like sharpness, video processing modes and other image processing type things are identical between units. Color and brightness type settings tend to vary the most between samples and changing these to match one sets versus yours can result in a worse picture than original.
Using inadequate color tools to set color is a recipe for disaster. Many sets today have a white balance setting that is pretty close to D65 and tracks reasonably well. If your tools are not good enough which is frequently the case you will make things worse than they were to start with. I see this all the time with others work in adjusting white balance using tools that are inadequate.
Color saturation is a very tricky thing to get right and many displays are not that far off out of the box. If you adjust this without excellent tools you are very likely to end up with a desaturated picture which does not look good. I also see this over and over again.
Brightness strangely seems to elude many people. Setting this requires the right test pattern and careful examination of reference images. Some sets will dynamically adjust brightness depending on the average picture level which will render many test patterns useless.
Gamma is very tricky to get right and will usually either be very right on a product or very wrong. Measuring this requires instruments that have good sensitivity down to about 0.2 fL on a flat panel. It also requires measuring in fine increments and down dark. The quality of an image is primarily determined by the performance below 40% signal level. If your tools are not good from 10% to 40% signal level you will do a poor job in setting gamma with your tools alone. I see many displays where things are set pretty close above 50% level, but very poor below 30% where most of the image is and the results are horrible.
The other problem many people have is knowing where the problem is in the picture. Some will adjust the TV when the problem is in the source and some will adjust the source when the problem is in the TV. Doing this wrong is very likely to result in a loss of bit depth and a more pixelated image or a loss of contrast or bad results on other sources. Not having a video signal generator to determine where the problem originates is a pitfall many fall into.