Originally Posted by peteylewis
so why not just watch TV on one of the standard picture modes and then apply the setting after that amount of time?
Is there any empirical data that suggests that running these test patterns for 120 hours and then applying calibration settings is measurably superior to just watching the TV (especially most/all in HD content.. full screen mode...) and then applying the settings?
Or is this just overkill?
Seems like a tad overkill to me but I really have no technical basis for that opinion... just seems
like it is...
Disclaimer: I'm a newb myself, as you might guess by my join date and post count. I've never owned an HDTV, though I plan to buy a 12G Panasonic as soon as the S and G series are out, reviewed, compared, and available under MSRP. I just have a habit of absorbing big chunks of arcane info quickly and translating it into laymen's terms.
That said, I downloaded the digital slideshow pics from D-Nice's link above to see what they were, and I think it helps to explain the purpose of the break-in process. They are a series of full-1080P images each of a single color: White stepping down through gray to black, then red, green, and blue from bright to dark.
Seems to me the idea is that you want to put some usage on every little cell of the display and do so as evenly as possible. Each cell is a sealed glass bubble (square/rectangular, not round) containing a gas, some phosphor, and the various layers that combine to make them light up in the right way at the right time. They are arranged in clusters of red, green, and blue that combine to make the colors your see. Normal viewing content will not consistently and reliably exercise all these clusters, and each color of each cluster, equally. Normal content will have a bias toward a certain range of colors and brightness, and will tend to be most active toward the center of the screen. Using the break-in images uses every cell on the screen equally so that, when it's time to adjust the settings, you're dealing with a predictable starting point.
Now you might ask, If normal content doesn't use the screen this way, why should I care, since normal content is what I plan to watch, not test patterns? The problem is the exceptions. You'll end up saying, my TV looks great, EXCEPT in really dark scenes, or EXCEPT when I watch a nature program with shots of the Arctic wilderness, or EXCEPT when I watch a spaghetti western with desert landscapes, etc. etc. You want your whole screen to be as good as it can, all the time.