Higher contrast ratio = lower risk of burn in, if anything.
But really, it's longer phosphor life you're after... The newest models are quoting 60,000 hours. I've come to the conclusion the other stuff is largely a bunch of gibberish. I'd take an orbiter over none, but it wouldn't be a feature I'd seek. White-bar scroll and such are useless for burn-in prevention and merely wipe out image retention/residual voltage, which goes away on its own.
If someone really had a static-image detector / screen saver, that'd impress me.
Originally posted by rogo Higher contrast ratio = lower risk of burn in, if anything.
According the a few industry engineers I know (my usual source of info), Contrast is the major culprit in burn-in. With that in mind however, the higher the contrast number does necessarily mean that the panel is more susceptible to burn-in.
Panasonic, the industries leader in claimed contrast numbers, uses a darkened or "painted" pane of glass to help achieve that higher contrast ratio. Being this is not phosphor driven, it does not make the panel more "burn-in" prone but gives the viewer a more detailed image and a blacker black.
This is considered cheating by industry standards but hey, I am sure if any of their competitors thought of it first, they wouldn't be complaining and I certainly am not!!
Contrast ratio is the difference between the luminence at white and the luminence at black.
Burn â€“ in is uneven phosphor aging.
Therefore a display with a higher contrast ratio has a higher risk of burn â€“in because the excited phospors are brighter and/or the unexcited phorphors are darker. It is the difference that causes burn-in.
Also, raising the brightness of your display raises the entire luminence curve. This not only will reduce the lifetime of your display but can also increase burn-in risk because the phosphor aging vs luminence curve is not linear.
I agree with you that competitors are just whining. Having a darkened pane of glass is not cheating by any standards. Firstly, it does nothing to improve true black level or dark room contrast ratio. Secondly, it is an absolute requirement to improve lit room contrast ratio as it prevents reflected light from killing your true contrast ratio and black level.
OLEDS, PLASMA, LCD and so on will all use this technology is some form or another.
White static images are the most common cause of burn in. White is a combination of all three phosphor colors. When white appears on the plasma, all of the phosphors in each cell displaying white must be excited. This means increased power levels. That is why plasmas "buzz" on an all white screen. If you increase the contrast you are increasing the intensity of white and therefore "burning" or "aging" the phosphors at an increased level.
Burn in can occur with any static color depending on the brightness level, which increases the range of colors between white and black. However, it would take longer than a static white image since not all of the phosphors in each cell need to be excited at such increased power levels.
Yeah, I've wondered about the contrast/burn-in relationship. I have my contrast (peak white levels) turned far down (for TV watching) and my brightness (illumination of lower gray scale) plunged even lower. However, while the brightness being really low no doubt saves phosphor life it obviously increases the contrast ratio, which I assume would make burn-in more possible. Therefore I wonder if turning down my brightness really far is not actually advisable.
What you describe is aging. Aging and burn-in, while related, are not the same thing.......
Burn-in is caused by un-even aging of phosphor (i.e. - one area of the screen is aged quicker than another). In this respect, the life of the phosphor along with the DIFFERENCE between peek luminence and black level will determine how easily you can burn in a screen.
Theoretically, changing the brightness in any way should not change contrast ratio because you change the entire brightness curve simultaneously. In reality it can affect contrast ratio because you can crush the blacks or whites when you go beyond the dynamic range of the display.
With all due respect, I described both aging and burn in. As I stated, burn in is caused by static images, which of course may result in the uneven aging of the phosphors if it remains on the screen too long.
I've read some papers discussing new and upcoming (long lasting) phosphors for plasma display purposes. As you say, this is the ultimate solution and the shape of the phosphor aging curve will also be very important. Even if the phosphor lasts forever, if it initially has a very steep aging curve before leveling off there will still be a burn-in issue (albeit easier to fix over time)
Right, xrox, on the last point. The curve is important.
This, however, I don't buy: "Contrast ratio is the difference between the luminence at white and the luminence at black.
Burn â€“ in is uneven phosphor aging.
Therefore a display with a higher contrast ratio has a higher risk of burn â€“in because the excited phospors are brighter and/or the unexcited phorphors are darker"
The propositions are unambiguous; the conclusion is very ambiguous. You conclude that because the phosphor-excitement difference is dramatic, the burn-in is more likely. I don't think the evidence backs that up nor do I think it is a necessary outcome.
I would argue, in spite of what I said above, that it's almost entirely the phosphor design, decline curve, and settings of the panel -- and perhaps it is not at all correlated to panel CR. If you run any panel "too hot", you greatly increase the chance for burn in. That's a given. Why the dynamic range of a given cell would make that more or less noticeable -- absent anything else -- I'm not convinced.
I totally agree that the rate at which the phosphor decays (phoshpor design as you put it) is absolutely the most important variable in aging/burn - in. No argument here.
But the original poster asked if CR played a part and it must because it is a measure of dynamic range (white to black) and burn-in is uneven aging (white ages/ black doesn't). If you change the dynamic range you change this ratio.
Just running a panel hotter would increase the rate of phosphor aging. But every cell at every greyscale value is increased the same amount. This is what the "brightness" adjustment of a panel is. "Contrast ratio" is an adjustment of the full white luminence only. Since phosphor aging is proportional (not linearly) to luminence then if you increase the luminence of the white only you will invariably increase the risk of burn-in.
Again, I agree with your last point. But Panasonic's better contrast ratio, for instance, comes from sucking down the black level and not merely increasing white luminance. The difference between the 3000:1 panels and the 1000:1 panels is the black level of 0.2 nits vs. 0.6 nits. Not a white luminance difference.
Gotta agree with xrox in theory, but rogo's right for all practical purposes. Differences in CR are primarily differences in black level as rogo stated and the phosphor aging rate is so low that the difference between .2 nits and .6 nits is insignigicant compared to the aging rate of peak white. The contrast setting of your TV will be the dominant factor, assuming the brightness is set correctly. I don't know what the numbers are, but I bet the phosphor half-life at a black level is at least several hundred years compared to a few years at all white.
The reason I cited the 0.2 vs. 0.6 nits example -- which is a real one -- is that if your top brighness is 600 nits and your bottom is 0.6 nits, you have 1000:1. Make no change other than offering 0.2 nits black level and you have 3000:1. The white is the same on both panels -- 600 nits max.
1 nit = 1 candela per square meter, a number often cited in plasma specs. Unfortunately, the number they cite tells you nothing, but if measures, you can easily get peak white and darkest black. Peter Putman, www.projectorexpert.com , has done several plasma reviews where he measures those numbers on calibrated displays -- so they have meaning.
The recent highlighting of this issue relating to plasma screens has somewhat changed the definition of burn-in. It is quite valid to describe uneven aging as burn-in, after all the uneven luminance of a screen's output gives all the appearances of a burned in image.
However in the truest sense burn in was in fact when the phosphors became overheated to a point where they were permanently damaged.
Some older readers may remember the older B&W TVs prior to the introduction of dot suppression circuits. When you turned your TV off the beam would collapse to the very center of the screen before fading away.
Given that the beam faded after a couple of minutes and that TVs were generally kept on for many hours at a time, the percentage of time the beam spent only at the very center of the screen was negligible. Therefore uneven aging was not the problem. The problem was the very intense beam, albeit for short durations, was enough to heat the phosphrs to the pint of damage.
This can still happen today if constant white logos appear on a very bright screen for an extended period of time. If this happens no amount of usage will even out the picture again. Although because of improved phosphors, beam current limiting and other clever design tricks, this has been minimised to a degree on modern screens.
......and let's not forget that plasmas don't need dot suppressors.
A forum community dedicated to home theater owners and enthusiasts. Come join the discussion about home audio/video, TVs, projectors, screens, receivers, speakers, projects, DIY’s, product reviews, accessories, classifieds, and more!