Originally Posted by smitty
I think someone, perhaps it was Kenbar or someone else, suggested that it might make a difference for image retention, but not for burn-in, which is apparently a different phenomenon, and which is supposedly cumulative. I don't really know the answer.
Yes, the predominant theory is that only cumulative hours of static elements matter when it comes to burn in. I tend to think that's the biggest factor but not the only one. Let's assume "random" content ages pixels at a rate of 1x, although there's really no such thing as truly random wear rate, but let's assume. Let's also assume that the CNN logo's static red pixels age faster at at a rate of 5x that of random content pixels. Let's now look at 2 identical model OLED TVs, with identical settings. TV#1 displayed CNN for a total cumulative 5000 hours (5 hours a day), and that's all it ever displayed. TV#2 displayed CNN for 5000 hours, but also displayed other random content for an additional 5000 hours (5 hours a day CNN plus 5 hours a day random), for a total of 10,000 hours. Let's call the CNN logo area of the screen "area A" and the rest of the screen always displaying random content as "area B." Burn-in intensity (how light or dark the BI appears relative to its surroundings) can be defined as the ratio of wear between area A (CNN logo area) and area B (rest of the screen).
TV#1 (CNN Only)
Area A (CNN logo area) Wear = 5000 hours x 5 wear factor = 25,000 wear units
Area B (Rest of screen/random content) Wear = 5000 hours x 1 wear factor = 5000 wear units
Ratio is 25,000 / 5000 or simply 5. Put another way, the burn-in on this TV should appear 5x darker than the surrounding area. Again, not actually, just for illustrative purposes.
TV#2 (CNN 50% of the time)
Area A (CNN logo area) Wear = 5000 hours @ 5x wear rate PLUS 5000 hours @ 1x wear rate (when showing random content) = 30,000 wear units
Area B (Rest of screen/always random) Wear = 10,000 hours @ 1x wear rate = 10,000 wear units
Ratio of A to B is 30,000 / 10,000 or simply 3. On this TV, the burn-in of the CNN logo is only 3x darker than the surrounding area.
Conclusion (of the theory; not proven): Even though both TVs displayed CNN for the same identical cumulative number of hours, the TV where random content was played in addition to CNN content, may show the burn in as less sever (lighter) or may take longer for the burn-in to appear. In other words, no matter what or how much content with static elements you watch, the more random content you watch, the longer it will take for burn-in to appear, the longer it will take for it to progressive in severity. Also, once an OLED TV already suffers burn-in, the more random content you watch afterwards, the lighter and less noticeable it should become.
Disclaimer: This is not very scientific. I'm trying to use simplified terms to illustrate a theory. Nothing has been scientifically proven by this post. This is very basic and doesn't take into account actual wear rates of various subpixels, or compensation methods to combat burn-in, or a bunch of other factors I've not considered.