Originally Posted by LeoRavus
The game I'm currently playing on PC has a couple bright static HUD elements that seem to increase by a lot when HDR is turned on.
I would use common sense. If it's really bright and static, turn down the brightness and minimize on-screen time.
BTW, do you know if the game's visual content was produced with 10-bit pixel depth in a Rec. 2020 color space? Or are you looking at SDR mapped to HDR?
Originally Posted by Ken Masters
What are your SDR settings? If you're using the out-the-box SDR settings, your average picture level is probably just as bright, if not brighter, than HDR. Meaning panel lifespan probably isn't going to be much different whichever you use.
I think people displaying SDR content using the TV's standard or vivid "torch mode" presets are far more likely to run into trouble than people who view a lot of HDR content with a calibrated set or using close to calibrated presets.
Originally Posted by theandies
I heard all the same arguments when I bought my plasma in 2005. 49,000 hours later and my plasma still looked great and zero image retention. The only thing that made me buy a new OLED is my plasma finally died. Of course my plasma wasn't HDR capable.
Yep. I heard the same warnings about how my expensive new Pioneer plasma was doomed to burn-in. 12 years later and it never happened, even with a fair bit of console gaming on it. A decade before that, I kept hearing that my CRT RPTV was doomed. It turned out OK too. So I am not particularly worried about burn-in.
But if I owned a sports bar and needed to choose a TV that could handle displaying the ESPN logo on bright settings for like 12 hours a day, I would not buy an OLED.
HT: Dynaudio C2, Contour S CX, 2x BM14S, Aperion surrounds, Simaudio Titan, Marantz AV8801, Oppo 103, Linn Majik DS, and a Pioneer Kuro
Stereo: Dynaudio Focus 160, Simaudio W-5 LE or Luxman M-600A, Linn Akurate DSM, 2x Rythmik F12G
Other interests: motorcycling, skiing, being active