AVS Forum banner
1 - 19 of 19 Posts

·
Registered
Joined
·
11 Posts
Discussion Starter · #1 ·
Hi, folks,

I'm feeling things out for a new TV right now, and the EC9300 has caught my attention with its picture quality. However, some reviews I've read have mentioned that it has some issues concerning image retention. Since I play video games somewhat often, this could be a bit of a problem; there's often a static health bar or some other onscreen overlay that remains unchanging for lengthy periods of time. That might result in an afterimage when switching to a cutscene or whatever. Thus, I thought I'd ask the folks here, who may have more hands-on experience with the TV. So: Just what is the image retention or burn-in like on the model? Does anyone have experience with playing games on it that can lend their opinion? Also, how do people find the input lag? The reviews seem to find that it's not too bad, but again I was seeing what practical usage has shown. This is a somewhat lesser factor, since I don't generally play first-person shooters, i.e. something requiring really fast reflexes. However, I'll occasionally get the odd quick-time event or something like that, so I'm searching for a TV with a reasonably low lag.
 

·
Banned
Joined
·
425 Posts
At the risk of being proven wrong several years from now, I'd go as far as to say burn-in is effectively impossible on LG's OLEDs. They are so anal about dimming the screen when there's a static image in order to prolong the life of the panel, that's not something you have to worry about in the short-term. In the long-term, I don't know if the panels are going to last long enough to worry about burn-in, especially if you play a lot of games on them. The lifespan's much higher using LG's WRGB pixels than RGB, but it's still nothing to write home about.

Temporary image retention does occur, I can produce some that lasts a few frames when transitioning from something extraordinarily bright to black or in one case I was able to get the Xbox One logo to stay etched into the screen for several hours because I had something snapped to the side while playing a game. Even the long-term image retention miraculously disappeared by the end of the next day. This is probably because the TV does automated wear management, which eliminates all these problems in the short-term. For an image that darkly retained to go away on a plasma, it would have taken 50 or 60 hours of consciously ignoring the big black Xbox logo in the upper right corner of the screen before it went away :p

In my own experience, input lag's not what makes gaming questionable on these OLEDs. It's actually the judder from fast responding pixels + slow refreshing image that makes me want to rip my hair out. Without motion blur (yes, motion blur would be desirable in this case), if you pan the camera around you're going to see the same image in two places. The faster you move the camera and the lower the framerate, the worse it gets. With other display technologies there's a little bit of intrinsic blur that hides the double images during reasonably slow motion, but these panels don't have that and I perceive judder with even the slightest of motion. The only solution to this on consoles, since you can't increase their framerate, is to enable Trumotion De-Judder processing, but that just might turn input latency into a real issue in addition to making everything look freakishly artificial.

A solid 60 FPS game can be tolerable without using Trumotion smoothing, but 30 FPS is an absolute nightmare. Ordinarily I'm not one of those people who promotes the merits of high framerate gaming, 30 FPS is adequate on a CRT or plasma, but it's definitely not on LCD or these particular OLEDs. Instead of borrowing Trumotion from their LCDs, LG should have added some simple black frame insertion feature (flicker would be much more desirable).
 

·
Registered
Joined
·
11 Posts
Discussion Starter · #3 ·
Hm - that's an interesting take. I was unaware of the judder issue. Is there any particular brand/model/technology you might recommend that's better for handling it (apart from plasma)?

I don't know how viable it would be to insert blank frames; deliberately inducing flickering could trigger epileptic seizures in a small portion of the population, for instance.
 

·
Banned
Joined
·
3,567 Posts
I have seen about 5 cases of real burn in and most of the were display models that were hardly ever off. You will get slight image retention from time to time, but actually burn-in for the home user would be almost impossible.
 

·
Registered
Joined
·
2,098 Posts
Allow me to quote a post from Overclock.net - note that he was using a 1st generation LG 55EA9800 OLED TV rather than the newer 55EC9300:

[URL=http://www.overclock.net/t/1542119/sony-sony-expands-trimaster-el-series-with-first-oled-designed-for-pro-video-production#post_23562524]Assimilator87 @ Overclock.net[/URL] said:
I've been using an OLED display for almost a year and have not had any issues with image burn in. There have even been many occasions when I passed out with the screen still on, displaying my desktop. Considering that's with a first gen OLED display, I'd imagine the new ones fair even better.
Now that's not to say thes eabsolutely can't get burn-in, but it seems to be at a level similar to the final generation of CRT displays where you have to really abuse the set for several days straight.

And remember, burn in =/= image retention.


Regarding image rention, to me it also sounds similar to the way late CRTs work. For example, my Trinitron CRT monitor pretty much always has a little bit of image retention after being on for several hours straight, but it isn't noticable unless the display is actually off, and even then it always goes away after the display is off for an hour or so.


Lastly, one thing about framerates - I noticed that you didn't mention any specific systems. If you're planning on mainly playing 2D sprite-based games, PC games, or post-N64 Nintendo-published games (other than F-Zero X), then you'll be dealing with 60fps for the most part anyway - and possibly Sega-developed games as well (including DreamCast games). Also most fighting games are 60fps by default due to their nature of being timing-critical.

For Sega Saturn, PS1, and N64 games, see this list:
http://www.sega-16.com/forum/showth...ics-running-at-60-fps-5th-Generation-Consoles
 

·
Registered
Joined
·
4,574 Posts
It's actually the judder from fast responding pixels + slow refreshing image that makes me want to rip my hair out. Without motion blur (yes, motion blur would be desirable in this case), if you pan the camera around you're going to see the same image in two places. The faster you move the camera and the lower the framerate, the worse it gets. With other display technologies there's a little bit of intrinsic blur that hides the double images during reasonably slow motion, but these panels don't have that and I perceive judder with even the slightest of motion. The only solution to this on consoles, since you can't increase their framerate, is to enable Trumotion De-Judder processing, but that just might turn input latency into a real issue in addition to making everything look freakishly artificial.

A solid 60 FPS game can be tolerable without using Trumotion smoothing, but 30 FPS is an absolute nightmare. Ordinarily I'm not one of those people who promotes the merits of high framerate gaming, 30 FPS is adequate on a CRT or plasma, but it's definitely not on LCD or these particular OLEDs. Instead of borrowing Trumotion from their LCDs, LG should have added some simple black frame insertion feature (flicker would be much more desirable).
The double-image ghosting problem is also an issue with Plasma and CRT for 30 fps content. There is no way to avoid that because you're strobing/flashing the same image twice with 60hz panel refresh. Even LCD has it when employing scanning or strobing backlights. I have not tried 30 fps gaming on an OLED but since the LG models are sample-and-hold, why would you see a double ghost image?
 

·
Banned
Joined
·
425 Posts
The double-image ghosting problem is also an issue with Plasma and CRT for 30 fps content. There is no way to avoid that because you're strobing/flashing the same image twice with 60hz panel refresh. Even LCD has it when employing scanning or strobing backlights. I have not tried 30 fps gaming on an OLED but since the LG models are sample-and-hold, why would you see a double ghost image?
That may be true for progressive scanning CRTs, but most 30 fps content on a CRT TV is going to be interlaced. However, even when progressively scanned, the old image gradually dims instead of disappearing in one place and suddenly appearing somewhere else. It's this sudden change after 33 ms of seeing a stationary image that happens on OLED that is most alarming. The diminishing brightness over time between impulses is very beneficial to smooth motion perception at low frame rates.

Plasma, of course, works slightly different. It will illuminate a full bright pixel for the entire duration of a refresh, but for pixels at lower brightness it only keeps it lit for a fraction of the nominal sub-fields (10 per-frame on modern plasmas). That's 1 impulse every 1.6 ms, followed by a falloff in brightness. It's not as good as the CRT solution, but this too avoids the issue of presenting the brain with a static image for 33 ms at a time.
 

·
Registered
Joined
·
351 Posts
I don't know how viable it would be to insert blank frames; deliberately inducing flickering could trigger epileptic seizures in a small portion of the population, for instance.
A lot of games have epilepsy warnings on then already...the game would probably cause someone to sieze long before the TV did.

Also, some LCD/LED TVs already flicker, all CRTs flicker, most of the computer monitors you use flicker...so its already fairly widespread...and totally viable.
 

·
Registered
Joined
·
117 Posts
I was unaware of the judder issue. Is there any particular brand/model/technology you might recommend that's better for handling it (apart from plasma)?

30fps games are terrible on any display that has accurate motion handling whilst 60fps will look stunningly accurate. On a bad LCD, both 30 & 60fps look pretty sloppy so people aren't as concerned.
 

·
Banned
Joined
·
425 Posts
30fps games are terrible on any display that has accurate motion handling whilst 60fps will look stunningly accurate. On a bad LCD, both 30 & 60fps look pretty sloppy so people aren't as concerned.
30 FPS runs considerably better on CRT and plasma. Because the phosphors dim over time, the brain finds the otherwise very stop-motion like nature of 30 FPS tolerable. CRT does it better than plasma, but plasma still does it better than LCD and certainly better than OLED.
 

·
Banned
Joined
·
425 Posts
Kaldaien, have you tried a custom resolution on your OLED for 1080p @120Hz? Some 4K TVs accept it using Nvidia's custom resolution configuration. That might improve motion handling when using your PC.
It is not in the EDID that the driver reads, but I would not be surprised if it will accept a 120 Hz signal at those resolutions. I think that usually throws TVs into 3D mode though - not a huge issue, but I have not and probably never will calibrate the TV's 3D picture modes :p

Actually ... now that I think about it, it might be a pretty big deal after all. These TVs don't use active shutters for 3D, instead they polarize light. That'd mean that left and right images would be visible simultaneously at 60 Hz instead of 120 discrete images per-second.
 

·
Registered
Joined
·
4,357 Posts
On the 55EC9300 I'm demoing I've seen very minor image retention after hours of gaming. I play Killer Instinct a lot on Xbox One which has static health cars. Normally it's only very bright white things that leave image retention and even then it goes away after a few minutes of different content.

As for input lag if you change HDMI to PC and then use game you'll be around 29ms which is great. Regular game mode is in the mid 40ms range which isn't terrible for most games.
 

·
Banned
Joined
·
10,026 Posts
Kaldaien, have you tried a custom resolution on your OLED for 1080p @120 Hz? Some 4K TVs accept it using Nvidia's custom resolution configuration. That might improve motion handling when using your PC.
....Don't forget to verify that the TV isn't doing frame dropping though. I don't care what is in the manual, and I certainly don't care what the PC reports that the TV is accepting.

Many TVs will report back to the source device that they're accepting a particular frame rate, and in fact even post it in the corner of the screen, but silently not actually display all the frames.

My TV is one such animal. It'll accept 1080p120, display "1080p 120" in the corner, but throw out frames. It'll only actually display 120 fps if the resolution is set to 720p.

You need to run one of the test utilities (use mark rejhon's blurbusters.com site) to verify this is happening.
 

·
Registered
Joined
·
4,574 Posts
It is not in the EDID that the driver reads, but I would not be surprised if it will accept a 120 Hz signal at those resolutions. I think that usually throws TVs into 3D mode though - not a huge issue, but I have not and probably never will calibrate the TV's 3D picture modes :p

Actually ... now that I think about it, it might be a pretty big deal after all. These TVs don't use active shutters for 3D, instead they polarize light. That'd mean that left and right images would be visible simultaneously at 60 Hz instead of 120 discrete images per-second.
I would test 1080p @120 just in case it works. If it does, you might be able to "fake" a black-frame-insertion feature by sending it alternating black frames interleaved with regular video frames. This would reduce motion blur at the cost of visible flicker and loss of 50% brightness. I think the Samsung OLED did this at 240Hz which made the flicker less of an issue.
 

·
Banned
Joined
·
425 Posts
I would test 1080p @120 just in case it works. If it does, you might be able to "fake" a black-frame-insertion feature by sending it alternating black frames interleaved with regular video frames. This would reduce motion blur at the cost of visible flicker and loss of 50% brightness. I think the Samsung OLED did this at 240Hz which made the flicker less of an issue.
I've tried a bunch of non-standard refresh rates not listed in the EDID, so far the only thing I've been able to get was 48 Hz and 72 Hz. It's proving to be an unpleasant experience trying this because 1/3 times the NVIDIA driver somehow manages to deadlock the entire system (this is supposed to be impossible in WDDM from what I understand) after testing a refresh rate and the computer has to be rebooted :(

This isn't really the best solution though, I prefer to play my PC games at 3840x2160 and the cable standard limits that to 60 Hz regardless. Even if I could manually insert some black frames, I'd have to drop down to 30 FPS. It sounds like an interesting idea for console gaming though, if you could somehow get a box to sit in-between the console and the TV and insert some black frames on alternate frames I'd definitely buy it.
 

·
Registered
Joined
·
92 Posts
Guys I hate to revive this, but I am on the fence about OLED.

When you guys talk gaming and burn-in/IR on OLED, how long are your gaming sessions?

I sometimes game on the same game for hours on end, and the health-bar/HUD are displayed for mad long hours.

What are your opinions on that?
 

·
Registered
Joined
·
368 Posts
On the 55EC9300 I'm demoing I've seen very minor image retention after hours of gaming. I play Killer Instinct a lot on Xbox One which has static health cars. Normally it's only very bright white things that leave image retention and even then it goes away after a few minutes of different content.

As for input lag if you change HDMI to PC and then use game you'll be around 29ms which is great. Regular game mode is in the mid 40ms range which isn't terrible for most games.
I wish the 4k model could pull off those input lag numbers. I really want to go OLED, but might end up with the JS9500 because I need 30ms and below input lag for gaming. Anything over 30ms and I can immediately notice input lag.
 
1 - 19 of 19 Posts
Top