Originally Posted by Scott Wilkinson
In fact, the OLED had a somewhat different color cast then the LCDs on all content, leaning toward teal/green compared with the other two.
That's what happens when you have a WRGB display and your white subpixel isn't D65.
It always surprised me that so many people - even reviewers/professional calibrators - weren't bothered by this.
LG's OLEDs have this ugly teal/green cast when displaying certain images that I don't like at all.
OLEDs are exciting stuff and they have a lot of potential, but I think it's a mistake to discount LCD yet.
They're still superior to OLED in several ways - just not black level, viewing angle, or response times.
Originally Posted by sytech
This guy that went to the demo is claiming minor off axis blooming and banding on a solid grey screen. He is not the only one either. One of the more professional reviews makes mention of off axis blooming. Again, very very minor blooming that may not be an issue to most.
You're not going to have a local dimming LCD without that problem.
When you view an LCD off-axis, the contrast ratio drops.
When you view LED zones off-axis, their brightness is relatively static.
So when you view an LCD using LED zones off-axis, the zone structure becomes much more apparent the further you get off-axis.
Originally Posted by Scott Wilkinson
It's been said many times on AVS, but it bears repeating: You can't tell anything useful from a photo of the image on a TV, especially an HDR image, so there's no point drawing any conclusions whatsoever from them. For example, there are many different reasons why an image on a TV might look like it's blooming in a photo that have nothing to do with the TV itself, and viewing the same image on that TV in person, you might not see much blooming at all. I didn't specifically look for off-axis blooming in the Z9D at yesterday's press conference, but I did not notice much if any blooming in the images I saw.
I wouldn't say that you can't tell anything
useful from photographs. A camera can be a useful tool in comparing/evaluating displays if you know what you're doing.
However you're correct that no-one should be judging the quality of a display based on images like these.
Originally Posted by Oledtech
Knowing Sony, FALD will 99.9% not work in gamemode. When it works, input lag will be sky-high.
And it looks to be another VA panel, so not revolutionary for LCD.
Is that something which has changed in recent models? My old HX900 still uses FALD in game mode, however it is not the "full" FALD.
In game mode it no longer switches zones off entirely, and zones lag a frame or so behind the LCD panel rather than delaying the LCD image to keep them in sync.
Originally Posted by RLBURNSIDE
Thing I don't get with all these innovations to FALD, is why on earth did no one patent simply using two LCDs, one in front of the other, before now?
Apple only patented
that idea TWO WEEKS ago. (to HDR-ify their iphones and possibly upcoming VR helmets). I thought of it too after people on AVS mentioned dual DLPs placed in serial to reach the square of the contrast ratio of each DMD (so, 4M:1 instead of 2000:1).
Sharp had a prototype display using this concept back in 2005. I can't understand how Apple managed to patent this.
Originally Posted by http://www.bit-tech.net/news/hardware/2005/10/03/sharp_mega_contrast/1
While Sharp's claim of 1,000,000:1 is indeed impressive, the display has a peak luminance of just 500 cd/m².
In order to hit the magic million-to-one contrast ratio, their engineers have simply reduced the minimum luminance to 0.0005 cd/m²
No local dimming or other trickery. Engadget gallery from 2007
Unfortunately I don't believe it ever went into production.
Originally Posted by TuteTibiImperes
For projectors the use of lasers seems like it could be the better way to go - if you have a three laser (red, green, and blue) projector that can shine the light only on the mirrors that should be lit, while skipping the beams over the mirrors than should be black, wouldn't that create true blacks combined with very bright brights and great color spectrum from the extra-pure primaries?
Scanning laser displays have a lot
of problems. I would honestly be surprised if they ever become viable as a high-end mainstream technology. (yes, I know there are low-res scanning laser pico projectors)
You're more likely to see lasers used to replace traditional light sources rather than anything else.
Originally Posted by Vader1
Motion? You do realize motion is not at all something LCD is inherently or essentially better at? It's taken 15 years for motion on LCD to get to this point and only Sony really pulls off the truly great motion for LCD, and only on their most expensive sets. LG's OLED's have never been anywhere near as bad as LCD's were for years and many low to mid range LCD's still are today.
In fact if you compare an LCD to an OLED with equal refresh rates and no BFI for either the OLED would have inherently better motion because it has the fastest pixel response of any display technology ever.
Image persistence is the primary cause of motion blur on displays.
When you take image persistence out of the equation, yes, pixel response time wins.
However when you factor in image persistence, pixel response time hardly matters at all.
When comparing like-for-like, the OLED's response times do result in less image ghosting/streaking.
However reducing image persistence has a far bigger impact on motion blur.
Yes, there are after-images caused by the LCD's high response times, which would ideally not be there, but motion blur is significantly lower overall.
So while better response times are always appreciated, they (mostly) don't matter if it comes with increased image persistence.
Of course there are many types of content where you cannot reduce image persistence.
You can't use backlight scanning with films because they're 24 FPS and no-one is going to produce a display that strobes at 24Hz.
You can combine strobing with interpolation, but videophiles won't accept that. So for films, you need to have a full-persistence display.