RGB OLED is the future not WRGB - Page 3 - AVS Forum | Home Theater Discussions And Reviews
Forum Jump: 
 40Likes
Reply
 
Thread Tools
post #61 of 107 Old 12-19-2015, 08:33 AM
AVS Forum Special Member
 
Rich Peterson's Avatar
 
Join Date: Jan 2000
Location: St Paul, MN
Posts: 2,816
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 145 Post(s)
Liked: 167
Quote:
Originally Posted by joys_R_us View Post
It will be funny to watch Samsung in 2016 making a 180 degrees turn and pushing the OLEDs as the best ever TV technology once they have got their act together in the production of OLEDs. I guess they must be panicking now seeing the development of the OLED panels at LG.
Yeah, for sure. It's just a matter of time before there are many others producing large-screen OLEDs. This doom-and-gloom is mostly driven by those not there yet.
Magnesus likes this.
Rich Peterson is offline  
Sponsored Links
Advertisement
 
post #62 of 107 Old 12-19-2015, 09:20 AM
AVS Forum Special Member
 
taichi4's Avatar
 
Join Date: Dec 2007
Posts: 3,123
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 535 Post(s)
Liked: 318
Quote:
Originally Posted by rogo View Post

...LG should license the technology behind WRGB to anyone willing to build a fabrication plant at this point. They need others to have skin in the game. LG has leadership. It will have invented the technology, it can win a big share...

Excellent point about getting other companies' skin in the game.

But WOLED was invented by Kodak, and sold to LG. Kodak also invented the digital camera, but similarly mishandled that. What a pity.
taichi4 is offline  
post #63 of 107 Old 12-19-2015, 09:39 AM
AVS Forum Special Member
 
5x10's Avatar
 
Join Date: Sep 2012
Posts: 1,521
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 917 Post(s)
Liked: 624
The "future" is already out of production
5x10 is offline  
Sponsored Links
Advertisement
 
post #64 of 107 Old 12-19-2015, 11:09 AM
AVS Forum Addicted Member
 
rogo's Avatar
 
Join Date: Dec 1999
Location: Stop making curved screens
Posts: 32,169
Mentioned: 23 Post(s)
Tagged: 0 Thread(s)
Quoted: 1895 Post(s)
Liked: 2197
Quote:
Originally Posted by taichi4 View Post
Excellent point about getting other companies' skin in the game.

But WOLED was invented by Kodak, and sold to LG. Kodak also invented the digital camera, but similarly mishandled that. What a pity.
Sorry, you're 100% correct. Invented was an errant word choice. "Developed and popularized" would have been better, as would many other phrasings of the point.

There's a saying about "everything in moderation". If only it was applied to well, you know...
rogo is offline  
post #65 of 107 Old 12-19-2015, 11:17 AM
 
tgm1024's Avatar
 
Join Date: Dec 2010
Location: Maybe ⅓ of the way from here to there.
Posts: 10,026
Mentioned: 5 Post(s)
Tagged: 0 Thread(s)
Quoted: 2589 Post(s)
Liked: 2312
Quote:
Originally Posted by rogo View Post
Sorry, you're 100% correct. Invented was an errant word choice. "Developed and popularized" would have been better, as would many other phrasings of the point.
correct? incorrect?
tgm1024 is offline  
post #66 of 107 Old 12-19-2015, 12:18 PM
Advanced Member
 
hungro's Avatar
 
Join Date: Mar 2007
Posts: 898
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 177 Post(s)
Liked: 116
Quote:
Originally Posted by tgm1024 View Post
correct? incorrect?
This is from flat panels hd website. It is pertaining to the doom sayer article about OLED and HDR and how because of it's peak light output limit it won't be able to "compete" with LED LCD.
First details on 2016 OLED TVs

"But what most of us really want to know is what will be available to buy in 2016 and we received some details. LG will focus on 4K, HDR (high dynamic range) and a wider color space, one of LG’s engineers, Haengjoon Kang, told FlatpanelsHD in an interview. If you have read our EG9600 reiew you will know that it offers all three components but that it does not reach the finishing line. LG will take another big step further next year.

As for HDR, LG said that "light level will be increased a lot" and even goes so far as to say "nearly two times". Today, LG's OLED TVs can achieve a maximum brightness of about 400 nits, while flagship LCDs reach approximately 700-800 nits. The maximum brightness level is of course only one component for HDR - black levels are at least as important - but many in the industry argue that HDR should be reproduced with bright highlights - for example sun reflections - of up to 800-1200 nits."

You can take that for what it is, straight from an LG Engineers mouth.
hungro is offline  
post #67 of 107 Old 12-19-2015, 12:21 PM
Advanced Member
 
hungro's Avatar
 
Join Date: Mar 2007
Posts: 898
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 177 Post(s)
Liked: 116
Quote:
Originally Posted by mhrir View Post
Here is the trascript of:
Experts Claim That OLED TV Technology Is Expensive, & Not What It's Cracked Up To Be


By David Richards | Friday | 20/11/2015

At CES 2016 there is set to be a lot of debate around HDR 4K content.

Interest in HDR (high dynamic range) 4K technology has resulted in a wider discussion on whether OLED TV displays will be able to handle the display characteristics of this new technology as it arrives.
This is from flat panels hd website. It is pertaining to the doom sayer article about OLED and HDR and how because of it's peak light output limit it won't be able to "compete" with LED LCD.
First details on 2016 OLED TVs

"But what most of us really want to know is what will be available to buy in 2016 and we received some details. LG will focus on 4K, HDR (high dynamic range) and a wider color space, one of LG’s engineers, Haengjoon Kang, told FlatpanelsHD in an interview. If you have read our EG9600 reiew you will know that it offers all three components but that it does not reach the finishing line. LG will take another big step further next year.

As for HDR, LG said that "light level will be increased a lot" and even goes so far as to say "nearly two times". Today, LG's OLED TVs can achieve a maximum brightness of about 400 nits, while flagship LCDs reach approximately 700-800 nits. The maximum brightness level is of course only one component for HDR - black levels are at least as important - but many in the industry argue that HDR should be reproduced with bright highlights - for example sun reflections - of up to 800-1200 nits."

We asked LG’s engineers how far they can take OLED. LG will not try to predict that this early in the cycle but they seem convinced that this is only the beginning. They explained to us that a brightness level of 1000 nits is possible today but that it would be disproportionately expensive for consumer products.

Read more at http://www.flatpanelshd.com/focus.ph...SmrtztHftj6.99

You can take that for what it is, straight from an LG Engineers mouth.
hungro is offline  
post #68 of 107 Old 12-19-2015, 12:54 PM
AVS Forum Special Member
 
taichi4's Avatar
 
Join Date: Dec 2007
Posts: 3,123
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 535 Post(s)
Liked: 318
taichi4 is offline  
post #69 of 107 Old 12-19-2015, 03:53 PM
AVS Forum Special Member
 
GregLee's Avatar
 
Join Date: Jul 2002
Location: Waimanalo HI
Posts: 4,524
Mentioned: 5 Post(s)
Tagged: 0 Thread(s)
Quoted: 1101 Post(s)
Liked: 386
Quote:
Originally Posted by hungro View Post
This is from flat panels hd website. It is pertaining to the doom sayer article about OLED and HDR and how because of it's peak light output limit it won't be able to "compete" with LED LCD.
...
The simplistic level of the discussion about LED/OLED and HDR is a little surprising to me. We all know that contrast is much more important to perception of natural scenes than brightness. We know about Dolby's result on user preference for up to 10,000 nits with their special 20,000 nit LED monster, which presumably didn't have real great black levels. It should follow that lowering black level should have the effect of also lowering the maximum brightness that people like, though this wouldn't have been testable on Dolby's test rig.

So these figures for maximum brightness that are bandied about are really only valid for LEDs. OLED sets, since they have better black levels, should require lower maximum brightness for highlights. I would have thought that by this time, someone would have devised some way of estimating how much brightness OLEDs actually need for HDR.
homogenic likes this.

Greg Lee
GregLee is offline  
post #70 of 107 Old 12-19-2015, 04:35 PM
AVS Forum Special Member
 
taichi4's Avatar
 
Join Date: Dec 2007
Posts: 3,123
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 535 Post(s)
Liked: 318
Quote:
Originally Posted by GregLee View Post
...So these figures for maximum brightness that are bandied about are really only valid for LEDs. OLED sets, since they have better black levels, should require lower maximum brightness for highlights. I would have thought that by this time, someone would have devised some way of estimating how much brightness OLEDs actually need for HDR.
An excellent point.
taichi4 is offline  
post #71 of 107 Old 12-19-2015, 07:28 PM
AVS Forum Addicted Member
 
darinp2's Avatar
 
Join Date: Sep 2003
Location: Seattle, WA
Posts: 23,188
Mentioned: 7 Post(s)
Tagged: 0 Thread(s)
Quoted: 1135 Post(s)
Liked: 1767
Quote:
Originally Posted by GregLee View Post
We know about Dolby's result on user preference for up to 10,000 nits with their special 20,000 nit LED monster, which presumably didn't have real great black levels.
Have you seen something to indicate that it didn't? Dolby has been capable of very bright whites and very dark blacks for a long time. They bought Brightside who had technology to due very bright and very dark with the same display, due to having lots of zones. Maybe not 10k to 20k (I don't recall if they one I saw over a decade ago could go that bright).
Quote:
Originally Posted by GregLee View Post
So these figures for maximum brightness that are bandied about are really only valid for LEDs. OLED sets, since they have better black levels, should require lower maximum brightness for highlights. I would have thought that by this time, someone would have devised some way of estimating how much brightness OLEDs actually need for HDR.
The amount they "need" to meet Dolby's specifications are basically what is in the standards. The LCDs "need" to be able to do the black in their standards also to meet those also.

While OLED being able to go completely black and do huge CR between small areas of an image are great things it doesn't mean that to really do HDR like Dolby would like to see it doesn't require being able to do super bright whites, reds, greens, and blues. I may not want to use it that way and some others may not either, but it isn't that being able to only do 400 nits really meets what those driving HDR would like to achieve.

OLED not meeting those peak values also doesn't mean that we would prefer something that struggles on the other end.

If there were no technological limitations at all it sounds like many of the people driving HDR would like a display to be able to do both 20k nit whites and 0 nit blacks. That is a lot of power to give to people encoding content since they could make really annoying sequences, but with the right artists that whole range might be used over time without getting too annoying.

--Darin

This is the AV Science Forum. Please don't be gullible and please do remember the saying, "Fool me once, shame on you. Fool me twice, shame on me."
darinp2 is offline  
post #72 of 107 Old 12-20-2015, 05:26 AM
AVS Forum Special Member
 
GregLee's Avatar
 
Join Date: Jul 2002
Location: Waimanalo HI
Posts: 4,524
Mentioned: 5 Post(s)
Tagged: 0 Thread(s)
Quoted: 1101 Post(s)
Liked: 386
Quote:
Originally Posted by darinp2 View Post
Have you seen something to indicate that it didn't? ...
No. That's why I wrote "presumably".
Quote:
The amount they "need" to meet Dolby's specifications are basically what is in the standards.
And what's that? I know of some Dolby proposals about source specification for maximum brightness. I'm not aware of a proposal from Dolby about a display specification for maximum brightness.

Greg Lee
GregLee is offline  
post #73 of 107 Old 12-20-2015, 07:51 AM
AVS Forum Addicted Member
 
darinp2's Avatar
 
Join Date: Sep 2003
Location: Seattle, WA
Posts: 23,188
Mentioned: 7 Post(s)
Tagged: 0 Thread(s)
Quoted: 1135 Post(s)
Liked: 1767
Quote:
Originally Posted by GregLee View Post
And what's that? I know of some Dolby proposals about source specification for maximum brightness. I'm not aware of a proposal from Dolby about a display specification for maximum brightness.
I believe the end points of the PQ curve from Dolby are 10k nits and zero nits.

However, the source contains information about the peak level it was mastered to. So, if the source is mastered to 1200 nit white then the consumer display is supposed to be able to do 1200 nit whites, even if it can do zero for black.

One place where I may choose to deviate is with backlighting behind a display. I personally think that turning room lights on degrades images unless it is to overcome a display weakness and would rather leave all lights off other than the display and then choose a white level I am comfortable with for those conditions. That may well be lower than a standard calls for, especially if it calls for backlighting.

--Darin
darinp2 is offline  
post #74 of 107 Old 12-20-2015, 08:39 AM
 
tgm1024's Avatar
 
Join Date: Dec 2010
Location: Maybe ⅓ of the way from here to there.
Posts: 10,026
Mentioned: 5 Post(s)
Tagged: 0 Thread(s)
Quoted: 2589 Post(s)
Liked: 2312
Actually I wasn't doubting rogo, I was asking if it was a typo. The "sorry" and "you're 100% correct" is cool stylistic pairing if not a typo.
tgm1024 is offline  
post #75 of 107 Old 12-20-2015, 09:52 AM
AVS Forum Special Member
 
GregLee's Avatar
 
Join Date: Jul 2002
Location: Waimanalo HI
Posts: 4,524
Mentioned: 5 Post(s)
Tagged: 0 Thread(s)
Quoted: 1101 Post(s)
Liked: 386
Quote:
Originally Posted by darinp2 View Post
So, if the source is mastered to 1200 nit white then the consumer display is supposed to be able to do 1200 nit whites, even if it can do zero for black.
What exactly does "is mastered to" mean? The source signal will have a number for the brightness of each subpixel, and if it has 10 bit color depth, the maximum number that could be in the video is 1023. Are you saying that the 1023 value represents 1200 nits in the original scene and recorded by the camera? (Or, if you really mean "whites" and not brightness, I should have said 1023 for all three subpixels.)

I did not interpret the Dolby proposals this way. I thought they meant that the maximum number for brightness in a file would represent 10,000 nits in the scene, rather than some arbitrary value that would differ from video to video.

How do you get to the "is supposed to" part? Do you assume that the maximum of 1200 nits ever actually occurs in this example video?

Greg Lee
GregLee is offline  
post #76 of 107 Old 12-20-2015, 01:45 PM
AVS Forum Addicted Member
 
rogo's Avatar
 
Join Date: Dec 1999
Location: Stop making curved screens
Posts: 32,169
Mentioned: 23 Post(s)
Tagged: 0 Thread(s)
Quoted: 1895 Post(s)
Liked: 2197
Quote:
Originally Posted by tgm1024 View Post
correct? incorrect?
Quote:
Originally Posted by tgm1024 View Post
Actually I wasn't doubting rogo, I was asking if it was a typo. The "sorry" and "you're 100% correct" is cool stylistic pairing if not a typo.
I feel like this is AVS Kremlinology.

I apologized to Taichi with the word "Sorry."

He was (is) 100% correct. The word "invented" was a poor word choice by me, especially since I am well aware that WRGB OLED was a Kodak invention and LG acquired the technology.

I was being polite with an apology, then correcting my error with an attempt at better language.

Nothing more.

There's a saying about "everything in moderation". If only it was applied to well, you know...
rogo is offline  
post #77 of 107 Old 12-21-2015, 05:31 PM
Advanced Member
 
hungro's Avatar
 
Join Date: Mar 2007
Posts: 898
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 177 Post(s)
Liked: 116
Quote:
Originally Posted by GregLee View Post
. I would have thought that by this time, someone would have devised some way of estimating how much brightness OLEDs actually need for HDR.
I think this is what the UHD Alliance was trying to figure out, or come up with a solution for. The UHDA should be making their announcement at CES , along with other details of their UHD specs.

Last edited by hungro; 12-21-2015 at 05:52 PM.
hungro is offline  
post #78 of 107 Old 12-26-2015, 11:39 AM
AVS Forum Special Member
 
saprano's Avatar
 
Join Date: Oct 2007
Location: Bronx NY
Posts: 4,121
Mentioned: 3 Post(s)
Tagged: 0 Thread(s)
Quoted: 582 Post(s)
Liked: 602
Sony's master monitor uses RGB OLED so maybe they'll use the same thing for their consumer OLED's if they make any. http://www.postmagazine.com/Post-Blo...Our-World.aspx

Here's hoping.

I agree with the poster earlier in the thread that said LG's OLED gives off an LCD look. The picture is too clean and flat looking. Plasmas have a more organic textured picture. I don't know if that's because LG uses S&H and WOLED, but i don't like it.

home theater addict
saprano is offline  
post #79 of 107 Old 12-26-2015, 01:17 PM
Senior Member
 
tubby497's Avatar
 
Join Date: Apr 2013
Posts: 489
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 149 Post(s)
Liked: 395
Quote:
Originally Posted by saprano View Post
Sony's master monitor uses RGB OLED so maybe they'll use the same thing for their consumer OLED's if they make any. http://www.postmagazine.com/Post-Blo...Our-World.aspx

Here's hoping.

I agree with the poster earlier in the thread that said LG's OLED gives off an LCD look. The picture is too clean and flat looking. Plasmas have a more organic textured picture. I don't know if that's because LG uses S&H and WOLED, but i don't like it.
Plasma "organic feel" = dithering/pwm noise
Magnesus likes this.
tubby497 is offline  
post #80 of 107 Old 12-26-2015, 06:24 PM
AVS Forum Special Member
 
saprano's Avatar
 
Join Date: Oct 2007
Location: Bronx NY
Posts: 4,121
Mentioned: 3 Post(s)
Tagged: 0 Thread(s)
Quoted: 582 Post(s)
Liked: 602
Quote:
Originally Posted by tubby497 View Post
Plasma "organic feel" = dithering/pwm noise
It's more than that though. The picture just looks more real.

By the way, I don't see either on my kuro anyway.

home theater addict
saprano is offline  
post #81 of 107 Old 12-30-2015, 10:49 PM
Senior Member
 
JaguarCRO's Avatar
 
Join Date: Feb 2003
Location: Sunnyvale, CA
Posts: 309
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 193 Post(s)
Liked: 387
Quote:
Originally Posted by tubby497 View Post
Plasma "organic feel" = dithering/pwm noise
Completely agree with this statement. The pwm noise, dithering and color banding (I may have the term wrong but it is when my current Samsung Plasma keeps changing bright colors into other colors like whites -> purples, which has been getting much worse over time). Some people may like these qualities but I have really grown to dislike them.

The Flat OLEDS at CES 2015 looked so much better both far away and up close.

I personally can't wait to see the 2016 CES TV offerings and am most interested to see what the OLED options are. I really need a new set and think that 2016 will be the year.
JaguarCRO is offline  
post #82 of 107 Old 12-31-2015, 01:42 AM
AVS Forum Special Member
 
Chronoptimist's Avatar
 
Join Date: Sep 2009
Posts: 3,118
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 557
Quote:
Originally Posted by JaguarCRO View Post
Completely agree with this statement. The pwm noise, dithering and color banding (I may have the term wrong but it is when my current Samsung Plasma keeps changing bright colors into other colors like whites -> purples, which has been getting much worse over time). Some people may like these qualities but I have really grown to dislike them.
I really don't see how those ugly digital artifacts create an "organic feel".
If you look at a CRT, it has none of those artifacts and produces a very smooth noise-free, banding-free image - and I doubt anyone would argue that a CRT produces anything other than a very natural organic image.

I think the reason that LG's OLEDs look "LCD-like" is due to their image processing (do they still have forced noise reduction?) and the fact that they are flicker-free displays.
Being 4K native may also be a contributing factor, if you're mostly watching upscaled content and are used to watching content at its native resolution on your 1080p plasma or even downscaled on a 720p plasma.
Personally though, I would say that upscaling - as long as it's done well - is always a positive thing for video. Even on a CRT.

Though I dislike the WRGB pixel structure, I don't think that using white OLED material under RGB color filters should have a negative effect on whether the image looks "organic" or not compared to RGB OLED - as long as the image is not changing with viewing angle.

I think it's the fact that these are flicker-free displays which is the main problem.
Plasmas all flicker at 60Hz and 48/72/96Hz with film, depending on the model.
This has a significant impact on how motion is perceived, and I think that, and the fact that the image remains largely unchanged with viewing angle, has more to do with what people mean by the "organic" look of those displays than anything else.

Quote:
Originally Posted by GregLee View Post
What exactly does "is mastered to" mean? The source signal will have a number for the brightness of each subpixel, and if it has 10 bit color depth, the maximum number that could be in the video is 1023. Are you saying that the 1023 value represents 1200 nits in the original scene and recorded by the camera? (Or, if you really mean "whites" and not brightness, I should have said 1023 for all three subpixels.)
HDR video is not coded the same was as SDR video. The video includes metadata which states the brightness level that the content was mastered for.
I have not been staying up to date with HDR mastering information, but my understanding was that if you have an 600 nit HDR display, any scene which is 600 nits or lower should look exactly the same as it would on an HDR display capable of 1200 nits.

It's only once you have scenes with content greater than 600 nits brightness that things should differ on the displays, as the 600 nit display will have to use highlight compression, whereas a 1200 nit display would not.
If this was SDR, a 1200 nit display would show everything at twice the brightness of a 600 nit display, rather than only the brighter scenes which require >600 nits brightness.
Chronoptimist is offline  
post #83 of 107 Old 12-31-2015, 01:49 AM
AVS Forum Special Member
 
kucharsk's Avatar
 
Join Date: Feb 2004
Location: Louisville, CO
Posts: 6,461
Mentioned: 8 Post(s)
Tagged: 0 Thread(s)
Quoted: 1775 Post(s)
Liked: 1048
Quote:
Originally Posted by mhrir View Post
Sony claim that there are too many problems with the production of OLED technology and that OLED TV's have reliability issues and are costly for what you get.
Expensive perhaps, but good, given all of Sony's professional broadcast monitors are now OLED.
kucharsk is offline  
post #84 of 107 Old 12-31-2015, 09:07 AM
AVS Forum Addicted Member
 
darinp2's Avatar
 
Join Date: Sep 2003
Location: Seattle, WA
Posts: 23,188
Mentioned: 7 Post(s)
Tagged: 0 Thread(s)
Quoted: 1135 Post(s)
Liked: 1767
Quote:
Originally Posted by Chronoptimist View Post
I have not been staying up to date with HDR mastering information, but my understanding was that if you have an 600 nit HDR display, any scene which is 600 nits or lower should look exactly the same as it would on an HDR display capable of 1200 nits.

It's only once you have scenes with content greater than 600 nits brightness that things should differ on the displays, as the 600 nit display will have to use highlight compression, whereas a 1200 nit display would not.
I don't think this is way off, but using highlight compression (which is better than clipping) will mean that the 600 nit display is going to need to use some of its upper range for things encoded between 600 and 1200. So, I would expect a 550 nit object to be dimmer on the 600 nit display than the master specified. That is, unless the display dynamically changes its output curve based on whether the current frame includes values above the display's peak, but I think that could create some issues.

--Darin
darinp2 is offline  
post #85 of 107 Old 12-31-2015, 09:24 AM
AVS Forum Special Member
 
Chronoptimist's Avatar
 
Join Date: Sep 2009
Posts: 3,118
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 557
Quote:
Originally Posted by darinp2 View Post
I don't think this is way off, but using highlight compression (which is better than clipping) will mean that the 600 nit display is going to need to use some of its upper range for things encoded between 600 and 1200. So, I would expect a 550 nit object to be dimmer on the 600 nit display than the master specified. That is, unless the display dynamically changes its output curve based on whether the current frame includes values above the display's peak, but I think that could create some issues.

--Darin
That's true, if you're using highlight compression you would need some degree of headroom, so perhaps you would only get up to 500 nits on a 600 nit HDR display before highlight compression was applied.
Or perhaps they would just clip after 600 nits.

It probably depends on how the individual displays handle it.
But the point is more that, with HDR, content should look at least mostly the same for scenes mastered using a brightness level below the peak for that display, whether that display is only capable of just reaching those peak levels, or if it is capable of significantly higher peak brightness.

You could have a display capable of 3000 nit peaks, and it shouldn't look any different from a display capable of 600 nit peaks in a scene coded for 300 nits - unlike SDR where everything would be displayed 5x brighter on the 3000 nit display.
Chronoptimist is offline  
post #86 of 107 Old 01-01-2016, 10:47 AM
AVS Forum Special Member
 
saprano's Avatar
 
Join Date: Oct 2007
Location: Bronx NY
Posts: 4,121
Mentioned: 3 Post(s)
Tagged: 0 Thread(s)
Quoted: 582 Post(s)
Liked: 602
Quote:
Originally Posted by Chronoptimist View Post
I really don't see how those ugly digital artifacts create an "organic feel". If you look at a CRT, it has none of those artifacts and produces a very smooth noise-free, banding-free image - and I doubt anyone would argue that a CRT produces anything other than a very natural organic image.
Like i said above it has to do with something else because i don't notice any of that. What you described about CRT's is exactly what i would say about the kuro.

You said it yourself

don't see how those ugly digital artifacts create an "organic feel"

They wouldn't. There's something about the way plasmas process the picture; Self emissive, better motion, having a solid uniform picture all over, the driving system etc. It could be any combination of plasmas tech that gives it a more realistic picture.
Quote:
Originally Posted by Chronoptimist View Post
I think the reason that LG's OLEDs look "LCD-like" is due to their image processing (do they still have forced noise reduction?) and the fact that they are flicker-free displays. Being 4K native may also be a contributing factor, if you're mostly watching upscaled content and are used to watching content at its native resolution on your 1080p plasma or even downscaled on a 720p plasma. Personally though, I would say that upscaling - as long as it's done well - is always a positive thing for video. Even on a CRT.
The 1080p OLED's look the same. So i wouldn't say it's because of upscailing to 4K.

home theater addict

Last edited by saprano; 01-01-2016 at 11:31 AM.
saprano is offline  
post #87 of 107 Old 01-03-2016, 01:07 AM
Member
 
Join Date: Sep 2015
Posts: 95
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 51 Post(s)
Liked: 19
Quote:
Originally Posted by Chronoptimist View Post
If you look at a CRT, it has none of those artifacts and produces a very smooth noise-free, banding-free image - and I doubt anyone would argue that a CRT produces anything other than a very natural organic image.
I would. CRT looks "organic" because it horribly smooths the image hiding all source problems like blocks in the shadows or color banding. LCD and OLED are extremely sharp in comparison and if your source is not good you will see a lot of compression artifacts. I prefer perfect sharpness any day. If you like CRT look there are ways to simulate it on LCD or OLED by doing a bit of smoothing and adding screen effect and image distortions.
Magnesus is offline  
post #88 of 107 Old 01-03-2016, 05:17 AM
 
tgm1024's Avatar
 
Join Date: Dec 2010
Location: Maybe ⅓ of the way from here to there.
Posts: 10,026
Mentioned: 5 Post(s)
Tagged: 0 Thread(s)
Quoted: 2589 Post(s)
Liked: 2312
Quote:
Originally Posted by Magnesus View Post
I would. CRT looks "organic" because it horribly smooths the image hiding all source problems like blocks in the shadows or color banding. LCD and OLED are extremely sharp in comparison and if your source is not good you will see a lot of compression artifacts. I prefer perfect sharpness any day. If you like CRT look there are ways to simulate it on LCD or OLED by doing a bit of smoothing and adding screen effect and image distortions.
Blurring an LCD, no matter how carefully, results in a blurry LCD look. Not a CRT look. No matter which you prefer, the two visuals are distinct.
UltraBlack and JeruL01 like this.
tgm1024 is offline  
post #89 of 107 Old 01-15-2016, 10:06 AM
Senior Member
 
Join Date: Jul 2015
Posts: 205
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 151 Post(s)
Liked: 37
Samsungs true rgb Oled still produces a better picture than any lg w Oled.
This just proves the woled can't compete with true rgb
Even panasonics Cz Oled has a yelliw stain (tint) which every lg Oled has on whites and Banding.
Samsungs s9c till this day has no banding no near black issues and tinting or vignetting.

Let's hope next year we see a true rgb 4K Oled.
Maybe then we'll have something to watch in true 4K.
suge967mari is offline  
post #90 of 107 Old 01-15-2016, 10:08 AM
 
tgm1024's Avatar
 
Join Date: Dec 2010
Location: Maybe ⅓ of the way from here to there.
Posts: 10,026
Mentioned: 5 Post(s)
Tagged: 0 Thread(s)
Quoted: 2589 Post(s)
Liked: 2312
Quote:
Originally Posted by suge967mari View Post
Samsungs true rgb Oled still produces a better picture than any lg w Oled.
This just proves the woled can't compete with true rgb
Even panasonics Cz Oled has a yelliw stain (tint) which every lg Oled has on whites and Banding.
Samsungs s9c till this day has no banding no near black issues and tinting or vignetting.

Let's hope next year we see a true rgb 4K Oled.
Maybe then we'll have something to watch in true 4K.
Gee, who might this ^^^^ be a rename of? Show of hands, anyone know?

LOL
tgm1024 is offline  
Sponsored Links
Advertisement
 
Reply OLED Technology and Flat Panels General

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off