Originally Posted by JaguarCRO
Completely agree with this statement. The pwm noise, dithering and color banding (I may have the term wrong but it is when my current Samsung Plasma keeps changing bright colors into other colors like whites -> purples, which has been getting much worse over time). Some people may like these qualities but I have really grown to dislike them.
I really don't see how those ugly digital artifacts create an "organic feel".
If you look at a CRT, it has none of those artifacts and produces a very smooth noise-free, banding-free image - and I doubt anyone would argue that a CRT produces anything other than a very natural organic image.
I think the reason that LG's OLEDs look "LCD-like" is due to their image processing (do they still have forced noise reduction?) and the fact that they are flicker-free displays.
Being 4K native may also be a contributing factor, if you're mostly watching upscaled content and are used to watching content at its native resolution on your 1080p plasma or even downscaled on a 720p plasma.
Personally though, I would say that upscaling - as long as it's done well - is always a positive thing for video. Even on a CRT.
Though I dislike the WRGB pixel structure, I don't think that using white OLED material under RGB color filters should have a negative effect on whether the image looks "organic" or not compared to RGB OLED - as long as the image is not changing with viewing angle.
I think it's the fact that these are flicker-free displays which is the main problem.
Plasmas all flicker at 60Hz and 48/72/96Hz with film, depending on the model.
This has a significant impact on how motion is perceived, and I think that, and the fact that the image remains largely unchanged with viewing angle, has more to do with what people mean by the "organic" look of those displays than anything else.
Originally Posted by GregLee
What exactly does "is mastered to" mean? The source signal will have a number for the brightness of each subpixel, and if it has 10 bit color depth, the maximum number that could be in the video is 1023. Are you saying that the 1023 value represents 1200 nits in the original scene and recorded by the camera? (Or, if you really mean "whites" and not brightness, I should have said 1023 for all three subpixels.)
HDR video is not coded the same was as SDR video. The video includes metadata which states the brightness level that the content was mastered for.
I have not been staying up to date with HDR mastering information, but my understanding was that if you have an 600 nit HDR display, any scene which is 600 nits or lower should look exactly the same as it would on an HDR display capable of 1200 nits.
It's only once you have scenes with content greater than 600 nits brightness that things should differ on the displays, as the 600 nit display will have to use highlight compression, whereas a 1200 nit display would not.
If this was SDR, a 1200 nit display would show everything
at twice the brightness of a 600 nit display, rather than only the brighter scenes which require >600 nits brightness.