Originally Posted by John West
ULED (Quantum Dots) is both, marketing and a real thing. It has to due with the actual light the TV generates to create the backlight. In a traditional TV (not quantum dot) the light has to be filtered, that filtering causes it to be dimmer, etc. Quantum Dots (in the most basic terms) allows for unfiltered (and thus brighter) light to be used which in general terms means bright images, fuller (and broader) colors, etc. That being said ULED is not a requirement for HDR, it is simply one means of designing a TV that allows those TV's to reach those color and brightness ranges but the tv still needs to have a 10-bit panel for instance. So a TV could have quantum dots, be especially bright and colorful, but still lack the panel to display the full range of HDR colors and thus have banding like a traditional set despite having the nits to otherwise be "HDR". Thats kind of whats hurting the market right now, "HDR" is such a loose term that really is made up of numerous technical components the various companies are doing a really really poor job of customer education and product identification.
Well, not entirely -- HDR is a software metadata specification for brightness, contrast, and color while quantum dots (a hardware technology) help us pull off HDR by allowing for brighter panels, wider color gamut, and richer color saturation.
There are currently two flavors of HDR metadata: HDR10 and DolbyVision.
In order for a TV to support either HDR10 or DolbyVision, beyond being able to simply reach certain brightness levels or display a specific color gamut width, the TV's firmware must be able to decode the HDR metadata
For example, when I owned the Hisense H9B (great thread here on AVS about it) the firmware initially did not support HDR10 metadata. Nothing about the attributes of the panel changed, but with a firmware update that supported HDR10 metadata, the TV was instantly an HDR TV.
I also have a Vizio P50 which initially only supported DolbyVision but with a firmware update is now also able to decode HDR10.
Are SUHD, ULED, and Triluminous displays better than LCD displays that are not? Yes, definitely! The integration of LCD's cutting edge technologies in color gamut, brightness, contrast, and software decoding simply create a significantly better picture.
Now, when you leverage Full Array Local Dimming as opposed to side-lit screen, you get yet another jump in picture quality. With FALD, in addition to the other benefits of quantum dot technology, you get dramatically blacker blacks and tremendous contrast! The more zones in the FALD, the better the black performance and the better the contrast
When it's all working together: the contrast, brightness and wide color gamut of quantum dots, FALD, (and other stuff like a 120hz native panel)... then HDR comes in and choreographs the picture dynamics in software from the source (a 4k blueray for example).
I was just watching Sicario and Deadpool in UHD HDR blu-ray from the XB1S on the N9000U. The dust floating in the air moving from a dark shadow through a streak of sunlight in an open window is awesome. Likewise the lighting in Deadpool, and the color, I think is a great example of HDR -- both in the bright scenes and the dark scenes.
The N9000U is a just a tremendous display. I have a long-in-the tooth 50" TCL 4k display (from the pre-Roku, pre-HDR days) which has been a great display and I have a Vizio P50... The Sharp is in a completely different weight class in terms of picture quality and overall viewing experience.