AVS Forum | Home Theater Discussions And Reviews (https://www.avsforum.com/forum/)
-   OLED Technology and Flat Panels General (https://www.avsforum.com/forum/40-oled-technology-flat-panels-general/)
-   -   OLED or HDR LED (https://www.avsforum.com/forum/40-oled-technology-flat-panels-general/1936201-oled-hdr-led.html)

lobotimizer 03-17-2015 01:40 PM

OLED or HDR LED
 
Which looks better the ec9300 or the js9500 with HDR content? I currently own the ec970 but with HDR hyped up so much I am contemplating returning it. Has any reviewer seen both in person and made the comparison?

Is it possible for them to update the firmware in the ec9300 to show HDR?

Thanks in advance,

NintendoManiac64 03-17-2015 02:08 PM

"Better" is a very subjective term...

lobotimizer 03-18-2015 08:09 AM

Quote:

Originally Posted by NintendoManiac64 (Post 32700153)
"Better" is a very subjective term...

Are the trade offs roughly equal?

NintendoManiac64 03-18-2015 11:57 AM

Technically OLED is the superior display technology, but HDR displays can a brighter "torch mode" look that some people like. Not only that, but torch mode-like brightness can be useful if you watch things during the daytime, but the likes of OLED would look better in a darker environment unless you like to burn your eyeballs out (eyes don't like very bright things in dark environments).

The obvious solution is to get an HDR OLED. :p

As for with native HDR content, uh...is there actually any of that yet readily available yet other than maybe some demo videos?

GregLee 03-18-2015 05:40 PM

Quote:

Originally Posted by NintendoManiac64 (Post 32724633)
Technically OLED is the superior display technology, but HDR displays can a brighter "torch mode" look that some people like.

This is not a fair characterization. The attraction of HDR is that it displays natural scenes in higher fidelity than other technologies, since it has a better approximation to the contrast and color we can see. To see the benefits, however, the source video has to record scenes in higher fidelity, before the superior contrast and color can be shown on an HDR TV. Technically, HDR is the superior display technology over non-HDR displays.

NintendoManiac64 03-18-2015 05:47 PM

[quote=GregLee;32735057]The attraction of HDR is that it displays natural scenes in higher fidelity than other technologies, since it has a better approximation to the contrast and color we can see./quote]
To me, what you describe sounds more like both gamut and HDR together, not HDR alone.

Quote:

Originally Posted by GregLee (Post 32735057)
Technically, HDR is the superior display technology over non-HDR displays.

But he was asking about OLED vs HDR LED, not non-HDR OLED vs HDR OLED.

GregLee 03-18-2015 06:59 PM

Quote:

Originally Posted by NintendoManiac64 (Post 32735233)
Quote:

Originally Posted by GregLee (Post 32735057)
The attraction of HDR is that it displays natural scenes in higher fidelity than other technologies, since it has a better approximation to the contrast and color we can see.

To me, what you describe sounds more like both gamut and HDR together, not HDR alone.

No, HDR has both high brightness and more gradations of brightness. Otherwise it would conform to your "torch mode" caricature. The only way of producing more gradations of brightness is to produce more gradations of R, G, B, because that's how color television works. So an HDR display will have improved color, aside from any wider gamut it may have.

NintendoManiac64 03-18-2015 07:02 PM

Quote:

Originally Posted by GregLee (Post 32737049)
and more gradations of brightness

But that is deep color (10bit, 12bit, 16bit) not HDR.

GregLee 03-18-2015 07:29 PM

Quote:

Originally Posted by NintendoManiac64 (Post 32737137)
But that is deep color (10bit, 12bit, 16bit) not HDR.

You're mistaken. All varieties of HDR have had over 8 bits of brightness, from the original Brightside technology, to Dolby Vision, to the UHD alliance version. Can you refer me to something called HDR which has only 8 bit color depth?

NintendoManiac64 03-18-2015 07:34 PM

Quote:

Originally Posted by GregLee (Post 32737697)
You're mistaken. All varieties of HDR have had over 8 bits of brightness, from the original Brightside technology, to Dolby Vision, to the UHD alliance version.

But that doesn't make deep color the same as HDR, it just makes it a prerequisite for HDR if you don't want banding issues.

Quote:

Originally Posted by GregLee (Post 32737697)
Can you refer me to something called HDR which has only 8 bit color depth?

You do know that HDR is a generic term independent of TV technologies, right? It literally just means "high dynamic range".

There's a reason the specific TV HDR standards are not named just "HDR", because that already means something.

GregLee 03-18-2015 07:52 PM

Quote:

Originally Posted by NintendoManiac64 (Post 32737841)
You do know that HDR is a generic term independent of TV technologies, right?

No, I don't know that. If you are referring to something going on inside your head, I defer to your authority. Otherwise, I would need evidence.

NintendoManiac64 03-18-2015 07:57 PM

Quote:

Originally Posted by GregLee (Post 32738249)
No, I don't know that. If you are referring to something going on inside your head, I defer to your authority. Otherwise, I would need evidence.

I said what the acronym stood for - "high dynamic range"; put that exact phrase into any search engine.

GregLee 03-18-2015 08:06 PM

Quote:

Originally Posted by NintendoManiac64 (Post 32738401)
I said what the acronym stood for - "high dynamic range"; put that exact phrase into any search engine.

And does some search engine tell me that HDR is "independent of TV technologies"? Because that's what you claimed. (If you can recall what you wrote.)

SledgeHammer 03-18-2015 08:07 PM

Comparing OLED to HDR is meaningless. HDR is just a wider color gamut. All the properties that people like about OLED, infinite contrast, perfect blacks, "inky" colors, those are characteristics of an OLED panel. Something LCD will never be able to match. The best FALD sets have a few hundred dimming zones. A 4K OLED has 8,294,400 "dimming zones".

NintendoManiac64 03-18-2015 08:08 PM

Quote:

Originally Posted by GregLee (Post 32738649)
And does some search engine tell me that HDR is "independent of TV technologies"? Because that's what you claimed. (If you can recall what you wrote.)

It should, otherwise you'd be considering HDR in TV tech, HDR in photos, and HDR in videogames to be completely unrelated to one-another.

High dynamic range is nothing more than dynamic range that is of enough quantity to be considered "high" rather than typical or even low - heck you can have high dynamic range in audio.

GregLee 03-18-2015 08:10 PM

Quote:

Originally Posted by SledgeHammer (Post 32738697)
HDR is just a wider color gamut.

Going from silly to sillier. I give up.

SledgeHammer 03-18-2015 08:19 PM

Quote:

Originally Posted by GregLee (Post 32738777)
Going from silly to sillier. I give up.

How so?

GregLee 03-18-2015 09:12 PM

Quote:

Originally Posted by SledgeHammer (Post 32739001)
How so?

An HDR display has to have peak brightness much higher than lowest black level, but a wide color gamut display need not have enhanced brightness. So to say that HDR is just wide color gamut really makes no sense.

SledgeHammer 03-18-2015 09:26 PM

Quote:

Originally Posted by GregLee (Post 32740281)
An HDR display has to have peak brightness much higher than lowest black level, but a wide color gamut display need not have enhanced brightness. So to say that HDR is just wide color gamut really makes no sense.

So white is going to be brighter then black on these new sets? Man... what will they think of next??


On a more serious note... so how are they going to increase max brightness on these panels that are claiming to be HDR upgradable via a firmware update? The 2015 OLEDs are 800nit and we've heard that they will be HDR upgradable via firmware.


Also, HDR at the end of the day is bringing out the detail in dark scenes, for example, by, you guessed it, a wider color gamut, less banding, etc. At the end of the day an RGB TV is an RGB TV.


What had to be done to get HDR content into your TV is irrelevant since we are talking about the TV aspect. So at the end of the day, yes, its just showing more colors since that's really what a TV does last time I checked :).


That being said "wider color gamut" is an oversimplification of what's going on, but its still what's going on at a simplified level.

GregLee 03-18-2015 11:26 PM

Quote:

Originally Posted by SledgeHammer (Post 32740513)
That being said "wider color gamut" is an oversimplification of what's going on, but its still what's going on at a simplified level.

You know about the hue/saturation/brightness model for describing colors. HDR has more colors in the sense that it reproduces more levels of brightness. Wide color gamut has more colors in the sense that it reproduces more levels of saturation. There are different dimensions involved.

NintendoManiac64 03-19-2015 12:26 AM

Quote:

Originally Posted by GregLee (Post 32742057)
You know about the hue/saturation/brightness model

It's actually "lightness" or "Lum" (unsure what the unabbreviated form of the word is), which is not the same thing as brightness. Case in point, many photo-editing programs have both a "brightness" slider and a "lightness" slider, the ladder being a direct adjustment of the 'lum' value.

Direct link to image below in case your display isn't high enough resolution to view it full-size when embeded:
http://i.minus.com/iMtuNCgCDnncO.png

http://i.minus.com/iMtuNCgCDnncO.png

coldkick 03-19-2015 05:57 AM

Really? You're arguing over color models now?
For the record Hue/Saturation/Lightness, Hue/Saturation/Brightness, and Hue/Saturation/Value are all real color models.
In terms of HDR, you would most likely use the HSB/V color model as it is the standard model for measuring the power of the light given off by the source.
HSL however, uses 'Lightness' which is the amount of 'pure' white that is being allowed through.

Case-in-point. @GregLee wasn't wrong, stop being petty over things and looking for a way to get a one-up.

SledgeHammer 03-19-2015 08:05 AM

You guys do understand all these different color models are just different ways of describing the SAME EXACT colors, right? HSB, CYMK, RGB, etc. cannot describe a single color that cannot be exactly described by all the other systems. Sorry, but to say HDR is not a "wider color gamut" because it uses HSB is pure nonsense. HSB is just describing the color from a different point of view. If you take any color and crank up the brightness component, it becomes white. End of story. Its not a different white then the one described by RGB. HSB doesn't magically do anything magical to the colors.


At the end of the day, we live in an RGB world and this is an RGB TV, so you can describe a color using the potatoes / ground up unicorn horns / pixie dust system for all I care... it's still an RGB color.


If you don't like it, take it up with Roy G. Biv :).

taichi4 03-19-2015 10:16 AM

Ground up Unicorn horn would never be used in display technology, as it's incredibly difficult to handle, has longevity issues (I'd be surprised if you'd get 5000 hours out of it), and is for all intents and purposes impossible to procure in large amounts nowadays.

lobotimizer 03-25-2015 04:02 PM

To revise my original question. What would you prefer having hdr(in 12 months when it becomes available) or an OLED that doesn't support it such as the ec9300?

EvLee 03-26-2015 10:05 PM

It's going to take a little while for HDR to roll out and become mature. The situation is like this: Manufacturers haven't optimized their consumer displays for HDR because they don't have access to studio HDR content, and studios are struggling to create content because nobody sells a reference quality HDR display yet. Dolby's Pulsar could be a reference HDR display except it is only available on loan... nobody can actually buy one. The majority of what people have seen so far in HDR demos is actually standard content that has been stretched out for HDR. I think if you are happy with your current display, just hold onto it for now and wait through the first round of HDR displays for the bugs to get sorted out. Until content is actually being created with HDR in mind, I don't think you will miss out on too much.

RLBURNSIDE 03-26-2015 10:57 PM

HDR indeed only makes sense on 10+ bit displays. 8-bit would have way too much banding to be useable. It already HAS plenty of banding, even in the scene's it's already expected to display (8 bpc 420 Bluray content).

Increasing the peak white level in lumens means you need more bits per channel otherwise there will not be enough gradations in the possible signal to give a smooth transition in not only greyscale (luma), but also in the chroma components. You can swap between YCbCr (luma + 2 chroma) channels or other color spaces and RGB, using any number of bits, but those bits are typically the same per component for easy transfer back and forth (which happens often)

When you increase the peak white from 100 nits to 2600 as the UHD HDR standard does, you will need a minimum of 10-bits per channel to avoid banding. To go beyond 2600 nits, you need 12 bits.

Deep color has been supported in HDMI since 1.3, meaning 30-bit or 36-bit total (10-bit or 12 bit per channel). But most of that data is merely extrapolated. In UHD Bluray HDR titles, you will actually get content that's specifically graded to HDR. Dolby Vision is a separate spec that uses 12-bits encoded as 10-bit 444 YCbCr SDR sent over the wire then decoded in the TV as 12-bit 420 HDR with 10,000 nits peak white in the signal, plus metadata in the signal to say what the min and max white in the signal is, so that the TV can auto-calibrate itself to reproduce the signal the best it can.

tgm1024 03-28-2015 06:35 AM

Quote:

Originally Posted by RLBURNSIDE (Post 32977121)
HDR indeed only makes sense on 10+ bit displays. 8-bit would have way too much banding to be useable.

Dithering would solve that problem, particularly on 4K. An every-other checkerboard of neighboring values can move a raster from N bit depth to a cleanly effective N+1 bit depth. It's nearly impossible to detect this on a 2K device. On a 4K device? Forget it. To go to N+2 depth is only slightly more complicated involving more than one value and/or spacing. And this is entirely using spatial dithering.

Having 10 bits native harware with matching source is a win, and if they've cemented 10 bits as the minimum depth for the HDR standard, then so be it. But 8 bits is hardly a death sentence, particularly at 4K.

siouxchief 04-07-2015 12:38 AM

Quote:

Originally Posted by SledgeHammer (Post 32740513)
On a more serious note... so how are they going to increase max brightness on these panels that are claiming to be HDR upgradable via a firmware update? The 2015 OLEDs are 800nit and we've heard that they will be HDR upgradable via firmware.

I'd be interested to hear opinions on this too please?

Cheers

fafrd 04-07-2015 01:06 PM

Quote:

Originally Posted by siouxchief (Post 33285657)
I'd be interested to hear opinions on this too please?

Cheers

You can"t (increase peak brightness through firmware).

You may be able to interpret HDR metadata through firmware, and that is probably the intended meaning of 'add HDR'.

In addition, to the extent there are any ABL-like brightness limiters in the LGs current WOLED TVs, you may be able to throttle that back and allow pixels to put out more lumens in certain situations than they can currently.

But overall, LG is playing catch-up to the hard tack-to-the-right that Samsung has driven (with the entire industry backing them), and the 2015 LG OLEDs are very, very unlikely to support HDR in a meaningful and future-proof manner.

If the 65" OLEDs dropped dramatically in price, I might think about it, but at '3 times the price of High-End LCDs' (to quote an LG marketing executive: http://www.cnet.com/news/the-great-o...ed-at-a-price/), that makes waiting to see what LG announces at CES 2016 a no-brainer for me...


All times are GMT -7. The time now is 09:36 PM.

Powered by vBulletin® Copyright ©2000 - 2019, Jelsoft Enterprises Ltd.
vBulletin Security provided by vBSecurity (Pro) - vBulletin Mods & Addons Copyright © 2019 DragonByte Technologies Ltd.

vBulletin Optimisation provided by vB Optimise (Pro) - vBulletin Mods & Addons Copyright © 2019 DragonByte Technologies Ltd.
User Alert System provided by Advanced User Tagging (Pro) - vBulletin Mods & Addons Copyright © 2019 DragonByte Technologies Ltd.