AVS Forum banner

Wouldn't both HDR & Dolby Vision appear to look about the same on 10-bit panels?

10K views 38 replies 7 participants last post by  DisplayCalNoob 
#1 ·
If 12-bit Dolby Vision has more color gamut data than any current display panel can display, then I assume 12-bit Dolby Vision's color gamut is reduced to the panel's 10-bit HDR color gamut max of about one billion. Wouldn't both HDR & Dolby Vision appear to look about the same?
 
#2 ·
Its not a color gamut, it can achieve more color within a gamut and color space.

68 billion colors 12 bit

10 billion colors 10 bit

There are some benefits, despite current displays being 10 bit only, like less to no banding and or posterization.

12 bit can be achieved, although artificially with dithering, like FRC. Honestly, if I was behind display manufacturing, I would use dithering over switching to 8K. DV would look amazing, particularly streamed content that is pure DV, no HDR10 base layer.

Sent from my SM-G550T1 using Tapatalk
 
#3 ·
Thanks for the explanation. Is the HDR10 base included with DV for backwards compatibility?
 
#5 ·
The biggest benefit to Dolby Vision is not the bit depth, but the way it analyzes the capabilities of both the mastering display and the output display, and creates a custom tonemapping curve for each scene that reproduces the image as accurately to what the editors saw as you can get with your screen. And Dolby Vision's tonemapping algorithms seem to be all significantly better than anything from HDR10, and I'd imaging HDR10+. From what I can tell, DV even takes ABL into account.

That being said, if you theoretically had a 10,000 nit OLED with no ABL, there would probably be no tonemapping applied to any content, HDR10 or DV. However I can't be 100% confident in that, as I have heard conflicting reports that the DV tonemapping may still take advantage even if your TV has better capabilities than the mastering display. How and to what degree that might happen, I have no idea.
 
#6 ·
The biggest benefit to Dolby Vision is not the bit depth, but the way it analyzes the capabilities of both the mastering display and the output display, and creates a custom tonemapping curve for each scene that reproduces the image as accurately to what the editors saw as you can get with your screen. And Dolby Vision's tonemapping algorithms seem to be all significantly better than anything from HDR10, and I'd imaging HDR10+. From what I can tell, DV even takes ABL into account.



That being said, if you theoretically had a 10,000 nit OLED with no ABL, there would probably be no tonemapping applied to any content, HDR10 or DV. However I can't be 100% confident in that, as I have heard conflicting reports that the DV tonemapping may still take advantage even if your TV has better capabilities than the mastering display. How and to what degree that might happen, I have no idea.
Stacey mentioned that the full enhancement layer carries data from the original HDR master.

Only way I can see DV benefiting any content based on the display exceeding the master, is tone mapping. Its in the patent, but know one is willing to confirm it.

Sent from my SM-G550T1 using Tapatalk
 
#9 ·
@morphinapg @EvLee

Tone mapping means, it maps, atleast in Dolby Vision case, that the code values from black level, APL, color saturation, hue, chroma, etc. This is all done dynamically based on the dynamic metadata, FEL or MEL. Even a 1000 nit consumer display, will still tone map if its gamut coverage doesn't match or exceed the monitor the content was mastered on. Ultimately, you are mapping 4000 nits down to 800 nits, with the goal being no visible artifacts.

I posted about this statement from Stacey Spears about two-three weeks ago, and he said that the enhancement layer is what carries the data from the original HDR master.

How HDR10 and DV look on your display, depends a lot on the manufacturer understanding of how DV/HDR10 works, and factory calibrating them to meet the standards.

Many of the old issues from SDR still exist, which I would place blame on the display manufacturers, and physical media playback device manufacturers.

Clipping of white, clipping of color, having contrast set to high. Clipping detail below 4000 nits, color clipping where detail in colors are crushed.

Sent from my SM-G550T1 using Tapatalk
 
#11 ·
Dolby Vision is a hoax



You will hear all kinds of comments from so called experts that will defend Dolby Vision for home theaters with there life and give you all sorts of technical babble to why your seeing Dolby Vision to How dare you question that at all.They know better and I would bet they push DV because there is a buck in it. Look your question should be a statement ,believe your eyes and your brain. What your seeing with dolby vision and a 10 bit panel is Dolby vision enabled ,or a dumb down version of Dolby vision,just not true DV like in a DV theater. Why? because of what you said.It's another gimmick without the 12 bit panel. And your right ,I have seen reviews side by side DV vs HDR10 and it's hard to see a difference. I would think if DV was so great there would be a huge difference when you see it and TVS with DV would be flying out of the stores ,but they are not! Until we get 12 bit panels it's a waste of money. This is marketing 101 ,Samsung has Qled which use to be SUHD but changed the name because it didn't sound better then OLED ,so we got QLED LG has nanocells same thing quantum dot technology and Sony has there fancy name for the same technology. Some do the technology better then others but its all qleds.That's what going on with DV at this point ,just another way to get in your wallet.
 
#12 ·
You will hear all kinds of comments from so called experts that will defend Dolby Vision for home theaters with there life and give you all sorts of technical babble to why your seeing Dolby Vision to How dare you question that at all.They know better and I would bet they push DV because there is a buck in it. Look your question should be a statement ,believe your eyes and your brain. What your seeing with dolby vision and a 10 bit panel is Dolby vision enabled ,or a dumb down version of Dolby vision,just not true DV like in a DV theater. Why? because of what you said.It's another gimmick without the 12 bit panel. And your right ,I have seen reviews side by side DV vs HDR10 and it's hard to see a difference. I would think if DV was so great there would be a huge difference when you see it and TVS with DV would be flying out of the stores ,but they are not! Until we get 12 bit panels it's a waste of money. This is marketing 101 ,Samsung has Qled which use to be SUHD but changed the name because it didn't sound better then OLED ,so we got QLED LG has nanocells same thing quantum dot technology and Sony has there fancy name for the same technology. Some do the technology better then others but its all qleds.That's what going on with DV at this point ,just another way to get in your wallet.
The 12bit part of Dolby Vision is the least important part of it. Their smart tonemapping solution with dynamic metadata is the far more critical part of the experience. Most people would not notice a difference between 10bit and 12bit even on a 12bit panel, however, there is still benefit to be had in having a higher quality source signal, even if your display doesn't do 12bit, as that results in less degradation during signal processing.

That being said, I've never fully understood what a panel being 10bit or 12bit even actually means on a hardware level. The current going to the pixels is going to be analog, not digital.
 
#29 ·
While it's true that there are a lot more colors, its not as if most of that is a visible difference. Like, if I made a TV that displayed colors in the infrared or ultraviolet spectrum, I could claim "ow wow, billions of more colors", and I'd technically be true, but you're not seeing any of those new colors.

Our eyes have limitations. 10bit is very close to those limitations. Even if we could see the full 12bit spectrum, it would be an incredibly minor difference. Something most people would never be able to notice. However, because 10bit is so close to the eye's threshold, the difference is even less than the minor difference it would have been as is. It should also be clear that 12/10 bit does not affect how wide the gamut is.

Ultimately, these are diminishing returns. 12bit isn't the important part of DV, the dynamic metadata and smart tonemapping is. Especially considering a lot of DV is actually only 10bit anyway.

Sent from my OnePlus One using Tapatalk
 
  • Like
Reactions: SuperFist
#30 ·
The DV grade is always 12 bit, the encode is 10 bit. 10 bit HDR10 base layer, 2 bit enhancement layer. This is combined at the display or player. Stacey says that the DV enhancement layer carries data from the mastering monitor, along with MaxCLL and MaxFaLL, etc.

MEL and FEL with different file sizes, WB seems to have really come in to its own when it comes to their grades.



Sent from my LGMP450 using Tapatalk
 
#31 ·
The grade itself may be 12bit, but the enhancement layer does not always contain the extra 2bits of information. Some discs and most streaming services I believe remain 10bit and only use the enhancement layer for the metadata.

Sent from my OnePlus One using Tapatalk
 
#33 ·
The difference in what we see once we get to 12 bit panels, isn't what we assume or think. Its not particularly better color, but a few things that exist in areas we usually associate with resolution, and bit rate.

We get what Dolby has always advertised as a strength of HDR and more specifically Dolby Vision. Better pixels, or fine detail. A life like image quality level of detail. More color, means more detail.

Accompany that with higher peak nit displays, there is a level of depth and sharpness you want find on a 10 bit panel comes very close.

Sent from my LGMP450 using Tapatalk
 
#34 ·
Fine detail requires contrast between pixels, meaning RGB values are going to be significantly apart from each other, so higher bit depth doesn't improve that. High bit depth only improves areas where the RGB values are very close to each other.

Honestly accurate fine detail can be achieved with even 8bit HDR. For example, I was editing an HDR project but because of the editing environment I had to send the signal to my TV clamped to 8bit. For the most part, there wasn't a significant difference compared to 10bit. All detail and noise looked the same. Most colors looked the same. Highlights looked the same. Shadows looked the same. The only areas that looked different compared to 10bit were the areas with longer, smoother gradients between colors, like the sky, or a character's skin, or a plain floor. There you would see banding or inaccuracy in color, but even then, it wasn't a significant difference. It was like a really subtle band shifted slightly pink or something. Barely there. With 10bit, there's no evidence of anything like that. No banding, no inaccuracy in color, etc. Everything looks perfect. So I don't see how that can be improved upon. I genuinely think 12bit is beyond what we're capable of seeing. Much like 8k would be in nearly all cases.

As for brighter displays, the EOTF has the same sensitivity to bright and dark as our eyes do, so there would be no difference with brighter panels.

Sent from my OnePlus One using Tapatalk
 
#36 ·
As someone already mentioned the real benefit of DV is per scene control of the display. I could be wrong but I think standard vanilla HDR is just a wider color gamut and brightness/darkness, but I don't believe it alters those on a scene by scene basis. Also for some reason, I think I am probably wrong, but I think DV actually takes over your current display settings and uses custom ones. At least I have noticed, DV movies on something like Netflix alter my white brightness on things like on screen menus, say like the brightness of my volume slider display.
 
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top