Wouldn't both HDR & Dolby Vision appear to look about the same on 10-bit panels? - AVS Forum | Home Theater Discussions And Reviews
Forum Jump: 
 11Likes
Reply
 
Thread Tools
post #1 of 39 Old 05-30-2019, 12:25 PM - Thread Starter
Advanced Member
 
DERG's Avatar
 
Join Date: Feb 2003
Location: Ocala, Florida
Posts: 903
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 186 Post(s)
Liked: 89
Wouldn't both HDR & Dolby Vision appear to look about the same on 10-bit panels?

If 12-bit Dolby Vision has more color gamut data than any current display panel can display, then I assume 12-bit Dolby Vision's color gamut is reduced to the panel's 10-bit HDR color gamut max of about one billion. Wouldn't both HDR & Dolby Vision appear to look about the same?

LG OLED65C8PUA, Marantz SR-7013, Pioneer Elite UDP-LX500 UHD Blu-ray Player, nVidia Shield TV (2017), 2 - HSU VTF-3 MK5 HP Subwoofers, Poke Audio fronts, center & surrounds.
DERG is offline  
Sponsored Links
Advertisement
 
post #2 of 39 Old 05-30-2019, 01:07 PM
AVS Forum Special Member
 
Join Date: Mar 2015
Location: Los Angeles, CA
Posts: 2,806
Mentioned: 18 Post(s)
Tagged: 0 Thread(s)
Quoted: 1848 Post(s)
Liked: 1106
Its not a color gamut, it can achieve more color within a gamut and color space.

68 billion colors 12 bit

10 billion colors 10 bit

There are some benefits, despite current displays being 10 bit only, like less to no banding and or posterization.

12 bit can be achieved, although artificially with dithering, like FRC. Honestly, if I was behind display manufacturing, I would use dithering over switching to 8K. DV would look amazing, particularly streamed content that is pure DV, no HDR10 base layer.

Sent from my SM-G550T1 using Tapatalk
DisplayCalNoob is offline  
post #3 of 39 Old 05-31-2019, 05:39 AM - Thread Starter
Advanced Member
 
DERG's Avatar
 
Join Date: Feb 2003
Location: Ocala, Florida
Posts: 903
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 186 Post(s)
Liked: 89
Quote:
Originally Posted by DisplayCalNoob View Post
Its not a color gamut, it can achieve more color within a gamut and color space.

68 billion colors 12 bit

10 billion colors 10 bit

There are some benefits, despite current displays being 10 bit only, like less to no banding and or posterization.

12 bit can be achieved, although artificially with dithering, like FRC. Honestly, if I was behind display manufacturing, I would use dithering over switching to 8K. DV would look amazing, particularly streamed content that is pure DV, no HDR10 base layer.

Sent from my SM-G550T1 using Tapatalk
Thanks for the explanation. Is the HDR10 base included with DV for backwards compatibility?

LG OLED65C8PUA, Marantz SR-7013, Pioneer Elite UDP-LX500 UHD Blu-ray Player, nVidia Shield TV (2017), 2 - HSU VTF-3 MK5 HP Subwoofers, Poke Audio fronts, center & surrounds.
DERG is offline  
Sponsored Links
Advertisement
 
post #4 of 39 Old 05-31-2019, 12:16 PM
AVS Forum Special Member
 
Join Date: Mar 2015
Location: Los Angeles, CA
Posts: 2,806
Mentioned: 18 Post(s)
Tagged: 0 Thread(s)
Quoted: 1848 Post(s)
Liked: 1106
Quote:
Originally Posted by DERG View Post
Thanks for the explanation. Is the HDR10 base included with DV for backwards compatibility?
Currently, only on disc. I think Netflix still use that Profile, although they can support several profiles. Vudu and Apple use a pure DV stream, while HDR10 only displays get, well you know.

Sent from my SM-G550T1 using Tapatalk
DERG likes this.
DisplayCalNoob is offline  
post #5 of 39 Old 05-31-2019, 01:36 PM
AVS Forum Special Member
 
morphinapg's Avatar
 
Join Date: Jun 2006
Posts: 3,749
Mentioned: 39 Post(s)
Tagged: 0 Thread(s)
Quoted: 2280 Post(s)
Liked: 1812
The biggest benefit to Dolby Vision is not the bit depth, but the way it analyzes the capabilities of both the mastering display and the output display, and creates a custom tonemapping curve for each scene that reproduces the image as accurately to what the editors saw as you can get with your screen. And Dolby Vision's tonemapping algorithms seem to be all significantly better than anything from HDR10, and I'd imaging HDR10+. From what I can tell, DV even takes ABL into account.

That being said, if you theoretically had a 10,000 nit OLED with no ABL, there would probably be no tonemapping applied to any content, HDR10 or DV. However I can't be 100% confident in that, as I have heard conflicting reports that the DV tonemapping may still take advantage even if your TV has better capabilities than the mastering display. How and to what degree that might happen, I have no idea.

Living Room: Sony XBR65X930D, Sony HTCT780, Base PS4, Philips BDP7501
Bedroom: LG OLED55C6P, Denon AVR-X1300W (5.1.2), PS4 Pro, Sony UBP-X700, Atomos Ninja Inferno
Check out my video game movie edits
morphinapg is offline  
post #6 of 39 Old 05-31-2019, 11:08 PM
AVS Forum Special Member
 
Join Date: Mar 2015
Location: Los Angeles, CA
Posts: 2,806
Mentioned: 18 Post(s)
Tagged: 0 Thread(s)
Quoted: 1848 Post(s)
Liked: 1106
Quote:
Originally Posted by morphinapg View Post
The biggest benefit to Dolby Vision is not the bit depth, but the way it analyzes the capabilities of both the mastering display and the output display, and creates a custom tonemapping curve for each scene that reproduces the image as accurately to what the editors saw as you can get with your screen. And Dolby Vision's tonemapping algorithms seem to be all significantly better than anything from HDR10, and I'd imaging HDR10+. From what I can tell, DV even takes ABL into account.



That being said, if you theoretically had a 10,000 nit OLED with no ABL, there would probably be no tonemapping applied to any content, HDR10 or DV. However I can't be 100% confident in that, as I have heard conflicting reports that the DV tonemapping may still take advantage even if your TV has better capabilities than the mastering display. How and to what degree that might happen, I have no idea.
Stacey mentioned that the full enhancement layer carries data from the original HDR master.

Only way I can see DV benefiting any content based on the display exceeding the master, is tone mapping. Its in the patent, but know one is willing to confirm it.

Sent from my SM-G550T1 using Tapatalk
DisplayCalNoob is offline  
post #7 of 39 Old 06-10-2019, 04:02 PM
Advanced Member
 
EvLee's Avatar
 
Join Date: Dec 2009
Posts: 685
Mentioned: 7 Post(s)
Tagged: 0 Thread(s)
Quoted: 394 Post(s)
Liked: 374
Quote:
Originally Posted by morphinapg View Post
The biggest benefit to Dolby Vision is not the bit depth, but the way it analyzes the capabilities of both the mastering display and the output display, and creates a custom tonemapping curve for each scene that reproduces the image as accurately to what the editors saw as you can get with your screen. And Dolby Vision's tonemapping algorithms seem to be all significantly better than anything from HDR10, and I'd imaging HDR10+. From what I can tell, DV even takes ABL into account.

That being said, if you theoretically had a 10,000 nit OLED with no ABL, there would probably be no tonemapping applied to any content, HDR10 or DV. However I can't be 100% confident in that, as I have heard conflicting reports that the DV tonemapping may still take advantage even if your TV has better capabilities than the mastering display. How and to what degree that might happen, I have no idea.
Many studios now are actually limiting their distribution masters to 1000 nits peak in HDR10. So you don't even need a 10,000 nit television, for those titles a 1000 nit television will see no benefit from Dolby Vision. Also, the accuracy of Dolby's tonemapping is debatable.
EvLee is offline  
post #8 of 39 Old 06-10-2019, 04:09 PM
AVS Forum Special Member
 
morphinapg's Avatar
 
Join Date: Jun 2006
Posts: 3,749
Mentioned: 39 Post(s)
Tagged: 0 Thread(s)
Quoted: 2280 Post(s)
Liked: 1812
Quote:
Originally Posted by EvLee View Post
Many studios now are actually limiting their distribution masters to 1000 nits peak in HDR10. So you don't even need a 10,000 nit television, for those titles a 1000 nit television will see no benefit from Dolby Vision. Also, the accuracy of Dolby's tonemapping is debatable.
Tonemapping isn't really about "accuracy". It's about reproducing the dynamics of the image as closely as possible to what was seen on the reference monitor. And a 1000 nit screen may still not be able to reproduce everything exactly the same way the reference monitor did. So there will likely always be some tonemapping to correct for those differences unless you're watching on literally the exact same monitor as the master was made on. Tonemapping will inherently be less accurate than displaying the content nit for nit from the source if possible, but because it results in an image closer to the mastering display, that's a good thing. Dolby does a very good job and utilizes much more advanced techniques in its tonemapping than any other I've seen.

1000 nit peak happens on some studios yes, but that's more of an issue of what mastering displays they are using. When those get better, so do their limitations.

Living Room: Sony XBR65X930D, Sony HTCT780, Base PS4, Philips BDP7501
Bedroom: LG OLED55C6P, Denon AVR-X1300W (5.1.2), PS4 Pro, Sony UBP-X700, Atomos Ninja Inferno
Check out my video game movie edits
morphinapg is offline  
post #9 of 39 Old 06-10-2019, 05:49 PM
AVS Forum Special Member
 
Join Date: Mar 2015
Location: Los Angeles, CA
Posts: 2,806
Mentioned: 18 Post(s)
Tagged: 0 Thread(s)
Quoted: 1848 Post(s)
Liked: 1106
@morphinapg @EvLee

Tone mapping means, it maps, atleast in Dolby Vision case, that the code values from black level, APL, color saturation, hue, chroma, etc. This is all done dynamically based on the dynamic metadata, FEL or MEL. Even a 1000 nit consumer display, will still tone map if its gamut coverage doesn't match or exceed the monitor the content was mastered on. Ultimately, you are mapping 4000 nits down to 800 nits, with the goal being no visible artifacts.

I posted about this statement from Stacey Spears about two-three weeks ago, and he said that the enhancement layer is what carries the data from the original HDR master.

How HDR10 and DV look on your display, depends a lot on the manufacturer understanding of how DV/HDR10 works, and factory calibrating them to meet the standards.

Many of the old issues from SDR still exist, which I would place blame on the display manufacturers, and physical media playback device manufacturers.

Clipping of white, clipping of color, having contrast set to high. Clipping detail below 4000 nits, color clipping where detail in colors are crushed.

Sent from my SM-G550T1 using Tapatalk
DisplayCalNoob is offline  
post #10 of 39 Old 06-27-2019, 04:13 PM
Member
 
Join Date: May 2015
Location: Bethel Vermont
Posts: 61
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 60 Post(s)
Liked: 9
hdr

Quote:
Originally Posted by morphinapg View Post
The biggest benefit to Dolby Vision is not the bit depth, but the way it analyzes the capabilities of both the mastering display and the output display, and creates a custom tonemapping curve for each scene that reproduces the image as accurately to what the editors saw as you can get with your screen. And Dolby Vision's tonemapping algorithms seem to be all significantly better than anything from HDR10, and I'd imaging HDR10+. From what I can tell, DV even takes ABL into account.

That being said, if you theoretically had a 10,000 nit OLED with no ABL, there would probably be no tonemapping applied to any content, HDR10 or DV. However I can't be 100% confident in that, as I have heard conflicting reports that the DV tonemapping may still take advantage even if your TV has better capabilities than the mastering display. How and to what degree that might happen, I have no idea.
That's all fine and good, but look the normal consumer that knows nothing about Dolby vision or HDR10 or 10+ is given a half ass explanation from the brick and mortar stores, which works for the store and the manufacturer to sell products. Here is the problem I have with Dolby Vision. If you read the PDF of DV on the Dolby website the spec clearly says ( and i'm paraphrasing) to see true DV you must have a 12 bit panel. When we first heard about DV coming to TVS and 4K blu-ray players and discs, my first thought was ''oh will see movies just like in a Dolby theater'' and without them really saying that it was marketed as Dolby vision better then HDR 10 ,something like that. And yes DV is a active metadata ,ok. But still without that 12 bit panel it's not the same. And I have seen reviews, honest reviews that say what they see side by side DV vs HDR 10 ,they be hard pressed to tell a huge difference. What I think your really seeing is Dolby vision enabled.To me that's important because were paying for something that's marketed as something that really isn't true. So without a 12 bit panel lets be honest and not try to justify what we think we have ,but don't.I'll say it it's a hoax until Dolby is held accountable for pushing a technology that doesn't really work according to their own specs.Another words were getting raked over the coals and they are getting richer. This website and others must get out the word out at least tell people that go to these sites when reviewing Dolby Vision ,in a nice way ,without a 12 bit panel you getting a dumb down version of Dolby Vision ,just not real Dolby vision.
tomvinelli is offline  
post #11 of 39 Old 06-27-2019, 04:42 PM
Member
 
Join Date: May 2015
Location: Bethel Vermont
Posts: 61
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 60 Post(s)
Liked: 9
Dolby Vision is a hoax

Quote:
Originally Posted by DERG View Post
If 12-bit Dolby Vision has more color gamut data than any current display panel can display, then I assume 12-bit Dolby Vision's color gamut is reduced to the panel's 10-bit HDR color gamut max of about one billion. Wouldn't both HDR & Dolby Vision appear to look about the same?
You will hear all kinds of comments from so called experts that will defend Dolby Vision for home theaters with there life and give you all sorts of technical babble to why your seeing Dolby Vision to How dare you question that at all.They know better and I would bet they push DV because there is a buck in it. Look your question should be a statement ,believe your eyes and your brain. What your seeing with dolby vision and a 10 bit panel is Dolby vision enabled ,or a dumb down version of Dolby vision,just not true DV like in a DV theater. Why? because of what you said.It's another gimmick without the 12 bit panel. And your right ,I have seen reviews side by side DV vs HDR10 and it's hard to see a difference. I would think if DV was so great there would be a huge difference when you see it and TVS with DV would be flying out of the stores ,but they are not! Until we get 12 bit panels it's a waste of money. This is marketing 101 ,Samsung has Qled which use to be SUHD but changed the name because it didn't sound better then OLED ,so we got QLED LG has nanocells same thing quantum dot technology and Sony has there fancy name for the same technology. Some do the technology better then others but its all qleds.That's what going on with DV at this point ,just another way to get in your wallet.
tomvinelli is offline  
post #12 of 39 Old 06-27-2019, 04:48 PM
AVS Forum Special Member
 
morphinapg's Avatar
 
Join Date: Jun 2006
Posts: 3,749
Mentioned: 39 Post(s)
Tagged: 0 Thread(s)
Quoted: 2280 Post(s)
Liked: 1812
Quote:
Originally Posted by tomvinelli View Post
You will hear all kinds of comments from so called experts that will defend Dolby Vision for home theaters with there life and give you all sorts of technical babble to why your seeing Dolby Vision to How dare you question that at all.They know better and I would bet they push DV because there is a buck in it. Look your question should be a statement ,believe your eyes and your brain. What your seeing with dolby vision and a 10 bit panel is Dolby vision enabled ,or a dumb down version of Dolby vision,just not true DV like in a DV theater. Why? because of what you said.It's another gimmick without the 12 bit panel. And your right ,I have seen reviews side by side DV vs HDR10 and it's hard to see a difference. I would think if DV was so great there would be a huge difference when you see it and TVS with DV would be flying out of the stores ,but they are not! Until we get 12 bit panels it's a waste of money. This is marketing 101 ,Samsung has Qled which use to be SUHD but changed the name because it didn't sound better then OLED ,so we got QLED LG has nanocells same thing quantum dot technology and Sony has there fancy name for the same technology. Some do the technology better then others but its all qleds.That's what going on with DV at this point ,just another way to get in your wallet.
The 12bit part of Dolby Vision is the least important part of it. Their smart tonemapping solution with dynamic metadata is the far more critical part of the experience. Most people would not notice a difference between 10bit and 12bit even on a 12bit panel, however, there is still benefit to be had in having a higher quality source signal, even if your display doesn't do 12bit, as that results in less degradation during signal processing.

That being said, I've never fully understood what a panel being 10bit or 12bit even actually means on a hardware level. The current going to the pixels is going to be analog, not digital.

Living Room: Sony XBR65X930D, Sony HTCT780, Base PS4, Philips BDP7501
Bedroom: LG OLED55C6P, Denon AVR-X1300W (5.1.2), PS4 Pro, Sony UBP-X700, Atomos Ninja Inferno
Check out my video game movie edits
morphinapg is offline  
post #13 of 39 Old 06-27-2019, 05:06 PM
AVS Forum Special Member
 
Join Date: Mar 2015
Location: Los Angeles, CA
Posts: 2,806
Mentioned: 18 Post(s)
Tagged: 0 Thread(s)
Quoted: 1848 Post(s)
Liked: 1106
Quote:
Originally Posted by tomvinelli View Post
You will hear all kinds of comments from so called experts that will defend Dolby Vision for home theaters with there life and give you all sorts of technical babble to why your seeing Dolby Vision to How dare you question that at all.They know better and I would bet they push DV because there is a buck in it. Look your question should be a statement ,believe your eyes and your brain. What your seeing with dolby vision and a 10 bit panel is Dolby vision enabled ,or a dumb down version of Dolby vision,just not true DV like in a DV theater. Why? because of what you said.It's another gimmick without the 12 bit panel. And your right ,I have seen reviews side by side DV vs HDR10 and it's hard to see a difference. I would think if DV was so great there would be a huge difference when you see it and TVS with DV would be flying out of the stores ,but they are not! Until we get 12 bit panels it's a waste of money. This is marketing 101 ,Samsung has Qled which use to be SUHD but changed the name because it didn't sound better then OLED ,so we got QLED LG has nanocells same thing quantum dot technology and Sony has there fancy name for the same technology. Some do the technology better then others but its all qleds.That's what going on with DV at this point ,just another way to get in your wallet.
One, most reviews are not side by side. Vincent Teoh is the only display reviewer that has done actual side by side comparisons. He has pointed out very visible differences. It is up to the director of cinematography to determine whether a release is DV full enhancement layer or minimum enhancement layer. They also can choose to or not to enable mapping, which means a display that is technically inferior to the mastering monitor, will not have fine tune data that relates to the consumer display peak nits.

DV doesn't have its on color gamut, it being 12 bit just means it hits more points within Rec.2020. Producing more colors, 53 billion more colors than 10 bit.

Check out Vincent Teoh(HDTvtest) videos on YouTube.



Sent from my LGMP450 using Tapatalk
DisplayCalNoob is offline  
post #14 of 39 Old 06-27-2019, 05:13 PM
Member
 
Join Date: May 2015
Location: Bethel Vermont
Posts: 61
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 60 Post(s)
Liked: 9
Quote:
Originally Posted by morphinapg View Post
Tonemapping isn't really about "accuracy". It's about reproducing the dynamics of the image as closely as possible to what was seen on the reference monitor. And a 1000 nit screen may still not be able to reproduce everything exactly the same way the reference monitor did. So there will likely always be some tonemapping to correct for those differences unless you're watching on literally the exact same monitor as the master was made on. Tonemapping will inherently be less accurate than displaying the content nit for nit from the source if possible, but because it results in an image closer to the mastering display, that's a good thing. Dolby does a very good job and utilizes much more advanced techniques in its tonemapping than any other I've seen.

1000 nit peak happens on some studios yes, but that's more of an issue of what mastering displays they are using. When those get better, so do their limitations.
Ok I read most of these comments , look if according to the specs of Dolby vision on Dolby's website are true then why is everybody jumping thru hoops about Dolby vision. If you can't produce True Dolby Vision with a 10 bit panel and there other specs then why are people wasting there money on something that doesn't work like it should. How many times in the past have we bought into a new technology only to sit there and say ,nah,i'm not seeing it or hearing it like they hyped it. The audio end of things are the same way .Listen to a Dolby Atmos demo disc and you go yeah that's awesome and it is. However most movies are not mixed that way. with the action going on and the music ,bombs going off at the same time it's hard for the ear to pick all that up .Yeah I feel like i'm in a dome .but it sounds muddy to me. Go to a store and watch an Oled with demo loop going on , you say wow!! what a picture , get the TV home and it's nothing like that.My point is don't get caught up in the hype because you will be disappointed every time.
tomvinelli is offline  
post #15 of 39 Old 06-27-2019, 05:20 PM
AVS Forum Special Member
 
morphinapg's Avatar
 
Join Date: Jun 2006
Posts: 3,749
Mentioned: 39 Post(s)
Tagged: 0 Thread(s)
Quoted: 2280 Post(s)
Liked: 1812
Quote:
Originally Posted by tomvinelli View Post
If you can't produce True Dolby Vision with a 10 bit panel
This is false. A lot of DV content is actually 10bit, first of all, and second of all, again, even with 12bit content, the main benefit of DV is not the 12bit AT ALL. It's the dynamic metadata and the specialized tonemapping that uses it. That tonemapping produces fantastic results whether it's 10bit or 12bit, and far better results than HDR10. 10bit is already very close to the limit of human perception. 12bit just takes it very slightly beyond that limit. I guarantee you if you showed 10bit vs 12bit content to people side by side, in a blind test, 19/20 people probably wouldn't be able to see a difference at all. Because that's not why DV is good. That's not the main thing we're gaining from it.

There's no hype in this. I've seen the results side by side with actual real every day content.

Living Room: Sony XBR65X930D, Sony HTCT780, Base PS4, Philips BDP7501
Bedroom: LG OLED55C6P, Denon AVR-X1300W (5.1.2), PS4 Pro, Sony UBP-X700, Atomos Ninja Inferno
Check out my video game movie edits
morphinapg is offline  
post #16 of 39 Old 06-27-2019, 06:00 PM
Member
 
Join Date: May 2015
Location: Bethel Vermont
Posts: 61
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 60 Post(s)
Liked: 9
Quote:
Originally Posted by morphinapg View Post
The 12bit part of Dolby Vision is the least important part of it. Their smart tonemapping solution with dynamic metadata is the far more critical part of the experience. Most people would not notice a difference between 10bit and 12bit even on a 12bit panel, however, there is still benefit to be had in having a higher quality source signal, even if your display doesn't do 12bit, as that results in less degradation during signal processing.

That being said, I've never fully understood what a panel being 10bit or 12bit even actually means on a hardware level. The current going to the pixels is going to be analog, not digital.
It is confusing isn't it . Even if we understood the technical side of things ,its still confusing. I suppose on a scope of some kind you would see a difference, but my eyes which are good don't see it. And with good HDR is there are other factors involved to get good HDR. Like the more NITS you have the better the HDR ,but I have no idea what a nit is .All I know is it makes HDR better. In the audio world a CD is 16 bits but then there is 24 bit audio and like you said it would be hard to hear the difference
tomvinelli is offline  
post #17 of 39 Old 06-27-2019, 08:44 PM
AVS Forum Special Member
 
morphinapg's Avatar
 
Join Date: Jun 2006
Posts: 3,749
Mentioned: 39 Post(s)
Tagged: 0 Thread(s)
Quoted: 2280 Post(s)
Liked: 1812
Quote:
Originally Posted by tomvinelli View Post
It is confusing isn't it . Even if we understood the technical side of things ,its still confusing. I suppose on a scope of some kind you would see a difference, but my eyes which are good don't see it. And with good HDR is there are other factors involved to get good HDR. Like the more NITS you have the better the HDR ,but I have no idea what a nit is .All I know is it makes HDR better. In the audio world a CD is 16 bits but then there is 24 bit audio and like you said it would be hard to hear the difference
A nit is a unit of brightness. It's another name for the unit "Candela per square metre". Think of it as this, within a square that is one meter by one meter in dimensions, each nit represents another candle flame in that area. The more nits there are, the brighter the image is. SDR is mastered with a maximum brightness of 100 nits. This limits what you can do visually, because in real life things can get much brighter. So you're limited to choosing whether to use a higher or lower exposure. A higher exposure and details in the shadows and most of the middle-bright objects will be nicely visible, but the brighter highlights will essentially crush to white. This is why, for example, the center of a fireplace will appear white in SDR, because you simply can not display the bright yellows and oranges in the middle of a fireplace in an SDR image without lowering the exposure. But if you lower the exposure, then everything else in the image is simply too dark to see. So that's why we have HDR. We can properly display those brighter elements (high nits) without lowering the exposure to compensate, giving them a realistic contrast as they would appear in the real world. So HDR allows us to utilize a palette that goes up to 10,000 nits. That means the brightest highlights in an HDR image can potentially be 100 times brighter than an SDR image, while the shadows and midtones remain at roughly the same brightness. This expansion of dynamic range is what gives HDR it's "pop".

The problem is right now, most TVs fall in the range of 500-1000 nits of peak brightness. So what if a movie is encoded with brighter highlights than that? Well, either the set will clip those highlights away, or it will try to "tonemap" them, or use a curve that tries to balance preserving those brighter details while not sacrificing overall brightness of the image. The problem is, standard HDR10 uses the same metadata for the entire movie. You have information about the brightest and darkest capabilities of the mastering monitor, and you have information that describes the brightest pixel in the movie, and the brightest average luminance of a frame. While these are useful to directing tonemapping, the problem is it's only describing two frames out of an entire movie. Those pieces of metadata may not be all that representative of the entire movie as a whole. Dynamic metadata, such as that used by DV solves this, as it gives the player information about each scene, not just the movie as a whole. Dolby uses this to calculate a different tonemapping curve for each scene in the movie, depending on what that scene contains. However, they go a step above that as well. These tonemapping algorithms are specially custom designed to take advantage of the specific capabilities of the display. Not just how bright or dark they go, but more in depth than that as well, for example, on my OLED, I've noticed that when you adjust the OLED light setting higher, this adjusts the tonemapping, and DV becomes more aggressive in how highlights are tonemapped to avoid triggering the display's ABL. Little quirks and things like that with each display, things like color gamut, color volume capabilities, minor differences in panel gammas, local dimming, etc. Dolby compares your displays capabilities to those of the display used to master the movie, and creates a custom tailored tonemapping curve for every scene. It's really incredible what they accomplish with this.

For the difference between Dolby Vision and HDR10, use a movie mastered at 4000 nits, with a high MaxCLL. (you'll need to research the metadata for the movie if your player doesn't display it). Look for a bright scene, and then compare the two. Pay attention to the details in the brightest parts of the image. Dolby Vision will almost always do a much better job at preserving these details than HDR10 will. HDR10+ has the potential to improve that, but I'm guessing their tonemapping algorithms still won't touch how well optimized Dolby's tonemapping curves are able to achieve. Simply put, if you want an image that looks as close as possible to what the director saw in the editing room, you want to use Dolby Vision, because its tonemapping will do a much better job getting you as close as possible.
SuperFist likes this.

Living Room: Sony XBR65X930D, Sony HTCT780, Base PS4, Philips BDP7501
Bedroom: LG OLED55C6P, Denon AVR-X1300W (5.1.2), PS4 Pro, Sony UBP-X700, Atomos Ninja Inferno
Check out my video game movie edits

Last edited by morphinapg; 06-27-2019 at 08:49 PM.
morphinapg is offline  
post #18 of 39 Old 06-27-2019, 10:09 PM
AVS Forum Special Member
 
Join Date: Mar 2015
Location: Los Angeles, CA
Posts: 2,806
Mentioned: 18 Post(s)
Tagged: 0 Thread(s)
Quoted: 1848 Post(s)
Liked: 1106
Quote:
Originally Posted by morphinapg View Post
This is false. A lot of DV content is actually 10bit, first of all, and second of all, again, even with 12bit content, the main benefit of DV is not the 12bit AT ALL. It's the dynamic metadata and the specialized tonemapping that uses it. That tonemapping produces fantastic results whether it's 10bit or 12bit, and far better results than HDR10. 10bit is already very close to the limit of human perception. 12bit just takes it very slightly beyond that limit. I guarantee you if you showed 10bit vs 12bit content to people side by side, in a blind test, 19/20 people probably wouldn't be able to see a difference at all. Because that's not why DV is good. That's not the main thing we're gaining from it.



There's no hype in this. I've seen the results side by side with actual real every day content.
12 bits, you will see a difference. By the time 12 bit panels arrive, displays will need to be able to hit 4000 nits calibrated. That is where you will start to see the difference 12 bit color accompanied by a 12 bit panel makes.

Sent from my LGMP450 using Tapatalk
DisplayCalNoob is offline  
post #19 of 39 Old 06-27-2019, 10:27 PM
AVS Forum Special Member
 
morphinapg's Avatar
 
Join Date: Jun 2006
Posts: 3,749
Mentioned: 39 Post(s)
Tagged: 0 Thread(s)
Quoted: 2280 Post(s)
Liked: 1812
Quote:
Originally Posted by DisplayCalNoob View Post
12 bits, you will see a difference. By the time 12 bit panels arrive, displays will need to be able to hit 4000 nits calibrated. That is where you will start to see the difference 12 bit color accompanied by a 12 bit panel makes.

Sent from my LGMP450 using Tapatalk
The thing is, 10bit PQ is already really close to the barten limit, so I'm not entirely convinced you would. 12bit is just beyond it. Perceptually, there may not be any visible differences for most people. Remember, PQ is encoded in a way where high nits and low nits are both encoded relative to our perceptual capabilities. While yes, high nits get less bits, and so naturally you would think higher nits would benefit more, perceptually, the nits are distributed in a way where that's not the case, because we're less sensitive to higher nits, so we don't need as many bits up there.



Sent from my OnePlus One using Tapatalk
DunMunro and SuperFist like this.

Living Room: Sony XBR65X930D, Sony HTCT780, Base PS4, Philips BDP7501
Bedroom: LG OLED55C6P, Denon AVR-X1300W (5.1.2), PS4 Pro, Sony UBP-X700, Atomos Ninja Inferno
Check out my video game movie edits

Last edited by morphinapg; 06-27-2019 at 10:52 PM.
morphinapg is offline  
post #20 of 39 Old 06-28-2019, 01:22 PM
AVS Forum Special Member
 
Join Date: Mar 2015
Location: Los Angeles, CA
Posts: 2,806
Mentioned: 18 Post(s)
Tagged: 0 Thread(s)
Quoted: 1848 Post(s)
Liked: 1106
Quote:
Originally Posted by morphinapg View Post
The thing is, 10bit PQ is already really close to the barten limit, so I'm not entirely convinced you would. 12bit is just beyond it. Perceptually, there may not be any visible differences for most people. Remember, PQ is encoded in a way where high nits and low nits are both encoded relative to our perceptual capabilities. While yes, high nits get less bits, and so naturally you would think higher nits would benefit more, perceptually, the nits are distributed in a way where that's not the case, because we're less sensitive to higher nits, so we don't need as many bits up there.







Sent from my OnePlus One using Tapatalk
If the majority of content today start out as DV 12 bit deliverables for streaming, and the same for disc. Then its pretty likely, that much of this content has colors that cannot be accurately mapped on 10 bit displays, especially content up to 4000 nits.

FEL does a pretty good job of taking advantage of the strengths of any DV enabled display, but there is detail in a film like Mad Max Fury Road, which has a MaxCLL of 10k nits. 15 bit processing and a 12 bit panel, capable of 4000 nits calibrated will be needed in order to see that level of detail, as well as proper contrast control that doesn't crush whites up to 10k nits.

Sent from my LGMP450 using Tapatalk
DisplayCalNoob is offline  
post #21 of 39 Old 06-28-2019, 01:55 PM
AVS Forum Special Member
 
morphinapg's Avatar
 
Join Date: Jun 2006
Posts: 3,749
Mentioned: 39 Post(s)
Tagged: 0 Thread(s)
Quoted: 2280 Post(s)
Liked: 1812
Quote:
Originally Posted by DisplayCalNoob View Post
If the majority of content today start out as DV 12 bit deliverables for streaming, and the same for disc. Then its pretty likely, that much of this content has colors that cannot be accurately mapped on 10 bit displays, especially content up to 4000 nits.

FEL does a pretty good job of taking advantage of the strengths of any DV enabled display, but there is detail in a film like Mad Max Fury Road, which has a MaxCLL of 10k nits. 15 bit processing and a 12 bit panel, capable of 4000 nits calibrated will be needed in order to see that level of detail, as well as proper contrast control that doesn't crush whites up to 10k nits.

Sent from my LGMP450 using Tapatalk
While I agree 12bit will allow the source to be displayed more accurately, I don't necessarily think people will be able to tell. Robots and calibration devices, sure, lol.

But of course higher nit displays will see great benefit for content like that, regardless of bits.

Sent from my OnePlus One using Tapatalk

Living Room: Sony XBR65X930D, Sony HTCT780, Base PS4, Philips BDP7501
Bedroom: LG OLED55C6P, Denon AVR-X1300W (5.1.2), PS4 Pro, Sony UBP-X700, Atomos Ninja Inferno
Check out my video game movie edits
morphinapg is offline  
post #22 of 39 Old 07-13-2019, 09:03 AM
 
imabel's Avatar
 
Join Date: Nov 2018
Posts: 14
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 10 Post(s)
Liked: 5
I recently watched a HDR10 and Dolby Vision comparison and the 12bit Dolby Vision showed much smoother gradation.
Hetfieldjames likes this.
imabel is offline  
post #23 of 39 Old 07-18-2019, 08:41 PM
Member
 
Join Date: May 2015
Location: Bethel Vermont
Posts: 61
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 60 Post(s)
Liked: 9
Quote:
Originally Posted by morphinapg View Post
The biggest benefit to Dolby Vision is not the bit depth, but the way it analyzes the capabilities of both the mastering display and the output display, and creates a custom tonemapping curve for each scene that reproduces the image as accurately to what the editors saw as you can get with your screen. And Dolby Vision's tonemapping algorithms seem to be all significantly better than anything from HDR10, and I'd imaging HDR10+. From what I can tell, DV even takes ABL into account.

That being said, if you theoretically had a 10,000 nit OLED with no ABL, there would probably be no tonemapping applied to any content, HDR10 or DV. However I can't be 100% confident in that, as I have heard conflicting reports that the DV tonemapping may still take advantage even if your TV has better capabilities than the mastering display. How and to what degree that might happen, I have no idea.
That being said and what you say sounds good ,here is a video link that will prove DV is fake right now.
tomvinelli is offline  
post #24 of 39 Old 07-18-2019, 08:47 PM
AVS Forum Special Member
 
morphinapg's Avatar
 
Join Date: Jun 2006
Posts: 3,749
Mentioned: 39 Post(s)
Tagged: 0 Thread(s)
Quoted: 2280 Post(s)
Liked: 1812
Quote:
Originally Posted by tomvinelli View Post
That being said and what you say sounds good ,here is a video link that will prove DV is fake right now.https://youtu.be/kvh3geblZY8
Yet another example of someone who clearly doesn't realize bit depth is the least important part of Dolby Vision
SuperFist likes this.

Living Room: Sony XBR65X930D, Sony HTCT780, Base PS4, Philips BDP7501
Bedroom: LG OLED55C6P, Denon AVR-X1300W (5.1.2), PS4 Pro, Sony UBP-X700, Atomos Ninja Inferno
Check out my video game movie edits
morphinapg is offline  
post #25 of 39 Old 07-18-2019, 09:39 PM
AVS Forum Special Member
 
Join Date: Mar 2015
Location: Los Angeles, CA
Posts: 2,806
Mentioned: 18 Post(s)
Tagged: 0 Thread(s)
Quoted: 1848 Post(s)
Liked: 1106
Quote:
Originally Posted by tomvinelli View Post
That being said and what you say sounds good ,here is a video link that will prove DV is fake right now.https://youtu.be/kvh3geblZY8
You posted a link from the most uninformed Youtuber. He doesn't know anything.

Sent from my LGMP450 using Tapatalk
morphinapg and SuperFist like this.
DisplayCalNoob is offline  
post #26 of 39 Old 07-21-2019, 10:35 AM
Member
 
Join Date: May 2015
Location: Bethel Vermont
Posts: 61
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 60 Post(s)
Liked: 9
Quote:
Originally Posted by morphinapg View Post
The 12bit part of Dolby Vision is the least important part of it. Their smart tonemapping solution with dynamic metadata is the far more critical part of the experience. Most people would not notice a difference between 10bit and 12bit even on a 12bit panel, however, there is still benefit to be had in having a higher quality source signal, even if your display doesn't do 12bit, as that results in less degradation during signal processing.

That being said, I've never fully understood what a panel being 10bit or 12bit even actually means on a hardware level. The current going to the pixels is going to be analog, not digital.
this link may clear up some things about Dolby Vision. Keep in mind a 12 bit panel does a few things, you get true dolby vision and the bit depth is important because you get 64 billion colors with a 12 bit panel. Were not seeing all those colors right now with a 10 bit panel. Anybody that says that's not important to see dolby vision doesn't get it at all.
tomvinelli is offline  
post #27 of 39 Old 07-21-2019, 11:16 AM
AVS Forum Special Member
 
morphinapg's Avatar
 
Join Date: Jun 2006
Posts: 3,749
Mentioned: 39 Post(s)
Tagged: 0 Thread(s)
Quoted: 2280 Post(s)
Liked: 1812
Quote:
Originally Posted by tomvinelli View Post
https://www.youtube.com/watch?v=kvh3geblZY8&t=1s this link may clear up some things about Dolby Vision. Keep in mind a 12 bit panel does a few things, you get true dolby vision and the bit depth is important because you get 64 billion colors with a 12 bit panel. Were not seeing all those colors right now with a 10 bit panel. Anybody that says that's not important to see dolby vision doesn't get it at all.
The colors aren't what makes Dolby Vision special, it's the dynamic smart tonemapping. The difference between 10bit and 12bit is barely perceptible by the human eye. Do you ever see banding on 10bit HDR10 content? Probably not.

Also you literally just posted this exact same thing a few posts up.

Sent from my OnePlus One using Tapatalk
SuperFist likes this.

Living Room: Sony XBR65X930D, Sony HTCT780, Base PS4, Philips BDP7501
Bedroom: LG OLED55C6P, Denon AVR-X1300W (5.1.2), PS4 Pro, Sony UBP-X700, Atomos Ninja Inferno
Check out my video game movie edits
morphinapg is offline  
post #28 of 39 Old 07-21-2019, 01:23 PM
AVS Forum Special Member
 
Join Date: Mar 2015
Location: Los Angeles, CA
Posts: 2,806
Mentioned: 18 Post(s)
Tagged: 0 Thread(s)
Quoted: 1848 Post(s)
Liked: 1106
Quote:
Originally Posted by tomvinelli View Post
https://www.youtube.com/watch?v=kvh3geblZY8&t=1s this link may clear up some things about Dolby Vision. Keep in mind a 12 bit panel does a few things, you get true dolby vision and the bit depth is important because you get 64 billion colors with a 12 bit panel. Were not seeing all those colors right now with a 10 bit panel. Anybody that says that's not important to see dolby vision doesn't get it at all.
Its true, 12 bit produces way more color. But, you continue to post the same link, from a YouTuber that doesn't know anything.

Why don't you ask sspears a color scientist, if you really want to understand Dolby Vision and 12 bit color. He has a professional relationship with Dolby, Portait Display, film studios, and many display, bluray player manufacturers.

Sent from my LGMP450 using Tapatalk
DisplayCalNoob is offline  
post #29 of 39 Old 07-21-2019, 02:03 PM
AVS Forum Special Member
 
morphinapg's Avatar
 
Join Date: Jun 2006
Posts: 3,749
Mentioned: 39 Post(s)
Tagged: 0 Thread(s)
Quoted: 2280 Post(s)
Liked: 1812
While it's true that there are a lot more colors, its not as if most of that is a visible difference. Like, if I made a TV that displayed colors in the infrared or ultraviolet spectrum, I could claim "ow wow, billions of more colors", and I'd technically be true, but you're not seeing any of those new colors.

Our eyes have limitations. 10bit is very close to those limitations. Even if we could see the full 12bit spectrum, it would be an incredibly minor difference. Something most people would never be able to notice. However, because 10bit is so close to the eye's threshold, the difference is even less than the minor difference it would have been as is. It should also be clear that 12/10 bit does not affect how wide the gamut is.

Ultimately, these are diminishing returns. 12bit isn't the important part of DV, the dynamic metadata and smart tonemapping is. Especially considering a lot of DV is actually only 10bit anyway.

Sent from my OnePlus One using Tapatalk
SuperFist likes this.

Living Room: Sony XBR65X930D, Sony HTCT780, Base PS4, Philips BDP7501
Bedroom: LG OLED55C6P, Denon AVR-X1300W (5.1.2), PS4 Pro, Sony UBP-X700, Atomos Ninja Inferno
Check out my video game movie edits
morphinapg is offline  
post #30 of 39 Old 07-21-2019, 02:47 PM
AVS Forum Special Member
 
Join Date: Mar 2015
Location: Los Angeles, CA
Posts: 2,806
Mentioned: 18 Post(s)
Tagged: 0 Thread(s)
Quoted: 1848 Post(s)
Liked: 1106
The DV grade is always 12 bit, the encode is 10 bit. 10 bit HDR10 base layer, 2 bit enhancement layer. This is combined at the display or player. Stacey says that the DV enhancement layer carries data from the mastering monitor, along with MaxCLL and MaxFaLL, etc.

MEL and FEL with different file sizes, WB seems to have really come in to its own when it comes to their grades.



Sent from my LGMP450 using Tapatalk
DisplayCalNoob is offline  
Sponsored Links
Advertisement
 
Reply High Dynamic Range (HDR) & Wide Color Gamut (WCG)

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off