Should I be seeing such a huge difference between Dolby Vision and HDR10? - AVS Forum | Home Theater Discussions And Reviews
Forum Jump: 
 1Likes
  • 1 Post By SpeedDemon
 
Thread Tools
post #1 of 10 Old 11-17-2019, 10:08 PM - Thread Starter
Member
 
Join Date: Oct 2019
Posts: 29
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 16 Post(s)
Liked: 1
Question Should I be seeing such a huge difference between Dolby Vision and HDR10?

I purchased the movie Christine (4K) from Movies Anywhere. It propagated to Prime Video (standard UHD), VUDU (Dolby Vision), and directly in the Movies Anywhere App (HDR10). I am seeing HUGE differences between the formats. In the Movies Anywhere app on HDR10, the detail is acceptable, but there is a ton of noise. I'm assuming it's film grain, but it's very noticeable, and sometimes distracts from the scene, with noticeable patches, or splotches of higher noise. On VUDU, the format is Dolby Vision. This version looks Amazing, with great shadows, highlights, and very little or no noise/grain. The Prime Video version looks washed out and hazy, but I would expect this, not being in an HDR format.

Anyway, My question is, should there be such a noticeable difference between Dolby Vision, and HDR10? I will have to watch the movie entirely on both, and get an overall opinion of which I prefer more. At the moment, I 'm leaning toward Dolby Vision.
cam94zee is offline  
Sponsored Links
Advertisement
 
post #2 of 10 Old 11-17-2019, 10:12 PM
Member
 
Join Date: Sep 2017
Location: Detroit MI
Posts: 48
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 44 Post(s)
Liked: 33
I have Samsung so they only proved HDR and no Dolby Vision.

The guy at Best Buy tried to convince me Dolby Vision is “better”

I looked at a picture side by side and it looked exactly the same.

Some people are really technical about that stuff, but I don’t care for 2% more color and contrast

I saw enjoy what you like, don’t worry about what others think is best. If it works for you great


Sent from my iPhone using Tapatalk Pro
sshuttari is offline  
post #3 of 10 Old 11-18-2019, 03:09 PM
AVS Forum Special Member
 
Elisha's Avatar
 
Join Date: Apr 2007
Posts: 1,108
Mentioned: 12 Post(s)
Tagged: 0 Thread(s)
Quoted: 630 Post(s)
Liked: 600
Best way to compare is on a physical disc. Not the streaming services!


Sent from my iPhone using Tapatalk
Elisha is offline  
Sponsored Links
Advertisement
 
post #4 of 10 Old 11-18-2019, 07:24 PM
AVS Forum Addicted Member
 
ray0414's Avatar
 
Join Date: Feb 2011
Location: michigan
Posts: 17,070
Mentioned: 266 Post(s)
Tagged: 0 Thread(s)
Quoted: 12860 Post(s)
Liked: 12220
Quote:
Originally Posted by cam94zee View Post
I purchased the movie Christine (4K) from Movies Anywhere. It propagated to Prime Video (standard UHD), VUDU (Dolby Vision), and directly in the Movies Anywhere App (HDR10). I am seeing HUGE differences between the formats. In the Movies Anywhere app on HDR10, the detail is acceptable, but there is a ton of noise. I'm assuming it's film grain, but it's very noticeable, and sometimes distracts from the scene, with noticeable patches, or splotches of higher noise. On VUDU, the format is Dolby Vision. This version looks Amazing, with great shadows, highlights, and very little or no noise/grain. The Prime Video version looks washed out and hazy, but I would expect this, not being in an HDR format.

Anyway, My question is, should there be such a noticeable difference between Dolby Vision, and HDR10? I will have to watch the movie entirely on both, and get an overall opinion of which I prefer more. At the moment, I 'm leaning toward Dolby Vision.
On a lower end tv, Dolby Vision will definitely look better. the higher up you go, the less differences there are.

82Q90R*75Q9FN(RIP)*55C8OLED*Galaxy Note10+*Ub820 fed into Oppo 203*XB1X*4k DenonX4200

MASTER LIST OF HDR CONTENT THREAD HERE, UPDATED OFTEN
ray0414 is offline  
post #5 of 10 Old 11-19-2019, 10:01 AM - Thread Starter
Member
 
Join Date: Oct 2019
Posts: 29
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 16 Post(s)
Liked: 1
Quote:
Originally Posted by Elisha View Post
Best way to compare is on a physical disc. Not the streaming services!


Sent from my iPhone using Tapatalk
I haven't been able to find a way to switch between HDR formats. Basically if Dolby Vision is available my TV selects it. Otherwise it uses HDR10. Then again, I am watching my physical discs on an Xbox One S, which has very few video options.
cam94zee is offline  
post #6 of 10 Old 11-19-2019, 04:49 PM
AVS Forum Special Member
 
Elisha's Avatar
 
Join Date: Apr 2007
Posts: 1,108
Mentioned: 12 Post(s)
Tagged: 0 Thread(s)
Quoted: 630 Post(s)
Liked: 600
Quote:
Originally Posted by cam94zee View Post
I haven't been able to find a way to switch between HDR formats. Basically if Dolby Vision is available my TV selects it. Otherwise it uses HDR10. Then again, I am watching my physical discs on an Xbox One S, which has very few video options.


Like Ray said, on low end TVs and low Nit TVs, DV will look better. But if your TV can do more than 1,000 Nits, you won't be able to tell much difference.
On my KS8000, good HDR content is so brilliant and highlights are blinding!


Sent from my iPhone using Tapatalk
Elisha is offline  
post #7 of 10 Old 11-19-2019, 06:12 PM - Thread Starter
Member
 
Join Date: Oct 2019
Posts: 29
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 16 Post(s)
Liked: 1
Quote:
Originally Posted by Elisha View Post
Like Ray said, on low end TVs and low Nit TVs, DV will look better. But if your TV can do more than 1,000 Nits, you won't be able to tell much difference.
On my KS8000, good HDR content is so brilliant and highlights are blinding!


Sent from my iPhone using Tapatalk
My TV would definitely fall into the "below 1,000 Nits" category. The only review I found of my actual model (PCMag) gave it a mediocre contrast ratio. The Rtings rating for my model (Vizio M507) is actually for the next series higher (M508), and it only gets a "decent" HDR Peak Brightness. It is rated 200 nits higher than the M507. So, You are probably correct. I will likely be returning this model during the Costco return period. There really don't seem to be many affordable options out there in the 50" size with decent brightness and features. Unfortunately I am limited to this size.
cam94zee is offline  
post #8 of 10 Old 11-27-2019, 07:50 PM
Advanced Member
 
SpeedDemon's Avatar
 
Join Date: Nov 2004
Location: Washington
Posts: 544
Mentioned: 4 Post(s)
Tagged: 0 Thread(s)
Quoted: 189 Post(s)
Liked: 105
Quote:
Originally Posted by cam94zee View Post
I purchased the movie Christine (4K) from Movies Anywhere. It propagated to Prime Video (standard UHD), VUDU (Dolby Vision), and directly in the Movies Anywhere App (HDR10). I am seeing HUGE differences between the formats. In the Movies Anywhere app on HDR10, the detail is acceptable, but there is a ton of noise. I'm assuming it's film grain, but it's very noticeable, and sometimes distracts from the scene, with noticeable patches, or splotches of higher noise. On VUDU, the format is Dolby Vision. This version looks Amazing, with great shadows, highlights, and very little or no noise/grain. The Prime Video version looks washed out and hazy, but I would expect this, not being in an HDR format.

Anyway, My question is, should there be such a noticeable difference between Dolby Vision, and HDR10? I will have to watch the movie entirely on both, and get an overall opinion of which I prefer more. At the moment, I 'm leaning toward Dolby Vision.
What you're describing has everything to do with the fact that each streaming service often has their own different encode of the movie.

The grain, detail, and noise you're describing has little/nothing to do with the HDR format.

With that said, there can be a significant difference between DV & HDR10 and the easiest way to observe this is to playback a DV video on a player that allows you to toggle off the DV metadata and watch with just HDR10... Oppo players let you do this.

DV should have more natural brightness/contrast/gamma since the metadata allows the image to be optimally tone-mapped to your display as the mastering engineer intended.
Adamg (Ret-Navy) likes this.

--
Spoiler!
SpeedDemon is offline  
post #9 of 10 Old 01-07-2020, 07:45 AM
Senior Member
 
Join Date: Mar 2019
Posts: 428
Mentioned: 6 Post(s)
Tagged: 0 Thread(s)
Quoted: 370 Post(s)
Liked: 325
TLDR: Higher bit depth (e.g. 12bit from Dolby Vision vs 10bit from HDR10/+) is much better, period. It’s better for current budget display capabilities (when combined with metadata), it’s better for current flagship display capabilities, and it’s better for future-proofing -- since 4K -> 8K up-scaling will work better, and obviously will be much better if/when we get extremely high dynamic range microLED TVs.

Quote:
Originally Posted by SpeedDemon View Post
Quote:
Originally Posted by cam94zee View Post
I purchased the movie Christine (4K) from Movies Anywhere. It propagated to Prime Video (standard UHD), VUDU (Dolby Vision), and directly in the Movies Anywhere App (HDR10). I am seeing HUGE differences between the formats. In the Movies Anywhere app on HDR10, the detail is acceptable, but there is a ton of noise. I'm assuming it's film grain, but it's very noticeable, and sometimes distracts from the scene, with noticeable patches, or splotches of higher noise. On VUDU, the format is Dolby Vision. This version looks Amazing, with great shadows, highlights, and very little or no noise/grain. The Prime Video version looks washed out and hazy, but I would expect this, not being in an HDR format.

Anyway, My question is, should there be such a noticeable difference between Dolby Vision, and HDR10? I will have to watch the movie entirely on both, and get an overall opinion of which I prefer more. At the moment, I 'm leaning toward Dolby Vision.
What you're describing has everything to do with the fact that each streaming service often has their own different encode of the movie.

The grain, detail, and noise you're describing has little/nothing to do with the HDR format.

With that said, there can be a significant difference between DV & HDR10 and the easiest way to observe this is to playback a DV video on a player that allows you to toggle off the DV metadata and watch with just HDR10... Oppo players let you do this.

DV should have more natural brightness/contrast/gamma since the metadata allows the image to be optimally tone-mapped to your display as the mastering engineer intended.
While I don't disagree with your assessment here, I would point out that 12-bit is a distinct improvement over 10-bit that actually will show up even on 10-bit (or lesser, for that matter) displays for reasons that are perhaps somewhat subtle. This could manifest not only as differences in posterization, but also in 'grain' and 'noise' depending on how the 10-bit content was mastered (specifically, to what extent they encode dithering into the content).

While in theory there would be nothing gained from 12-bit content on a 10-bit display if the content was simply presented as-is, this is not the case in the real world because each TV has a different subpixel structure, brightness/contrast/sharpness capability, calibration, and core display technology (e.g. FALD + LCD vs OLED, etc.) As a result, the input content must be mapped somehow to the TV's capabilities, and the unique limitations of the TV (which is always the case, even with HDR10 -- at least looking at the current state-of-the art in consumer TVs) must be factored in as well.

I hope it's well known that 10 bits is NOT enough precision for much HDR content out there (e.g. posterization in sky gradients, extreme dark scenes, etc.), and this must be mitigated with various tricks (e.g. dithering) either during mastering, or by the TV as it retargets the content from 12bit -> 10bit (or in some cases as in 'smooth gradation' filters, tries to intelligently figure out where quantization is a problem, and intelligently tries to recover what was lost during the mastering process).

Take dithering for example: Either approach (applied when mastering a 10bit video, vs applied dynamically by the TV's processor to map a 12bit master to 10bit panel) works well, but the problem with applying dithering etc. during mastering is that you have now permanently baked in your set of tradeoffs into the content, and the parameters you chose for dithering will not (and cannot) be ideal for all TVs (because when nonlinear operations like tone mapping, gamma curves, etc. are applied on top, the dithering no longer 'works' the way it's supposed to). In contrast, when the dithering is applied by the TV's internal processor chip (e.g. when rendering 12 bit content to a 10 bit panel), the best dithering algorithm for that particular TV can be chosen, and can be applied after tone mapping, gamma correction, color calibration, etc. And, in the future, when native 12+ bit panels are available, perhaps no additional dithering will need to be applied at all.

So in this sense, 10 bits of precision could very well cause EITHER more grain or more posterization artifacts than 12 bits -- even when the content is being rendered to 10-bit panels!

Also, this is NOT simply a question of Dolby Vision being better just for the less capable TVs today. The extra bit depth will enable older content to take advantage of future TVs with greater dynamic range, but more than just that it will also result in much better 4K -> 8K upscaling results! Again, consider dithering as an example: If dithering is encoded into a 4K 10-bit signal to achieve an effective resolution of 12 bits on 10 bit panels, that content upscaled to 8K will most likely still have the 4K resolution dithering pattern (which will be much more visible as noise) vs the ideal of performing dithering at the native 8K resolution by the TV internally *after* upscaling.

In theory it is possible to retarget dithering across resolutions via “AI” (deep convolutional neural networks), but it’s much more computationally expensive AND less accurate (e.g. likely to accidentally smooth over genuine scene details). Ultimately, you can't circumvent the information theory: Encoding dithering trades better effective color and luminance dynamic range, in exchange for a bit less effective spatial resolution. Encoding more bits per channel (when this is appropriate to the 'ground truth' of the scene) is therefore always better both for old, current, and future TVs alike.

Neumann KH310A | Neumann KH120A | Ascend Sierra Towers & Horizon (RAAL) | Ascend Sierra 2EX | Revel F206 | Rythmik F18 x2 | Rythmik F12 | JL Audio E112 | SMSL M500 | Topping D10 | Sonos Amp | Marantz SR7012

Last edited by echopraxia; 01-07-2020 at 08:26 AM.
echopraxia is offline  
post #10 of 10 Old 01-07-2020, 08:34 AM
Senior Member
 
Join Date: Mar 2019
Posts: 428
Mentioned: 6 Post(s)
Tagged: 0 Thread(s)
Quoted: 370 Post(s)
Liked: 325
Also, another reason why compressed 12-bit content could actually look significantly better than 10-bit content, when compressed to the exact same bit rate:

Counter-intuitive as this may seem, lower bit depth content (with dithering) will actually compress less effectively than high bit depth content (without dithering)! Imagine your source master content is a 16-bit-per-channel HDR video, uncompressed. Imagine you encode this into a 10-bit-per-channel uncompressed signal with dithering to reduce posterization artifacts, then compress to a fixed file size. The dithering will add many random high frequency components to the signal, which is inherently much more difficult to compress via the discrete cosine/sine transforms used in modern video compression algorithms (e.g. H.264, H.265) than the smooth gradient that would appear in an undithered high bit-depth signal. The latter could be encoded nearly losslessly with only a few DCT coefficients, whereas the former would require encoding bits to be distributed almost evenly throughout to correctly recover the dithering pattern without terrible compression artifacts.

As a result, the compressed 12-bit (undithered) video could actually suffer from fewer compression artifacts and potentially much greater visual quality overall, because more encoding bits can be freed up from struggling to encode the dithering pattern on the smooth gradients, and reallocated elsewhere in the scene to actually complex textures.

Disclaimer: This is all theory (I haven't explicitly tested/confirmed this), from my rusty memory of HEVC (H.265) back from many years ago when I did some research work here long before it was finalized/finished. The only situation I can imagine where a dithered 10-bit signal would compress as well as an undithered 12-bit one is if the encoder is capable of somehow intelligently reversing dithering from a 10-bit input signal prior to compression, where the decoder would then of course re-dither afterwards when quantizing to the target bit depth. However, reverse dithering is not going to be a lossless process in any case (it will lose spatial resolution), so even if some encoders have this feature I'm not sure if it would be in standard use (or something you'd ever really want to use, to be honest).

I also don't know the technical details about the container formats used for HDR10/HDR10+/Dolby Vision. Strangely, it seems quite hard to find any specs at all online. I'm curious now.

Neumann KH310A | Neumann KH120A | Ascend Sierra Towers & Horizon (RAAL) | Ascend Sierra 2EX | Revel F206 | Rythmik F18 x2 | Rythmik F12 | JL Audio E112 | SMSL M500 | Topping D10 | Sonos Amp | Marantz SR7012

Last edited by echopraxia; 01-07-2020 at 09:56 AM.
echopraxia is offline  
Sponsored Links
Advertisement
 
Reply High Dynamic Range (HDR) & Wide Color Gamut (WCG)

Tags
christine , dolby vision , hdr10 , noise

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off