"Fake HDR": Does it look better than SDR content on an HDR TV? - AVS Forum | Home Theater Discussions And Reviews
Forum Jump: 
 4Likes
  • 1 Post By ray0414
  • 1 Post By EvLee
  • 1 Post By SpeedDemon
  • 1 Post By EvLee
 
Thread Tools
post #1 of 11 Old 11-27-2019, 04:42 PM - Thread Starter
Member
 
Comp625's Avatar
 
Join Date: Nov 2007
Posts: 191
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 32 Post(s)
Liked: 14
"Fake HDR": Does it look better than SDR content on an HDR TV?

In reference to the recent news about Disney+ and The Mandalorian using Fake HDR flags, my brother-in-law says viewing "fake HDR" content on an HDR TV still looks than viewing regular SDR content on the same HDR TV. He believes HDR TV's will handle this "fake HDR" content more accurately than SDR even though dynamic range isn't there.

True or not true? Thoughts?
Comp625 is offline  
Sponsored Links
Advertisement
 
post #2 of 11 Old 11-27-2019, 07:43 PM
Advanced Member
 
SpeedDemon's Avatar
 
Join Date: Nov 2004
Location: Washington
Posts: 540
Mentioned: 4 Post(s)
Tagged: 0 Thread(s)
Quoted: 186 Post(s)
Liked: 102
I think your brother-in-law must like the brighter look of the picture but is missing the point of HDR. If Mando was graded in SDR then it should be flagged as SDR on D+ so an Apple TV will correctly play it back in SDR and let the TV/Display use it’s calibrated SDR setting.

Playing back SDR in an HDR container results in an erroneous playback experience due to unpredictable gamma curves and dynamic tone mapping in the absence of DV metadata. I’m guessing someone at Disney incorrectly made this decision because they graded in DCI-P3 and wanted to master in Rec. 2020 instead of collapsing into Rec. 709.

This is an awkward time as the industry learns how to properly transition from SDR to HDR, but if creators want Rec. 2020 then they really need to also go all in on HDR as well to ensure a consistent playback experience. There’s no consumer standard for using Rec. 2020 after all without HDR.

Mando is SDR (in an HDR container) but its brightest elements are 200-nits (should be 150-nits). This was just a bad decision no matter which way you look at it and is what results in the washed out appearance (elevated near blacks)

With all that said, I think this could start a decent argument for using DV for 150-nit SDR masterings that takes advantage of the expanded Rec. 2020 color space because of the tone-mapping assistance provided by DV. Mando must look especially bad on non-DV HDR TVs running automated tone-mapping algos that expect 1500-nits... cough... Samsung... cough...

--
Spoiler!
SpeedDemon is offline  
post #3 of 11 Old 11-27-2019, 11:20 PM
AVS Forum Addicted Member
 
ray0414's Avatar
 
Join Date: Feb 2011
Location: michigan
Posts: 17,005
Mentioned: 266 Post(s)
Tagged: 0 Thread(s)
Quoted: 12806 Post(s)
Liked: 12149
There is more to hdr than just raw nits. A member over at bluray.com posted some comparison shots of some highlight detail improvement seen with hdr grading and it was quite a big difference. So there are still benefits. It may not be fake hdr but just used differently.
StayingSalty likes this.

82Q90R*75Q9FN(RIP)*55C8OLED*Galaxy Note10+*Ub820 fed into Oppo 203*XB1X*4k DenonX4200

MASTER LIST OF HDR CONTENT THREAD HERE, UPDATED OFTEN
ray0414 is online now  
Sponsored Links
Advertisement
 
post #4 of 11 Old 11-28-2019, 04:12 AM
Advanced Member
 
EvLee's Avatar
 
Join Date: Dec 2009
Posts: 704
Mentioned: 7 Post(s)
Tagged: 0 Thread(s)
Quoted: 403 Post(s)
Liked: 413
Quote:
Originally Posted by Comp625 View Post
In reference to the recent news about Disney+ and The Mandalorian using Fake HDR flags, my brother-in-law says viewing "fake HDR" content on an HDR TV still looks than viewing regular SDR content on the same HDR TV. He believes HDR TV's will handle this "fake HDR" content more accurately than SDR even though dynamic range isn't there.

True or not true? Thoughts?
People pay a premium for Dolby Cinema in part because of the better image quality compared to regular digital cinema. Dolby Cinema has a peak luminance twice that of regular theaters. Similarly, the "fake HDR" people are talking about goes at least twice as high in peak luminance as regular SDR. That extra stop in exposure for highlights is enough to make a big improvement in the saturation of bright colors and add detail to the image. It's not going to provide the sharp specular highlights some people may expect from HDR, but it's certainly an improvement over SDR. As far as accuracy is concerned, it depends on the television of course. In the absence of metadata to direct the tonemapping, the televisions I am familiar with will track the PQ curve accurately at least up to 200 nits before they start applying any compression to the highlights. So as long as it is set up correctly, it should reproduce the image as accurately as your calibration.

Last edited by EvLee; 11-28-2019 at 04:17 AM.
EvLee is offline  
post #5 of 11 Old 11-28-2019, 04:35 AM
Advanced Member
 
EvLee's Avatar
 
Join Date: Dec 2009
Posts: 704
Mentioned: 7 Post(s)
Tagged: 0 Thread(s)
Quoted: 403 Post(s)
Liked: 413
Quote:
Originally Posted by SpeedDemon View Post
I think your brother-in-law must like the brighter look of the picture but is missing the point of HDR. If Mando was graded in SDR then it should be flagged as SDR on D+ so an Apple TV will correctly play it back in SDR and let the TV/Display use it’s calibrated SDR setting.

Playing back SDR in an HDR container results in an erroneous playback experience due to unpredictable gamma curves and dynamic tone mapping in the absence of DV metadata. I’m guessing someone at Disney incorrectly made this decision because they graded in DCI-P3 and wanted to master in Rec. 2020 instead of collapsing into Rec. 709.

This is an awkward time as the industry learns how to properly transition from SDR to HDR, but if creators want Rec. 2020 then they really need to also go all in on HDR as well to ensure a consistent playback experience. There’s no consumer standard for using Rec. 2020 after all without HDR.

Mando is SDR (in an HDR container) but its brightest elements are 200-nits (should be 150-nits). This was just a bad decision no matter which way you look at it and is what results in the washed out appearance (elevated near blacks)

With all that said, I think this could start a decent argument for using DV for 150-nit SDR masterings that takes advantage of the expanded Rec. 2020 color space because of the tone-mapping assistance provided by DV. Mando must look especially bad on non-DV HDR TVs running automated tone-mapping algos that expect 1500-nits... cough... Samsung... cough...
The flaw in this argument is that even for HDR movies that do utilize 1000+ nits, they don't stay that bright all the time. Having content in an HDR container that only goes to 200 nits is not by itself going to break the tonemapping, because then all other content would also break as soon as it dips down for a darker scene. If people are seeing elevated blacks, washed out appearance, whatever... that is for an entirely different reason. Not because of the 200 nits max. The way DV works it does scene-by-scene analysis and reports max picture luminance within each scene through the metadata. As long as that metadata is generated correctly the processing in the television can adapt to any range of content inside the HDR container. But even without DV, TVs like the Samsung should be adapting their tonecurve to the content for the same reason I stated earlier. No HDR content stays at the peak luminance for its entire duration, so any algorithm has to be capable of adjusting to lower peak luminance levels.
EvLee is offline  
post #6 of 11 Old 11-28-2019, 06:07 AM
Advanced Member
 
SpeedDemon's Avatar
 
Join Date: Nov 2004
Location: Washington
Posts: 540
Mentioned: 4 Post(s)
Tagged: 0 Thread(s)
Quoted: 186 Post(s)
Liked: 102
"Fake HDR": Does it look better than SDR content on an HDR TV?

Quote:
Originally Posted by EvLee View Post
The flaw in this argument is that even for HDR movies that do utilize 1000+ nits, they don't stay that bright all the time. Having content in an HDR container that only goes to 200 nits is not by itself going to break the tonemapping, because then all other content would also break as soon as it dips down for a darker scene. If people are seeing elevated blacks, washed out appearance, whatever... that is for an entirely different reason. Not because of the 200 nits max. The way DV works it does scene-by-scene analysis and reports max picture luminance within each scene through the metadata. As long as that metadata is generated correctly the processing in the television can adapt to any range of content inside the HDR container. But even without DV, TVs like the Samsung should be adapting their tonecurve to the content for the same reason I stated earlier. No HDR content stays at the peak luminance for its entire duration, so any algorithm has to be capable of adjusting to lower peak luminance levels.
The problem isn’t simply caused by the 200-nit max of Mando, it’s that the 150-nit SDR grading of Mando was stretched to 200-nits.

As I stated, Mando would have likely done great and looked beautiful in DV if it were left at 150-nits.

The primary reason HDR10 would have a problem vs DV with 150-nit content/scenes is due to the way the HDR tone mapping algorithms will typically artificially brighten many scenes on most displays.

--
Spoiler!

Last edited by SpeedDemon; 11-28-2019 at 06:21 AM.
SpeedDemon is offline  
post #7 of 11 Old 11-28-2019, 08:12 AM
Advanced Member
 
EvLee's Avatar
 
Join Date: Dec 2009
Posts: 704
Mentioned: 7 Post(s)
Tagged: 0 Thread(s)
Quoted: 403 Post(s)
Liked: 413
Quote:
Originally Posted by SpeedDemon View Post
The problem isn’t simply caused by the 200-nit max of Mando, it’s that the 150-nit SDR grading of Mando was stretched to 200-nits.

As I stated, Mando would have likely done great and looked beautiful in DV if it were left at 150-nits.

The primary reason HDR10 would have a problem vs DV with 150-nit content/scenes is due to the way the HDR tone mapping algorithms will typically artificially brighten many scenes on most displays.
That's certainly news to me considering the televisions I have calibrated will take an HDR10 signal and display it to a nearly identical match when compared against a reference mastering monitor. Maybe try disabling any additional HDR processing if you are seeing it pump the brightness up.

As for 150 nits... SDR doesn't get graded at 150 nits. It is done at 100 nits reference. Doesn't matter what studio or facility is doing the work, that's the standard everybody has to follow. It's also been shown with screen captures in other threads that the Mandalorian is not a stretched SDR grade. There is highlight information and color in the HDR image that simply does not exist in the SDR version.
SpeedDemon likes this.
EvLee is offline  
post #8 of 11 Old 11-28-2019, 10:21 AM
Advanced Member
 
SpeedDemon's Avatar
 
Join Date: Nov 2004
Location: Washington
Posts: 540
Mentioned: 4 Post(s)
Tagged: 0 Thread(s)
Quoted: 186 Post(s)
Liked: 102
Quote:
Originally Posted by EvLee View Post
That's certainly news to me considering the televisions I have calibrated will take an HDR10 signal and display it to a nearly identical match when compared against a reference mastering monitor. Maybe try disabling any additional HDR processing if you are seeing it pump the brightness up.



As for 150 nits... SDR doesn't get graded at 150 nits. It is done at 100 nits reference. Doesn't matter what studio or facility is doing the work, that's the standard everybody has to follow. It's also been shown with screen captures in other threads that the Mandalorian is not a stretched SDR grade. There is highlight information and color in the HDR image that simply does not exist in the SDR version.
You’re right. I meant to say 100-nits.

I agree that disabling the extra HDR10 processing for dynamic tone-mapping would fix some of the issues I described, but my experience has been that all proper HDR10 material benefits from this setting being enabled. I think this will remain true until we have displays capable of true 10k-nits of range.

With that said, certain TVs like Samsungs don’t follow a reference EOTF curve regardless while in an HDR picture mode and are going to always throw things off but that’s kind of a different problem.

That’s news to me that Mando isn’t stretched from 100 to 200 nits though. Any source for this?
EvLee likes this.

--
Spoiler!
SpeedDemon is offline  
post #9 of 11 Old 11-28-2019, 03:16 PM
Advanced Member
 
EvLee's Avatar
 
Join Date: Dec 2009
Posts: 704
Mentioned: 7 Post(s)
Tagged: 0 Thread(s)
Quoted: 403 Post(s)
Liked: 413
Quote:
Originally Posted by SpeedDemon View Post
You’re right. I meant to say 100-nits.

I agree that disabling the extra HDR10 processing for dynamic tone-mapping would fix some of the issues I described, but my experience has been that all proper HDR10 material benefits from this setting being enabled. I think this will remain true until we have displays capable of true 10k-nits of range.

With that said, certain TVs like Samsungs don’t follow a reference EOTF curve regardless while in an HDR picture mode and are going to always throw things off but that’s kind of a different problem.

That’s news to me that Mando isn’t stretched from 100 to 200 nits though. Any source for this?
These screenshots were shared in the Disney+ thread.

http://www.framecompare.com/image-co...rison/DGWKGNNX
http://www.framecompare.com/image-co...rison/JE0FMNNU

If you compare the SDR to HDR you can see the HDR contains much more detail in the highlights, as well as additional color. Look inside the core of the flames. If the SDR had been stretched to make HDR it would be lacking all that information.

You may be right about the Samsungs. Most of the televisions I have set up have been LG, Sony or Panasonic. Samsung seems determined to do everything their own way.
SpeedDemon likes this.
EvLee is offline  
post #10 of 11 Old 11-28-2019, 09:36 PM
Advanced Member
 
SpeedDemon's Avatar
 
Join Date: Nov 2004
Location: Washington
Posts: 540
Mentioned: 4 Post(s)
Tagged: 0 Thread(s)
Quoted: 186 Post(s)
Liked: 102
Quote:
Originally Posted by EvLee View Post
These screenshots were shared in the Disney+ thread.

http://www.framecompare.com/image-co...rison/DGWKGNNX
http://www.framecompare.com/image-co...rison/JE0FMNNU

If you compare the SDR to HDR you can see the HDR contains much more detail in the highlights, as well as additional color. Look inside the core of the flames. If the SDR had been stretched to make HDR it would be lacking all that information.

You may be right about the Samsungs. Most of the televisions I have set up have been LG, Sony or Panasonic. Samsung seems determined to do everything their own way.
There's clearly a difference in those images as you point out, but do you think that it's possible that Disney simply didn't produce an SDR encode of Mando and that this is a result of tone mapping being performed by whatever device the D+ app is running on?

--
Spoiler!
SpeedDemon is offline  
post #11 of 11 Old 11-30-2019, 08:54 AM
Advanced Member
 
SoNic67's Avatar
 
Join Date: Aug 2005
Posts: 725
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 7 Post(s)
Liked: 30
This a re-hash of the old discussion about audio mp3 files and various software solutions that will "enhance" them to sound "like" CD quality. I personally never found that to actually make a positive difference.
Same here - if the program was not recorded in HDR (bit-depth) then something else that fakes that HDR will not look better. Might look "different" due to mapping, but... gamma adjusting could do the same.
SoNic67 is offline  
Sponsored Links
Advertisement
 
Reply High Dynamic Range (HDR) & Wide Color Gamut (WCG)

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off