AVS Forum | Home Theater Discussions And Reviews (https://www.avsforum.com/forum/)
-   High Dynamic Range (HDR) & Wide Color Gamut (WCG) (https://www.avsforum.com/forum/465-high-dynamic-range-hdr-wide-color-gamut-wcg/)
-   -   Joe Kane at Samsung QLED/HDR10 Summit (https://www.avsforum.com/forum/465-high-dynamic-range-hdr-wide-color-gamut-wcg/2895289-joe-kane-samsung-qled-hdr10-summit.html)

Scott Wilkinson 07-12-2017 08:35 PM

Joe Kane at Samsung QLED/HDR10 Summit
 
Video guru Joe Kane offered his vision for the future of UHD—multiple resolutions, gamuts, grayscales, peak luminances, and EOTFs, all derived from a single master file.

https://www.avsforum.com/joe-kane-sam...dhdr10-summit/

EvLee 07-12-2017 09:52 PM

IMO, the single master file isn't a particularly good idea creatively, but it's a great idea to sell to studios and distributors since it promises to reduce inventory management. We already use half float as the intermediate format in post-production, but once you get to grading you are making creative choices that may diverge for different delivery formats. I.e. if you are going to IMAX, you as a paying customer want the full IMAX experience so creatively they will push the image further for that venue because of this expectation from the audience that you will showcase the capabilities of the format. Similarly with HDR, even the reviews here on AVS forum foremost focus on peak luminance and absolute black, although I will hand it to the reviewers for recognizing that some material just doesn't make sense to push into the extremes. Anyway, if you care about your movie you will spend time making it look good in each format, and I think the difference between SDR and HDR is simply too great to bridge without introducing localized tone mapping, but nobody really wants to implement that. Other problems... well, there is no single display that outperforms every other class of display so there is no way to QC a universal master and actually SEE everything. I think this reference monitor problem may have also been touched on in one of the discussion panels that day? There is also a glaring conceptual problem with the idea of everything being capture device referred because for the very obvious case of animation there is no capture device, but that problem also extends to stylized live action like say Sin City where the image you are creating is not actually a reproduction of a real world scene.

Now as for making more displays available to consumers that are easy to calibrate in the same way as a DCI projector, that would be pretty awesome.

Tom Roper 07-12-2017 10:52 PM

Quote:

Originally Posted by Scott Wilkinson (Post 54238097)
Video guru Joe Kane offered his vision for the future of UHD—multiple resolutions, gamuts, grayscales, peak luminances, and EOTFs, all derived from a single master file.

But Joe's vision stems from the present reality, that HDR is jumbled mess of display targeted parameters strung together by metadata. Log gamma is a much more sensible solution. It's good enough for acquisition. It's good enough for display and has the benefit of returning to the user a reasonable amount of picture adjustment control.

Chief Technician 07-13-2017 06:02 AM

Which Problem Do You Want To Solve
 
Quote:

Originally Posted by Tom Roper (Post 54239825)
But Joe's vision stems from the present reality, that HDR is jumbled mess of display targeted parameters strung together by metadata.

Everything in ATSC 3.0 is going to be dependent upon metadata.
Quote:

Originally Posted by Tom Roper (Post 54239825)
Log gamma is a much more sensible solution.

Do you mean Hybrid Log Gamma?
Quote:

Originally Posted by Tom Roper (Post 54239825)
It's good enough for acquisition.

Log is good enough for acquisition if it is used by people who know what they are doing.
Quote:

Originally Posted by Tom Roper (Post 54239825)
It's good enough for display and has the benefit of returning to the user a reasonable amount of picture adjustment control.

Which problem do you want to solve? Think about it. In both SD (NTSC) and HD (ATSC 1.0), the artistic intent of the video can only be realized on a properly calibrated display. I work at a post production facility, and we have had our displays calibrated several times over the years. When a client asks us, "Is that what my video will look like?", our response is "If the viewer's display is calibrated, then yes, that is what it will look like." The use of metadata and display reference solves this problem. Unless the viewer has either messed with their display's settings or has an inferior display, metadata should allow the viewer to see what the content creators intended.

HLG was the first HDR solution that was usable in live production. Dolby Vision has since caught up. HLG is not display referenced. If you want to experience what I described in the previous paragraph in the era of ATSC 3.0, then HLG is the way to go. I think content creators would rather not have to deal with that. Only people who know what they are doing (most readers here) need the ability to calibrate their displays. Even then, it may not be necessary in a properly designed display referenced system.

I understand that a single deliverable may require much more metadata if you want it to be compatible with everything without backwards compatibility compromises. Such a compromise might be "Don't push the HDR too hard, otherwise the SDR conversion will be clipped." This problem, a single deliverable that is compatible with SDR and HDR while not compromising what can be done with both HDR and SDR, is something that hopefully SMPTE and EBU will be able to solve.

I think it all goes back to the question I posited earlier. Which problem do you want to solve?

RLBURNSIDE 07-13-2017 12:38 PM

Half float for distribution to consumers? When 12-bit PQ can deliver 100% banding-free HDR from 0 to 10k nits in rec 2020? I think not. That's 12 extra bits wasted for nothing. (48 vs 36).

EXR / FP16 is good for the mastering studio, that's it. Full stop. Unless new codecs support it directly and achieve better quality / bandwidth as compared to PQ at 12-bit (not likely, but anything's possible).

Shipping EXR to end users is worthless, it confers literally zero benefit. The only reason FP16 is still used in games is for HDR and because, due to it being linear, blending of light just "works", and there's no native PQ formatted render targets to do PQ-correct rendering / blending (as gamma-correctness did before it, for 8-bit render targets and texture formats). YET. But there could be. We need native PQ hardware support for textures anyway, so might as well have it for render targets too (in fact I think one implies the other).

And 4:4:4 is already supported over HDMI, unless he means in the compressed video stream itself? Again, NO.

This would double the bandwidth and bitrate budget for video with very little gain in terms of sharpness. You're MUCH better off, if you have 2X the bandwidth to spare, to either increase the bitrate or to increase the luma resolution by root(2). This is like comparing 720p in 4:4:4 with 1080p in 4:2:0. Which do you think looks better? 1080p in 4:2:0. Duh.

stef2 07-13-2017 01:15 PM

It is always good to read about people who perform extremely well in their own domain. Go Joe!

Tom Roper 07-16-2017 10:04 PM

Quote:

Originally Posted by Chief Technician (Post 54242841)
Which problem do you want to solve? Think about it. In both SD (NTSC) and HD (ATSC 1.0), the artistic intent of the video can only be realized on a properly calibrated display. I work at a post production facility, and we have had our displays calibrated several times over the years. When a client asks us, "Is that what my video will look like?", our response is "If the viewer's display is calibrated, then yes, that is what it will look like." The use of metadata and display reference solves this problem. Unless the viewer has either messed with their display's settings or has an inferior display, metadata should allow the viewer to see what the content creators intended.

The problems I want to see solved are the multiple trim passes for the colorist, and for the user, the problem of not having an appropriate display gamma for his viewing circumstances, and the problem of the broadcasters who have enough problems with metadata already just keeping audio in synch, and the equipment that's needed to support metadata throughout the broadcast chain taken together reduces the available amount of HDR content. All of those problems are solved by HLG.

As for the director's intent, it's not sacred once it leaves the confines of the theatrical presentation. It may be the director's intent for the movie to be seen in a darkened theater but the home enthusiast may want to watch in a bright room. And I'd feel differently about display referenced EOTF if it were consistently adhered to, but it's not. If the director intended it to be viewed at 100 cd/m^2 on the silver screen but Dolby vision tone maps it to 1000 for an LED, how is that maintaining director's intent? Even in your answer given to your clients for how will a video look, you must make disclaimers for a calibrated display and non-messed-with settings. So as you can see, it's a step toward consolidating control from the user that he may not want consolidated, particularly when these topics share common complaints about HDR being too bright or too dark.

That said, I respect the well reasoned and helpful tone of your posts. There is much (maybe all) I agree with, just not the need for continuing down the slippery slopes of multiple trim passes, metadata, evolving HDMI specs, and tone mapping as justification for taking away user the adjustment knob which has served well since the dawn.

Chief Technician 07-19-2017 06:14 AM

99 Problems
 
Quote:

Originally Posted by Tom Roper (Post 54330641)
The problems I want to see solved are the multiple trim passes for the colorist, and for the user, the problem of not having an appropriate display gamma for his viewing circumstances, and the problem of the broadcasters who have enough problems with metadata already just keeping audio in synch, and the equipment that's needed to support metadata throughout the broadcast chain taken together reduces the available amount of HDR content. All of those problems are solved by HLG.

I have plenty of experience in post with keeping audio in synch. I moderated sessions on the topic for several AES conventions. If broadcasters can keep their audio in sync up to their emission (transmission) point, then the Presentation Time Stamp (PTS) in the MPEG2 stream is supposed to be used to maintain audio synch. Unfortunately, this is not a requirement for those implementing MPEG2 decoders, and in the effort to keep costs down (by pennies), this functionality is not implemented. I agree that there are some audio synch situations that are clearly network related. In most cases, the broadcasters are not the problem.

Quote:

Originally Posted by Tom Roper (Post 54330641)
As for the director's intent, it's not sacred once it leaves the confines of the theatrical presentation. It may be the director's intent for the movie to be seen in a darkened theater but the home enthusiast may want to watch in a bright room. And I'd feel differently about display referenced EOTF if it were consistently adhered to, but it's not. If the director intended it to be viewed at 100 cd/m^2 on the silver screen but Dolby vision tone maps it to 1000 for an LED, how is that maintaining director's intent? Even in your answer given to your clients for how will a video look, you must make disclaimers for a calibrated display and non-messed-with settings. So as you can see, it's a step toward consolidating control from the user that he may not want consolidated, particularly when these topics share common complaints about HDR being too bright or too dark.

The moment you scale anything (SDR originated content --> Dolby Vision), one can question artistic intent. I understand your viewpoint. Either the process (scaling) is good or it is not. If we really wanted to maintain artistic intent on an extreme scale, SD content would only be viewable on CRTs.

Quote:

Originally Posted by Tom Roper (Post 54330641)
That said, I respect the well reasoned and helpful tone of your posts. There is much (maybe all) I agree with, just not the need for continuing down the slippery slopes of multiple trim passes, metadata, evolving HDMI specs, and tone mapping as justification for taking away user the adjustment knob which has served well since the dawn.

Thank you. You are making valid points in a conversational tone as well. Constructive conversation is better than the alternative of fighting.


All times are GMT -7. The time now is 11:06 AM.

Powered by vBulletin® Copyright ©2000 - 2019, Jelsoft Enterprises Ltd.
vBulletin Security provided by vBSecurity (Pro) - vBulletin Mods & Addons Copyright © 2019 DragonByte Technologies Ltd.

vBulletin Optimisation provided by vB Optimise (Pro) - vBulletin Mods & Addons Copyright © 2019 DragonByte Technologies Ltd.
User Alert System provided by Advanced User Tagging (Pro) - vBulletin Mods & Addons Copyright © 2019 DragonByte Technologies Ltd.