Joe Kane at Samsung QLED/HDR10 Summit - AVS Forum | Home Theater Discussions And Reviews
Forum Jump: 
 8Likes
  • 2 Post By Scott Wilkinson
  • 2 Post By EvLee
  • 2 Post By Tom Roper
  • 1 Post By Chief Technician
  • 1 Post By Chief Technician
 
Thread Tools
post #1 of 8 Old 07-12-2017, 08:35 PM - Thread Starter
AVS Forum Special Member
 
Scott Wilkinson's Avatar
 
Join Date: Sep 2001
Location: Burbank, CA
Posts: 3,258
Mentioned: 93 Post(s)
Tagged: 0 Thread(s)
Quoted: 1881 Post(s)
Liked: 4905
Joe Kane at Samsung QLED/HDR10 Summit

Video guru Joe Kane offered his vision for the future of UHD—multiple resolutions, gamuts, grayscales, peak luminances, and EOTFs, all derived from a single master file.

https://www.avsforum.com/joe-kane-sam...dhdr10-summit/
Haiej and am2model3 like this.
Scott Wilkinson is offline  
Sponsored Links
Advertisement
 
post #2 of 8 Old 07-12-2017, 09:52 PM
Advanced Member
 
EvLee's Avatar
 
Join Date: Dec 2009
Posts: 683
Mentioned: 7 Post(s)
Tagged: 0 Thread(s)
Quoted: 394 Post(s)
Liked: 373
IMO, the single master file isn't a particularly good idea creatively, but it's a great idea to sell to studios and distributors since it promises to reduce inventory management. We already use half float as the intermediate format in post-production, but once you get to grading you are making creative choices that may diverge for different delivery formats. I.e. if you are going to IMAX, you as a paying customer want the full IMAX experience so creatively they will push the image further for that venue because of this expectation from the audience that you will showcase the capabilities of the format. Similarly with HDR, even the reviews here on AVS forum foremost focus on peak luminance and absolute black, although I will hand it to the reviewers for recognizing that some material just doesn't make sense to push into the extremes. Anyway, if you care about your movie you will spend time making it look good in each format, and I think the difference between SDR and HDR is simply too great to bridge without introducing localized tone mapping, but nobody really wants to implement that. Other problems... well, there is no single display that outperforms every other class of display so there is no way to QC a universal master and actually SEE everything. I think this reference monitor problem may have also been touched on in one of the discussion panels that day? There is also a glaring conceptual problem with the idea of everything being capture device referred because for the very obvious case of animation there is no capture device, but that problem also extends to stylized live action like say Sin City where the image you are creating is not actually a reproduction of a real world scene.

Now as for making more displays available to consumers that are easy to calibrate in the same way as a DCI projector, that would be pretty awesome.
Rumble Devo and King Richard like this.

Last edited by EvLee; 07-12-2017 at 09:57 PM.
EvLee is offline  
post #3 of 8 Old 07-12-2017, 10:52 PM
AVS Forum Special Member
 
Tom Roper's Avatar
 
Join Date: Apr 2002
Location: Denver, Colorado
Posts: 4,526
Mentioned: 11 Post(s)
Tagged: 0 Thread(s)
Quoted: 455 Post(s)
Liked: 425
Quote:
Originally Posted by Scott Wilkinson View Post
Video guru Joe Kane offered his vision for the future of UHD—multiple resolutions, gamuts, grayscales, peak luminances, and EOTFs, all derived from a single master file.
But Joe's vision stems from the present reality, that HDR is jumbled mess of display targeted parameters strung together by metadata. Log gamma is a much more sensible solution. It's good enough for acquisition. It's good enough for display and has the benefit of returning to the user a reasonable amount of picture adjustment control.
rak306 and mrtickleuk like this.

HDR Colorist and Conversions
INTO THE CAVE OF WONDERS
Directed by MANUEL BENITO DE VALLE Produced by PEDRO PABLO FIGUEROA
Cast MANUEL ANGEL REINA, CLAUDIA GARROTE
LOVETHEFRAME STORIES, SOUNDTRACKS AND FILMS
Tom Roper is offline  
Sponsored Links
Advertisement
 
post #4 of 8 Old 07-13-2017, 06:02 AM
Member
 
Join Date: Jun 2017
Posts: 105
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 48 Post(s)
Liked: 30
Send a message via AIM to Chief Technician Send a message via Yahoo to Chief Technician Send a message via Skype™ to Chief Technician
Question Which Problem Do You Want To Solve

Quote:
Originally Posted by Tom Roper View Post
But Joe's vision stems from the present reality, that HDR is jumbled mess of display targeted parameters strung together by metadata.
Everything in ATSC 3.0 is going to be dependent upon metadata.
Quote:
Originally Posted by Tom Roper View Post
Log gamma is a much more sensible solution.
Do you mean Hybrid Log Gamma?
Quote:
Originally Posted by Tom Roper View Post
It's good enough for acquisition.
Log is good enough for acquisition if it is used by people who know what they are doing.
Quote:
Originally Posted by Tom Roper View Post
It's good enough for display and has the benefit of returning to the user a reasonable amount of picture adjustment control.
Which problem do you want to solve? Think about it. In both SD (NTSC) and HD (ATSC 1.0), the artistic intent of the video can only be realized on a properly calibrated display. I work at a post production facility, and we have had our displays calibrated several times over the years. When a client asks us, "Is that what my video will look like?", our response is "If the viewer's display is calibrated, then yes, that is what it will look like." The use of metadata and display reference solves this problem. Unless the viewer has either messed with their display's settings or has an inferior display, metadata should allow the viewer to see what the content creators intended.

HLG was the first HDR solution that was usable in live production. Dolby Vision has since caught up. HLG is not display referenced. If you want to experience what I described in the previous paragraph in the era of ATSC 3.0, then HLG is the way to go. I think content creators would rather not have to deal with that. Only people who know what they are doing (most readers here) need the ability to calibrate their displays. Even then, it may not be necessary in a properly designed display referenced system.

I understand that a single deliverable may require much more metadata if you want it to be compatible with everything without backwards compatibility compromises. Such a compromise might be "Don't push the HDR too hard, otherwise the SDR conversion will be clipped." This problem, a single deliverable that is compatible with SDR and HDR while not compromising what can be done with both HDR and SDR, is something that hopefully SMPTE and EBU will be able to solve.

I think it all goes back to the question I posited earlier. Which problem do you want to solve?
King Richard likes this.
Chief Technician is offline  
post #5 of 8 Old 07-13-2017, 12:38 PM
 
RLBURNSIDE's Avatar
 
Join Date: Jan 2007
Posts: 3,901
Mentioned: 16 Post(s)
Tagged: 0 Thread(s)
Quoted: 2012 Post(s)
Liked: 1406
Half float for distribution to consumers? When 12-bit PQ can deliver 100% banding-free HDR from 0 to 10k nits in rec 2020? I think not. That's 12 extra bits wasted for nothing. (48 vs 36).

EXR / FP16 is good for the mastering studio, that's it. Full stop. Unless new codecs support it directly and achieve better quality / bandwidth as compared to PQ at 12-bit (not likely, but anything's possible).

Shipping EXR to end users is worthless, it confers literally zero benefit. The only reason FP16 is still used in games is for HDR and because, due to it being linear, blending of light just "works", and there's no native PQ formatted render targets to do PQ-correct rendering / blending (as gamma-correctness did before it, for 8-bit render targets and texture formats). YET. But there could be. We need native PQ hardware support for textures anyway, so might as well have it for render targets too (in fact I think one implies the other).

And 4:4:4 is already supported over HDMI, unless he means in the compressed video stream itself? Again, NO.

This would double the bandwidth and bitrate budget for video with very little gain in terms of sharpness. You're MUCH better off, if you have 2X the bandwidth to spare, to either increase the bitrate or to increase the luma resolution by root(2). This is like comparing 720p in 4:4:4 with 1080p in 4:2:0. Which do you think looks better? 1080p in 4:2:0. Duh.
RLBURNSIDE is offline  
post #6 of 8 Old 07-13-2017, 01:15 PM
AVS Forum Special Member
 
stef2's Avatar
 
Join Date: Mar 2005
Location: Canada
Posts: 1,272
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 482 Post(s)
Liked: 282
It is always good to read about people who perform extremely well in their own domain. Go Joe!
stef2 is offline  
post #7 of 8 Old 07-16-2017, 10:04 PM
AVS Forum Special Member
 
Tom Roper's Avatar
 
Join Date: Apr 2002
Location: Denver, Colorado
Posts: 4,526
Mentioned: 11 Post(s)
Tagged: 0 Thread(s)
Quoted: 455 Post(s)
Liked: 425
Quote:
Originally Posted by Chief Technician View Post
Which problem do you want to solve? Think about it. In both SD (NTSC) and HD (ATSC 1.0), the artistic intent of the video can only be realized on a properly calibrated display. I work at a post production facility, and we have had our displays calibrated several times over the years. When a client asks us, "Is that what my video will look like?", our response is "If the viewer's display is calibrated, then yes, that is what it will look like." The use of metadata and display reference solves this problem. Unless the viewer has either messed with their display's settings or has an inferior display, metadata should allow the viewer to see what the content creators intended.
The problems I want to see solved are the multiple trim passes for the colorist, and for the user, the problem of not having an appropriate display gamma for his viewing circumstances, and the problem of the broadcasters who have enough problems with metadata already just keeping audio in synch, and the equipment that's needed to support metadata throughout the broadcast chain taken together reduces the available amount of HDR content. All of those problems are solved by HLG.

As for the director's intent, it's not sacred once it leaves the confines of the theatrical presentation. It may be the director's intent for the movie to be seen in a darkened theater but the home enthusiast may want to watch in a bright room. And I'd feel differently about display referenced EOTF if it were consistently adhered to, but it's not. If the director intended it to be viewed at 100 cd/m^2 on the silver screen but Dolby vision tone maps it to 1000 for an LED, how is that maintaining director's intent? Even in your answer given to your clients for how will a video look, you must make disclaimers for a calibrated display and non-messed-with settings. So as you can see, it's a step toward consolidating control from the user that he may not want consolidated, particularly when these topics share common complaints about HDR being too bright or too dark.

That said, I respect the well reasoned and helpful tone of your posts. There is much (maybe all) I agree with, just not the need for continuing down the slippery slopes of multiple trim passes, metadata, evolving HDMI specs, and tone mapping as justification for taking away user the adjustment knob which has served well since the dawn.

HDR Colorist and Conversions
INTO THE CAVE OF WONDERS
Directed by MANUEL BENITO DE VALLE Produced by PEDRO PABLO FIGUEROA
Cast MANUEL ANGEL REINA, CLAUDIA GARROTE
LOVETHEFRAME STORIES, SOUNDTRACKS AND FILMS
Tom Roper is offline  
post #8 of 8 Old 07-19-2017, 06:14 AM
Member
 
Join Date: Jun 2017
Posts: 105
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 48 Post(s)
Liked: 30
Send a message via AIM to Chief Technician Send a message via Yahoo to Chief Technician Send a message via Skype™ to Chief Technician
Thumbs up 99 Problems

Quote:
Originally Posted by Tom Roper View Post
The problems I want to see solved are the multiple trim passes for the colorist, and for the user, the problem of not having an appropriate display gamma for his viewing circumstances, and the problem of the broadcasters who have enough problems with metadata already just keeping audio in synch, and the equipment that's needed to support metadata throughout the broadcast chain taken together reduces the available amount of HDR content. All of those problems are solved by HLG.
I have plenty of experience in post with keeping audio in synch. I moderated sessions on the topic for several AES conventions. If broadcasters can keep their audio in sync up to their emission (transmission) point, then the Presentation Time Stamp (PTS) in the MPEG2 stream is supposed to be used to maintain audio synch. Unfortunately, this is not a requirement for those implementing MPEG2 decoders, and in the effort to keep costs down (by pennies), this functionality is not implemented. I agree that there are some audio synch situations that are clearly network related. In most cases, the broadcasters are not the problem.

Quote:
Originally Posted by Tom Roper View Post
As for the director's intent, it's not sacred once it leaves the confines of the theatrical presentation. It may be the director's intent for the movie to be seen in a darkened theater but the home enthusiast may want to watch in a bright room. And I'd feel differently about display referenced EOTF if it were consistently adhered to, but it's not. If the director intended it to be viewed at 100 cd/m^2 on the silver screen but Dolby vision tone maps it to 1000 for an LED, how is that maintaining director's intent? Even in your answer given to your clients for how will a video look, you must make disclaimers for a calibrated display and non-messed-with settings. So as you can see, it's a step toward consolidating control from the user that he may not want consolidated, particularly when these topics share common complaints about HDR being too bright or too dark.
The moment you scale anything (SDR originated content --> Dolby Vision), one can question artistic intent. I understand your viewpoint. Either the process (scaling) is good or it is not. If we really wanted to maintain artistic intent on an extreme scale, SD content would only be viewable on CRTs.

Quote:
Originally Posted by Tom Roper View Post
That said, I respect the well reasoned and helpful tone of your posts. There is much (maybe all) I agree with, just not the need for continuing down the slippery slopes of multiple trim passes, metadata, evolving HDMI specs, and tone mapping as justification for taking away user the adjustment knob which has served well since the dawn.
Thank you. You are making valid points in a conversational tone as well. Constructive conversation is better than the alternative of fighting.
Tom Roper likes this.
Chief Technician is offline  
Sponsored Links
Advertisement
 
Reply High Dynamic Range (HDR) & Wide Color Gamut (WCG)

Tags
hdr , uhd , wcg

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off