Calibrating HDR, Part 2 - AVS Forum | Home Theater Discussions And Reviews
Forum Jump: 
 5Likes
  • 4 Post By Scott Wilkinson
  • 1 Post By WiFi-Spy
 
Thread Tools
post #1 of 11 Old 07-15-2016, 02:12 PM - Thread Starter
AVS Forum Special Member
 
Scott Wilkinson's Avatar
 
Join Date: Sep 2001
Location: Burbank, CA
Posts: 3,258
Mentioned: 94 Post(s)
Tagged: 0 Thread(s)
Quoted: 1881 Post(s)
Liked: 4909
Calibrating HDR, Part 2

In this episode, I continue my discussion with Florian Friedrich about high dynamic range and wide color gamut, including a recap of the fundamentals of HDR and WCG, the importance of a bias light, the importance of metadata, problems with BT.2020 color primaries, the "golden reference" for consumer TVs in SpectraCal's CalMan software and why Florian thinks it's not the best approach for calibrating consumer TVs, static versus dynamic metadata, Florian's SEIEdit software and free sample video, answers to chat-room questions, and more.

Scott Wilkinson is offline  
Sponsored Links
Advertisement
 
post #2 of 11 Old 07-15-2016, 06:11 PM
Senior Member
 
freeman4's Avatar
 
Join Date: Oct 2009
Posts: 374
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 71 Post(s)
Liked: 1057
A case where the sequel exceeds the original! Great stuff!
freeman4 is offline  
post #3 of 11 Old 07-16-2016, 09:58 AM
Advanced Member
 
Utopianemo's Avatar
 
Join Date: May 2007
Location: Portland, OR
Posts: 751
Mentioned: 3 Post(s)
Tagged: 0 Thread(s)
Quoted: 248 Post(s)
Liked: 188
Wow, so frustrating! Like you've said many times before, Scott, this a great time to be into Home Theater, but it's also a challenge. On the one hand, I am completely surprised to see the UHD spec and its adoption move along at such a high rate of speed. On the other hand, It's very difficult for those of us with less expendable income and a desire for a 'future proof' set to just drop a large sum of money on a TV.

There is so much in flux with HDR, and Mr. Friedrich's comments as someone with his fingers in the post-production side of things shows how much transition is occurring even now, all the way up the production chain. It seems that we have years of incremental improvements to trickle down into the tech that we put in our living rooms.

And my plasma is starting to die! Which leads me to ask: For your dollar, Scott, which current-gen TV would you say would be the safest bet for HDR playback? Is it the models that have both HDR10 AND Dolby Vision?

Pioneer SC-95, 2x Crown XLS 1000 (LCR amps)
3x Epos Epic 2(LCR), 4x Emotiva UAW 6.2(Surrounds), 4x Aperion Intimus LC-I6(Atmos/DTS:X)
2x Dayton Ultimax 18" sealed(subs), Behringer Inuke 6000DSP(sub amp),
Vizio P-65C1 LED HDR
Oppo UDP-203 HD-Blu Ray Player, Oppo DV971H DVD Player, Roku Premiere+
Utopianemo is offline  
Sponsored Links
Advertisement
 
post #4 of 11 Old 07-16-2016, 05:07 PM - Thread Starter
AVS Forum Special Member
 
Scott Wilkinson's Avatar
 
Join Date: Sep 2001
Location: Burbank, CA
Posts: 3,258
Mentioned: 94 Post(s)
Tagged: 0 Thread(s)
Quoted: 1881 Post(s)
Liked: 4909
Quote:
Originally Posted by Utopianemo View Post
And my plasma is starting to die! Which leads me to ask: For your dollar, Scott, which current-gen TV would you say would be the safest bet for HDR playback? Is it the models that have both HDR10 AND Dolby Vision?
Yes, the best approach is to get a TV that does both. However, there are few TVs that fit that bill: the 2016 LG OLEDs (which are expensive); the LG UH9500, UH8500, and UH7700 LED-edgelit LCDs; and the Philips 8600 FALD LCD (if it ever becomes available at retail). The 2016 Vizio P-Series and M-Series both support Dolby Vision, and Vizio has promised a firmware update that will add HDR10, but it was promised for the P-Series by July, and it isn't here yet, so who knows when (or if) it will be available? I'm not a big fan of edgelit TVs, so that leaves the OLEDs (and the Philips, but who knows when it will be available?). Unless you can afford an OLED, I'm inclined to bet that Vizio will deliver the HDR10 firmware update at some point, though that is not a sure thing until we actually see it. In the meantime, the Vizios can't reproduce HDR from Ultra HD Blu-rays, all of which use HDR10 at this point, only from Dolby Vision-enabled sources such as Vudu and Netflix.
Scott Wilkinson is offline  
post #5 of 11 Old 07-17-2016, 05:24 PM
Member
 
carneb's Avatar
 
Join Date: Nov 2006
Location: Sydney, Australia
Posts: 93
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 78 Post(s)
Liked: 40
Just went to SEIedit.com. It redirects to the AVTOP website and I can't see any download link for the free test videos. Am I missing something?
carneb is offline  
post #6 of 11 Old 07-18-2016, 04:30 PM
Member
 
carneb's Avatar
 
Join Date: Nov 2006
Location: Sydney, Australia
Posts: 93
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 78 Post(s)
Liked: 40
Quote:
Originally Posted by carneb View Post
Just went to SEIedit.com. It redirects to the AVTOP website and I can't see any download link for the free test videos. Am I missing something?
It's there now.
carneb is offline  
post #7 of 11 Old 07-19-2016, 12:05 PM
AVS Forum Special Member
 
Tom Roper's Avatar
 
Join Date: Apr 2002
Location: Denver, Colorado
Posts: 4,549
Mentioned: 12 Post(s)
Tagged: 0 Thread(s)
Quoted: 463 Post(s)
Liked: 430
SMPTE ST 2094 dynamic metadata appears to have no direct application within the HDR10 viewing ecosystem. It is a scene by scene mastering transformation for when you are converting from a larger color volume (2020) to a smaller one (709); that there is no one size fits all transformation between the two color volumes that maintains artistic creative intent; that attempting to do so will transform differently for a bright scene than a dark scene resulting in out of gamut values (clipping); thus the need to develop scene by scene dynamic meta data is recognized.

So if you are watching HDR10 on a HDR10 compatible display, with dynamic metadata versus HDR10 with no dynamic metadata, there will be no difference whatsoever because you are not transforming from a larger color volume to a smaller one, you are always watching within the larger color volume.

So this really is just a method for taking HDR10 mastered content and tone mapping it to HDTV standard gamut (709) while maintaining creative artistic intent of the original.





Quote:
10E Dynamic Metadata for Color Transforms of HDR and WCG ImagesProject Type: SMPTE Engineering Project (ANSI)Project Contact: Lars BorgProject State: DPUpdated: 2016-06-14Progress Report: Parts 0: OV: Overview ready 1: ST: in publishing queue 2: WD: 31FS pre-FCD closed June 4. In 30MR for registry 10: ST: in publishing queue 20: DP: DP closed on May 18, ST Audit next 30: DP: DP closed on May 5, ST Audit next 40: FCD: 1 comment yello
Start: 2014-12-11 Est. Complete: 2016-09-15
80% Complete


Share |



Project Description: Develop a standard for the semantics and representation of metadata specifying content-dependent color volume transformation parameters to smaller color volumes in mastering applications.
Project Overview

Problem to be solved:High Dynamic Range / Wider Color Gamut content captured for example using the newly standardized SMPTE ST 2084:2014 Electro-Optical Transfer Function requires a set of industry standardized metadata to ensure a consistent transformation of this content to the existing BT.709 (add in SMPTE references) standard. This metadata is bounded by the characteristics of the mastering display as defined in SMPTE ST 2086:201x, but to ensure creative intent is maintained, content dependent metadata is also required. This standard will define the required metadata set. If color mapping is performed without this metadata, the resulting out of gamut content will suffer severely in visual quality due to clipping.
The SMPTE UHDTV Ecosystem Study Group has reported on this subject in Annex B “High Dynamic Range Imaging” and Annex C “Standard Dynamic Range Color Space Conversion” in the report dated March 28, 2014. The example cited above aligns closely with section C.4 and C.5 specifically cautioning the conversion between UHDTV and HDTV/SDTV.
The science of color volume mapping has seen major advances recently. A one-size-fits-all solution to map between different color volumes does not currently exist. The logical extension to the current color volume mapping techniques common in image processing is the use of a content-dependent dynamically parameterized color transformation. The color volume mapping transformation is derived for every scene/segment based on the metadata that characterizes the content. For example, a very dark scene will be mapped differently from a very bright scene when transforming from a large color volume to a more restricted color volume. This can be effectively achieved only when this transformation is guided by metadata.
and links..
https://www.smpte.org/sites/default/...rch%202015.pdf (see page 11 of pdf)
Tom Roper is offline  
post #8 of 11 Old 07-20-2016, 10:55 AM
AVS Forum Special Member
 
GeorgeAB's Avatar
 
Join Date: Feb 2002
Location: Denver, CO metro area, www.cinemaquestinc.com
Posts: 3,821
Mentioned: 8 Post(s)
Tagged: 0 Thread(s)
Quoted: 238 Post(s)
Liked: 340
Lightbulb Viewing Environment Fundamentals

"Back lighting" or "bias lighting" is a technique recommended by video experts for over half a century, since the black and white CRT days. Video technology and best practices are based upon the characteristics and limitations of human visual perception. TV display technology may change, but the human visual system, generally speaking, does not.

Even if someone invents a screen technology that rejects 100% of ambient light, there will still be a need to control the lighting in a viewing environment. This is due to how the human visual system adapts to ambient light, and perceives a displayed image in that adapted state. For the purposes of faithful image reproduction, it will always be advantageous for the consumer to emulate program mastering environment conditions. In other words, a bright room may be tolerable for casual viewing of sporting events, news programs, reality TV, etc., but not for optimal viewing of cinematic art.

'The Importance Of Viewing Environment Conditions In A Reference Display System'

http://cinemaquestinc.com/ive.htm

'How Viewing Environment Conditions Can Corrupt Or Enhance Your Calibration.'

https://www.avsforum.com/avs-vb/showthread.php?t=849430

'D65 Video Bias Lighting- Fundamental Theory And Practice'
http://cinemaquestinc.com/blb.htm

Best regards and beautiful pictures,
G. Alan Brown, President
CinemaQuest, Inc.
A Lion AV Consultants affiliate

"Advancing the art and science of electronic imaging"

Last edited by GeorgeAB; 07-20-2016 at 11:01 AM.
GeorgeAB is offline  
post #9 of 11 Old 07-21-2016, 02:36 PM
AVS Forum Special Member
 
WiFi-Spy's Avatar
 
Join Date: Feb 2004
Location: Seattle, WA
Posts: 3,827
Mentioned: 232 Post(s)
Tagged: 0 Thread(s)
Quoted: 1549 Post(s)
Liked: 2148
The larger gamut is the 100% P3 gamut of the mastering display that in then transformed into BT.2020 signaling for home release. The current home HDR10 displays range from 80-97% of P3, and dynamic metadata becomes more important the farther the performance of the HDR10 TV deviates from the mastering display. Beyond the standard for dynamic metadata, what is really lacking with HDR10 is a standardized gamut re-mapping and luminance tone-mapping algorithms, or even a recommendation. For as much as the discussion about Dolby Vision gets bogged down in licensing and political debates, The fact that Dolby has one tone mapping strategy for all the DV TVs, makes it kind of a de facto standard for that technology, since it isn't being dictated by how each TV manufacturer thinks HDR should look.

Quote:
Originally Posted by Tom Roper View Post
SMPTE ST 2094 dynamic metadata appears to have no direct application within the HDR10 viewing ecosystem. It is a scene by scene mastering transformation for when you are converting from a larger color volume (2020) to a smaller one (709); that there is no one size fits all transformation between the two color volumes that maintains artistic creative intent; that attempting to do so will transform differently for a bright scene than a dark scene resulting in out of gamut values (clipping); thus the need to develop scene by scene dynamic meta data is recognized.

So if you are watching HDR10 on a HDR10 compatible display, with dynamic metadata versus HDR10 with no dynamic metadata, there will be no difference whatsoever because you are not transforming from a larger color volume to a smaller one, you are always watching within the larger color volume.

So this really is just a method for taking HDR10 mastered content and tone mapping it to HDTV standard gamut (709) while maintaining creative artistic intent of the original




and links..
https://www.smpte.org/sites/default/...rch%202015.pdf (see page 11 of pdf)
dnoonie likes this.

Tyler Pruitt - Technical Evangelist - for CalMAN

10 Bit Gradient Test Patterns (HEVC) - Free Download
WiFi-Spy is online now  
post #10 of 11 Old 07-22-2016, 11:57 AM
AVS Forum Special Member
 
Tom Roper's Avatar
 
Join Date: Apr 2002
Location: Denver, Colorado
Posts: 4,549
Mentioned: 12 Post(s)
Tagged: 0 Thread(s)
Quoted: 463 Post(s)
Liked: 430
Quote:
Originally Posted by WiFi-Spy View Post
The larger gamut is the 100% P3 gamut of the mastering display that in then transformed into BT.2020 signaling for home release. The current home HDR10 displays range from 80-97% of P3, and dynamic metadata becomes more important the farther the performance of the HDR10 TV deviates from the mastering display. Beyond the standard for dynamic metadata, what is really lacking with HDR10 is a standardized gamut re-mapping and luminance tone-mapping algorithms, or even a recommendation. For as much as the discussion about Dolby Vision gets bogged down in licensing and political debates, The fact that Dolby has one tone mapping strategy for all the DV TVs, makes it kind of a de facto standard for that technology, since it isn't being dictated by how each TV manufacturer thinks HDR should look.
The smaller color volume referred to by ST-2094 is not the percentage of P3 mastering volume a particular consumer display is capable of.

If we back up to what Florian was saying, a point was being made that one of the differences between DV and HDR10 is "dynamic metadata" whereupon Florian said that we have that in HDR10 as well, known as ST-2094.

ST-2094 is a scene by scene transformation from a larger color volume into a smaller one that accounts for the out of gamut values that occur when creative content graded for a very dark or a very bright scene in a larger color volume is transformed into a smaller color volume (BT709). Since the color volume of a 100% P3 mastering monitor is bounded within HDR10's BT-2020, and since ST-2094 is not tied to any specific hardware, there will be no dynamic transformations for out of gamut values for an HDR10 display that can't do 100% P3. All that's need is the existing static metadata for that.

[quote-SMPTE ST-2094]
Project Overview

Problem to be solved:High Dynamic Range / Wider Color Gamut content captured for example using the newly standardized SMPTE ST 2084:2014 Electro-Optical Transfer Function requires a set of industry standardized metadata to ensure a consistent transformation of this content to the existing BT.709 (add in SMPTE references) standard. This metadata is bounded by the characteristics of the mastering display as defined in SMPTE ST 2086:201x, but to ensure creative intent is maintained, content dependent metadata is also required. This standard will define the required metadata set. If color mapping is performed without this metadata, the resulting out of gamut content will suffer severely in visual quality due to clipping.
The SMPTE UHDTV Ecosystem Study Group has reported on this subject in Annex B “High Dynamic Range Imaging” and Annex C “Standard Dynamic Range Color Space Conversion” in the report dated March 28, 2014. The example cited above aligns closely with section C.4 and C.5 specifically cautioning the conversion between UHDTV and HDTV/SDTV.
The science of color volume mapping has seen major advances recently. A one-size-fits-all solution to map between different color volumes does not currently exist. The logical extension to the current color volume mapping techniques common in image processing is the use of a content-dependent dynamically parameterized color transformation. The color volume mapping transformation is derived for every scene/segment based on the metadata that characterizes the content. For example, a very dark scene will be mapped differently from a very bright scene when transforming from a large color volume to a more restricted color volume. This can be effectively achieved only when this transformation is guided by metadata.
Project scope: develop multi-part standards for specifying the semantics and representation of content-dependent metadata needed for color volume transformation of high dynamic range and wide color gamut imagery to smaller color volumes (e.g. BT.709 or Digital Cinema) in mastering applications. The metadata entries constitute the logical concept (semantics) for immediate work followed by the physical encoding (representation) specification based on an extensible metadata international standard such as ISO 16684. The standards should specify metadata necessary to support the mastering of high dynamic range and wide color gamut content for next generation distribution as well as physical media formats.

The metadata should allow for parameterized color transformation that is variable along a timeline. The resulting metadata standard should include clip-based with optional title-level, shot-level, frame-level, and other applicable groupings of color volume transformation metadata and may include both public and private metadata.
The metadata is carried in parallel and synchronized with the mastered content and can be used for real time transformation during mastering or for a deferred transformation in distribution.
The metadata should also characterize certain colorimetric attributes of the content to aid in the transformation process. One example would be minimum, average, maximum luminance values as well as parametric controls for the color volume transformation process derived during mastering performed in a perceptually linear color space.
The standards implementation should not be tied to any specific hardware.
The metadata set specified by the resulting standard should use the mastering display color volume metadata specified in ST 2086 “Mastering Display Color Volume Metadata Supporting High Luminance and Wide Color Gamut Images”.
Specific methods for generation and use of the metadata are out of the scope of this standard.[/quote]
Tom Roper is offline  
post #11 of 11 Old 07-29-2016, 09:23 PM
Member
 
carneb's Avatar
 
Join Date: Nov 2006
Location: Sydney, Australia
Posts: 93
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 78 Post(s)
Liked: 40
Anyone know what the metadata is on the sample video? Or is it just blank so that you can add your own with SEIedit?
carneb is offline  
Sponsored Links
Advertisement
 
Reply AVS Forum® Podcasts

Tags
calibration , hdr , Home Theater Geeks , wcg

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off