Spears & Munsil UHD HDR Benchmark Disc - Discussion - Page 21 - AVS Forum | Home Theater Discussions And Reviews
Forum Jump: 
 291Likes
Reply
 
Thread Tools
post #601 of 647 Old 11-09-2019, 10:04 AM
Member
 
lukewayne's Avatar
 
Join Date: Mar 2008
Location: Los Angeles, CA
Posts: 35
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 11 Post(s)
Liked: 15
Hi Stacey,

Could you please give some guidance on what can be learned from the pictured color patterns from the disc?
I'm struggling through the process of getting an acceptable picture out of my projector, so I'm sure I'll have more questions, but I thought this one would be an easy start.

If you zoom in you can see that the green colors make it all the way to the top of the screen, but the other colors clip into nothing at the top. I'm battling very green shadows/blacks even with extreme settings alterations. I'm wondering if this pattern could be indicating that the projector is unable to create red and blue in the deeper darker colors, causing my green shadow issue.

Thanks,

Luke
Attached Thumbnails
Click image for larger version

Name:	Canon Settings-139.jpg
Views:	49
Size:	1.01 MB
ID:	2638614   Click image for larger version

Name:	Canon Settings-141.jpg
Views:	42
Size:	1.03 MB
ID:	2638616  

Emotiva RMC-1 | Emotiva XPA DR3 | Emotiva XPA-8 Gen 3
7.1.4 - Polk LSiM707 (front 3) - LSiM702F/X (surround) - 265-LS (atmos)
Canon 4K600Z | Seymour Screen Excellence Trim TB-130-NEO
Panasonic DP-UB9000 | Oppo UDP-203 |Xbox One X - PS3
lukewayne is offline  
Sponsored Links
Advertisement
 
post #602 of 647 Old 11-11-2019, 08:40 AM
AVS Forum Special Member
 
sspears's Avatar
 
Join Date: Feb 1999
Location: Sammamish, WA, USA
Posts: 5,357
Mentioned: 23 Post(s)
Tagged: 0 Thread(s)
Quoted: 340 Post(s)
Liked: 539
Hi Luke,

I use that particular pattern to see the effects of pre and post calibration. e.g. If you enable/disable 3D LUT, does banding become much worse after the 3D LUT is applied. What happens when you enable / disable hue shift compensation. (if the device has that capability)

What you are describing with the green shadow issue sounds like a white balance / grayscale issue on the bottom end of the range. If you step through the grayscale window patterns, at what window level does the green cast go away?

Do you have a colorimeter to measure the grayscale patches?

Stacey Spears
Co-Creator, Spears & Munsil UHD HDR Benchmark
sspears is offline  
post #603 of 647 Old 11-11-2019, 09:55 PM
Member
 
lukewayne's Avatar
 
Join Date: Mar 2008
Location: Los Angeles, CA
Posts: 35
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 11 Post(s)
Liked: 15
Quote:
Originally Posted by sspears View Post
Hi Luke,

I use that particular pattern to see the effects of pre and post calibration. e.g. If you enable/disable 3D LUT, does banding become much worse after the 3D LUT is applied. What happens when you enable / disable hue shift compensation. (if the device has that capability)

What you are describing with the green shadow issue sounds like a white balance / grayscale issue on the bottom end of the range. If you step through the grayscale window patterns, at what window level does the green cast go away?

Do you have a colorimeter to measure the grayscale patches?
Hi Stacey,

That is a great tip on looking for color banding with those patterns. I've noticed w/ 4k and HDR I'm more aware of seeing processing artifacts. .. could just be the big screen ;-)

I don't have a colorimeter for checking monitors/projectors. I've got a Minolta Color Meter III (for film lighting) but I imagine it's not specific enough for this purpose.

I got ahold of Canon support, and they suspect there is something wrong with my projector causing the uncorrectable green shadows. I'm bringing it in for service tomorrow, hopefully, they'll help me get to the bottom of it. Here's a folder with a bunch of pictures of the issues I was experiencing. https://drive.google.com/drive/folde...PY?usp=sharing
There was a strange thing happening on the Brightness test pattern, where the background checker would appear and then disappear and then appear again while moving the settings in a single direction. Anyway, if it is still doing that when I get it back from service I'll be sure to make some fresh pictures and ask about it.

Thanks for making such a great tool in this test disk!

Emotiva RMC-1 | Emotiva XPA DR3 | Emotiva XPA-8 Gen 3
7.1.4 - Polk LSiM707 (front 3) - LSiM702F/X (surround) - 265-LS (atmos)
Canon 4K600Z | Seymour Screen Excellence Trim TB-130-NEO
Panasonic DP-UB9000 | Oppo UDP-203 |Xbox One X - PS3
lukewayne is offline  
Sponsored Links
Advertisement
 
post #604 of 647 Old 11-12-2019, 10:43 AM
AVS Forum Special Member
 
sspears's Avatar
 
Join Date: Feb 1999
Location: Sammamish, WA, USA
Posts: 5,357
Mentioned: 23 Post(s)
Tagged: 0 Thread(s)
Quoted: 340 Post(s)
Liked: 539
That is just the behavior of some digital brightness controls. I believe the one in Photoshop behaves the exact same way. The checkerboard is there for DLP projectors. You see dither in the bright square and no dither in the darker square. I would not worry to much about it.

Stacey Spears
Co-Creator, Spears & Munsil UHD HDR Benchmark
sspears is offline  
post #605 of 647 Old 11-13-2019, 11:21 AM
Member
 
lukewayne's Avatar
 
Join Date: Mar 2008
Location: Los Angeles, CA
Posts: 35
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 11 Post(s)
Liked: 15
Quote:
Originally Posted by sspears View Post
That is just the behavior of some digital brightness controls. I believe the one in Photoshop behaves the exact same way. The checkerboard is there for DLP projectors. You see dither in the bright square and no dither in the darker square. I would not worry to much about it.
Ok Thanks! My projector is LCoS so I'll stop worrying about the checkerboard.

Yesterday I got word from Canon that they were able to recreate my various issues in their lab, so I'm hoping to have a repaired projector to play with in a week or so.

Emotiva RMC-1 | Emotiva XPA DR3 | Emotiva XPA-8 Gen 3
7.1.4 - Polk LSiM707 (front 3) - LSiM702F/X (surround) - 265-LS (atmos)
Canon 4K600Z | Seymour Screen Excellence Trim TB-130-NEO
Panasonic DP-UB9000 | Oppo UDP-203 |Xbox One X - PS3
lukewayne is offline  
post #606 of 647 Old 11-18-2019, 04:45 PM
AVS Forum Special Member
 
ConnecTEDDD's Avatar
 
Join Date: Sep 2010
Location: Athens, Greece
Posts: 8,786
Mentioned: 220 Post(s)
Tagged: 1 Thread(s)
Quoted: 3473 Post(s)
Liked: 4309
Hey Stacey,

I have some questions about the color grade of your disk.

BTW what spectroradiometer/colorimeter combo you used to calibrate and verify that all the monitors used were properly calibrated?

Using RED Cine-X, the Red RAW footage converted to openEXR footage in RWG/Linear, so you import them to Resolve where you did the master grade @ 4000 nits with Dolby Pulsar Monitor configured @ 4000 nits.

Did you used Resolve Color management configured into Davinci YRGB? or into YRGB color managed? or into ACEScct? or into ACEScc ?

Did you used Resolve 14.x or 15.x?

Have you configured Resolve to "Use dual output on SDI"? ... in that case did you used the Dolby PRM-4220 configured for REC.709 100 nits to control the SDR trim pass?

...or did you use external CMU to be able to output the secondary SDR monitor for control purpose?

Did you used Resolve also to grade into 10000 nits from 4000 nits master? ...or you used Transcoder to re-map to 10000 nits from 4000 nits ? or other process?

How have you controlled the 10000 nits grade? It was an extrapolation at the end of the workflow from 4000 nits color grade?

Have you export the master grade into 4000 or 10000 nits or both of them from Resolve or from other software?

After the colorist from Dolby did the primary grade @ 4000 nits REC.2020 PQ ST.2084, what did you used as "softCMU" to calculate de Dolby Vision Metadata?

Did you used a dual monitor configuration to control SDR version from the HDR master? ...the first HDR monitor at 4000 nits (Dolby Pulsar) and the second SDR monitor at 100 nits (Dolby PRM-4220)?

How about the other HDR grades; @ 2000 - 1000 - 600 nits; have you used different dual HDR monitor configuration?

For example, did you used Sony BVM-X300 to control the 1000 nits version and the Dolby PRM-32FHD to control the 2000 nits version and of course the Dolby PRM-4220 to control the 600 nits version?

...or you used the metadata calculation from the 100 nits version secondary trim control?

The Dolby Vision iCMU/eCMU 2.9 (if its been used internal or external CMU, it provides the same end results but need a retouch) calculated the 2000/1000/600/100 trim pass from 4000 nits or from 10000 nits sequences?

For SDR 100 nits REC.709 trim, the colorist used the 5 available trim controls (as the 6th control in CMU 2.9 is disabled for SDR trim) to adjust the picture trying to match the creative intent of the 4000 nits primary grade from Dolby Pulsar.

Finally, from the Dolby Vision IMF or from TIFF or EXR and XML metadata, its been exported the SDR 709 100 nits version encoded as ''20012.M2TS'' to the disk, correct? from Transkoder?

(It looks like you have explained that it was done with Transcoder from master version graded into 4000 nits or into 10000 nits.)

I can understand the whole process is much more complex just I wanted to find out more details ....about the SDR grade I'm interested also. (20012.M2TS file)

For the Dolby Vision 4.0 version of the Montage, you plan to re-do the source material to ensure that Resolve will not clip?

Do you think there is some advantage to work with ACES colorspace?

For example...

1. Export RWG / Log3G10 from RCX as OpenEXR no compression 16-bit ACES-ST2065-1.

2. Use ACES ST2065-1 as IDT into ACEScct workflows and Rec2020 ST2084 as ODT and internalCMU configured at 4000 REC.2020 D65 ST.2084 as mastering display.

or with ACES also but with Resolve configured in YRGB and using Paul's Dore DCTL and OFX plug-in to be able to manipulate ACES IDT and ODT math: https://github.com/baldavenger

3. Using Resolve configured as "Use dual output on SDI" with Dolby Pulsar as primary HDR monitor @ 4000 nits and Dolby PRM-4220 @ 600 and @ 100 nits.

4. Using Resolve configured as "Use dual output on SDI" with Dolby Pulsar as primary HDR monitor @ 4000 or 2000 nits and Dolby PRM-4220 @ 600 and @100 nits and or Sony BVM-X300.

5. Use Resolve 16 for the new grade with Dolby's CMU 4.0 trim passes. (should be a nice improvement over current version)

6. Use your de-convolution (possibly the new version which Shaindow developed) doing versioning from 8K to 4K or 2160p.

7. Use Transkoder to create some of the other versions again... using it with AJA card with Dolby Vision HDMI Tunneling to QC with LG C9 OLED over HDMI Tunneling...

8. Use Dolby Encoding engine to encode to Dolby Vision Profile 7 to control the Sony's Pixelogic encoding.

9. Use the latest x265 to encode in HDR10+ to control the Transkoder encoding.

10. If its possible to include to the next release a USB Stick with the Montage version in MP4 Dolby Vision Profile 5 (as Netflix use Profile 5 also), encoded using ICtCp completely.

Ted's LightSpace CMS Calibration Disk Free Version for Free Calibration Software: LightSpace DPS / CalMAN ColorChecker / HCFR
S/W: LightSpace CMS, SpaceMan ICC, SpaceMatch DCM, CalMAN 5, CalMAN RGB, ChromaPure, ControlCAL
V/P: eeColor 3D LUT Box - P/G: DVDO AVLab TPG
Meters: JETI Specbos 1211, Klein K-10A, i1PRO2, i1PRO, SpectraCAL C6, i1D3, C5

Last edited by ConnecTEDDD; 11-19-2019 at 03:18 AM.
ConnecTEDDD is online now  
post #607 of 647 Old 11-18-2019, 05:41 PM
AVS Forum Special Member
 
ConnecTEDDD's Avatar
 
Join Date: Sep 2010
Location: Athens, Greece
Posts: 8,786
Mentioned: 220 Post(s)
Tagged: 1 Thread(s)
Quoted: 3473 Post(s)
Liked: 4309
For users asked about colors outside P3 colorspace....

I used a spectroscope tool from a professional QC software which is analyzing (without expanding video to PC levels; something Resolve is doing when you import a video file) the native YCbCr pixel values per each video frame of an HDR10 clip.

For that analysis I have selected the REC.2020 HDR10 10000 nits grade of Montage (20002.M2TS file) of S&M3 UHD disk.

After the analysis, the software using the following colorspace sequence conversions: YCbCr -> RGB - > XYZ -> xyY... the color pixel traces per each video frame will be displayed to the CIE chart, to be able to see which colors used per each video frame. (you can see this in real time while you playback the clip which colors are used...and Stacey has plan to include a CIE picture of colors displayed per frame as on-screen overlay to the actual video clip, a very interesting feature)

I have enabled a function where its adding to the CIE chart the sum of all clip frames, so looking the CIE chart picture below you see all the colors used during the playback of the entire clip (so its 11158 video frame color traces combined to one picture)

As the CIE chart has 2 dimensions (x/y) to visualize the Y (3rd dimension) its been enabled an RGB color mode where its plotting the actual RGB color values at the xy coordinate.

Here is the REC.2020 HDR10 10000 nits grade of Montage (20002.M2TS file):



Using the same analysis method, here is the REC.709 SDR 100 nits grade of Montage (20012.M2TS):


Ted's LightSpace CMS Calibration Disk Free Version for Free Calibration Software: LightSpace DPS / CalMAN ColorChecker / HCFR
S/W: LightSpace CMS, SpaceMan ICC, SpaceMatch DCM, CalMAN 5, CalMAN RGB, ChromaPure, ControlCAL
V/P: eeColor 3D LUT Box - P/G: DVDO AVLab TPG
Meters: JETI Specbos 1211, Klein K-10A, i1PRO2, i1PRO, SpectraCAL C6, i1D3, C5
ConnecTEDDD is online now  
post #608 of 647 Old 11-19-2019, 01:29 PM
Member
 
Oscarilbo's Avatar
 
Join Date: Feb 2012
Posts: 76
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 28 Post(s)
Liked: 15
Hi... I'm having an issue calibrating my Projector. I recently bought a somewhat dark grayed screen, and then proceed to calibrate the projector accordingly and all perfect. My main source are still physical discs (4K and fullhd). But when I stream, whereas be Netflix, or Amazon or whatever, image is a lot darker than physical media, so how or what could I do to calibrate specifically for streaming? (under an additional user mode of curse) Any suggestion?

Thank you so much in advance
Oscarilbo is offline  
post #609 of 647 Old 11-19-2019, 01:48 PM
AVS Forum Special Member
 
mrtickleuk's Avatar
 
Join Date: May 2016
Location: Birmingham, UK - you know, the original one!
Posts: 7,551
Mentioned: 184 Post(s)
Tagged: 0 Thread(s)
Quoted: 5230 Post(s)
Liked: 7566
Quote:
Originally Posted by ConnecTEDDD View Post
Hey Stacey,

I have some LOTS OF questions about the color grade of your disk.
I counted 34! That's a lot more than "some", Ted!
KC-Technerd likes this.

_______________
Denon AVR-X4200W, Arcam Alpha 8P; 5.1.4 setup: Mission 702e, M7C1i, 77DS, 731. Rel T5 Sub. Monitor Audio CT165 4 Tops | LG OLED55C8PLA TV | Samsung UBD-K8500 UHD Blu-Ray

HDMI 2.0 4K modes | Dolby & DTS core+outer audio tracks on (UHD) Blu-Rays | Hello to Jason Isaacs
mrtickleuk is online now  
post #610 of 647 Old 11-19-2019, 01:51 PM
AVS Forum Special Member
 
mrtickleuk's Avatar
 
Join Date: May 2016
Location: Birmingham, UK - you know, the original one!
Posts: 7,551
Mentioned: 184 Post(s)
Tagged: 0 Thread(s)
Quoted: 5230 Post(s)
Liked: 7566
Quote:
Originally Posted by ConnecTEDDD View Post
For users asked about colors outside P3 colorspace....

I used a spectroscope tool from a professional QC software which is analyzing (without expanding video to PC levels; something Resolve is doing when you import a video file) the native YCbCr pixel values per each video frame of an HDR10 clip.

For that analysis I have selected the REC.2020 HDR10 10000 nits grade of Montage (20002.M2TS file) of S&M3 UHD disk.

Here is the REC.2020 HDR10 10000 nits grade of Montage (20002.M2TS file):
Thanks for doing that and it's very pretty. I have one question
I can see quite a lot of pixels in this image which are clearly outside the big outer Rec.2020 triangle; how is this possible because I would not expect a single pixel to be outside the triangle?

Thanks

_______________
Denon AVR-X4200W, Arcam Alpha 8P; 5.1.4 setup: Mission 702e, M7C1i, 77DS, 731. Rel T5 Sub. Monitor Audio CT165 4 Tops | LG OLED55C8PLA TV | Samsung UBD-K8500 UHD Blu-Ray

HDMI 2.0 4K modes | Dolby & DTS core+outer audio tracks on (UHD) Blu-Rays | Hello to Jason Isaacs
mrtickleuk is online now  
post #611 of 647 Old 11-19-2019, 05:35 PM
Member
 
lukewayne's Avatar
 
Join Date: Mar 2008
Location: Los Angeles, CA
Posts: 35
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 11 Post(s)
Liked: 15
Quote:
Originally Posted by Oscarilbo View Post
Hi... I'm having an issue calibrating my Projector. I recently bought a somewhat dark grayed screen, and then proceed to calibrate the projector accordingly and all perfect. My main source are still physical discs (4K and fullhd). But when I stream, whereas be Netflix, or Amazon or whatever, image is a lot darker than physical media, so how or what could I do to calibrate specifically for streaming? (under an additional user mode of curse) Any suggestion?



Thank you so much in advance


Sounds like the difference between “Full” or “PC” levels vs “Limited” or “Video” do a check of the settings on your streamer and projector. Google a bit to get specifics but it has to do with what numbers represent what brightness levels.

Everyone’s situation is a bit different, but I’d guess your disc player is using Limited/Video and the streamer is sending Full/PC. And your projector isn’t switching between the two modes when you change devices.

Easy solution could be change everything to limited/video.

As an aside I was reading about Color Grading Dolby Vision the other day and I think it uses full range all the time. Not sure how that that applies to consumer devices though.


Sent from my iPad using Tapatalk

Emotiva RMC-1 | Emotiva XPA DR3 | Emotiva XPA-8 Gen 3
7.1.4 - Polk LSiM707 (front 3) - LSiM702F/X (surround) - 265-LS (atmos)
Canon 4K600Z | Seymour Screen Excellence Trim TB-130-NEO
Panasonic DP-UB9000 | Oppo UDP-203 |Xbox One X - PS3
lukewayne is offline  
post #612 of 647 Old 11-20-2019, 04:50 AM
aka jfinnie
 
Join Date: Aug 2015
Location: Norwich, UK
Posts: 3,516
Mentioned: 60 Post(s)
Tagged: 0 Thread(s)
Quoted: 2830 Post(s)
Liked: 1961
Quote:
Originally Posted by mrtickleuk View Post
Thanks for doing that and it's very pretty. I have one question
I can see quite a lot of pixels in this image which are clearly outside the big outer Rec.2020 triangle; how is this possible because I would not expect a single pixel to be outside the triangle?
My (limited) understanding is that the math that defines YCbCr - which is the format used for video typically - means there are some YCbCr pixel values that when turned into a "real" colour are actually outside the gamut.
This is because although the gamut is defined by RGB primaries, effectively there is a rotation of the RGB space to fit the whole of it within a YCbCr space that is slightly larger for a given set of primaries. So most of the pixel values possible to put in the stream are in gamut, but some will be slightly out of gamut.

I'm not sure if the errant points typically come from compression artifacts, or tend to be math artifacts from processing in the post chain, or what. Either way I guess you'd expect a playback device to ultimately crop them to an appropriate place on the gamut edge somewhere when there is an eventual conversion back to RGB (or WRGB) to get it onto a panel.
bobof is online now  
post #613 of 647 Old 11-20-2019, 08:32 AM - Thread Starter
AVS Forum Special Member
 
WiFi-Spy's Avatar
 
Join Date: Feb 2004
Location: Seattle, WA
Posts: 3,937
Mentioned: 237 Post(s)
Tagged: 0 Thread(s)
Quoted: 1647 Post(s)
Liked: 2232
Quote:
Originally Posted by bobof View Post
My (limited) understanding is that the math that defines YCbCr - which is the format used for video typically - means there are some YCbCr pixel values that when turned into a "real" colour are actually outside the gamut.

This is because although the gamut is defined by RGB primaries, effectively there is a rotation of the RGB space to fit the whole of it within a YCbCr space that is slightly larger for a given set of primaries. So most of the pixel values possible to put in the stream are in gamut, but some will be slightly out of gamut.



I'm not sure if the errant points typically come from compression artifacts, or tend to be math artifacts from processing in the post chain, or what. Either way I guess you'd expect a playback device to ultimately crop them to an appropriate place on the gamut edge somewhere when there is an eventual conversion back to RGB (or WRGB) to get it onto a panel.


Also this was made using DaVinci Resolve’s CIE gamut scopes that has probably been converted to floating point RGB before the calculations.

Something like the AJA/Colorfront HDR Analyzer is a better tool for this.


Tyler Pruitt - Technical Evangelist - for CalMAN

10 Bit Gradient Test Patterns (HEVC) - Free Download
WiFi-Spy is online now  
post #614 of 647 Old 11-20-2019, 08:45 AM
AVS Forum Special Member
 
sspears's Avatar
 
Join Date: Feb 1999
Location: Sammamish, WA, USA
Posts: 5,357
Mentioned: 23 Post(s)
Tagged: 0 Thread(s)
Quoted: 340 Post(s)
Liked: 539
Quote:
Originally Posted by Oscarilbo View Post
Hi... I'm having an issue calibrating my Projector. I recently bought a somewhat dark grayed screen, and then proceed to calibrate the projector accordingly and all perfect. My main source are still physical discs (4K and fullhd). But when I stream, whereas be Netflix, or Amazon or whatever, image is a lot darker than physical media, so how or what could I do to calibrate specifically for streaming? (under an additional user mode of curse) Any suggestion?

Thank you so much in advance
It almost sounds like some of your sources are in full range and some limited range. Assuming everything was in the same range, then they would match across sources. What is your source for streaming?

Stacey Spears
Co-Creator, Spears & Munsil UHD HDR Benchmark
sspears is offline  
post #615 of 647 Old 11-20-2019, 08:48 AM
AVS Forum Special Member
 
sspears's Avatar
 
Join Date: Feb 1999
Location: Sammamish, WA, USA
Posts: 5,357
Mentioned: 23 Post(s)
Tagged: 0 Thread(s)
Quoted: 340 Post(s)
Liked: 539
Quote:
Originally Posted by bobof View Post
My (limited) understanding is that the math that defines YCbCr - which is the format used for video typically - means there are some YCbCr pixel values that when turned into a "real" colour are actually outside the gamut.
This is because although the gamut is defined by RGB primaries, effectively there is a rotation of the RGB space to fit the whole of it within a YCbCr space that is slightly larger for a given set of primaries. So most of the pixel values possible to put in the stream are in gamut, but some will be slightly out of gamut.

I'm not sure if the errant points typically come from compression artifacts, or tend to be math artifacts from processing in the post chain, or what. Either way I guess you'd expect a playback device to ultimately crop them to an appropriate place on the gamut edge somewhere when there is an eventual conversion back to RGB (or WRGB) to get it onto a panel.
Using the AJA HDR Analyzer, nothing ends up outside of 2020. So it could be a color conversion issue, scopes set to something larger than 2020 or a bug in the tool. We will provide an HDR Analyzer video on the update. We may post some alternate versions on YouTube as well.
bobof likes this.

Stacey Spears
Co-Creator, Spears & Munsil UHD HDR Benchmark
sspears is offline  
post #616 of 647 Old 11-20-2019, 08:49 AM
AVS Forum Special Member
 
sspears's Avatar
 
Join Date: Feb 1999
Location: Sammamish, WA, USA
Posts: 5,357
Mentioned: 23 Post(s)
Tagged: 0 Thread(s)
Quoted: 340 Post(s)
Liked: 539
Quote:
Originally Posted by ConnecTEDDD View Post
Hey Stacey,

I have some questions about the color grade of your disk.

BTW what spectroradiometer/colorimeter combo you used to calibrate and verify that all the monitors used were properly calibrated?

Using RED Cine-X, the Red RAW footage converted to openEXR footage in RWG/Linear, so you import them to Resolve where you did the master grade @ 4000 nits with Dolby Pulsar Monitor configured @ 4000 nits.

Did you used Resolve Color management configured into Davinci YRGB? or into YRGB color managed? or into ACEScct? or into ACEScc ?

Did you used Resolve 14.x or 15.x?

Have you configured Resolve to "Use dual output on SDI"? ... in that case did you used the Dolby PRM-4220 configured for REC.709 100 nits to control the SDR trim pass?

...or did you use external CMU to be able to output the secondary SDR monitor for control purpose?

Did you used Resolve also to grade into 10000 nits from 4000 nits master? ...or you used Transcoder to re-map to 10000 nits from 4000 nits ? or other process?

How have you controlled the 10000 nits grade? It was an extrapolation at the end of the workflow from 4000 nits color grade?

Have you export the master grade into 4000 or 10000 nits or both of them from Resolve or from other software?

After the colorist from Dolby did the primary grade @ 4000 nits REC.2020 PQ ST.2084, what did you used as "softCMU" to calculate de Dolby Vision Metadata?

Did you used a dual monitor configuration to control SDR version from the HDR master? ...the first HDR monitor at 4000 nits (Dolby Pulsar) and the second SDR monitor at 100 nits (Dolby PRM-4220)?

How about the other HDR grades; @ 2000 - 1000 - 600 nits; have you used different dual HDR monitor configuration?

For example, did you used Sony BVM-X300 to control the 1000 nits version and the Dolby PRM-32FHD to control the 2000 nits version and of course the Dolby PRM-4220 to control the 600 nits version?

...or you used the metadata calculation from the 100 nits version secondary trim control?

The Dolby Vision iCMU/eCMU 2.9 (if its been used internal or external CMU, it provides the same end results but need a retouch) calculated the 2000/1000/600/100 trim pass from 4000 nits or from 10000 nits sequences?

For SDR 100 nits REC.709 trim, the colorist used the 5 available trim controls (as the 6th control in CMU 2.9 is disabled for SDR trim) to adjust the picture trying to match the creative intent of the 4000 nits primary grade from Dolby Pulsar.

Finally, from the Dolby Vision IMF or from TIFF or EXR and XML metadata, its been exported the SDR 709 100 nits version encoded as ''20012.M2TS'' to the disk, correct? from Transkoder?

(It looks like you have explained that it was done with Transcoder from master version graded into 4000 nits or into 10000 nits.)

I can understand the whole process is much more complex just I wanted to find out more details ....about the SDR grade I'm interested also. (20012.M2TS file)

For the Dolby Vision 4.0 version of the Montage, you plan to re-do the source material to ensure that Resolve will not clip?

Do you think there is some advantage to work with ACES colorspace?

For example...

1. Export RWG / Log3G10 from RCX as OpenEXR no compression 16-bit ACES-ST2065-1.

2. Use ACES ST2065-1 as IDT into ACEScct workflows and Rec2020 ST2084 as ODT and internalCMU configured at 4000 REC.2020 D65 ST.2084 as mastering display.

or with ACES also but with Resolve configured in YRGB and using Paul's Dore DCTL and OFX plug-in to be able to manipulate ACES IDT and ODT math: https://github.com/baldavenger

3. Using Resolve configured as "Use dual output on SDI" with Dolby Pulsar as primary HDR monitor @ 4000 nits and Dolby PRM-4220 @ 600 and @ 100 nits.

4. Using Resolve configured as "Use dual output on SDI" with Dolby Pulsar as primary HDR monitor @ 4000 or 2000 nits and Dolby PRM-4220 @ 600 and @100 nits and or Sony BVM-X300.

5. Use Resolve 16 for the new grade with Dolby's CMU 4.0 trim passes. (should be a nice improvement over current version)

6. Use your de-convolution (possibly the new version which Shaindow developed) doing versioning from 8K to 4K or 2160p.

7. Use Transkoder to create some of the other versions again... using it with AJA card with Dolby Vision HDMI Tunneling to QC with LG C9 OLED over HDMI Tunneling...

8. Use Dolby Encoding engine to encode to Dolby Vision Profile 7 to control the Sony's Pixelogic encoding.

9. Use the latest x265 to encode in HDR10+ to control the Transkoder encoding.

10. If its possible to include to the next release a USB Stick with the Montage version in MP4 Dolby Vision Profile 5 (as Netflix use Profile 5 also), encoded using ICtCp completely.
We are planning to post an interview sometime next year with these types of questions discussed.

Stacey Spears
Co-Creator, Spears & Munsil UHD HDR Benchmark
sspears is offline  
post #617 of 647 Old 11-20-2019, 11:20 AM
Member
 
Oscarilbo's Avatar
 
Join Date: Feb 2012
Posts: 76
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 28 Post(s)
Liked: 15
Quote:
Originally Posted by sspears View Post
It almost sounds like some of your sources are in full range and some limited range. Assuming everything was in the same range, then they would match across sources. What is your source for streaming?
Thank you. My source is the same as my 4K discs; the Sony UBPX700 UHD player. I use it for discs and as UHD streamer, I've seen youtube videos with HDR and look beautiful, but most of the Netflix shows (with and without HDR) look darker almost clipping, but the thing is not all shows and movies look as dark, others look AWESOME, which makes me think it's just the disadvantages of streaming VS actual physical media.
Oscarilbo is offline  
post #618 of 647 Old 11-20-2019, 05:36 PM
AVS Forum Special Member
 
ConnecTEDDD's Avatar
 
Join Date: Sep 2010
Location: Athens, Greece
Posts: 8,786
Mentioned: 220 Post(s)
Tagged: 1 Thread(s)
Quoted: 3473 Post(s)
Liked: 4309
Quote:
Originally Posted by WiFi-Spy View Post
Also this was made using DaVinci Resolve’s CIE gamut scopes that has probably been converted to floating point RGB before the calculations.

No, unfortunately you guessed wrong.

Quote:
Originally Posted by WiFi-Spy View Post
Something like the AJA/Colorfront HDR Analyzer is a better tool for this.
Quote:
Originally Posted by sspears View Post
Using the AJA HDR Analyzer, nothing ends up outside of 2020. So it could be a color conversion issue, scopes set to something larger than 2020 or a bug in the tool. We will provide an HDR Analyzer video on the update. We may post some alternate versions on YouTube as well.
Hey guys,

AJA HDR Analyzer can analyze only SDI video source input, useful only for post-production, broadcast or for shooting on location, but NOT for analyzing a final encoded file.

So any test signal you will send to AJA, it will not be the native YCbCr 4:2:0 bit-per-bit video data anymore as it should have added colorspace conversion/processing from playback software and processing of video card you will use to output video signal to AJA HDR Analyzer.

AJA HDR Analysier is a combination of AJA’s Kona 4 HD-SDI Input/Output cards powered from an NVidia's Quadro P4000 GPU using the licensed from Colorfront QC Player software analysing capabilities, a really reference device for specific applications, it can't match the kind of test I performed.

The test I performed is the more accurate as I used one of the best QC analysis software exist in professional market which is examining the actual file YCbCr data (+metadata) without any scalling.

I used GrayMeta Iris QC Pro DHQC which has $10000 retail, same retail as Colorfront's QC Player, both are the best solutions for QC video file testing.

Iris is a true reference and QC software, used to analyses IMF packages, DCP etc.

GrayMeta (aka Archimedia previously) has created HDR patterns for Netflix also and its to the list of recommended software for QC for Netflix a lot of years already, so I can't call it as 'buggy' of that is has 'issues' as its designed for such exactly reference level QC testing:

https://partnerhelp.netflixstudios.c...-by-Archimedia

All the available video patterns/audio tests available for Netflix, all have been created from Archimedia company (Archimedia Changes Name to GrayMeta): https://www.netflix.com/gr-en/watch/80018585

The out of gamut traces, mainly are coming from the fact that the clip has values below 64. There frames with above 940 (10000 nits) also, until ~20000 nits or more.




The in-picture frame overlay is the HDR Histogram which shows the distribution of levels of the video content across the entire frame.

These limits are taken automatically from the Mastering Display Luminance minimum and maximum values from HDR10 metadata of the clip.

So the levels shown with red color are brightness levels below and above Mastering Display Luminance values. The specific HDR10 clip had 0.005 nit black (min) and 10000 nits (max) white.

The levels within the Mastering Display Luminance values have green color.



Using FilmLight Baselight, it can be visible and double confirmed the below 64 data.





It can be seen from Resolve also, as the bright elements of picture are closer to the center of the CIE of REC.2020 while the black/darker part of the image hit the edge of REC.2020.

The problem is that since you used Resolve, you can't bypass the video to data range conversion when you will import a clip, so the video 64-940 will be mapped to 0-1023, you can't see WTW or BTW issues using Resolve as you don't have control and its impossible to bypass that scaling.

But you can bypass that scaling with FilmLight Baselight, as you can control if you need the levels to be scaled to full range or not, something Resolve can't do:

omarank likes this.

Ted's LightSpace CMS Calibration Disk Free Version for Free Calibration Software: LightSpace DPS / CalMAN ColorChecker / HCFR
S/W: LightSpace CMS, SpaceMan ICC, SpaceMatch DCM, CalMAN 5, CalMAN RGB, ChromaPure, ControlCAL
V/P: eeColor 3D LUT Box - P/G: DVDO AVLab TPG
Meters: JETI Specbos 1211, Klein K-10A, i1PRO2, i1PRO, SpectraCAL C6, i1D3, C5
ConnecTEDDD is online now  
post #619 of 647 Old 11-20-2019, 06:01 PM
AVS Forum Special Member
 
sspears's Avatar
 
Join Date: Feb 1999
Location: Sammamish, WA, USA
Posts: 5,357
Mentioned: 23 Post(s)
Tagged: 0 Thread(s)
Quoted: 340 Post(s)
Liked: 539
Quote:
Originally Posted by ConnecTEDDD View Post

AJA HDR Analyzer can analyze only SDI video source input, useful only for post-production, broadcast or for shooting on location, but NOT for analyzing a final encoded file.

So any test signal you will send to AJA, it will not be the native YCbCr 4:2:0 bit-per-bit video data anymore as it should have added colorspace conversion/processing from playback software and processing of video card you will use to output video signal to AJA HDR Analyzer.

AJA HDR Analysier is a combination of AJA’s Kona 4 HD-SDI Input/Output cards powered from an NVidia's Quadro P4000 GPU using the licensed from Colorfront QC Player software analysing capabilities, a really reference device for specific applications, it can't match the kind of test I performed.
The CIE diagram you posted was created in xyY, not 4:2:0. The software has to go through the same steps as AJA to create a CIE diagram.

Stacey Spears
Co-Creator, Spears & Munsil UHD HDR Benchmark
sspears is offline  
post #620 of 647 Old 11-20-2019, 06:08 PM
AVS Forum Special Member
 
ConnecTEDDD's Avatar
 
Join Date: Sep 2010
Location: Athens, Greece
Posts: 8,786
Mentioned: 220 Post(s)
Tagged: 1 Thread(s)
Quoted: 3473 Post(s)
Liked: 4309
Quote:
Originally Posted by lukewayne View Post
I got ahold of Canon support, and they suspect there is something wrong with my projector causing the uncorrectable green shadows. I'm bringing it in for service tomorrow, hopefully, they'll help me get to the bottom of it. Here's a folder with a bunch of pictures of the issues I was experiencing. https://drive.google.com/drive/folde...PY?usp=sharing
There was a strange thing happening on the Brightness test pattern, where the background checker would appear and then disappear and then appear again while moving the settings in a single direction. Anyway, if it is still doing that when I get it back from service I'll be sure to make some fresh pictures and ask about it.
Hi, why it says SDR EOTF? This is not correct.

https://drive.google.com/file/d/1s-M...ew?usp=sharing

Check your player output or projector input colorspace settings, connect the player directly with projector to be sure no other device can affect your signal.

Ted's LightSpace CMS Calibration Disk Free Version for Free Calibration Software: LightSpace DPS / CalMAN ColorChecker / HCFR
S/W: LightSpace CMS, SpaceMan ICC, SpaceMatch DCM, CalMAN 5, CalMAN RGB, ChromaPure, ControlCAL
V/P: eeColor 3D LUT Box - P/G: DVDO AVLab TPG
Meters: JETI Specbos 1211, Klein K-10A, i1PRO2, i1PRO, SpectraCAL C6, i1D3, C5
ConnecTEDDD is online now  
post #621 of 647 Old 11-20-2019, 06:24 PM
AVS Forum Special Member
 
ConnecTEDDD's Avatar
 
Join Date: Sep 2010
Location: Athens, Greece
Posts: 8,786
Mentioned: 220 Post(s)
Tagged: 1 Thread(s)
Quoted: 3473 Post(s)
Liked: 4309
Quote:
Originally Posted by sspears View Post
The CIE diagram you posted was created in xyY, not 4:2:0. The software has to go through the same steps as AJA to create a CIE diagram.
AJA is not getting the exact same signal.

I'm talking about YCbCr encoded file analysis as YCbCr without any transformation or scaling or matrix...

What QC software have you used?

About Dolby Vision QC, there specific procedure which is not possible to be performed with Resolve and iCMU (as you have used iCMU); the AJA Kona which can be used is limited for HD, so you need eCMU with Resolve.

Dolby recommends to perform QC with 2 different software.

Ted's LightSpace CMS Calibration Disk Free Version for Free Calibration Software: LightSpace DPS / CalMAN ColorChecker / HCFR
S/W: LightSpace CMS, SpaceMan ICC, SpaceMatch DCM, CalMAN 5, CalMAN RGB, ChromaPure, ControlCAL
V/P: eeColor 3D LUT Box - P/G: DVDO AVLab TPG
Meters: JETI Specbos 1211, Klein K-10A, i1PRO2, i1PRO, SpectraCAL C6, i1D3, C5
ConnecTEDDD is online now  
post #622 of 647 Old 11-20-2019, 06:46 PM
AVS Forum Special Member
 
sspears's Avatar
 
Join Date: Feb 1999
Location: Sammamish, WA, USA
Posts: 5,357
Mentioned: 23 Post(s)
Tagged: 0 Thread(s)
Quoted: 340 Post(s)
Liked: 539
Quote:
Originally Posted by ConnecTEDDD View Post
AJA is not getting the exact same signal.

I'm talking about YCbCr encoded file analysis as YCbCr without any transformation or scaling or matrix...

What QC software have you used?

About Dolby Vision QC, there specific procedure which is not possible to be performed with Resolve and iCMU (as you have used iCMU); the AJA Kona which can be used is limited for HD, so you need eCMU with Resolve.

Dolby recommends to perform QC with 2 different software.
We wrote our own QC tool.

An external CMU was used.

Stacey Spears
Co-Creator, Spears & Munsil UHD HDR Benchmark
sspears is offline  
post #623 of 647 Old 11-20-2019, 06:58 PM
AVS Forum Special Member
 
ConnecTEDDD's Avatar
 
Join Date: Sep 2010
Location: Athens, Greece
Posts: 8,786
Mentioned: 220 Post(s)
Tagged: 1 Thread(s)
Quoted: 3473 Post(s)
Liked: 4309
Quote:
Originally Posted by sspears View Post
We wrote our own QC tool.

An external CMU was used.
So you haven't cross-verified with a industry standard QC software, like ColorFont's?

Checking the clips there a lot of illegal black values under 64 and sometimes close from the real 0 value, does your QC tool hasn't checked for such stuff?

Dolby allow to perform QC directly from Resolve with eCMU and AJA card only over HDMI Tunneling with DoVi Metadata connected to a C7/C8 or better the C9 but this process is limited @ HD resolution.

I think that the best way is to encode the file into UltraHD Dolby Vision Profile 5 and to playback the file directly using different Dolby Vision compatible TV models.

I don't think is possible to write your own tool to QC Dolby Vision.

As you know there are different steps for Dolby Vision Quality Check.

QC is based on visual control from Color Workstation but it is not possible to be performed with Resolve and Internal CMU (to blu-ray.com you said its been used softCMU)

You need to visual check the difference from master monitor to consumer DolbyVision LG OLED TV.

For that reason its better way to encode the file into UltraHD Dolby Vision Profile 5 and to read it directly from different DolbyVision compatible TV models from USB.

The LG OLED's need to be close to the master monitor to visually check some difference due to the encoding and also due to the conversion from RGB Full range to YCbCr TV Legal range.

I asked you the 'many' questions earlier about the workflows process and because I really appreciate your work and also because I want to be sure against quality.

I have checked by following all the recommendation and I can see using many different ways that there is a lot of illegal black under 64 and close from 0 in some cases.

Can you please answer about that?

Ted's LightSpace CMS Calibration Disk Free Version for Free Calibration Software: LightSpace DPS / CalMAN ColorChecker / HCFR
S/W: LightSpace CMS, SpaceMan ICC, SpaceMatch DCM, CalMAN 5, CalMAN RGB, ChromaPure, ControlCAL
V/P: eeColor 3D LUT Box - P/G: DVDO AVLab TPG
Meters: JETI Specbos 1211, Klein K-10A, i1PRO2, i1PRO, SpectraCAL C6, i1D3, C5

Last edited by ConnecTEDDD; 11-20-2019 at 07:21 PM.
ConnecTEDDD is online now  
post #624 of 647 Old 11-20-2019, 07:34 PM
Member
 
lukewayne's Avatar
 
Join Date: Mar 2008
Location: Los Angeles, CA
Posts: 35
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 11 Post(s)
Liked: 15
Quote:
Originally Posted by ConnecTEDDD View Post
Hi, why it says SDR EOTF? This is not correct.

https://drive.google.com/file/d/1s-M...ew?usp=sharing

Check your player output or projector input colorspace settings, connect the player directly with projector to be sure no other device can affect your signal.
I was checking the HDR to SDR conversion of my Panasonic UHD Player at that moment, Most of those pictures were in HDR Mode.

Canon Has my projector currently, I'm going to pick it up tomorrow. They were able to recreate all the green issues I was having in the lab, and do some major adjustments to the projector (not consumer accessable adjustments). They also had a firmware update that wasn't available on their site, so I'm very excited to get it back.

I'm also excited to bring my S&M Benchmark disk with me to pick it up and see how it looks before I drive it back home.

Emotiva RMC-1 | Emotiva XPA DR3 | Emotiva XPA-8 Gen 3
7.1.4 - Polk LSiM707 (front 3) - LSiM702F/X (surround) - 265-LS (atmos)
Canon 4K600Z | Seymour Screen Excellence Trim TB-130-NEO
Panasonic DP-UB9000 | Oppo UDP-203 |Xbox One X - PS3
lukewayne is offline  
post #625 of 647 Old 11-20-2019, 07:40 PM
AVS Forum Special Member
 
sspears's Avatar
 
Join Date: Feb 1999
Location: Sammamish, WA, USA
Posts: 5,357
Mentioned: 23 Post(s)
Tagged: 0 Thread(s)
Quoted: 340 Post(s)
Liked: 539
Quote:
Originally Posted by ConnecTEDDD View Post
So you haven't cross-verified with a industry standard QC software, like ColorFont's?

Checking the clips there a lot of illegal black values under 64 and sometimes close from the real 0 value, does your QC tool hasn't checked for such stuff?

Dolby allow to perform QC directly from Resolve with eCMU and AJA card only over HDMI Tunneling with DoVi Metadata connected to a C7/C8 or better the C9 but this process is limited @ HD resolution.

I think that the best way is to encode the file into UltraHD Dolby Vision Profile 5 and to playback the file directly using different Dolby Vision compatible TV models.
How did you get it was not QC'd from it was graded at Dolby?

HDR movies are graded in full range, not limited range. The conversion to limited range happens prior to encoding. Here is the peacock frame. I highlighted the pixels <= 64 in green. The block in the image is where the pixel values are coming from on the right. This shot has noise around the feather. The dither caused the noise to go below 64, which is normal. If you clamp it, then you can end up with little black squares in the image, which looks ugly.



The pixels >940 (in RGB, not YCbCr) in the Seattle skyline are on the crane on the far right. This was an issue with Resolve where it clipped the gamut on render, but was fine during QC prior to render. It was much worse before Transcoder (Colorfront) corrected it. (best it could given one channel was gone) It should be much more blue and very little purple. There is some purple in the LA skyline that should also be blue. There is purple in there too, but one door has purple light around it that is supposed to be blue. Again, was blue during QC and clipped on Render.

Edit: Sorry, forgot to mention all QC is done in full range. This includes the encoded frames. They are decoded back into full range and compared against the source frames.
KC-Technerd and mrtickleuk like this.

Stacey Spears
Co-Creator, Spears & Munsil UHD HDR Benchmark

Last edited by sspears; 11-20-2019 at 07:50 PM.
sspears is offline  
post #626 of 647 Old 11-21-2019, 04:05 AM
AVS Forum Special Member
 
Light Illusion's Avatar
 
Join Date: Aug 2010
Posts: 1,882
Mentioned: 30 Post(s)
Tagged: 0 Thread(s)
Quoted: 809 Post(s)
Liked: 1213
Quote:
Originally Posted by bobof View Post
My (limited) understanding is that the math that defines YCbCr - which is the format used for video typically - means there are some YCbCr pixel values that when turned into a "real" colour are actually outside the gamut.
This is because although the gamut is defined by RGB primaries, effectively there is a rotation of the RGB space to fit the whole of it within a YCbCr space that is slightly larger for a given set of primaries. So most of the pixel values possible to put in the stream are in gamut, but some will be slightly out of gamut.

I'm not sure if the errant points typically come from compression artifacts, or tend to be math artifacts from processing in the post chain, or what. Either way I guess you'd expect a playback device to ultimately crop them to an appropriate place on the gamut edge somewhere when there is an eventual conversion back to RGB (or WRGB) to get it onto a panel.
It is correct that YCbCr can contain colours that RGB cannot.
But, as all images are predominately generated RGB, and then converted into YCbCr there should be little 'out-of'gamut' excursions in the final YCbCr version.

I have to say the excursions in these examples form Ted look far greater than expected.
(This was something we had to deal with a lot when I worked at Quantel)

Steve

Steve Shaw
LIGHT ILLUSION

Light Illusion is offline  
post #627 of 647 Old 11-21-2019, 04:28 AM
aka jfinnie
 
Join Date: Aug 2015
Location: Norwich, UK
Posts: 3,516
Mentioned: 60 Post(s)
Tagged: 0 Thread(s)
Quoted: 2830 Post(s)
Liked: 1961
Quote:
Originally Posted by Light Illusion View Post
It is correct that YCbCr can contain colours that RGB cannot.
But, as all images are predominately generated RGB, and then converted into YCbCr there should be little 'out-of'gamut' excursions in the final YCbCr version.

I have to say the excursions in these examples form Ted look far greater than expected.
(This was something we had to deal with a lot when I worked at Quantel)
Fair enough, I was more answering @mrtickle 's question about how it could even be possible for there to be excursions beyond the RGB primary points. How those excursions got there, and which of the two views is correct (or whether they're both correct but in different points of the chain, one being pre-encode and the other being post-encode) I'll leave to folk who know more about the subject than I to investigate.

One thing I guess that is missing in the (2D) CIE charts is of course the luminance of the pixels which have excursions. Has anyone made a similar viewer of the content pixel colours that puts them in their 3D space? If these are all almost black or insanely bright pixels it may well end up being a "meh" question.

@ConnecTEDDD - out of interest, have you done similar analysis of other real HDR content mastered to very high saturation and luminance levels? How does that look through the same tools and settings you are using? Are they hard clamped within 0-10000 nits and to the gamut edges?
mrtickleuk likes this.
bobof is online now  
post #628 of 647 Old 11-21-2019, 11:43 AM
AVS Forum Special Member
 
ConnecTEDDD's Avatar
 
Join Date: Sep 2010
Location: Athens, Greece
Posts: 8,786
Mentioned: 220 Post(s)
Tagged: 1 Thread(s)
Quoted: 3473 Post(s)
Liked: 4309
Quote:
Originally Posted by sspears View Post
How did you get it was not QC'd from it was graded at Dolby?

HDR movies are graded in full range, not limited range. The conversion to limited range happens prior to encoding. Here is the peacock frame. I highlighted the pixels <= 64 in green. The block in the image is where the pixel values are coming from on the right. This shot has noise around the feather. The dither caused the noise to go below 64, which is normal. If you clamp it, then you can end up with little black squares in the image, which looks ugly.
The Dolby Vision encoding is performed for example from IMF or TIFF or JPEG2000 files, the first pre-encoding is done in YUV analog colorspace in full range. It is the pre-encoded file and with the other RPU file (metadata)

After that, the Dolby's Encoding Engine is applying the scaling and encode the final file in digital YCbCr into different profiles.

The picture from software analyzer you posted show the pre-encoded file which is YUV but full range, there no scaling at this time.

So, the analysis you posted is not the master RGB file but also not the final YCbCr file, it's the pre-encoded YUV file, somewhere in the middle of the encoding process.

I don't see any serious valuable of that picture, is it’s not having any scaling and because its YUV analog colorspace, it’s not analyzing the final delivery file, like the QC test I performed.

Because of a limitation in some workflows with Resolve, there's no way to control correctly 10000 nits footage because the whole Dolby Vision algorithm is working for maximum of 4000 nits input Mastering Display, for that reason I asked you earlier if the master file grade performed at 4000 nits and if then extrapolated to 10000 nits.

I believe the grade done in 4000 nits and you used an algorithm or maybe a 3D LUT and 1D LUT to map it to 10000 nits.

Maybe it’s been used the WonderLookPRO software which is capable to easily build LUT's for HDR purposes as it can manipulate signal to 10000 nits.

Using Baselight, Resolve or Nucoda, you can only choose 4000 nits as maximum mastering display as default workflows or maybe you use your own color pipeline by using Paul's Dore DCT or OFX plugin and manipulated the math?

Can your QC custom software analyze and post the frame in final YCbCr file in Video Legal range?

Ted's LightSpace CMS Calibration Disk Free Version for Free Calibration Software: LightSpace DPS / CalMAN ColorChecker / HCFR
S/W: LightSpace CMS, SpaceMan ICC, SpaceMatch DCM, CalMAN 5, CalMAN RGB, ChromaPure, ControlCAL
V/P: eeColor 3D LUT Box - P/G: DVDO AVLab TPG
Meters: JETI Specbos 1211, Klein K-10A, i1PRO2, i1PRO, SpectraCAL C6, i1D3, C5
ConnecTEDDD is online now  
post #629 of 647 Old 11-21-2019, 12:56 PM
AVS Forum Special Member
 
ConnecTEDDD's Avatar
 
Join Date: Sep 2010
Location: Athens, Greece
Posts: 8,786
Mentioned: 220 Post(s)
Tagged: 1 Thread(s)
Quoted: 3473 Post(s)
Liked: 4309
Quote:
Originally Posted by bobof View Post
@ConnecTEDDD - out of interest, have you done similar analysis of other real HDR content mastered to very high saturation and luminance levels? How does that look through the same tools and settings you are using? Are they hard clamped within 0-10000 nits and to the gamut edges?
I'm trying to find out actual HDR commercial movie release frame still which has the same time large gamut and high-brightness but its not that easy. The colors used are not so saturated people believe, I have scanned about 20 HDR10 movies today and the luminance range commonly used are between 80-500 nits.

But there demo clips from Samsung/Sony/LG which have boosted colors which can be only good for stores, the most boosted clip is the LG's Demo - Cymatic Jazz, below are the RGB color traces of all clip frames combined:



There other clips with not such color-boost, like Sony's Swordsmith:



or LG's OLED HDR Tech Demo:



Full movie frames color traces combined, of Lion King 2019 HDR10:



Contrast Pattern of Stacey's @ 10000 nits:



Contrast Pattern of Stacey's @ 1000 nits:



Below its a still frame from LG's Cymatic Jazz HDR10Demo, which goes over 10K nits.

But its not having below 64, the are in HDR Histogram has read color to near black because the clip mastering display metadata has 0.004 nit black, so its marked as black the 0-0.004 nit range.




The clip has mastering display peak 1100 nits, for that reason the HDR Histogram has with red color a lot of area of the right side. (because is above 1100 nits)

Ted's LightSpace CMS Calibration Disk Free Version for Free Calibration Software: LightSpace DPS / CalMAN ColorChecker / HCFR
S/W: LightSpace CMS, SpaceMan ICC, SpaceMatch DCM, CalMAN 5, CalMAN RGB, ChromaPure, ControlCAL
V/P: eeColor 3D LUT Box - P/G: DVDO AVLab TPG
Meters: JETI Specbos 1211, Klein K-10A, i1PRO2, i1PRO, SpectraCAL C6, i1D3, C5
ConnecTEDDD is online now  
post #630 of 647 Old 11-21-2019, 01:32 PM
AVS Forum Special Member
 
sspears's Avatar
 
Join Date: Feb 1999
Location: Sammamish, WA, USA
Posts: 5,357
Mentioned: 23 Post(s)
Tagged: 0 Thread(s)
Quoted: 340 Post(s)
Liked: 539
Quote:
Originally Posted by ConnecTEDDD View Post
The Dolby Vision encoding is performed for example from IMF or TIFF or JPEG2000 files, the first pre-encoding is done in YUV analog colorspace in full range. It is the pre-encoded file and with the other RPU file (metadata)

After that, the Dolby's Encoding Engine is applying the scaling and encode the final file in digital YCbCr into different profiles.

The picture from software analyzer you posted show the pre-encoded file which is YUV but full range, there no scaling at this time.

So, the analysis you posted is not the master RGB file but also not the final YCbCr file, it's the pre-encoded YUV file, somewhere in the middle of the encoding process.

I don't see any serious valuable of that picture, is it’s not having any scaling and because its YUV analog colorspace, it’s not analyzing the final delivery file, like the QC test I performed.

Because of a limitation in some workflows with Resolve, there's no way to control correctly 10000 nits footage because the whole Dolby Vision algorithm is working for maximum of 4000 nits input Mastering Display, for that reason I asked you earlier if the master file grade performed at 4000 nits and if then extrapolated to 10000 nits.

I believe the grade done in 4000 nits and you used an algorithm or maybe a 3D LUT and 1D LUT to map it to 10000 nits.

Maybe it’s been used the WonderLookPRO software which is capable to easily build LUT's for HDR purposes as it can manipulate signal to 10000 nits.

Using Baselight, Resolve or Nucoda, you can only choose 4000 nits as maximum mastering display as default workflows or maybe you use your own color pipeline by using Paul's Dore DCT or OFX plugin and manipulated the math?

Can your QC custom software analyze and post the frame in final YCbCr file in Video Legal range?
The peacock feather image I posted was limited range, not full. You can tell because black is at 64. That is from the base layer 10-bit 4:2:0 YUV created by Pixelogic through their Dolby Vision pipeline.

No LUT was used to convert to 10,000. What was encoded came out of Resolve from the grading session. I don't know what WonderLookPRO or Paul's Dore DCT or OFX plug-in is. I have never heard of them. Had the content been altered like that after the grade, the Dolby Vision metadata would have been broken.
mrtickleuk likes this.

Stacey Spears
Co-Creator, Spears & Munsil UHD HDR Benchmark

Last edited by sspears; 11-21-2019 at 04:17 PM.
sspears is offline  
Sponsored Links
Advertisement
 
Reply Display Calibration

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off