AVS Forum banner

HDR Calibration & Discussion

117K views 884 replies 100 participants last post by  oldmanavsforum 
#1 · (Edited)
I'm opening this thread to add all the current and any upcoming information about the available software solutions that support HDR calibration/profiling, HDR calibration methods/workflows/hardware/pattern generators etc.

Here is the available solutions we have now:

CalMAN 5

HDR Calibration Support: ST.2084 Gamma available from CalMAN Ultimate/Studio 5.4.1 released @ 5 March 2015 (or later version).
ST.2084 Gamma available from CalMAN Enthusiast 5.6.0 RC1 released @ 7 October 2015 (or later version).

License Level Required for HDR Calibration: CalMAN Ultimate, CalMAN Studio, CalMAN Enthusiast.

ChromaPure 2.x

HDR Support: Not yet supported. Expected be to supported in ChromaPure 3.x.

License Level Required for HDR Calibration: To be announced.

HCFR

HDR Calibration Support: ST.2084 Gamma available from HCFR 3.3.0 released @ 6 June 2015 (or later version). *Supports HDR Parametric Gamma also.

License Level Required for HDR Calibration: Free (Open Source) Software

LightSpace

HDR Calibration Support: ST.2084 Gamma available from LightSpace 6.6.7.2061 released @ 25 March 2015 (or later version).

License Level Required for HDR Calibration: All license levels support it.

Notes: LightSpace supports HDR Parametric Gamma capability also. Users can download an example HDR Parametric Gamma profile from LightSpace website; this one for the Sony BVM-X300, which is used by some major Post Facilities, including Light Iron – who were one of the first to start using the Sony display.

dispcalGUI (Powered by ArgyllCMS)

HDR Calibration Support: ST.2084 Gamma available from dispcalGUI 3.0.3 released @ 6 July 2015 (or later version).

License Level Required for HDR Calibration: Free (Open Source) Software.
 
See less See more
#2 · (Edited)
Here is the list with the published Peak Luminance Range Limit for each meter available in market:

Colorimeters

X-Rite DTP-94 up to 1.000 cd/m2
X-Rite i1Display 2 up to 3.000 cd/m2
Sencore's CP-6000 up to 1.000 cd/m2
X-Rite Chroma 5 up to 1.000 cd/m2
X-Rite Hubble up to 1.350 cd/m2
Sencore OTC-1000 up to 1.350 cd/m2
X-Rite i1Display PRO (i1d3 OEM/Retail i1d3) up to 1.000 cd/m2
SpectraCAL C6 (Branded OEM i1d3) up to 1.000 cd/m2
SpectraCAL C6-HDR (Branded OEM i1d3) up to 1.300 cd/m2
Datacolor Spyder 2 up to 5.000 cd/m2
Datacolor Spyder 3 up to 5.000 cd/m2
Datacolor Spyder 4 up to 5.000 cd/m2
Datacolor Spyder 5 up to 5.000 cd/m2
BasICColor Discus up to 2.500 cd/m2
Colorimetry Research CR-100 up to 5.140 cd/m2
Klein K-80 up to 10.000 cd/m2
Klein K-10A up to 10.000 cd/m2
Minolta CS-100A up to 300.000 cd/m2
Minolta CA-210 up to 1.000 cd/m2
Minolta CA-310 up to 1.000 cd/m2
Minolta CS-200 up to 20.000.000 cd/m2

Spectroradiometers/Spectrophotometers

X-Rite ColorMunki up to 1.000 cd/m2
X-Rite i1PRO1 up to 300 cd/m2
X-Rite i1PRO2 up to 1.200 cd/m2
JETI 1201 up to 70.000 cd/m2 (using optional JETI filters... up to 75.000/250.000 cd/m2)
JETI 1211 up to 2.500 cd/m2 (using optional JETI filters... up to 10.000/25.000/50.000/75.000/250.000 cd/m2)
JETI 1501 up to 150.000 cd/m2
JETI 1511 up to 150.000 cd/m2
Colorimetry Research CR-250RH up to 154.180 cd/m2
Photoreseach PR-650 up to 5.000 cd/m2
Photoreseach PR-655 up to 15.000 cd/m2
Photoreseach PR-670 up to 8.566.000 cd/m2
Photoreseach PR-680 up to 17.130.000 cd/m2
Minolta CS-1000 up to 80.000 cd/m2
Minolta CS-2000 up to 500.000 cd/m2
Minolta CS-2000A up to 500.000 cd/m2
 
#8 ·
#9 ·
HDR News Updates

CalMAN 5.6.1 RC1 release added for CalMAN Expert, Professional & Ultimate license levels:

HDR-10 support for Quantum Data 780 and 804 pattern generators. (This requires firmware version 15092260 or higher)
HDR-10 support for Astro Design VG-876 & VG-877 Video Signal Generators.

HDR-10 is a standard used for mastering content, it has metadata content describing the mastering monitor peak brightness and native gamut. HDR-10 doesn't support consumer devices; displays/projectors etc.

LightIllussion added a page related with HDR here: http://www.lightillusion.com/hdr.html
 
#10 ·
That LightIllusion page is wrong in so many ways. Everything from their explanation of useful dynamic range, to the impact of viewing distance, to the claim that 650 nits is excessive. Here is an obvious question: If 650 was excessive then why wasn't the standard limited to 650? And suggesting you can clip the 2084 curve to your display white luminance? That is a guaranteed way to destroy the image quality. Just, wow. They really don't get it.
 
#13 · (Edited)
The problem with HDR is it is presently the most ill defined concept we have seen in a long while.

There are no standards that as yet are fully defined, and the various hardware manufacturers have gone off half cocked in an attempt to sell more displays.

As a 'colour management' company we have to research all the various proposals to make sure LightSpace CMS can manage them as required, and in doing that we obviously have to have a lot of direct access and experience of different displays.

Having sat in front of a large number of HDR displays we have our own 'feelings' on what works and what doesn't.
But we have to be able to deal with all variants.

For example, one of the points that often comes up when calibrating a display is 'fatigue' levels.
Excessive HDR causes (can cause?) serious eye fatigue.

There is also the issue of the way dynamic backlighting works.
ABL with plasmas was a real pain, and HDR is no different - potentially worse.
The last HDR display we played with had a real issue with the screen taking serious time to dim after a bright scene.

And while HDR is Hight Dynamic Range, UWG has also been brought into the mix.
This is causing yet another level of total standard failure.

This is really VHS vs. Betamax, but about 100 times worse...

Steve
 
#15 ·
Remember regular old HD? It wasn't fully defined until um… let's see when was ITU-R 1886 published? 2011. But it seemed to work pretty well before then. Here's what you are missing. This new container was made both big (10k nits) and wide (2020 primaries) so it wouldn't have to be redefined in another 2, 5, 10 years down the road every time display technology advances. It was made as large as reasonable with the knowledge that only a limited portion of it would be used initially. This is where 2086 comes into play. You start with a smaller utilization and grow into the larger range over time. Even with HD, 100 nits for white was completely undocumented but informally agreed on in practice because that was all you could get out of the Sony reference CRTs. So what is going to happen with HDR and WCG? Well, there are two reference displays currently available to studios. Those will set a limit on what actually produced for distribution. By the nature of the industry and the need to interchange content between facilities, they will converge on a common (although not necessarily SMPTE standardized) target. But all you'll need to do is take a look inside the 2086 metadata to find out what that is. My suggestion is wait until CES, wait until test discs come out and then try again.

Also, as to your ABL comment… one of the most linear (flattest APL curve) response displays I have ever seen is HDR and intended for consumer market. I have also seen some very junk ones as well, and I think the biggest risk for HDR is that too many low tier displays, especially with low quality dimming like you described, will be forced into the market by manufacturers and advertised as "HDR". There is a real danger that these will compromise the consumer impression of the format, when it is capable of so much more when done right.
 
#19 ·
HDR display configuration can only be used on material that has been mastered for HDR.
And only on material that has been mastered for the specific HDR format you display uses.

To attempt to use HDR on SDR material, or mix HDR formats (mastering with final display), will result in very inaccurate images.

This is another of the HDR issues - everything is so ill defined that it is almost impossible to guarantee that source matches destination.
And matching source to destination is key in accurate final image display, as defined by calibration.

And honestly, as TVs have been 100+ Nits for years, and have looked great, why would a projector that delivers around 100 Nits all of a sudden be HDR?

That is yet another HDR issue that has not been accurately resolved or defined.

Steve
 
#22 ·
And honestly, as TVs have been 100+ Nits for years, and have looked great, why would a projector that delivers around 100 Nits all of a sudden be HDR?
I fully agree with you on this. HDR projection wasn't defined in any of the standards because there is no HDR projection technology available in the market. Unfortunately, the manufacturers of projectors see the buzz around HDR and want to be able to capitalize on it so you see products that are being sold as HDR when they really are not. They are "HDR-compatible" meaning they can recognize an HDR signal but don't actually reproduce it accurately. Sort of like a "4k-compatible" display that can receive 4k signal but has only 720 pixels native resolution.
 
#20 ·
There has to be a reason why a director(s) wants to color grade both their films and 4k blu ray movies with HDR (besides increasing sales). Are they looking to provide more pop in the mid-tones and shadows where much of the emotion/envelopment resides? Or are they trying to provide more pop in the highlights to create more emotion/envelopment where it is difficult to do so (because it's so bright)? JVC projectors can create a small amount of pop in the mid-tones just by adjusting several of their controls. Is this what the directors want to do with films, provide more pop, but to do it while grading the film and applied it to the full range of shadows, mid-tones and highlights? ... something that the JVC projector cannot tweak unless it too has HDR capabilities. What do directors think?
 
#21 ·
HDR provides directors a larger color palette to work with. It's not just about "pop", it lets them produce images that are fundamentally different from what you are accustomed to seeing on a standard display. What you see in projection, even at a Dolby Cinema location, is not full HDR. It is the bottom half (darks) of HDR plus a little extra range in the highlights. So they are making two different HDR versions of the movie when they do color grading. They make a theatrical HDR grade which is limited in the brights, and they make a home grade which is intended for viewing on a true HDR display. If you try to project a home HDR grade at 100 nits, even with a projector that can go fully black, it won't look right.
 
#23 ·
So the new star wars is coming out on December 18 in Dolby HDR. http://www.avsforum.com/forum/44-mo...ill-shown-dolby-vision-hdr-dolby-cinemas.html Scott Wilkinson has seen other HDR movies at El Capitan Theatre in Hollywood, CA and really liked it. The commercial theater's Ymax is 100nits (DCI) which falls into the range of these new JVC projectors along with 12bit and DCI. So why so cautious with HDR on home cinema projectors?
 
#24 ·
I actually see no difference with the professional so-called HDR projectors and consumer ones.

100 Nits is just not HDR in reality - it is just projection as it should always have been, to match standard home TVs when viewed in similar viewing environments - ie., dark.

Obviously, the size of the screen, and the resulting viewing subtend angle, means we do get close to the desired 45 degrees, which does make a difference to home viewing - but that also brings us back to some of my original points on HDR.

Steve
 
#25 · (Edited)
Looks like what the director and colorists will be grading "with HDR" is shown in the 2 diagrams below that I grabbed from Scott Wilkinson talk on hdr - eotf: http://www.avsforum.com/forum/166-l...33447-smpte-webinar-dolby-vision-pq-eotf.html


These diagrams answer my question of where among Highlights, Mid-Tones or Shadows would one see improvements in HDR technology for commercial and Home Cinema. Instead of the 3 relative luminance ranges which are color graded today (Highlights, Mid-Tones and Shadows), Dolby redefines them as 6 absolute luminance ranges: Super Highlights, Highlights, White Range, "Most Typical Objects", Shadows and Blacks. At 100nits PQ for projectors, one could extrapolate that there are major improvements in the shadows and mid-tones at the expense of highlights. To use Dolby's vernacular at 100nits, there are no Super Highlights, no Highlights and the White Range content is sacrificed to insure liberal distribution of the luminance range among the 3 lower Dolby luminance ranges: "Most Typical Objects", Shadows and Blacks. So for Commericial and JVC (hehehe) HT, one will still see a great picture and it will play into the strengths of the projector ... in a light controlled room. From what I've seen in Dolby's HDR demos, the shadows and mid-tones will take on slightly brighter colors than what we are used to ... this will make objects in the shadows and mid-tones easier for your eyes to see and more colorful ... a different kind of color pop ... but still based on luminance. JVC has 3 controls (Bright Level, Picture Tone, Dark Level) to fine tune the effect of HDR.


JJ


****



Source Code
The last part of the webinar illustrated how different ranges of luminance values are allocated in a gamma-based display and a PQ-based display. In the graphs below, the allocation of luminance values (code words) are shown for different peak light levels.


A gamma-based system allocates very few luminance values (code words) to shadow detail and progressively fewer values to the brightness of most typical objects and whites as the peak brightness increases. In this graph, each cylinder represents a different peak light level (500 nits, 1000 nits, etc.), increasing from right to left, with a gamma of 2.4. (Courtesy of Dolby Laboratories)


In a PQ-based display, more values are allocated to shadow detail at all peak light levels, and the range of values allocated to typical objects and whites remains more constant as the peak brightness increases. In this graph, each cylinder represents a different peak light level (500 nits, 1000 nits, etc.), increasing from right to left. (Courtesy of Dolby Laboratories)
 
#28 · (Edited)
What we consider to be highlights, midtones and shadows are scene dependent. You can't specify the luminance of white because perceptually, white is judged relative to a light source. The higher the luminance of the light source in a scene, the higher the luminance of your white and corresponding midtones and shadows will be. If a colorist keeps their "whites" at the same luminance they are used to, like 100 nits, then it allows bright colors to hold saturation and be more colorful. It also allows the reproduction of luminous objects that are brighter than white. However, instead of keeping everything normalized around 100 nits they can also push the entire scene up to a higher luminance range, or a lower luminance range since the shadow detail is better now, and use that to build a very convincing effect like the transition from a dark interior into a bright exterior. Imagine walking from inside an unlit cabin out into a field at midday. This has implications about how display controls need to behave because making adjustment with fixed pivot positions is not well-suited to HDR. The controls should ideally scale with the scene luminance.

I think you are correct that with a 100 nits projector it is near impossible to produce the effect of luminous colors, aka super highlights, but you can produce pretty good highlights if the projection is done with whites held down to the 48 nits we are used to seeing in standard projection.
 
#30 · (Edited)
2084 doesn't define an ecosystem. I don't understand why everybody here keeps acting like it does. It's one component. You can't analyze it as being anything more than what it defines itself to be. A reference EOTF. It astounds me that absolutely nobody has talked, or even asked, about the implications of 2086 or dynamic metadata. I agree that 2084 completely by itself does not work for distribution, but that is not what is being rolled out despite what many here proclaim.

Graemme, what are the parameters that you think constitute a viewing environment? Luminance, black level? These are transferred via 2086. Diffuse white can not be encoded statically because it depends on the scene content. It can be captured in either dynamic metadata (2094) for color appearance rendering, or be calculated on the fly by a display / color management system. 2084 encodes JND's, not appearance.

http://www.screenplaysmag.com/2015/...id-growing-clarity-on-bitrate-quality-issues/

"That intent has been bolstered by the development of SMPTE 2094, a draft standard moving to adoption that introduces more dynamism into the color transformation process than provided by SMPTE 2086, otherwise known as the “Master Display Color Volume Metadata Supporting High Luminance and Wide Color Gamut.” SMPTE 2086 serves as the metadata component referenced by the SMPTE 2084 Electro-Optical Transfer Function (EOTF), an alternative to the traditional Gamma or Opto-Electric Transfer Function (OETF) that is now part of the BDA’s UHD standard and the HDR specifications recommended by the Consumer Electronics Association.

By providing a means of assigning a dynamic brightness dimension to the rendering of colors by TV displays, SMPTE 2094, an adaptation of the metadata instruction set used in Dolby Vision, brings the full HDR experience into play with 10-bit as well as 12-bit sampling systems. With SMPTE 2084 and ITU Rec-2020 color gamut now specified in both the BDA’s and the CEA’s HDR profiles, a fairly clear HDR roadmap is in place, leaving it to individual industry players to determine whether they want to utilize 12-bit formats like Dolby Vision and the one developed by Technicolor or the 10-bit formats offered by Samsung and others."
 
#100 · (Edited)
2084 doesn't define an ecosystem. I don't understand why everybody here keeps acting like it does.
Because it's been adopted as a consumer media encoding format, and because it seems incompatible with a functional HDR ecosystem.
It's one component. You can't analyze it as being anything more than what it defines itself to be.
I can't understand how you think that the encoding format for HDR distribution can somehow be isolate. It has broad implications.
It astounds me that absolutely nobody has talked, or even asked, about the implications of 2086 or dynamic metadata.
The dynamic meta-data is basically useless as far as I can ascertain.
I agree that 2084 completely by itself does not work for distribution, but that is not what is being rolled out despite what many here proclaim.
That's not what I'm seeing in specs. like CEA-861.3, which is at the heart of HDMI-2.0a and HDR BluRay seems to be heading right down the same path, not surprisingly since the output of HDR BluRay players will be HDMI 2.0a.
Graeme, what are the parameters that you think constitute a viewing environment? Luminance, black level?
Amongst many others to be complete. See CIECAM02 if you want an idea.
These are transferred via 2086.
Peak brightness only - pretty much completely useless for figuring out what the critical reproduction luminance range is.
[ Correction - maybe MaxFALL is slightly more useful, but one is still blundering about in the dark when trying to figure out what part of the luminance range is important, and what part can be compressed or preserved in other ways. ]
Diffuse white can not be encoded statically because it depends on the scene content.
No it doesn't - it isn't in non-HDR encoded video. It would give the director more flexibility to be able to vary the diffuse white levcl, and the reproduction chain more scope for knowing what is important, but it isn't essential. Given a maximum 100% diffuse white encoding level, it's pretty reasonable to expect good reproduction at any brightness up to that level, just like current SDR systems. Over that level, and you are into the HDR special effects, where the reproduction can start to play fast an loose when it gets near the display limitations. Viewing environment and display brightness limitations will guide where the user sets their maximum diffuse white brightness, and everything else gets scales proportionally. There's then a firm foundation for adapting to other viewing environment conditions such as relative ambient level etc.

Of course this assumes that people want to be sensible with HDR. If past experience is any guide though, many people can't help themselves when it comes to a new gimmick to sell more stuff. Stereo sound - lets have steam trains rushing from left to right! "3D" stereo vision - lets have arrows hitting people right between the eyes! Wide gamut - lets sear their retina with garish colors! HDR - lets burn their eyes out with a dazzling flood light pointed strait at them!
It can be captured in either dynamic metadata (2094) for color appearance rendering, or be calculated on the fly by a display / color management system. 2084 encodes JND's, not appearance.
I don't trust anything to do real time human level accuracy recognition of moving images, just to extract some basic information that should be part of any useful HDR standard, never-mind the complete waste of compute resources/power needed to do so. And even if the 2094 standard when it arrives was to include some useful information (something that CEA-861.3 and ST2086 give me no confidence to expect) I'm rather cynical about the possibility of manufacturers of consumer gear successfully implementing inter-operable processing to do anything with dynamic meta-data that doesn't degrade the resulting image rather than improve its fidelity. Their record up to now is simply terrible - all sort of gimmicks to make their displays eye-popping, all of which need to be turned off to make it accurately reproduce what the director intended.

Contrast 2084 with ARIB STD-B67, which has a Reference white value that demarcates normal image content from specular, and also BBC WHP 283 similarly makes allowance for simple transformation to a wide variety of display devices.

(The BBC WHP covers the problems with 2084 explicitly in the 2nd paragraph on page 6, and none of these issues appear to have been addressed in making use of it as a distribution format).

One could shoe-horn 2084 into being a distribution format by nominating an official reference white value (say 100 cd/m^2, corresponding to one of the two viewing conditions mentioned in the standard), but there is no hint of this being the official approach.

"By providing a means of assigning a dynamic brightness dimension to the rendering of colors by TV displays, SMPTE 2094, an adaptation of the metadata instruction set used in Dolby Vision, brings the full HDR experience into play with 10-bit as well as 12-bit sampling systems. With SMPTE 2084 and ITU Rec-2020 color gamut now specified in both the BDAs and the CEAs HDR profiles, a fairly clear HDR roadmap is in place, leaving it to individual industry players to determine whether they want to utilize 12-bit formats like Dolby Vision and the one developed by Technicolor or the 10-bit formats offered by Samsung and others."
It's not at all clear how to adapt 2084 absolute HDR encoding to real world variable brightness capability end user display systems, as they seem horribly mismatched, yet it seems that "HDR" systems are being delivered with even more haste than 3D. And that's without even touching on the practical issues of attempting to create displays with Rec2020 primaries for an audience with quite real variations from the standard color observer. A commercial train crash seems one likely outcome.
 
#31 · (Edited)
Hi there,

Interesting discussion. :)

I have two practical questions.

First, does anyone knows - for sure - if a display with HDR10 support will be able to playback in HDR a Dolby Vision UHD Bluray title?
I hear conflicting information on this. Reading various white papers, my understanding was that HDR10 was a mandatory layer, so that every title would have to offer HDR10 as a base layer, and something like Dolby Vision would have another optional layer adding specific improvements, but some say not at all, what's mandatory is only for every player to support HDR10, there is nothing forcing every title to provide an HDR10 layer and in fact, Dolby Vision UHD Bluray titles won't play on HDR10 displays (which often dont' support Dolby Vision as well, it seems to be an either/or situation at the moment)

Another question, does anyone know - for sure again - what we'll lose with an UHD display which isn't HDR compatible, or which doesn't support the right HDR format for the title, supposing everything else is compliant (UHD, HDCP 2.2, 18Gb/s HDMI 2.0a, so enough bandwidth for 10bits 4:2:2 at any refresh rate)? Would we get the whole shebang (UHD, 10bits, rec2020 - I know, container, simplifying here - just SDR instead of HDR) or would there be a more severe fall-back, without going to 1080p, will we get something like UHD rec-709 8bits?

With a projector, I don't really care about HDR as such, as I don't think it can deliver significantly more contrast or highlights from 150nits, in fact we'll probably get less dynamic range and brightness with projectors in HDR than in SDR (like we get less dynamic range on a projector when we follow THX guidelines and resolve up to 255 instead of clipping at 235 or maybe 240 for a bit of headroom), especially if we can't specify our own values for reference white and peak white (if I could get reference white at 50nits and peak white at 150nits with a good roll-off, I'd be quite happy). I just see HDR as a safety line to be sure to get the other goodies. So I wouldn't really care about losing HDR - in fact, I'd rather get SDR 10bits rec2020 because at least we can calibrate to that accurately, unlike the HDR current me$$ - as long as I do get everything else from UHD Bluray.

I know it's the wild west out there, but if be someone has a link or a source to a non ambiguous answer to these two questions, it would be much appreciated.

My hopes for a final answer on these at these stage are not very high, especially before the UHD Alliance announcement about HDR at CES in a few weeks, but I thought it would be worth a try :)

Please feel free to PM me for a confidential answer if you have some info that cannot be posted publicly (yet).

Thanks.
 
#36 ·
Interesting discussion. :)

I have two practical questions.
There is a relentless lack of specifics in this thread, Let me provide some. Here's the problem. The HDR gamma curve requires 0.024% output @ 20% input or 2.4 nits from the required 10,000 nit display.

However, we don't have 10,000 cd/m2 displays. What we have are, at best, 1,000 nit displays. How do you calibrate the 20% point on these displays? If you shoot for 2.4 nits you are way, way off from the prescribed gamma curve. The SMPTE 2084 curve would have you shoot for 0.24 nits @ 20% on this display. But that's ridiculous. 0.24 nits is visible but it is very, very dark. A film in which 20% video--very typical of film content--output 0.24 nits would be unwatchable.

If you stuck with the 2.4 nits target, then this would be fine, but now you have deviated by a large degree from the specified curve. So what, exactly, are you supposed to do in these cases? The only suggestion I have seen that makes sense is that you calibrate to the absolute luminance targets prescribed by SMPTE 2084 and stick to the curve AS IF your display output 10,000 nits, and then above about 75% you just clip.
 
#32 ·
I have tried to read and understand this new standard and
frankly I don't understand what to calibrate a set to.
I know my software has the ability but I would not know where to start.
I am conditioned to think about 100%white and gamma curves to/from black.

I dread the day I might actually have to say no if asked about my
ability to set HDR properly.

still looking for what are the patterns to use, what resolution, color space,
and other standards needed to do a good job.

looking forward to 3 grade level details that I can understand.
 
#33 ·
So how will HDR look today on a 4K HDR display? I still have no idea, but photographers have been doing it for a long time ... so, what they are doing will give you some idea. See https://www.newtonplks.org/wp-content/uploads/2009/09/High-Dynamic-Range-HDR.pdf What photographers basically do is they take 3 photographs to create an HDR picture. These 3 pictures of the same object are taken at high exposure (overly bright), normal exposure and at a low exposure (darker) and are then merged onto one photo on a pixel basis to get the "best" pixels of all 3 ... per the "artistic intent" of the photographer. Still, photographers need a special display that can show the dynamic range that covers the dynamic range of all 3 exposure levels. Camera stops are either the doubling (x2) or cut in half (1/2) of the exposure range. The human eye covers a range of 24 stops. See: http://www.picturecorrect.com/tips/what-are-stops-in-digital-camera-settings/ .


Can anyone describe their reaction to HDR film vs. its STD version?


JJ


*********








 

Attachments

#34 ·
The result image that you showed is what you get if you take an HDR image and process it for a standard dynamic range reproduction with a certain type of tone mapping that boosts colorfulness and enhances image detail. It has pretty much nothing in common with what you get when showing HDR on an HDR capable display. You can also do HDR tone mapping to standard dynamic range without making the photo look like the garbage that goes up on Flickr as "HDR". I don't think of HDR as having to look one particular way. If you want to stylize it then of course you can do that, but you can also make it look very natural and lifelike, or make it look very filmic just with greater range. One of my biggest fears is that the industry will fall into an "HDR" look much like it has fallen into the orange/teal blockbuster look.
 
#35 · (Edited)
EOTF Centric Calibration

Calibrators are heavily EOTF/gamma focused (ST 2084) because it’s the most powerful tool we have to “scientifically/smoothly beautify/adapt”source material to counteract the ill effects of ambient light conditions of the home cinema/theater and counteract poorly graded blu ray movies that are slightly over or under exposed.

For example, my JVC 6710 projector has a gamma range from 1.8 to 2.6. I’ve used 1.8 gamma with 3D blu ray movies that are under exposed, like the Immortals. My JVC 6710 is not bright enough to compensate for the under exposure of this film. So I use 1.8 gamma so I can at least see brighter mid-tones. On the otherhand, I’ve used 2.6 gamma for the 3D blu ray movie “47 Ronin” which is way too overexposed. The beauty that multiple gammas provide is that it allowed us to adjust for poor blu ray exposure levels and ambient light conditions without changing the black levels.

So, how will calibrators scientifically/smoothly correct for under or over exposure of 4K HDR Blu Ray movies cause by poor post production? How do we adjust for the ill effects of ambient light conditions in home cinemas in terms of source content … today we have a choice of gammas between 1.8 and 2.6. What EOTF “choices” will be have with HDR … are there any? BT1886 is not dynamic enough. Right now, JVC’s HDR capable RS500/600 projectors have “EOTF/gamma” pseudo controls for "Picture Tone", "Dark Level" and "Bright Level". I doubt other consumer HDR enabled projectors have even these controls … anyone? I’m sure this will be addressed over time as user HDR related feedback starts pouring in. :D
 
#48 ·
Sony has it right. The display in a Rec 2020 mode expects to receive Rec 2020 encoded video and will display it as accurately as it can. If the gamut of the display ends near P3, then your calibration targets should be P3 colors encoded as 2020.

It sounds like MadVR is essentially placing itself between the display and the source and doing the color management itself by performing a conversion from 2020 to your display gamut. In a normal AV setup, the UHD Blu-ray player will send 2020 to the television and then the television handles the conversion from 2020 to its native gamut.
 
#50 ·
Sony has it right. The display in a Rec 2020 mode expects to receive Rec 2020 encoded video and will display it as accurately as it can. If the gamut of the display ends near P3, then your calibration targets should be P3 colors encoded as 2020.

It sounds like MadVR is essentially placing itself between the display and the source and doing the color management itself by performing a conversion from 2020 to your display gamut. In a normal AV setup, the UHD Blu-ray player will send 2020 to the television and then the television handles the conversion from 2020 to its native gamut.
The problem is that I can't have the same calibration for MadVR and another source like UHD Bluray, because with UHD Bluray I'll have to use a rec2020 calibration and with MadVR I have to use a P3 calibration simply because MadVR decides that it should convert the container to the actual gamut used during mastering.

This really confuses the issue, but until experts agree on the way to do this, we're only left with a best guess scenario.

There is no doubt that most displays used for grading content are right now closer to P3 capability than to rec2020 capability, and there is no doubt that most consumer displays are closer to P3 than rec2020 as well. That's not the question.

The UHD Standard doesn't even mention P3. Using rec2020 as a container, and playing back the content with a rec2020 calibration, makes it possible to not have to worry about the actual capability of the display, whether it was P3 or rec2020 (and yes, it's going to change over the next couple of years, by 2020 most likely everything will be mastered to rec2020).

But deciding to convert to P3 makes it 1) not right for all titles, especially going into the future and 2) not right for all displays. It just feels wrong, and it forces two calibrations when one (rec2020) would be enough.

I guess the only positive aspect of calibrating to P3 is to make it possible to show nice little graphs with nice little targets met right at the edge at 100% sat, but that doesn't make it easier to calibrate a set, especially when UHDTV will start to broadcast in rec2020 as well.

Hopefully this will get clearer and we'll stick to a single rec2020 calibration, even on sets limited to P3. Then the hardware and software can grow towards the end goal, which is rec2020, both for mastering and content reproduction.
 
#51 ·
Sounds like you've answered your own question then. MadVR isn't doing a standard approach if it expects you to calibrate your display to P3. P3 isn't a standardized color space. Even in digital cinema, the color space is XYZ and P3 is just the required range of colors that it needs to be able to reproduce. So with 2020 it is a similar thing. The color space IS 2020 and then there is some range of colors (i.e. P3) that a calibrated display should be able to hit.

If MadVR is converting its output to the mastering gamut then it is doing it wrong. You would have to change your display configuration every time a different mastering gamut is defined in the metadata. Think of the mastering gamut is an intermediate step between the 2020 encoding gamut and your display's native gamut, sort of like a connection space in ICC profiles.
 
#52 ·
ATM none of the 3 major GPU manufacturers (AMD, NVidia, Intel) have an API available that allows me to signal SMPTE 2084/86 to the display. As a result madVR currently limits itself to convert HDR content to a format all non-HDR displays can understand. That means I'm (properly!) converting HDR content to a conventional gamma curve, using an intelligent (non-clipping) tone mapping algorithm which can be adjusted by the user to the peak white capability of his display.

It's not true that madVR would always convert the gamut to P3. The user can tell madVR which gamut his display is calibrated to. 2020 is a valid choice. If the user tells madVR that his display is calibrated to 2020 then madVR will passthrough the full HDR color data to the display without compressing the gamut in any way. However, if the user tells madVR that his display is calibrated to e.g. P3 or BT.709 (or if the user doesn't provide madVR with any such information at all), madVR will then clip the gamut to P3 or BT.709, by using math that avoids hue or lightness changes. Which means madVR compresses the gamut in higher quality than what any display is likely to do.

All the GPU manufacturers are working on HDR support now. Once they add APIs for me to signal SMPTE 2084/86 to the display, madVR will support doing that, too, of course.

-------

One major issue with HDR that I can see is that "external" calibration is becoming very very difficult because the *display* is going to do a lot of both gamut and lightness processing with unknown algorithms. If the UHD Blu-Ray player were responsible for converting HDR content to something the display can handle, we could use an external video processor to do calibration in a good way. But since all the processing is going to be done by the display, calibration has to work on the original HDR data, without knowing what kind of tone and gamut mapping algorithms the display is going to apply afterwards. That makes things very very complicated for calibration companies, as far as I can see.

The benefits of letting madVR do the conversion from HDR to what the display can handle are:

1) madVR probably does tone + gamut mapping in much higher quality than what any of the first generation HDR compatible displays will do.
2) 3dlut calibration still works as usual. No change needed for the calibration software.

Unfortunately UHD Blu-Ray comes with a new copy protection, so using madVR for UHD Blu-Ray playback is probably going to be hard or even impossible, for a while at least.

-------

How to calibrate a display for HDR content? IMHO the best approach would be to use test patterns that are encoded using SMPTE 2084/86, with similar metadata to what actual UHD Blu-Rays are going to use. That appears to be P3 gamut in a 2020 container, using either 1200nits or 4000nits peak luminance. Only this way we are calibrating the display with test patterns that are running through the exact display processing algorithms that are later going to be used for actual UHD Blu-Rays.

Of course another option would be to try calibrate using 2020 in a 2020 container. But I haven't seen any HDR content with these properties yet, so I don't think its worth the headaches at this point. Just my personal opinion, of course...
 
#53 ·
How to calibrate a display for HDR content? IMHO the best approach would be to use test patterns that are encoded using SMPTE 2084/86, with similar metadata to what actual UHD Blu-Rays are going to use. That appears to be P3 gamut in a 2020 container, using either 1200nits or 4000nits peak luminance.
Two questions: First, what's the difference between calibrating to a "P3 gamut in a 2020 container" and just calibrating to a P3 gamut? Second, how would one calibrate gamma on a 1200 or 4000-nit display when SMPTE 2084 requires 10,000 nits?
 
#57 ·
Hello Madshi,


Thanks for pitching in.


A few questions:


1) As far as I can see, it isn't true that MadVR doesn't always convert to P3. If I have a 3DLUT for rec2020 and no 3D LUT for P3, MadVR still converts HDR content using rec2020 as a container to P3.
2) As far as I can see, MadVR doesn't allow to specify what to do if there is no 3D LUT for a given calibration slot, which is an issue given the way displays are going to have different calibrations for rec-709 and HDR/rec2020 calibrations. For example, a projector like the JVCs will usually have a calibration for rec-709, NTSC and PAL from one baseline which doesn't use a P3 filter, so as not to lose brightness unnecessarily and keep the best possible saturations from the native gamut. Then a separate calibration will be used for P3/Rec2020, either using a P3 filter to extend the capabilities of the native gamut or using a different calibration with P3 or rec2020 saturations. How does MadVR deal with that? At the moment, I think MadVR is only able to deal with one baseline, and when there is no calibration in a slot, it converts to what it believes is the displays capability. So if there is a rec-709 calibration, it will convert P3 or rec2020 to rec-709. But what if we would like no conversion to happen for P3 because the display can reach 90% of P3 without a filter and 100% of P3 with a filter? We can have the display calibrated internally to P3, yet not want to use a 3D LUT because our internal calibration is good enough - for example, a calibration done with the JVC Autocal software using a custom P3 colour profile provides excellent results and can't be improved with a quick LUT. MadVR is unable to deal with such a situation, which is going to cause lots of issues down the line. MadVR needs to provide more flexibility regarding what is done when there is no calibration in a slot. Converting to rec-709 because there is a 3D LUT for rec-709 isn't necessarily the best thing to do.
3) Regarding calibrating to rec2020, you are saying not to bother, but what is the difference between calibrating to rec2020 and calibrating to P3? One calibration (rec2020) allows us to deal with ALL UHD Bluray content, while another (P3) only allows us to deal with current content. There is already content being mastered to rec2020, and there is likely more content to come with this calibration. So why convert the container to a temporary gamut which isn't even part of the specification for UHD Bluray? Also why would we need to use specific patterns? If we are using madVR to convert PQ gamma to a standard gamma on a non HDR display, why couldn't we use standard patterns and calibrate to P3 or rec2020 without having to switch the display to HDR mode?
4) You are saying you are not an expert on calibration, yet you are contradicting what experts on calibration are saying we should be doing regarding UHD Bluray calibration, which is to use rec2020 saturations. Why? If you have a different opinion, please could you post, as I have already requested, in the discussion where these experts have already given their advice, so that under their control you can explain why you disagree and why you believe a P3 calibration is more appropriate and more relevant? I am not in a position to challenge you on this, yet your position is very confusing. I would really appreciate if you could challenge the position of the experts who have taken the time to explain what should be done, in their opinion, so that this can be clarified and we can reach a better understanding of the situation. Here is again the thread where all this has been discussed: http://www.spectracal.com/forum/viewtopic.php?f=92&t=5859. Towards the end of the thread, I wrote a recap to make sure that I wasn't misunderstanding anything - I'm not an expert either! - and there was no significant correction to this recap, in particular regarding the need to calibrate to rec2020 saturations, not to P3.
5) MadVR does a great job at converting and displaying HDR content on both HDR and non HDR compatible displays, so thanks for adding this feature. I just hope the calibration situation will be resolved so that we know what we should be doing on the display side to get consistent results and not two or three different calibrations for next gen content, without having to deal with more calibrations than necessary, especially as rec2020 mastered content becomes more frequent. There is content mastered using the Christie rec2020 projector, so it's unavoidable that true rec2020 content will arrive at some point over the next couple of years. We will have to deal with it properly, and converting it to P3 isn't necessarily the best way to deal with it. Displaying the content using rec2020 calibration, even on a P3 limited display, might allow us to get a better representation of the content as it will take a gamut conversion out of the process.


Thanks!
 
#58 ·
#59 ·
Thanks, Ted. Those were good reads. I particularly agree with this statement (as its been discussed on here):


HDR content is often created with some pixels containing increased content peak brightness levels that cannot be reproduced on all HDR displays. High-quality HDR displays should avoid hard clipping at their peak luminance limit and instead roll off bright highlights that exceed the display capabilities, mapping the values that cannot be shown to values that can be shown on the display. Of course, more advanced tone mapping algorithms are desired, potentially even required, especially in scenarios where the frame average luminance exceeds the display capabilities. In either case, consumer displays should be designed expecting to receive content with video parameters exceeding their own display capabilities, and the displays should be prepared to manage the signal without compromising the user experience.


This goes back to the discussion a few posts back about hard clipping the source material's MaxCLL to the capability of the display in order to maintain 25-100cd/m2 MaxFALL (I think it was Tom's post). Hard clipping the material could compromise the user experience as will bringing the white level control down to prevent hard clipping. The MaxFALL would be too low to be usable and enjoyable for consumers. Think about all of those users that have set up their current TVs to show levels up to 254 @ 120cd/m2 - their average frame light level is too low, with 235 measuring at 70cd/m2. For what reason? To preserve what? All post companies I calibrate for don't use above 235 and the same can be said for most BDs. But moving forward with HDR this material will be important, so hard clipping may not be the answer either. Whether consumer displays will be designed to be intelligent to roll off highlights is a different story...


 
#60 ·
this "HDfury Integral" helped me better understand.
iI understand the handshake and it struck me that pattern generators would need to be able to tell the display that the triplets are for
a certain color space.

It sounds like? The generator will also need to have the ability to send metadata as well?

and? 3 different data specs?

I think I understand the 75% and then let it clip.

The device HDfury Integral helps me with the pattern source confusion.

good discussion.
 
#61 ·
Top