I have two practical questions.
First, does anyone knows - for sure - if a display with HDR10 support will be able to playback in HDR a Dolby Vision UHD Bluray title?
I hear conflicting information on this. Reading various white papers, my understanding was that HDR10 was a mandatory layer, so that every title would have to offer HDR10 as a base layer, and something like Dolby Vision would have another optional layer adding specific improvements, but some say not at all, what's mandatory is only for every player to support HDR10, there is nothing forcing every title to provide an HDR10 layer and in fact, Dolby Vision UHD Bluray titles won't play on HDR10 displays (which often dont' support Dolby Vision as well, it seems to be an either/or situation at the moment)
Another question, does anyone know - for sure again - what we'll lose with an UHD display which isn't HDR compatible, or which doesn't support the right HDR format for the title, supposing everything else is compliant (UHD, HDCP 2.2, 18Gb/s HDMI 2.0a, so enough bandwidth for 10bits 4:2:2 at any refresh rate)? Would we get the whole shebang (UHD, 10bits, rec2020 - I know, container, simplifying here - just SDR instead of HDR) or would there be a more severe fall-back, without going to 1080p, will we get something like UHD rec-709 8bits?
With a projector, I don't really care about HDR as such, as I don't think it can deliver significantly more contrast or highlights from 150nits, in fact we'll probably get less dynamic range and brightness with projectors in HDR than in SDR (like we get less dynamic range on a projector when we follow THX guidelines and resolve up to 255 instead of clipping at 235 or maybe 240 for a bit of headroom), especially if we can't specify our own values for reference white and peak white (if I could get reference white at 50nits and peak white at 150nits with a good roll-off, I'd be quite happy). I just see HDR as a safety line to be sure to get the other goodies. So I wouldn't really care about losing HDR - in fact, I'd rather get SDR 10bits rec2020 because at least we can calibrate to that accurately, unlike the HDR current me$$ - as long as I do get everything else from UHD Bluray.
I know it's the wild west out there, but if be someone has a link or a source to a non ambiguous answer to these two questions, it would be much appreciated.
My hopes for a final answer on these at these stage are not very high, especially before the UHD Alliance announcement about HDR at CES in a few weeks, but I thought it would be worth a try
Please feel free to PM me for a confidential answer if you have some info that cannot be posted publicly (yet).
Last edited by Manni01; 12-18-2015 at 03:29 AM.
frankly I don't understand what to calibrate a set to.
I know my software has the ability but I would not know where to start.
I am conditioned to think about 100%white and gamma curves to/from black.
I dread the day I might actually have to say no if asked about my
ability to set HDR properly.
still looking for what are the patterns to use, what resolution, color space,
and other standards needed to do a good job.
looking forward to 3 grade level details that I can understand.
Can anyone describe their reaction to HDR film vs. its STD version?
For example, my JVC 6710 projector has a gamma range from 1.8 to 2.6. I’ve used 1.8 gamma with 3D blu ray movies that are under exposed, like the Immortals. My JVC 6710 is not bright enough to compensate for the under exposure of this film. So I use 1.8 gamma so I can at least see brighter mid-tones. On the otherhand, I’ve used 2.6 gamma for the 3D blu ray movie “47 Ronin” which is way too overexposed. The beauty that multiple gammas provide is that it allowed us to adjust for poor blu ray exposure levels and ambient light conditions without changing the black levels.
So, how will calibrators scientifically/smoothly correct for under or over exposure of 4K HDR Blu Ray movies cause by poor post production? How do we adjust for the ill effects of ambient light conditions in home cinemas in terms of source content … today we have a choice of gammas between 1.8 and 2.6. What EOTF “choices” will be have with HDR … are there any? BT1886 is not dynamic enough. Right now, JVC’s HDR capable RS500/600 projectors have “EOTF/gamma” pseudo controls for "Picture Tone", "Dark Level" and "Bright Level". I doubt other consumer HDR enabled projectors have even these controls … anyone? I’m sure this will be addressed over time as user HDR related feedback starts pouring in.
Last edited by Carbon Ft Print; 12-23-2015 at 03:42 AM. Reason: typo
However, we don't have 10,000 cd/m2 displays. What we have are, at best, 1,000 nit displays. How do you calibrate the 20% point on these displays? If you shoot for 2.4 nits you are way, way off from the prescribed gamma curve. The SMPTE 2084 curve would have you shoot for 0.24 nits @ 20% on this display. But that's ridiculous. 0.24 nits is visible but it is very, very dark. A film in which 20% video--very typical of film content--output 0.24 nits would be unwatchable.
If you stuck with the 2.4 nits target, then this would be fine, but now you have deviated by a large degree from the specified curve. So what, exactly, are you supposed to do in these cases? The only suggestion I have seen that makes sense is that you calibrate to the absolute luminance targets prescribed by SMPTE 2084 and stick to the curve AS IF your display output 10,000 nits, and then above about 75% you just clip.
ChromaPure Software/AccuPel Video Signal Generators
Although these are not answers to the questions I've asked for, there are already ways to address that if the display handles it properly.
First of all, along with the characteristics of the display regarding primaries (as apart from the Christie projectors which is able to display rec2020, most grading monitors are closer to P3 than to rec-2020), the actual peak white used for mastering is usually encoded in the content with metadata. 10000nits is a theoretical max. Even the Dolby Vision HDR monitor peaks at 4000nits. Most consumer content is mastered to something like 700-1200nits, which is the maximum most consumer displays are cabable of. For example, if you take the Exodus and Life Of Pi HDR trailers which circulate, they have been encoded at 1200nits, not at 4000 or 10000nits.
Then, for projectors for example which are not capable of 1000nits but closer to 100-200nits, you need to convert this mastering curve to the actual capability of the display.
I believe the new JVCs had a setting in pre-production units which allowed to adjust the value of peak white depending on the mode used (and model). Sadly this adjustment seems to be gone in production units. They still offer three gamma adjustments (for dark, mid and high tones). It's a crude control but it offers a way to get a better result.
But it you want to see how this works very well, try the latest version of MadVR on an HTPC (you will need to enable the Full Screen Exclusive mode if you want to keep the 10bits). It plays HDR content on any display (HDR compatible or not) by reading the metadata, converting the PQ gamma curve to power gamma or BT1886, and this works very, very well. You only have to specify the peak brightness your display is capable of (as low as 120nits for projectors) and the roll off of the curve is adjusted. The result looks great. Its OSD also displays some of the HDR metadata of the content played (like the value used for peak white during mastering) which is very handy.
Now it's not what I call proper calibration, and Dolby Vision from this respect is much better defined than HDR10, but still the results are very decent.
Of course, you don't get a better dynamic range with HDR on a projector, you get less dynamic range and contrast for reference white and more headroom for highlights, just like when you calibrate SDR following THX recommendations and resolve up to 255 instead of clipping at 235 or maybe 240 for a bit of headroom for the color channels.
This is why my questions are not really related to calibrating HDR itself, which isn't properly defined yet (we might get more info at CES when the UHD Alliance announces what the HDR standard is going to be for consumer content), but what happens if a display doesn't support the type of HDR for a given title, for example how a Dolby Vision title plays on an HDR10 display) and what happens when a display doesn't support HDR (or the right kind of HDR).
This, unlike HDR itself, will impact projector owners, depending on whether a non HDR UHD display still gets the other goodies (10bits, wider gamut) or not.
As a projector user in a dedicated room, I'd much rather have taken the same HDR as in cinemas, which is mastered to 50cd/m2 for reference white and 100cd/m2 for peak white, but that wouldn't make any difference for flat panels in a living room which are already calibrated to 100cd/m2 for reference white in SDR.
In fast, HDR is such a mess and of so little value for projectors in a dedicated room that I would much rather get SDR 10bits rec2020/P3 to which I can calibrate properly, than an HDR mode designed for flat panels in a room with ambient light which is impossible to calibrate to accurately at the moment (except with Dolby Vision, which seems better defined because it not only defines a standard for consumer content delivery, which HDR10 hasn't done yet, but it also has a database of the characteristics of all its supported displays, which it communicates to calibration software so that they know which targets to use for an effective calibration).
Anyway, all this is a work in progress, hence my two practical questions
Last edited by Manni01; 12-23-2015 at 01:34 AM.
ChromaPure Software/AccuPel Video Signal Generators
It's not ignoring SMPTE2084, it's reading it, using the metadata to know which peak white value was used for mastering the specific content being played - as well as which gamut coordinates were used during mastering as rec2020 is only used as a container - and then converting it to a standard power or BT1886 curve. The metadata is essential for this, as no no mastering display is able to support 10000nits and rec-2020. So the metadata tells you what you need to know to be able to display the content properly, either using an ST2084 PQ gamma curve (on an HDR compatible display), or converting this PQ gamma to a power gamma or BT1886, once you know the capability of the consumer display (as MadVR does on any display).
This is where Dolby Vision is better designed than HDR10 (at the moment): it stores and provides the capability of each display that supports it, which makes consumer content reproduction possible and accurate. At the moment, my understanding is that HDR10 doesn't define any standard for consumer content reproduction, hence the impossibility to calibrate to any standard (yet). This is why MadVR's ability to not only read the metadata (to know at which peak white value the content was mastered, as well as the gamut coordinates used for mastering) but also to specify the peak white ability of the display helps to achieve good results.
The biggest difference with what we were doing is two-fold.
First, with SDR/HD/Bluray, we could use a fixed gamut (rec-709) and a fixed gamma (BT1886 has become the standard more recently) to calibrate a display and see exactly what was intended.
Now, because rec2020 is only used as a container most of the time and because it's not reading 100% white and black which defines the gamma targets, but because peak white is fixed for the content (but can vary depending on the choice made during mastering), we need metadata to know 1) the gamut capability of the mastering display, which is usually around P3, as 100% of the capability of the display will not be encoded as 100% of the gamut in the container, similar to calibrating for rec-709 on SMPTE-C displays), and 2) the value for peak white used for the PQ Gamma curve applied during mastering, as this also isn't fixed. But as I said, it's usually around 1000nits at the moment for consumer content, because that's the capability of most consumer displays (flat panels), not 4000nits or 10000nits.
So yes, you do need the metadata to be able to display HDR content properly. Without it, you can only guess what's in the content.
Second, we used to be able to get precise targets from reading black and white on the user display, and the gamut was fixed too but that's gone too. Until consumer content reproduction is defined for HDR10, it's the wild west.
So it's both the mastering targets and the consumer display targets which are moving targets at the moment, making HDR10 accurate reproduction close to impossible. Hopefully this will be defined soon...
There is still a debate regarding whether we should calibrate to rec2020 for HDR content, to get the correct saturations as this is the container being used, or if we should calibrate to P3 as most of the content is mastered today with mastering monitors having closer to P3 capabilities.
My understanding was that we were supposed to calibrate to rec2020, so that the saturations are correct, as the content is not encoded in P3, just graded with monitors having close to P3 abilities, but some are challenging this, so I guess experts still need to agree on this.
Last edited by Manni01; 12-23-2015 at 03:45 AM.
First, I should probably be asking Madshi this, but how in the world can madVR "specify the peak white ability of the display" beyond just measuring it as we do now? DolbyVision displays have firmware that specify peak output capability, but a Samsung "HDR" display does not.
Second, how can metadata--which is encoded into the source material--tell us anything at all about "the gamut capability of the display"? How can it do anything beyond providing information about the mastering display? I guess I am not up to speed on the whole metadata thingy yet.
Finally, if there is a mismatch between the mastering display and the viewing display, what role does the metadata play? The display can only render the gamut it is physically capable of. Metadata cannot change that.
You cannot use the full SMPTE 2084 gamma curve (shown below) on current HDR displays. They would be MUCH too dim. You can either:
1) Use the SMPTE 2084 luminance targets (assuming falsely that you have a 10000 nit display) and then clip above 75%.
2) Simply use another gamma curve (power law or BT.1886) with your display's actual peak output.
In either case, I don't understand what value there is in knowing the peak luminance of the mastering display. If I have a commercial HDR display with peak luminance of 800 nits and the metadata reports that the content was mastered on a display with peak luminance of 1200 nits, what is the display or the individual calibrating that display supposed to do with that information? I can only work within the capabilities of my display.
SMPTE 2084 gamma curve
Regarding gamut, I guess that I am still not entirely sure what role the metadata plays beyond perhaps automating what we have always done. If the content is mastered at DCi-P3, then it can only be reproduced properly on a display that is capable of this gamut. This has always been the case. For example, there was a period in the early days of HD when there was some question about the gamut used to master content: some used SMPTE-C and others used Rec. 709 (Blu-ray content is now all mastered on Rec. 709). It would have been useful during this period to know which gamut was used because it would have provided targets to aim for that were routinely achievable (most displays have been capable of Rec. 709 for many years). With the transition to UHD, aren't we in pretty much the same situation? It would be useful to know the mastering gamut so we can use that as a calibration target. The limiting factor is just the displays themselves. Are they capable of rendering a full DCi-P3 gamut? Does the metadata simply force the UHD display into the Picture mode that best matches the mastering display's gamut, automating what we would have done manually in the past?
ChromaPure Software/AccuPel Video Signal Generators
It's not MadVR that specifies the peak white capability of the user's display, it's the user
You measure your display in the mode you plan to use for HDR, and you report that peak white (or the closest value) in the MadVR settings for the display, where you also specify the max bit depth and other things which are display specific.
The metadata doesn't tell you anything about the gamut or peak white capability of the user's display, it tells you about the gamut capability of the MASTERING display.
Say the mastering display used to grade the content is only able to reach just above P3 and around 1200nits. The content will be graded using this, but using rec2020 as a container. This means that even if the grading display is in rec2020 mode, a fully saturated primary on the mastering display will be undersaturated compared to the container's primary.
So, a properly calibrated display in rec2020 mode but only able to display up to just above P3 will be able to display rec2020 (with the correct saturations)up to P3 limits, and will clip after that. It will report its actual capabilities in the metadata, along with the peak white used FOR MASTERING.
Then, at the other end of the chain, an HDR compatible display will get the content encoded in rec2020, and display it according to the capabilities of the user's display, performing both gamut conversions (for example if the display can only handle rec709) and PQ Gamma conversion (for example if it can only handle 150nits peak bright instead of the 1200nits the data was graded to).
MadVR reads both the metadata relevant to the grading display (peak white value, gamut primaries) and the user specified values (bit depth, peak white for the user's display) to convert the content properly. It's doing what a manufacturer should be doing for a specific display, but as it doesn't know the capability of each display, it allows the user to specify the peak white value to get the best results.
As I said, the big question still up in the air is do we need to calibrate to rec2020 (the container's primaries) or to P3 (the capabilities reported by most of the displays used to grade HDR content). Most user's displays won't reach above P3, but to get the correct saturations we need to know if we have to calibrate to P3 or to rec2020. At the moment, MadVR seems to convert the rec2020 content to P3 when the mastering display reports close to P3 capability, so it would suggest that for MadVR at least, we should use a P3 calibrated gamut, but that's not what others are recommending. That I'm still waiting to get a firm answer about, because experts seem to be disagreeing on this.
You do need to get up to speed with all this, it took me a while to get around it as a lot of it is still up in the air, at least as far as HDR10 consumer content playback is concerned. My understanding can only go as far as what experts agree on, and there is no agreement at the moment as the standard is unfinished as far as consumer playback is concerned.
Last edited by Manni01; 12-23-2015 at 06:34 AM.
ChromaPure Software/AccuPel Video Signal Generators
My understanding was that content was mastered either to rec-709 or to rec-2020, and that P3 was never used. In that case, metadata would only be relevant to tell you what the actual gamut was (rec-709 or rec-2020, as both are part of the UHD Bluray specs). I was initially told that P3 wasn't even mentioned in the UHD Bluray specs, and that we only had to calibrate to rec-709 or rec2020, and chose the calibrated mode according to the content (and the metadata). But then another source said that not at all, that I had got it all wrong, because most mastering monitors used for grading had close to P3 capability, most content was reporting a P3 calibration, so we didn't have to calibrate to rec2020 but to P3, even if P3 was not part of the UHD Bluray specs. Talk about confusion!
Honestly, don't look at me as if I should be clearer. I'm not an expert. I've asked experts and even then cannot agree regarding how we should calibrate for UHD bluray. Apparently they are still discussing this amongst themselves.
This is why I was hoping that if you did some research and got up to speed, you could maybe come up with more definitive answers.
You're asking me as if I either had it wrong, or knew the answer.
At first, I knew I didn't know, so I did some research, asked some experts, then I thought I knew, but some other expert told me I was wrong, so I'm waiting until the experts get to agree.
The only thing I know for sure is that given the various ways UHD Bluray content can be mastered (rec-709, rec2020 and apparently P3 too possibly depending on which expert is talking) you need the metadata to tell you what was used to master the content.
With bluray, you didn't have that issue because even when the mastering monitors were only able to reach SMPTE-C, officially the standard has always been rec-709. So on early blurays you had to try both calibrations and decide - without any way of knowing for sure, except what your eyes would tell you - which gamut had been used. Same with gamma: there was no standard, and no way to know for sure which power gamma - and later whether BT1886 - had been used. While BT1886 became more of a standard over the last couple of years, it's still not used all the time and I frequently have to switch to a 2.4 or 2.35 power gamma in order not to get raised black levels.
The metadata in HDR/UHD Bluray content is there to prevent this. To tell you which peak white was used for the PQ gamma curve during mastering, and which gamut was used during calibration.
If this doesn't make sense, please use your contacts, experience and clout to get some solid answers. I'm still waiting for them
Last edited by Manni01; 12-23-2015 at 02:09 PM.
ChromaPure Software/AccuPel Video Signal Generators
I guess this means software publishers have to get a license from Dolby and implement access to their database in some way, so you might want to explore this as this is proprietary information. Once the user display is calibrated, the metadata in the content is only interpreted by the display itself to calculate the proper roll-off for the PQ gamma curve and gamut conversion, it's not needed during calibration.
As I said earlier, apart from the fact that HDR10 is not completed yet for consumer playback and that many experts don't agree regarding which gamut we are supposed to calibrate to depending on the way the content was mastered, the UHD Alliance is expected to make an announcement in a few weeks at CES to clarify the consumer HDR standard. Let's hope that they are going to fill all the gaps, and either announce a completed HDR10 standard, or select a properly defined standard such as Dolby Vision.
Let me try to be as clear as I can because I sometimes have a problem with being wordy J Unless there are other factors I haven't considered, I've concluded that it would be better to calibrate to Rec.2020 as best as the display can rather than calibrating to P3 targets. Today’s displays can’t reach those 2020 targets, but the display’s measurements will land somewhere between P3 and 2020. There are a handful of displays that are beginning to measure colour outside of some P3 targets and that will continue as displays continue to improve.
If 2020 is the container and P3 is within the container, we wouldn't want to desaturate P3 content by bringing our 2020 targets inwards to P3 during calibration. My hypothesis is that when using 2020 signals to calibrate, reducing the wider gamut of the display to match that of P3 will desaturate both P3 and Rec.709 signals within the 2020 container.
If the calibration software and video generator are sending out red, green, and blue signals at their specified 2020 values, it would be erroneous to reduce the display's saturation control to measure these signals at P3 (with the assumption that the content will be around P3.) By sending 2020 colours to the display, we’ll measure to the limitations of the display despite the large dEs at this time. Again, 2020 will be the container and the actual colours of the content could be anything inside of that.
If the above were true, then how would we be able to verify if the display is accurately displaying what’s inside of the container? Would a 2020 saturation sweep be sufficient to see how far out in the gamut the display would hit targets and where it would clip? But these sweeps wouldn’t identify the targets for P3 or Rec.709 gamuts. What would be more appropriate is to send out P3 and 709 colour saturation points within a 2020 signal, then measure and verify. That would be a surefire way to ensure that the display is being faithful to those targets within the 2020 container that uses the same white point and luminance value. I say all of this without factoring in colours at HDR levels, but measuring at a point of what we consider 100% white today.
What’s also in question is how consumer displays will display HD and UHD signals. When feeding the TVan HD709 signal, an HD picture memory should be used. When feeding it a UHD2020 signal, the TV should automatically kick into a UHD memory. Two calibrations will be needed since we can’t calibrate a single memory for HDR& WCG and expect those settings to be ok for HD/SD content.
This is good discussion and I’m glad I stumbled on this thread. I’ve been discussing this topic for almost a year now with my industry friends and I’m hoping in the new year there is greater definition around this. UHD Blu-ray is around the corner and one or our largest cable providers, Rogers, will be delivering 4K and HDR with live sporting events in 2016. I would like to know what their goals and targets are in the studio. My clients are beginning to ask questions and I don't like not having clear answers. I don't roll that way.
THX/ISF Professional Video Calibrator (AVS Listing)
Secrets of Home Theater
and High Fidelity
Last edited by Michael Osadciw; 12-26-2015 at 09:49 AM.
For example, although the calibration experts I discussed this with initially seemed to be 100% sure that we had to calibrate to rec2020 for UHD Bluray, Madshi believes that it's not the case, so in MadVR he converts rec2020 encoded content to P3 with MadVR because he believes that the content should be converted to the capability of the display used for mastering (he will clarify or correct if that's not the case, but that's what I see when I play HDR content encoded with a rec2020 container, MadVR converts this to P3, it doesn't keep the rec2020 coordinates).
So while the recommendation I was given was to calibrate to rec2020 for content using rec2020 as a container, I will have to have a separate P3 calibration for MadVR simply because Madshi disagrees and decides this is not the correct way to do it. I have no idea who is right until a white paper somewhere defines how to calibrate for playback of UHD Bluray, and everyone reads it and agrees with what's in it.
It looks like displays manufacturers follow the same path. For example, Sony calls their UHD/HDR mode rec2020, which suggests they will use rec2020 targets and saturations, but JVC doesn't even have a rec2020 colour profile in their JVC Autocal software, they only offer a P3 filter and a DCI/P3 profile. They do have a colour profile called reference which seems to be using rec2020 saturations, but this is not the profiled called by their HDR mode, they call P3, not rec2020 when HDR content is detected.
Of course, you can create a custom profile using the rec2020 coordinates and the P3 filter to try to get as close as possible, but until there is a clear agreement between manufacturers and software developers like Madshi, there will be no calibration standard for UHD Bluray.
So I'm waiting, hoping things will get clearer by the time UHD Bluray is released. But the lack of agreement between experts certainly doesn't make this easy.
It sounds like MadVR is essentially placing itself between the display and the source and doing the color management itself by performing a conversion from 2020 to your display gamut. In a normal AV setup, the UHD Blu-ray player will send 2020 to the television and then the television handles the conversion from 2020 to its native gamut.
As I have argued previously, signal generators do not send out gamut-specific signals. A generator sends out full red (example) and color that appears as depends upon the physical properties of the display. If the display reaches is P3, then it will be a P3 red. If the display reaches out to 2020 (unlikely), then it will appear as a 2020 red. All you can do is plot the resulting measurements in the software and you can, of course, select 2020 as the target.
ChromaPure Software/AccuPel Video Signal Generators
This really confuses the issue, but until experts agree on the way to do this, we're only left with a best guess scenario.
There is no doubt that most displays used for grading content are right now closer to P3 capability than to rec2020 capability, and there is no doubt that most consumer displays are closer to P3 than rec2020 as well. That's not the question.
The UHD Standard doesn't even mention P3. Using rec2020 as a container, and playing back the content with a rec2020 calibration, makes it possible to not have to worry about the actual capability of the display, whether it was P3 or rec2020 (and yes, it's going to change over the next couple of years, by 2020 most likely everything will be mastered to rec2020).
But deciding to convert to P3 makes it 1) not right for all titles, especially going into the future and 2) not right for all displays. It just feels wrong, and it forces two calibrations when one (rec2020) would be enough.
I guess the only positive aspect of calibrating to P3 is to make it possible to show nice little graphs with nice little targets met right at the edge at 100% sat, but that doesn't make it easier to calibrate a set, especially when UHDTV will start to broadcast in rec2020 as well.
Hopefully this will get clearer and we'll stick to a single rec2020 calibration, even on sets limited to P3. Then the hardware and software can grow towards the end goal, which is rec2020, both for mastering and content reproduction.
If MadVR is converting its output to the mastering gamut then it is doing it wrong. You would have to change your display configuration every time a different mastering gamut is defined in the metadata. Think of the mastering gamut is an intermediate step between the 2020 encoding gamut and your display's native gamut, sort of like a connection space in ICC profiles.
It's not true that madVR would always convert the gamut to P3. The user can tell madVR which gamut his display is calibrated to. 2020 is a valid choice. If the user tells madVR that his display is calibrated to 2020 then madVR will passthrough the full HDR color data to the display without compressing the gamut in any way. However, if the user tells madVR that his display is calibrated to e.g. P3 or BT.709 (or if the user doesn't provide madVR with any such information at all), madVR will then clip the gamut to P3 or BT.709, by using math that avoids hue or lightness changes. Which means madVR compresses the gamut in higher quality than what any display is likely to do.
All the GPU manufacturers are working on HDR support now. Once they add APIs for me to signal SMPTE 2084/86 to the display, madVR will support doing that, too, of course.
One major issue with HDR that I can see is that "external" calibration is becoming very very difficult because the *display* is going to do a lot of both gamut and lightness processing with unknown algorithms. If the UHD Blu-Ray player were responsible for converting HDR content to something the display can handle, we could use an external video processor to do calibration in a good way. But since all the processing is going to be done by the display, calibration has to work on the original HDR data, without knowing what kind of tone and gamut mapping algorithms the display is going to apply afterwards. That makes things very very complicated for calibration companies, as far as I can see.
The benefits of letting madVR do the conversion from HDR to what the display can handle are:
1) madVR probably does tone + gamut mapping in much higher quality than what any of the first generation HDR compatible displays will do.
2) 3dlut calibration still works as usual. No change needed for the calibration software.
Unfortunately UHD Blu-Ray comes with a new copy protection, so using madVR for UHD Blu-Ray playback is probably going to be hard or even impossible, for a while at least.
How to calibrate a display for HDR content? IMHO the best approach would be to use test patterns that are encoded using SMPTE 2084/86, with similar metadata to what actual UHD Blu-Rays are going to use. That appears to be P3 gamut in a 2020 container, using either 1200nits or 4000nits peak luminance. Only this way we are calibrating the display with test patterns that are running through the exact display processing algorithms that are later going to be used for actual UHD Blu-Rays.
Of course another option would be to try calibrate using 2020 in a 2020 container. But I haven't seen any HDR content with these properties yet, so I don't think its worth the headaches at this point. Just my personal opinion, of course...
ChromaPure Software/AccuPel Video Signal Generators
The difference between calibrating to a "P3 gamut in a 2020 container" and "just calibrating to a P3 gamut" would be the test patterns used. For calibrating to a "P3 gamut in a 2020 container" you would need to use SMPTE 2084/86 test patterns which would be encoded as BT.2020, but which would only actually use the P3 subset coverage of BT.2020, and which would come with the proper metadata to tell the display that the data is limited to P3.
For "just calibrating to a P3 gamut" would probably use simply gamma encoded test patterns, I suppose?
Why are those 2 things different? They might not be. It depends on the display. The display might use different processing algorithms depending on whether the content is gamma encoded or HDR/BT.2020 encoded, regardless of whether the actual gamut is P3 or not. Or maybe the display would not use different algorithms. It's everyone's best guess right now, and it might differ from display to display. But the "safest" solution to calibrate for UHD Blu-Ray is probably to use the same content type and encoding which UHD Blu-Ray is going to use, namely SMPTE 2084/2086.
I should add that I'm far from a calibration expert, so please take my random guesses with a pinch of salt.
Let's talk about a 2000 nits display. I've seen 1200 nits content and 4000 nits content. Let's suppose we had a test pattern generator which could output any nits number we want encoded as SMPTE 2084/86 with any metadata we want. So one option would be for you to do a full measurement for 4000 nits video content. You would do that e.g. by first measuring the grayscale from 0 to 4000 nits in certain steps. This would allow you to exactly measure the display reponse for 4000 nits content. Said 2000 nits display might reproduce the grayscale correctly up to 2000 nits and then simply clip all values above that to 2000 nits. Or if the display is any good, it might start compressing the grayscale, e.g. at 1500 nits or so, so that a 4000 nits pixel is displayed as 2000 nits, a 1500 pixel is displayed as 1500 nits, and every pixel value between 1500 and 4000 nits is displayed somewhere between 1500 and 2000 nits, using a reasonable compression curve.
Then you could repeat the above procedure with e.g. a 1200 test pattern source, measuring the grayscale from 0 to 1200 nits. Why can't you reuse the above taken measurements from 0-4000 nits? The reason is that the display MAY use a different compression curve for content that is limited to 1200 nits. If the display is clever, it will not do any compression at all, if the display can do 2000 nits and the content is limited to 1200 nits, while the display might do some compression for 4000 nits content. Another display might always simply clip. In that case doing just one measurement set for 0-10000 nits content should suffice. But if the display uses different tone mapping curves for different content peak luminance values then you may have to do one calibration for every possible content peak luminance value. Ouch.
The same problem could also apply to gamut mapping, not only to tone mapping. E.g. you may have to do one calibration for P3 HDR content, one for BT.2020 HDR content, one for BT.709 HDR content, and one for every other non-standard gamut HDR content. It will depend on the display. If the display just clips, one calibration per gamut will probably suffice. If the display performs some clever gamut compression algorith, depending on the video content, maybe a separate calibration for every possible HDR source gamut might be necessary for "perfect" results. It's really a mess for calibration, IMHO.
Hope that makes sense?
Edit: Maybe one reasonable compromise would be to try calibrating the display with test patterns that match the peak luminance capability of the display, in the hope that no tone mapping is used in that case. Also calibration to just P3 (in BT.2020 container) might be another reasonable compromise. Everything else (tone mapping of content that has higher peak luminance than the display, and gamut mapping of content that has wider than P3 colors) would then be up to the internal tone + gamut mapping algorithms of the display. Which may or may not do a good job. This compromise would make it impossible for the calibration to change the tone + gamut mapping behaviour of the display, obviously. When doing one separate calibration for different content peak luminance values, the calibration could actually apply a modified tone mapping curve, if so desired.
Last edited by madshi; 12-27-2015 at 03:48 AM.
ChromaPure Software/AccuPel Video Signal Generators
1) The pixel values need to be encoded in SMPTE 2084 and BT.2020. Meaning the desired P3 colors need to be mapped to BT.2020 (3x3 matrix multiplication to get from P3 to XYZ, and another 3x3 matrix multiplication to get from XYZ to BT.2020) and the desired linear light nits values need to be converted using the SMPTE 2084 transfer function. Ideally the final values should be dithered down to output bitdepth.
2) There must be HDMI HDR InfoFrames sent to the display, signalling the use of SMPTE 2084, with the appropriate metadata for content peak luminance and gamut information.
Currently AMD, NVidia and Intel do not support 2) yet, but they're working on it.
Thanks for pitching in.
A few questions:
1) As far as I can see, it isn't true that MadVR doesn't always convert to P3. If I have a 3DLUT for rec2020 and no 3D LUT for P3, MadVR still converts HDR content using rec2020 as a container to P3.
2) As far as I can see, MadVR doesn't allow to specify what to do if there is no 3D LUT for a given calibration slot, which is an issue given the way displays are going to have different calibrations for rec-709 and HDR/rec2020 calibrations. For example, a projector like the JVCs will usually have a calibration for rec-709, NTSC and PAL from one baseline which doesn't use a P3 filter, so as not to lose brightness unnecessarily and keep the best possible saturations from the native gamut. Then a separate calibration will be used for P3/Rec2020, either using a P3 filter to extend the capabilities of the native gamut or using a different calibration with P3 or rec2020 saturations. How does MadVR deal with that? At the moment, I think MadVR is only able to deal with one baseline, and when there is no calibration in a slot, it converts to what it believes is the displays capability. So if there is a rec-709 calibration, it will convert P3 or rec2020 to rec-709. But what if we would like no conversion to happen for P3 because the display can reach 90% of P3 without a filter and 100% of P3 with a filter? We can have the display calibrated internally to P3, yet not want to use a 3D LUT because our internal calibration is good enough - for example, a calibration done with the JVC Autocal software using a custom P3 colour profile provides excellent results and can't be improved with a quick LUT. MadVR is unable to deal with such a situation, which is going to cause lots of issues down the line. MadVR needs to provide more flexibility regarding what is done when there is no calibration in a slot. Converting to rec-709 because there is a 3D LUT for rec-709 isn't necessarily the best thing to do.
3) Regarding calibrating to rec2020, you are saying not to bother, but what is the difference between calibrating to rec2020 and calibrating to P3? One calibration (rec2020) allows us to deal with ALL UHD Bluray content, while another (P3) only allows us to deal with current content. There is already content being mastered to rec2020, and there is likely more content to come with this calibration. So why convert the container to a temporary gamut which isn't even part of the specification for UHD Bluray? Also why would we need to use specific patterns? If we are using madVR to convert PQ gamma to a standard gamma on a non HDR display, why couldn't we use standard patterns and calibrate to P3 or rec2020 without having to switch the display to HDR mode?
4) You are saying you are not an expert on calibration, yet you are contradicting what experts on calibration are saying we should be doing regarding UHD Bluray calibration, which is to use rec2020 saturations. Why? If you have a different opinion, please could you post, as I have already requested, in the discussion where these experts have already given their advice, so that under their control you can explain why you disagree and why you believe a P3 calibration is more appropriate and more relevant? I am not in a position to challenge you on this, yet your position is very confusing. I would really appreciate if you could challenge the position of the experts who have taken the time to explain what should be done, in their opinion, so that this can be clarified and we can reach a better understanding of the situation. Here is again the thread where all this has been discussed: http://www.spectracal.com/forum/view...hp?f=92&t=5859. Towards the end of the thread, I wrote a recap to make sure that I wasn't misunderstanding anything - I'm not an expert either! - and there was no significant correction to this recap, in particular regarding the need to calibrate to rec2020 saturations, not to P3.
5) MadVR does a great job at converting and displaying HDR content on both HDR and non HDR compatible displays, so thanks for adding this feature. I just hope the calibration situation will be resolved so that we know what we should be doing on the display side to get consistent results and not two or three different calibrations for next gen content, without having to deal with more calibrations than necessary, especially as rec2020 mastered content becomes more frequent. There is content mastered using the Christie rec2020 projector, so it's unavoidable that true rec2020 content will arrive at some point over the next couple of years. We will have to deal with it properly, and converting it to P3 isn't necessarily the best way to deal with it. Displaying the content using rec2020 calibration, even on a P3 limited display, might allow us to get a better representation of the content as it will take a gamut conversion out of the process.
Managing HDR content production and Display Device capabilities
Why does HDR/WCG need metadata?
Proposed Framework for BDA HDR Subjective Tests
Ted's LightSpace CMS Calibration Disk Free Version for Free Calibration Software: LightSpace DPS / CalMAN ColorChecker / HCFR
S/W: LightSpace CMS, SpaceMan ICC, SpaceMatch DCM, CalMAN 5, CalMAN RGB, ChromaPure, ControlCAL
V/P: eeColor 3D LUT Box - P/G: DVDO AVLab TPG
Meters: JETI Specbos 1211, Klein K-10A, i1PRO2, i1PRO, SpectraCAL C6, i1D3, C5
HDR content is often created with some pixels containing increased content peak brightness levels that cannot be reproduced on all HDR displays. High-quality HDR displays should avoid hard clipping at their peak luminance limit and instead roll off bright highlights that exceed the display capabilities, mapping the values that cannot be shown to values that can be shown on the display. Of course, more advanced tone mapping algorithms are desired, potentially even required, especially in scenarios where the frame average luminance exceeds the display capabilities. In either case, consumer displays should be designed expecting to receive content with video parameters exceeding their own display capabilities, and the displays should be prepared to manage the signal without compromising the user experience.
This goes back to the discussion a few posts back about hard clipping the source material's MaxCLL to the capability of the display in order to maintain 25-100cd/m2 MaxFALL (I think it was Tom's post). Hard clipping the material could compromise the user experience as will bringing the white level control down to prevent hard clipping. The MaxFALL would be too low to be usable and enjoyable for consumers. Think about all of those users that have set up their current TVs to show levels up to 254 @ 120cd/m2 - their average frame light level is too low, with 235 measuring at 70cd/m2. For what reason? To preserve what? All post companies I calibrate for don't use above 235 and the same can be said for most BDs. But moving forward with HDR this material will be important, so hard clipping may not be the answer either. Whether consumer displays will be designed to be intelligent to roll off highlights is a different story...
iI understand the handshake and it struck me that pattern generators would need to be able to tell the display that the triplets are for
a certain color space.
It sounds like? The generator will also need to have the ability to send metadata as well?
and? 3 different data specs?
I think I understand the 75% and then let it clip.
The device HDfury Integral helps me with the pattern source confusion.
|Gear in this thread|