AVS › AVS Forum › Display Devices › Display Calibration › How power law gamma calibration can lead to crushed blacks
New Posts  All Forums:Forum Nav:

# How power law gamma calibration can lead to crushed blacks

I was encoding test patterns for a little project I'm working on and started digging into transfer functions and the linkage between gamma and saturation. It dawned on me that the current practice of calibrating one's display gamma to a simple power law with exponent 2.2-2.5 can lead to significant black crush as well as introducing unintended contrast into a scene. I did see a thread on this in the spectracal forums but not here so if it's been discussed before please forgive me.

ITU/EBU recommends that the end-to-end gamma of the video chain should be between 1.1 - 1.2 so we have a clearly defined target. It's commonly stated that video RGB is encoded with an exponent of 1/2.22 but the full function is:

v'=4.5*v for v < 0.018
v'=1.099*v^(.45)-0.099 for v >= 0.018

So what we want our displays to do is invert the function above and then apply a modest power of between 1.1 - 1.2.

What do we do in practice? We feed the display a linear input of 10% steps and (if we have a 10pt gamma control) try to get a nice flat fit to the equation v=v'^(gamma=2.2, 2.3, etc.) This completely ignores the fact that the real video signals we will eventually watch are encoded using the equation above.

So what are the consequences of the simplified power law calibration approach?

This is a plot of the result of feeding a linear signal which has been encoded using the BT.709/601 transfer function into a display with a simple power law response. Ideally we want a flat line somewhere within the cross-hatched region and you can see we don't get that. Most significantly the deviation below 20% stimulus can cause significant compression of the response as we go into black. You'll also notice that the response is not particularly flat until the upper stimulus levels. The implications of this is that contrast is being added at many of the levels which is not in the original video information.

I have been calibrating my display to a display gamma=2.3 and I find that I end up raising brightness a bit by eye to counter this effect when watching some material. If a 10pt gamma control is available a better approach is to actually calibrate to the BT.709/601 transfer function and this is an option in some of the calibration packages. Because it's not a simple power law, the way it's implemented (at least in HCFR) leads to a non-intuitive target gamma value (discussion here)

If you want to give this approach a try in HCFR use the "Camera gamma with standard offset" fitting function and targets of:

camera gamma = 2.44 for an end-to-end gamma = 1.1
camera gamma = 2.66 for an end-to-end gamma = 1.2

You'll find that to flatten out this gamma curve will require raising the 10%, 20%, and 30% controls (in decreasing amounts) relative to your standard calibration. I've seen people mention favoring a lower (power law) gamma at the low end and this achieves a similar result, but it's a much better match to how the video signal is actually encoded.

Before/After pictures, notice the overemphasis on shadow and loss of detail in the dress in the power law calibration.

power law gamma=2.3 End-to-End gamma is variable from 1.15 to 1.5

camera (inverse BT.709 transfer function) gamma=2.65 End-to-End gamma is flat at 1.2

### AVS Top Picks

Camera gamma is way too bright.

But yes I agree, you need to at a minimum use black offset. The newer gamma formulas like BT.1886 include a black offset as part of their formulas.

While camera gamma isn't the answer, you're right to be asking the question.
Quote:
Originally Posted by sotti

Camera gamma is way too bright.

If you target 2.2 yes because that results in end-to-end of 1.0, but target 2.65 and you'll get to the "dim surround" recommendation. That may still be too bright for theatre conditions but you can always push it higher (2.8 will approximate a simple power law value of 2.5), the other main advantage which I don't mention above is you get much better linearity between mid-tones (40-75% stimulus) and darks in the 10-20% range than a power law will give you. This gives a more natural looking image by avoiding the addition of false contrast (any slope in end-to-end gamma will add contrast to the image which is not encoded in the original).

I don't understand what you mean by "camera gamma isn't the answer". It's the exact inverse of the BT.709 encoding function. You start there to linearize the encoded signal and then tack on the additional 1.1 - 1.2 power.
Quote:
Originally Posted by sotti

The newer gamma formulas like BT.1886 include a black offset as part of their formulas.

BT.1886 recommends two different EOTFs designed to match a CRT response. This seems to be in contradiction to the end-to-end recommendation of 1.1-1.2 from Hunt.
Quote:
Originally Posted by zoyd

BT.1886 recommends two different EOTFs designed to match a CRT response. This seems to be in contradiction to the end-to-end recommendation of 1.1-1.2 from Hunt.

The only thing that matters is that transfer function on your displays is a perceptive match to the the function used on the display where they graded the master.

Theory is fun to play with, but the rubber meets the road in the actual studios. You want to do what they are doing. Eventually they'll likely all be using BT.1886, right now they are using power typically with an exponent of 2.35.
Well that's a somewhat unsatisfactory state of affairs. So color science recommends a slightly non-linear signal chain for accurate perceptual reproduction and instead we get a system which has variable non-linearity (increasingly so as we approach black) but that's ok because it's being mastered on displays with the same non-ideal response function. Do directors, colorists, etc. realize they are not seeing a perceptually accurate rendition of their camera/telecine transfer signals?* Why weren't encoding transfer functions designed to better match CRT's? That would have made the world a tidier place.

*my guess is yes they do and adjust their displays accordingly.
Quote:
Originally Posted by zoyd

Do directors, colorists, etc. realize they are not seeing a perceptually accurate rendition of their camera/telecine transfer signals?*

Directors and cinematographers don't care what the camera saw, they are going for a specific look (PS most of those guys don't' know that much about calibration). The gamma of the display they work on is simply their palette. All the other stuff is theory, the meat of the issue is having your display look like their display.
What about telecine transfer of already produced films? Does the digitization work flow include a step which corrects for the mismatch between encoding function and a 2.35 CRT?
Another problem with mismatched transfer functions is color accuracy when saturation < 100%. You'll notice in the photographs above that in the 2.35 power law response that not only are the shadows overemphasized but the skin tones have been shifted to red. This can be explained with the following example:

1. Take a light skin tone (color checker pattern 2) with x=0.3871 y=0.3538 and Y=34.95%

2. Calculate R'G'B' using the BT.709 transfer function and you'll get 193,137,117 (0-255)

3. Now calculate the display xyY using a 2.35 straight power law and you'll get x=.4071, y=.3579, Y=28.6%

The y coordinate moves very little but x moves 0.051 towards red and Y is too dark. dE94 for this case is 5.7!

input------->output
Quote:
Originally Posted by zoyd

Another problem with mismatched transfer functions is color accuracy when saturation < 100%. You'll notice in the photographs above that in the 2.35 power law response that not only are the shadows overemphasized but the skin tones have been shifted to red. This can be explained with the following example:

I know.
That's why I was saying you want exactly what the studio guys are using.
ok but what about telecine transfer, or live broadcasts? Are you saying there is someone standing in between every delivery path adjusting levels so that it will appear correct on a 2.35 CRT at the user end?

Also, I've only done the one A/B comparison using the Blu-ray of "Tree of Life", but that material is clearly improved using this approach (on my panel).
Quote:
Originally Posted by sotti

I know.
That's why I was saying you want exactly what the studio guys are using.

So, exactly what are the studio guys using? Your assumption is that they grade and remap gamma from BT.709 to 2.35 and you assume this for all content, regardless of original source (film, video, TV (live or recorded)). If this assumption is incorrect you will have the problem stated in post 1, simple math.

Do we have anyone reading who actually works in these fields and can comment? The only previous comment I could find was from a thread that brought up similar issues:

Quote:
Originally Posted by Light Illusion

As both a professional colourist, and calibrator for professional post-production operations, all I can say is that a display gamma of 2.2 is the industry standard, with approx 23FtL as peak white.

Also for TV colour work (which the above figures are for) the grading room is illuminated to approximate a 'normal' living room environment... what ever that means, the reality is a room similar to your lounge at night with a few room lamps on.

For film grading, things are rather different, but for all films that end up on DVD or BlueRay, etc., they are re-mastered (sometimes by re-grading, some times via a LUT application) to adhere to the above TV figures for display.

This is of course completely contradictory to your assumption, but I have no way to know if it is an accurate statement or not of the industry wide practice.

Doug Blackburn takes a practical approach to the topic in that same thread basically saying don't take the numbers as gospel and see what looks right to you, using reference material. Of course this will work for the reference material used, but assumes all other material followed the same gamma workflow as the material he uses to grade the end-user display.

He also states:

Quote:

I've never seen a TV or projector where 2.4 looked right... it always seems to be too dark. 2.3 works on some displays and 2.2-2.25 is good for others (especially if they haven't got inky-dark blacks).

Which is consistent with the encode/decode mismatch stated in post #1.

If we assume as above that gamma is remapped to appear correct on a 2.2 display the end-to-end gamma becomes:

This situation would also be consistent with Doug's observation and would imply that an inverse BT.709 function would not be an appropriate calibration target. The grading/LUT remapping takes care of it in production.
Zoyd

I feel your pain on this one, I've also find it hard to get anything concrete out of the pros. Charles is probably the most helpful source but keeping up with his view is hard as it's a bit of a moving target. See for example this

http://www.spectracal.com/downloads/...20or%202.4.pdf

The main point is to forget the camera transfer function, waste of time.

Always be clear and demand clarity from others when talking about gamma about which formula they are using, does it have black compensation, what's the offset etc.

To my mind log log plot is the way to go and in looking at those you can really see what's going on as well as avoiding any single number confusion

The issue around crushed blacks is a real one but I think it is more to do with how you treat relative bightness of your display black (i.e. lowest grey) with the values just above and comparing this to how they would have been percieved in the mastering studio.

Personally I think treating desired black as zero is mathematically unhelpful as well as being physically impossible and a better approach is thinking about how to map the entire scale so that the result looks correct.

It's helpful to me to imagine how you would want to display an image on a display that had the black at y=50% where would you show 1%, 10%, 50% grey etc? Yes it would look washed out but what's the best you could do?

John
Thanks for the comments John. I've read that piece a number of times and it's still not getting at the crux of the problem.

Quote:

The key point concerning the consumer's gamma is this: What we
seek to maintain at presentation is the appearance of the colors at
program approval, not necessarily the physical stimuli.

In the previous paragraph he states by way of example that the colorist will increase chroma gain (to match a 2.4 gamma display btw). So over some range of luminances colors will appear correct assuming you reproduce the colorist's gamma function. This still does not address the gamma mismatch, the colorist can't make the two functions agree with each over all luminances with chroma gain alone. And it's still unclear whether this is actually standard practice given Light Illusion's comments.

I think I'll just have a beer and watch some TeeVee.
Hi all,

You may find this helpful - it's the characterization of a Sony BVM monitor that's used in the most common package for film colour management:

I can't post URLs - Google for "truelight standard colour spaces", check out page 32 of the PDF.

Alternatively it may just create even more confusion

I try to stay aware of what goes on in grading suites - my actual specialty is in VFX (can't post a link, but look up my name on IMDB), including setting up the colour pipeline to match what the grading guys are doing. Sometimes that overlaps into setting up the grading suite too.

As you (and Charles Poynton) have gathered the display spec area is a mess. I think this is mostly because it's really hard for a post facility to change to a new standard. Even if everyone had the will to implement a new standard a lot of suites don't have the hardware to apply a calibration LUT, especially a different one for the reference monitor and the secondary "client" monitor. Even if a new standard was implemented across a whole facility, what happens when the client goes to the audio mix in another company and the picture looks different there? It seems insane, but the Sony BVM CRT set up with a PLUGE chart is still the common element that people can agree to agree on, even though now all those monitors are dying and can't be economically repaired.

It's ironic that everything becomes easier when talking about matching displays to actual analogue film prints, because Kodak set up the film standards in about 1992 and everyone has stuck to them even though it's a far more complex process. The DCI spec is also adhered to well - it's just the "TV" area that's rather grey.

Anyway. All I can say is that I use the above Truelight profile as a target when calibrating VFX workstations, typically using Eizo 241 and 243 monitors, and it's been an excellent match thus far. I've even done the torture test of sitting one next to a reference monitor with the same footage and it matched really well.

I feel your pain at not having a standard, and here's hoping the the proliferation of wide gamut LCDs and OLEDs doesn't make everything even less consistent...
There is some talk about bt 709 has anyone looked at itu bt.1886?

There it some specifications about gamma.
Quote:
Originally Posted by zoyd

Thanks for the comments John. I've read that piece a number of times and it's still not getting at the crux of the problem.

Agree, I find he skirts around the issues but he does at least throw some numbers about.

Quote:
Originally Posted by zoyd

In the previous paragraph he states by way of example that the colorist will increase chroma gain (to match a 2.4 gamma display btw). So over some range of luminances colors will appear correct assuming you reproduce the colorist's gamma function. This still does not address the gamma mismatch, the colorist can't make the two functions agree with each over all luminances with chroma gain alone. And it's still unclear whether this is actually standard practice given Light Illusion's comments.

From what I've gathered by talking to various people involved with the process at various points it all seems very "artistic" in practice with a lot of key variables from a preceptual point of view left variable/unknown. I also think you have to work out who talsk sense in language you can directly use and who sounds plausible but doesn't really know what they are on about.

The point I think Charles is trying to make is that the colorist has some "dials" that they tweak based on what they see, therefore it's what they see that we want to see. To be more precise we want to have the same preceptual sensation as they did when they were looking at the image. So if you assume an old school grading room then one approach is to build a grading studio complete with 6500k backlights, grey walls, small screen, presumably things would be complete if we also had a vision mixing desk in front of us too, but even in this approach if your black level or screen reflectance is off spec then you won't get the same image even if your monitor had the same measured response as a BVM CRT.

Quote:
Originally Posted by zoyd

I think I'll just have a beer and watch some TeeVee.

know how you feel
Quote:
Originally Posted by lewis.saunders

Hi all,

You may find this helpful - it's the characterization of a Sony BVM monitor that's used in the most common package for film colour management:

Thanks for the comments, they are very informative. The document you referenced can be found here

Regarding analog film to digital transfers, are you saying the DCI specs are followed as normal practice in the transfer to DVD/BD or were you referring to digital theatre projection?
Quote:
Originally Posted by lewis.saunders

Hi all,

Hi and welcome

Quote:
Originally Posted by lewis.saunders

You may find this helpful - it's the characterization of a Sony BVM monitor that's used in the most common package for film colour management:

I can't post URLs - Google for "truelight standard colour spaces", check out page 32 of the PDF.

Alternatively it may just create even more confusion

Interesting, their formula for the crt response I calculate at having a log/log slope of 2.6 which sounds right for a CRT.

Quote:
Originally Posted by lewis.saunders

I try to stay aware of what goes on in grading suites - my actual specialty is in VFX (can't post a link, but look up my name on IMDB), including setting up the colour pipeline to match what the grading guys are doing. Sometimes that overlaps into setting up the grading suite too.

As you (and Charles Poynton) have gathered the display spec area is a mess. I think this is mostly because it's really hard for a post facility to change to a new standard. Even if everyone had the will to implement a new standard a lot of suites don't have the hardware to apply a calibration LUT, especially a different one for the reference monitor and the secondary "client" monitor. Even if a new standard was implemented across a whole facility, what happens when the client goes to the audio mix in another company and the picture looks different there? It seems insane, but the Sony BVM CRT set up with a PLUGE chart is still the common element that people can agree to agree on, even though now all those monitors are dying and can't be economically repaired.

It's ironic that everything becomes easier when talking about matching displays to actual analogue film prints, because Kodak set up the film standards in about 1992 and everyone has stuck to them even though it's a far more complex process. The DCI spec is also adhered to well - it's just the "TV" area that's rather grey.

Thanks for that, certainly chimes with what we get as consumers.

Quote:
Originally Posted by lewis.saunders

Anyway. All I can say is that I use the above Truelight profile as a target when calibrating VFX workstations, typically using Eizo 241 and 243 monitors, and it's been an excellent match thus far. I've even done the torture test of sitting one next to a reference monitor with the same footage and it matched really well.

I feel your pain at not having a standard, and here's hoping the the proliferation of wide gamut LCDs and OLEDs doesn't make everything even less consistent...

That formula is nice in that it gives an explict L value for black, is smooth etc.

It would be very intersting to see meaurements from a display that had a raised black level that had been lut'ed to look like a BVM.

John
Quote:
Originally Posted by visca blaugrana

There is some talk about bt 709 has anyone looked at itu bt.1886?

There it some specifications about gamma.

Thanks for the prod, yes that gamma specification looks interesting, need to play with it a bit more to see how the near blacks behave on raising the black level.

John
BTW Charles Poynton has released the Second Edition (2012) of his book:

Digital Video and HD Algorithms & Interfaces, Second Edition
Quote:
 Originally Posted by JohnAd Interesting, their formula for the crt response I calculate at having a log/log slope of 2.6 which sounds right for a CRT.
I don't get a power law response from that formula, I get:

Unless I screwed up the calculation this is far from an idealized 2.4 power law.

Edit: See attached paper, Figure 1 - trace 2. Very similar CRT transfer function to what's plotted above. So if the single parameter power law fit is a poor representation of the actual CRT transfer function in use in production houses, the concept of reproducing a flat line gamma at the user end is seriously flawed.

Â

Quote:
Originally Posted by zoyd

So if the single parameter power law fit is a poor representation of the actual CRT transfer function in use in production houses, the concept of reproducing a flat line gamma at the user end is seriously flawed.

Yep, a flat line target with the simple gamma formula seems wrong to me too.

John
So if a flat line gamma is not the correct gamma setting for an end device (ie; LCOS projector), what should it be? Or what would be more correct?
Quote:
Originally Posted by WTS

So if a flat line gamma is not the correct gamma setting for an end device (ie; LCOS projector), what should it be? Or what would be more correct?

Not an easy question to answer, as you have seen it all depends on your assumption about the characteristics of the mastering monitor. If we assume that a mastering system like the one Lewis Saunders uses (which he refers to as "the most common package for film colour management") then you would try to reproduce the curvature in my previous post and then adjust it up or down given your viewing environment.

If we just calculate the contrast ratio the colorist sees in such a system vs. the same ratio for flat gamma displays you get this:

2.4 in this case is better than the other options at these stimuli (it's worse above 35% but that has a bigger influence on color accuracy)

If we calculate the perceived differences in the first 12 Munsell color checker patches between a BVM raw display and a flat gamma display (assuming the gamut periphery has been calibrated to BT.709 and we are viewing the two monitors in the same environment) we get this:
Â

 Color 2.4 dE94 2.2 dE94 dark skin 2.7 4.8 light skin 4.5 3 blue sky 3.9 3.3 foliage 3.4 1.8 blue flower 4.3 1.7 bluish green 4.2 2.4 orange 2.2 1.9 purplish blue 2.6 1.9 moderate red 2.7 1.5 purple 2.1 2.6 yellow green 4.5 3.1 orange yellow 4.5 2.6

Other than the two darkest colors, a 2.2 gamma will give better color reproduction. Btw the above errors for either power law response would be considered unacceptable by most calibrators but I don't know of anyone who actually color grades their calibrationsÂ measures Â "inside the edges". Of course, without a reference or an outboard LUT there isn't a whole lot you can do other than to characterize this region.

However, if the mastering system is using LUTed displays that produce nice flat gamma responses then the current practice is fine but we have no idea what percentage of what type of material is mastered on one or the other systems. You could hedge your bets and average the two or come up with a more complicated tailored function that minimizes the errors in the above table.

Edited by zoyd - 6/21/12 at 2:26pm
Quote:
Originally Posted by zoyd

Not an easy question to answer, as you have seen it all depends on your assumption about the characteristics of the mastering monitor. If we assume that a mastering system like the one Lewis Saunders uses (which he refers to as "the most common package for film colour management") then you would try to reproduce the curvature in my previous post and then adjust it up or down given your viewing environment.

If we just calculate the contrast ratio the colorist sees in such a system vs. the same ratio for flat gamma displays you get this:

2.4 in this case is better than the other options at these stimuli (it's worse above 35% but that has a bigger influence on color accuracy)

If we calculate the perceived differences in the first 12 Munsell color checker patches between a BVM raw display and a flat gamma display (assuming the gamut periphery has been calibrated to BT.709 and we are viewing the two monitors in the same environment) we get this:

 Color 2.4 dE94 2.2 dE94 dark skin 2.7 4.8 light skin 4.5 3 blue sky 3.9 3.3 foliage 3.4 1.8 blue flower 4.3 1.7 bluish green 4.2 2.4 orange 2.2 1.9 purplish blue 2.6 1.9 moderate red 2.7 1.5 purple 2.1 2.6 yellow green 4.5 3.1 orange yellow 4.5 2.6

Other than the two darkest colors, a 2.2 gamma will give better color reproduction. Btw the above errors would be considered unacceptable by most calibrators but I don't know of anyone who actually measures "inside the edges". Of course, without a reference or an outboard LUT there isn't a whole lot you can do other than to characterize this region.

However, if the mastering system is using LUTed displays that produce nice flat gamma responses then the current practice is fine but we have no idea what percentage of what type of material is mastered on one or the other systems. You could hedge your bets and average the two or come up with a more complicated tailored function that minimizes the errors in the above table.

A lot of that is beyond my comprehension Let's say on a high end PDP display what would an optimal gamma curve look like? I understand there are a lot of variables but I'm looking for the most shadow detail while retaining the best MML possible. Also lots of pop in the colors.
If this could be ballparked I would appreciate it.

Thanks
Quote:
Originally Posted by wmwilker

A lot of that is beyond my comprehension Let's say on a high end PDP display what would an optimal gamma curve look like? I understand there are a lot of variables but I'm looking for the most shadow detail while retaining the best MML possible. Also lots of pop in the colors.
If this could be ballparked I would appreciate it.

Thanks

This is more of a preference question than a calibration question so I can't give you specific numbers. If you don't care to reproduce "what the colorist/director sees" then to increase shadow detail you lower gamma from 10%-30% and to increase contrast/saturation ("pop") you increase gamma from 30%-90% with a smooth transition between the two regions. Some displays have a dynamic contrast control which will do something similar but you'd have to decide for yourself what numbers to end up at.
The target for an ideal display is a "flat" 2.40 gamma. Sony's OLED monitors are capable of this, and that's how they are calibrated. Sony's CRTs were not. (0.01cd/m2 black level)

If your display has less than perfect black levels, the BT.1886 transfer function compensates for this.

If you want to emulate a CRT (and there's no real reason for an end user to do so) there is the alternate EOTF as defined in BT.1886. Most CRTs will be LUTed to as close to 2.40 as possible these days.
Quote:
Originally Posted by zoyd

This is more of a preference question than a calibration question so I can't give you specific numbers. If you don't care to reproduce "what the colorist/director sees" then to increase shadow detail you lower gamma from 10%-30% and to increase contrast/saturation ("pop") you increase gamma from 30%-90% with a smooth transition between the two regions. Some displays have a dynamic contrast control which will do something similar but you'd have to decide for yourself what numbers to end up at.

I guess I'll have to "play" with different calibrations.
You mentioned "what the colorist/director sees". Is that based on a flat gamma?
Quote:
Originally Posted by Chronoptimist

Most CRTs will be LUTed to as close to 2.40 as possible these days.

Define most? "these days" is fine going forward but what of legacy material mastered using raw CRT EOTFs? the one industry person who has commented so far claims that "the most common package for film colour management" is such a system.
New Posts  All Forums:Forum Nav:
Return Home
Back to Forum: Display Calibration
• How power law gamma calibration can lead to crushed blacks

### AVS Top Picks

AVS › AVS Forum › Display Devices › Display Calibration › How power law gamma calibration can lead to crushed blacks