Gamma & BT.1886 explained. Lightspace users check in here - need your help! - AVS Forum
Forum Jump: 
Reply
 
Thread Tools
post #1 of 200 Old 12-11-2013, 09:03 PM - Thread Starter
Advanced Member
 
spacediver's Avatar
 
Join Date: May 2013
Location: Toronto
Posts: 738
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 25 Post(s)
Liked: 65
The following is my current understanding of the issue. I do not claim to have perfect knowledge, and welcome any suggestions for improvement and correction.

Many of us have heard about calibrating our displays so that the gamma is correct (2.2, 2.4, take your pick). The reason often given for doing this is so that you can be more confident that your display matches that of the reference displays on which content was mastered. This is correct, but there is something deeper at play here, and understanding this is important to appreciate the importance of the EOTF transfer function recommended in BT.1886.

In an 8 bit PC RGB framework, we have 256 possible shades of gray. Ideally, we want each successive shade of gray to be just distinguishable from the previous shade, and we want this to hold true whether we're talking about the difference between [23 23 23] and [24 24 24] or whether we're talking about [210 210 210] and [211 211 211].

This will ensure that we're making efficient use of the available dynamic range. If each successive step was too easy to distinguish, we'd be wasting our visual's system ability to discriminate finely. If each successive step was not distinguishable, we'd be limiting our overall dynamic range, and we'd be limited to less than 256 perceivable shades of gray.

In the former case, we'd get some nasty artifacts, where scenes that are supposed to have smoothly varying shades instead have unsightly banding/posterization. In the latter case, we'd get a lot of perceptual crushing/clipping.

So how do we go towards achieving a perceptually uniform relationship between our input level (V) and our perceptual lightness response (L*)?

A naive solution is to set up our systems so that the luminiance (L) is a linear function of our input level (V). So if we double our RGB signal, we double our luminance output.

The problem with this is that the human lightness response (L*) is not a linear function of luminance (L).

See the following set of plots, which shows the results from several studies on the relationship between luminance and lightness.



Notice how the curves depict a nonlinear relationship, where the slope of the line starts out high and decreases as luminance increases.

What this means is that we are more sensitive to changes in luminance at the dark end than we are at the lighter end.

Now look at the natural relationship between the input voltage in a CRT monitor, and the resulting light output:



Notice how the nonlinear relationship here is "opposite" to that in the previous graph? In fact, the relationship of input signal (V) to luminance (L) is very close to the inverse of the relationship between luminance (L) and lightness (L*)!

Think about what this means. If we have a nonlinear relationship (A) between input (V) and luminance (L) and the inverse relationship (A') between luminance (L) and lightness (L*), we can state the following:

L = A(V)

L* = (A')L

therefore L* = (A)(A')V

since A' is the inverse of A, then (A)(A)' depicts a linear function (i.e. they "cancel out").

And thus, a CRTs natural response characteristics will produce a perceptually uniform set of brightness levels!

This is not by design, but is rather an extremely fortunate coincidence (as Charles Poynton points out in this paper, from which the above two images were taken).


Ok, so now we have an understanding of why we calibrate so that the relationship between our input signal (V) and our measured luminance (L) is of the form:

L = V^Gamma

where Gamma is an exponent value (typically 2.4).

Now this is fine when we are dealing with displays that have very low black levels. However, in reality, many displays do not have reference grade black levels.

Suppose we calibrate a display that has a minimum black level of 10 percent of the maximum luminance. On a display that had a maximum luminance of 100 cd/m2, this would mean a black level of 10 cd/m2. That's high, but it's good to use extreme examples for purposes of illustration.

How do we calibrate? We want a gamma of 2.4, but this assumes a very low black level.

A naive solution would be to do something like the following. Take a look at the image below. The blue curve represents a gamma of 2.4 with a black level of 0. The red curve represents a gamma of 2.4 with a black level of 10 percent.



Now there's an immediate problem we can see. The red curve reaches maximum luminance prematurely. This would clip the whites.

The solution is to scale the function so that L(1) = 1. See below:




This particular shifting and scaling approach, however, is not the best way to compensate for a non zero black level.

Remember the nonlinear human lightness response? Notice how the function starts at a luminance of 0. What do you suppose would happen if we did those same original experiments, but now tested observers starting at 10 cd/m2 instead of 0 cd/m2. The naive solution above assumes that the lightness response would shift!. But of course, it doesn't. Our ability to detect differences between any two light levels does not depend on the black level of the display we are viewing!

(that last bit isn't strictly true - we do actually undergo light adaptation, and the dynamic range of our neural response to luminance does recalibrate based on the surrounding light conditions, including ambient light, and light in other parts of the image, but we can ignore this for now, as it's not relevant to the endpoint of this discussion).

Remember, the luminance response of a display to the input signal is quite flat at low black levels. This is great, because we are naturally very sensitive to changes in luminance at low black levels. But if we simply shift the luminance function of the display up to our new black level, this means that the display is going to be outputting very small changes in luminance at the new (and higher) black level. Because we are not as sensitive at this higher black level, this means that we will not be able to distinguish different shades of gray as easily anymore, and we will experience perceptual clipping.

So, given the fixed human lightness response to luminance, how do we deal with displays that have non-zero black levels?


Enter BT.1886


This recommendation specifies a formula that scales and offsets the gamma curve in such a way that two things happen:

1) The curve begins at the black level of the display (which one ideally measures).

2) The shape of the curve is steepened at the low end of the display.

Crucially, the degree of steepening is calculated as a function of the black level. Higher black levels are going to have to have steeper functions so that shades retain their discriminability at the higher black levels.

See below, where I have contrasted the naive shift and scaling approach with the correct BT.1886 implementation:




Now based on discussion in another thread, it appears to be the case that LightSpace implements the naive solution, rather than the BT.1886 recommendation. However, it is hard to get clarification from Steve Shaw, possibly because he doesn't want to risk making his code public, which is understandable.

It may well be the case that LS does a fantastic job of implementing BT.1886, but I'm sure users of LS would like to know for sure.

There may be a way to find out if LS uses the correct BT.1886 implementation, if LightSpace users on this forum can do a simple experiment.

This would involve importing a profile into LightSpace, and seeing how LightSpace judges the luminance response (specified in the profile) relative to its implementation of the BT.1886 target.

The profile used in this experiment should have an artificially high black level, as discrepancies between naive and correct BT.1886 implementations will increase as the black level increases (this is why naive implementations will not cause as much harm in studios with reference displays).

To aid in this, here is a set of values you can use, which represent a correct Bt.1886 function with a black offset of 10% max luminance.

Input signal (V) ~~~~~~~~~ Luminance(L)
0 ~~~~~~~~~~~~~~~~~~~~10
10 ~~~~~~~~~~~~~~~~~~~~ 14.3901
20 ~~~~~~~~~~~~~~~~~~~~19.5425
30 ~~~~~~~~~~~~~~~~~~~~25.7497
40 ~~~~~~~~~~~~~~~~~~~~ 32.9765
50 ~~~~~~~~~~~~~~~~~~~~ 41.2658
60 ~~~~~~~~~~~~~~~~~~~~ 50.6583
70 ~~~~~~~~~~~~~~~~~~~~61.1922
80 ~~~~~~~~~~~~~~~~~~~~ 72.9041
90 ~~~~~~~~~~~~~~~~~~~~ 85.8289
100 ~~~~~~~~~~~~~~~~~~~~100



If you need me to convert the input signal into PC or Video RGB levels, let me know.

I don't have LS, and have never used it, so I don't know how to do this experiment, but Steve has said that you can manually enter profiles in XML format.
N3W813 likes this.
spacediver is offline  
Sponsored Links
Advertisement
 
post #2 of 200 Old 12-13-2013, 03:01 AM
Advanced Member
 
PE06MCG's Avatar
 
Join Date: Jun 2010
Location: West Yorkshire, UK
Posts: 729
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 48
As a novice I have been often confused by gamma and its apparent importance during Display calibration so thanks for detailing what appears to be disagreement amongst the experts.

Does this gamma uncertainty occur only at Display calibration stage or have the same uncertainties been transcoded by the mastering process of the film / program makers?

In other words are the cameras used by the program directors / producers subject to the same problem and if so what have they decided is the correct solution ?
PE06MCG is offline  
post #3 of 200 Old 12-13-2013, 03:25 AM - Thread Starter
Advanced Member
 
spacediver's Avatar
 
Join Date: May 2013
Location: Toronto
Posts: 738
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 25 Post(s)
Liked: 65
That's a good question, and it illustrates where my current understanding of the issue breaks down. There is something called camera gamma, which I think is mean to mimic the human lightness response. From what I understand, this is done to compensate for the display gamma. Where I get confused is that I thought human perception already naturally compensates for the display gamma.

Hopefully someone with a more nuanced understanding can chime in.
spacediver is offline  
post #4 of 200 Old 12-13-2013, 03:53 AM - Thread Starter
Advanced Member
 
spacediver's Avatar
 
Join Date: May 2013
Location: Toronto
Posts: 738
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 25 Post(s)
Liked: 65
Just thought about it some more, and I think I get it now.

Suppose in the world, there exist eleven stones, side by side in a row. The first stone is black, and the eleventh is white. The (reflected) luminance off these stones is such that they appear to change lightness in equal increments. This would mean that if you were to plot the stones (x-axis) against the reflected luminance (y axis), you'd end up with a curve very similar to the CRT gamma function (gamma ~2.4).

So while the luminance of these stones rises non-linearly, the (perceptual) lightness of these stones rises linearly.

Our goal is to reproduce this perception when an observer views the representation of that scene on a display.

If our camera transduced the luminance energy linearly, so that a doubling of luminance meant a doubling of the digital signal produced by the camera, we'd end up with the following situation, when we fed information from the camera to the display:

The camera has a digital representation of each of the stones, and if you were to plot the stones (x-axis) against the camera's represented digital magnitude of each stone, you'd end up with that gamma 2.4 curve

The display now converts the digital magnitude of each stone (actually the pixels that represent the stone) into a voltage (assuming a CRT). So now if we plot each stone against the CRT voltage, we have that gamma 2.4 curve.

Now remember, the function that defines how a CRT converts voltage into electron beams is also a gamma 2.4 function, so now when we have our final output luminance signal from the monitor, we end up with a "doubling" of this gamma 2.4 function!

The solution is to use the inverse gamma function at the camera end (I think the exponent is around 0.45 or something), which will ensure that the scene is reproduced on the display as it would have originally been viewed in "real life".
spacediver is offline  
post #5 of 200 Old 12-13-2013, 04:30 AM
Advanced Member
 
PE06MCG's Avatar
 
Join Date: Jun 2010
Location: West Yorkshire, UK
Posts: 729
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 48
So is the mastering program for all purposes (bLu Ray, Cinema etc.) always produced by all studios with a gamma of 2.4 or is there some dispute about that?
PE06MCG is offline  
post #6 of 200 Old 12-13-2013, 05:13 AM - Thread Starter
Advanced Member
 
spacediver's Avatar
 
Join Date: May 2013
Location: Toronto
Posts: 738
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 25 Post(s)
Liked: 65
Historically, CRTs have been used as reference displays while mastering. There may have been some variability from display to display, and I'm not sure if there was a specified standard, or how well it was adhered to, but I believe it was somewhere between 2.2 and 2.5.

There is some discussion of this here, although I haven't read that thread yet.

Nowadays, high end LCDs and OLEDs are being used, which have their own characteristics and flexibilities.

The recent BT.1886 recommendation specifies an exponent of 2.4 in its equation, although this technically is somewhat different from a gamma exponent, since gamma is a term traditionally used for the function L = V^Gamma. The BT.1886 equation is slightly different.

You mentioned a dispute earlier. There really isn't a dispute among experts. The BT.1886 recommendations are very clear, and serve a well defined purpose.

Everything I've written in this thread can be pieced together based on reading the paper from Charles Poynton, and the actual BT.1886 document.
spacediver is offline  
post #7 of 200 Old 12-13-2013, 05:36 AM
Advanced Member
 
PE06MCG's Avatar
 
Join Date: Jun 2010
Location: West Yorkshire, UK
Posts: 729
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 48
I was really trying to make sure that all the media we view in whatever form is produced at the same gamma otherwise the reproduction for fidelity purposes would be an obvious variable.
I try to change my ill suited viewing area by using different gamma values but all this would be a fruitless exercise if the mastered origin was not produced universally at the same value.
I can see that different Displays may be technically different so each will have their particular peculiarities so calibration must take this into account but at least lets start with a constant source was my point for questioning how all sources are mastered.
PE06MCG is offline  
post #8 of 200 Old 12-13-2013, 06:03 AM
AVS Special Member
 
zoyd's Avatar
 
Join Date: Sep 2006
Location: Planet Dog
Posts: 4,427
Mentioned: 6 Post(s)
Tagged: 0 Thread(s)
Quoted: 75 Post(s)
Liked: 299
Quote:
Originally Posted by spacediver View Post

Just thought about it some more, and I think I get it now.

Suppose in the world, there exist eleven stones, side by side in a row. The first stone is black, and the eleventh is white. The (reflected) luminance off these stones is such that they appear to change lightness in equal increments. This would mean that if you were to plot the stones (x-axis) against the reflected luminance (y axis), you'd end up with a curve very similar to the CRT gamma function (gamma ~2.4).

So while the luminance of these stones rises non-linearly, the (perceptual) lightness of these stones rises linearly.

Our goal is to reproduce this perception when an observer views the representation of that scene on a display.

You can also think of it more simply. The reflection of light (meaning photons) in nature is linear, if there are twice as many photons falling on one stone than the other, the goal of the video chain is to generate twice* as many photons emanating from that stone relative to the other at the display.

Inverse gamma encoding at the camera and the gamma response of displays ensures end-to-end linearity of the system and is an efficient way of transport because it aligns the bit levels in a perceptually uniform manner.



*not exactly 1:1, for dim surround viewing a slight power of 1.1 - 1.2 is recommended at the display end.
zoyd is online now  
post #9 of 200 Old 12-13-2013, 07:32 AM
 
MonarchX's Avatar
 
Join Date: Jul 2012
Posts: 669
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 29
Could someone be so kind and post an HCFR/CalMAN representation of a BT.1886 gamma and Rec.709 2.4 gamma reference curves? HCFR has BT.1886 reference curve already, but I would like to see how 2.4 gamma differs from it, using calibration software such as XY charts/graphs.
MonarchX is offline  
post #10 of 200 Old 12-13-2013, 10:25 AM
AVS Special Member
 
zoyd's Avatar
 
Join Date: Sep 2006
Location: Planet Dog
Posts: 4,427
Mentioned: 6 Post(s)
Tagged: 0 Thread(s)
Quoted: 75 Post(s)
Liked: 299
Quote:
Originally Posted by spacediver View Post


Enter BT.1886

To understand the whole purpose of the recommendation, this statement is key:
Quote:
With the introduction of new display technologies which have entirely different characteristics to
the CRT displays, it is necessary to define the EOTF of new devices that emulate that of the CRT
displays. In measuring the EOTF of a large number of CRTs it was determined that the EOTF of the
CRT was in fact highly variable when the brightness/contrast was adjusted, it is therefore not
possible to 100% emulate CRT capability (or limitations).

Users of this Recommendation in combination with the new technologies should be able to achieve
a higher degree of image presentation repeatability than that offered in the past.
zoyd is online now  
post #11 of 200 Old 12-13-2013, 01:40 PM - Thread Starter
Advanced Member
 
spacediver's Avatar
 
Join Date: May 2013
Location: Toronto
Posts: 738
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 25 Post(s)
Liked: 65
Quote:
Originally Posted by PE06MCG View Post

I was really trying to make sure that all the media we view in whatever form is produced at the same gamma otherwise the reproduction for fidelity purposes would be an obvious variable.
I try to change my ill suited viewing area by using different gamma values but all this would be a fruitless exercise if the mastered origin was not produced universally at the same value.
I can see that different Displays may be technically different so each will have their particular peculiarities so calibration must take this into account but at least lets start with a constant source was my point for questioning how all sources are mastered.

Rec 709 by itself doesn't specify a display gamma. It specifies a gamma at the encoding level (i.e. camera gamma). The assumption is that the display gamma of the reference CRTs of the time were such that the combination of the encoding gamma (~0.45) and the CRT display gamma (around 2.2-2.5) would result in an end-to-end gamma of about 1.1 (see Zoyd's post). According to wiki, in practice, the reference displays used a display gamma of 2.4, although I have also heard the value 2.2 thrown around.

What this tells me is that there probably wasn't an explicit standard that was adhered to (not surprising - see my other post on the gamut issue.

Bt.1886 is valuable since it not only accounts for different black levels, but also explicitly defines a display "gamma" (2.4).
spacediver is offline  
post #12 of 200 Old 12-13-2013, 02:23 PM
AVS Special Member
 
sotti's Avatar
 
Join Date: Aug 2004
Location: Seattle, WA
Posts: 6,581
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 6 Post(s)
Liked: 163
Quote:
Originally Posted by MonarchX View Post

Could someone be so kind and post an HCFR/CalMAN representation of a BT.1886 gamma and Rec.709 2.4 gamma reference curves? HCFR has BT.1886 reference curve already, but I would like to see how 2.4 gamma differs from it, using calibration software such as XY charts/graphs.

In all examples the measured data is simulated perfect 2.4 gamma with a black offset, the target is the reference BT.1886 formula. Lw is 100cd/m^2, Lb is 0.05cd/m^2

Standard Luminance chart:


Gamma chart:



One of my personal favorites, luminance with a log scale for the Y axis, note how smooth BT.1886 is and how 2.4 has a kink as it aproaches it's black level.



Also a great chart from CalMAN the color comparator:



A DelatE 2000 chart showing how much error there is, if the intent is to use BT.1886:


The actual data at 1 bit increments.
bt.1886vblackcomp.csv 34k .csv file

Quote:
Originally Posted by Light Illusion View Post

The naming of the function as a power law with offset is just semantic. It can be referred to by many different names.

To answer specifically, equation 1

L = a(max[(V + b),0])^γ

Is identical to equation 2

L = V^γ

No, it's not. It's not even close. How anyone's reputation can stand up after making such a claim is beyond me.
Attached Files
File Type: csv bt.1886vblackcomp.csv (34.5 KB, 7 views)

Joel Barsotti
SpectraCal
CalMAN Lead Developer
sotti is offline  
post #13 of 200 Old 12-13-2013, 03:11 PM - Thread Starter
Advanced Member
 
spacediver's Avatar
 
Join Date: May 2013
Location: Toronto
Posts: 738
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 25 Post(s)
Liked: 65
The kink and color comparator ones beautifully illustrate the core issue here (how BT.1886 avoids perceptual clipping at low input levels).
spacediver is offline  
post #14 of 200 Old 12-13-2013, 03:32 PM
 
MonarchX's Avatar
 
Join Date: Jul 2012
Posts: 669
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 29
OK, so the Rec709 2.4 gamma IS what is known as power law gamma, right? I thought it was a lot like BT.1886, just slightly darker. I think BT.1886 makes sense just from thinking about its smooth effect - blackness goes from lighter to blacker, creating more depth perception, instead of just being black. The Dark Knight Rises Bane fight scene looks horrible with power law gamma 2.3, most likely worse with 2.4. With BT.1886, you can actually see them fight instead of seeing black blobs. You can see every detail of Batman's suit and cloak, something that is crushed with power law gamma. Power law gamma does create this higher-contrast perception, but it also flattens the image.

So, I understand how power law crushes example-wise, but I don't understand why black clipping tests don't show that. Also, one way or another power law 2.4 gamma is just way too dark. Isn't that why people use 2.2? Its the golden rule so to speak. But even then BT.1886 makes more sense since it starts out low and gradually goes up, past 2.2, all the way to 2.3 @ 90% IRE, where the luminance is high, benefiting from a darker gamma. I realize this is highly shallow newbie talk not based on numbers of math, just common sense sort of thing.

And I think I answered my own question about whether production uses power law gamma or BT.1886. Back to the Batman Bane fight - there is just no way that camera saw pitch-black blobs. It would be too unreasonable to film and camera gamma is shaped like BT.1886 too. I was wondering whether this was taken into consideration post-production. Maybe it really was that black and they altered the image in such a manner that 2.2 gamma would re-produce that blackness exactly as it was seen. Ever since HD tech came out - I noticed that there was too much black present on HDTVs. People were raving about blacks, even truly crushing blacks, nothing was more important than a stringent abrupt blackness, killing all the detail.

So, I take it LightSpaceCMS (or whichever software we're talking about here?) simply does not have this type of gamma set up properly? If HCFR, CalMAN, ChromaPure, and ArgyllCMS all do it the same way, then the answer is obvious, Is there a graph/chart that shows how LS reference line for BT.1886 and the real BT.1886 reference line?
MonarchX is offline  
post #15 of 200 Old 12-13-2013, 03:41 PM
AVS Special Member
 
zoyd's Avatar
 
Join Date: Sep 2006
Location: Planet Dog
Posts: 4,427
Mentioned: 6 Post(s)
Tagged: 0 Thread(s)
Quoted: 75 Post(s)
Liked: 299
Quote:
Originally Posted by MonarchX View Post

So, I understand how power law crushes example-wise, but I don't understand why black clipping tests don't show that.

Because they are not literally clipped (missing), you can still see them but they will be very hard to distinguish from one another, or level 16. Especially when viewed when your vision is light adapted to a typical video image (15%).
zoyd is online now  
post #16 of 200 Old 12-13-2013, 03:42 PM
Senior Member
 
CalWldLif's Avatar
 
Join Date: Feb 2011
Posts: 378
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 23
It is my understanding the bt1886 can about to help TVs with poor black level.
yes it mimics the CRT but has the advantage of scaling to non CRT blacklevels.
CalWldLif is online now  
post #17 of 200 Old 12-13-2013, 03:47 PM
AVS Special Member
 
sotti's Avatar
 
Join Date: Aug 2004
Location: Seattle, WA
Posts: 6,581
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 6 Post(s)
Liked: 163
Quote:
Originally Posted by MonarchX View Post

OK, so the Rec709 2.4 gamma

There is no such thing as Rec709 2.4 gamma, I believe when that is used together it's just short hand for the combination of two separate pieces.

Rec 709 is colorspace recommendation, it does not have any recommendation for display gamma.

2.4 gamma is shorthand for using a basic power formula with an exponent of 2.4 as the EOTF. You could use that formula with or without black level compensation.

Joel Barsotti
SpectraCal
CalMAN Lead Developer
sotti is offline  
post #18 of 200 Old 12-13-2013, 04:00 PM
 
MonarchX's Avatar
 
Join Date: Jul 2012
Posts: 669
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 29
Quote:
Originally Posted by CalWldLif View Post

It is my understanding the bt1886 can about to help TVs with poor black level.
yes it mimics the CRT but has the advantage of scaling to non CRT blacklevels.

I thought it was the other way around. BT1886 looks better on TVs with the lowest black levels because the smooth gray effect is dark enough. On IPS panels, blacks are gray, and adding BT.1886 only makes those greys whiter, thus having some people prefer power law gamma. Although, it doesn't make much sense because blackest black is still going to be just as black with BT.1886.
MonarchX is offline  
post #19 of 200 Old 12-13-2013, 04:10 PM - Thread Starter
Advanced Member
 
spacediver's Avatar
 
Join Date: May 2013
Location: Toronto
Posts: 738
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 25 Post(s)
Liked: 65
Quote:
Originally Posted by MonarchX View Post

I thought it was the other way around. BT1886 looks better on TVs with the lowest black levels because the smooth gray effect is dark enough. On IPS panels, blacks are gray, and adding BT.1886 only makes those greys whiter, thus having some people prefer power law gamma. Although, it doesn't make much sense because blackest black is still going to be just as black with BT.1886.

Look closely at Sotti's color comparator chart. The top row shows what happens when you use a pure gamma 2.4, and the bottom shows what BT.1886 looks like. Both rows have the same non-zero black level.

Notice how there is clipping in the first two bars on the top row, but not the bottom?
derekjsmith likes this.
spacediver is offline  
post #20 of 200 Old 12-13-2013, 04:18 PM
AVS Special Member
 
zoyd's Avatar
 
Join Date: Sep 2006
Location: Planet Dog
Posts: 4,427
Mentioned: 6 Post(s)
Tagged: 0 Thread(s)
Quoted: 75 Post(s)
Liked: 299
Quote:
Originally Posted by spacediver View Post

Look closely at Sotti's color comparator chart. The top row shows what happens when you use a pure gamma 2.4, and the bottom shows what BT.1886 looks like. Both rows have the same non-zero black level.

Notice how there is clipping in the first two bars on the top row, but not the bottom?

Also to note is that if there were a third bar there showing a flat 2.2 response (where a lot of people end up calibrating to) you would see better differentiation at the low end but midtones would have less contrast relative to either end. This midtone contrast is very important in creating perceptual depth.
zoyd is online now  
post #21 of 200 Old 12-13-2013, 04:27 PM
AVS Special Member
 
derekjsmith's Avatar
 
Join Date: Oct 2003
Location: Mukilteo, WA
Posts: 1,890
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 98
Quote:
Originally Posted by zoyd View Post

Also to note is that if there were a third bar there showing a flat 2.2 response (where a lot of people end up calibrating to) you would see better differentiation at the low end but midtones would have less contrast relative to either end. This midtone contrast is very important in creating perceptual depth.

This is one of the main reasons we switched to using BT1886 as our default Gamma choice in CalMAN 5.2 going forward. We have also heard of several big post houses that are switching over to BT1886 next production season in the spring.

Derek

CTO / Founder - SpectraCal Inc.
derekjsmith is offline  
post #22 of 200 Old 12-13-2013, 04:43 PM
AVS Special Member
 
buzzard767's Avatar
 
Join Date: Dec 2005
Location: Naples, FL & Wausau, WI
Posts: 3,534
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 14 Post(s)
Liked: 323
Quote:
Originally Posted by sotti View Post

How anyone's reputation can stand up after making such a claim is beyond me.

The title of this thread pretty much covers it. You're in a glass house.

Buzz
THX Certified Video Calibrator

 

buzzard767 is offline  
post #23 of 200 Old 12-13-2013, 07:10 PM - Thread Starter
Advanced Member
 
spacediver's Avatar
 
Join Date: May 2013
Location: Toronto
Posts: 738
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 25 Post(s)
Liked: 65
Quote:
Originally Posted by MonarchX View Post


So, I take it LightSpaceCMS (or whichever software we're talking about here?) simply does not have this type of gamma set up properly? If HCFR, CalMAN, ChromaPure, and ArgyllCMS all do it the same way, then the answer is obvious


Yes. Thus far, it appears that LightSpace implements a pure power law gamma and calls it BT.1886. Nobody associated with LightSpace (either coders or users) has offered up solid data to prove otherwise. All it would require is a table of numbers showing the luminance associated with each input level, in 10 percent increments, that represent the LightSpace BT.1886 target, similar to how Sotti has done in the image below:


(the stimulus percent should be from 0 - 100 instead of 0 -1, but the numbers are solid, and fit the BT.1886 function perfectly).

b85df991_BT.1886_calman.png



Quote:
Originally Posted by MonarchX View Post

Is there a graph/chart that shows how LS reference line for BT.1886 and the real BT.1886 reference line?


See the last image in my original post - that plots a real BT.1886 line against what LightSpace probably does (and I say "probably", because Steve Shaw has been quite explicit in detailing how he interprets the BT.1886 function. He has clearly said that he simply fits a 2.4 gamma function between minimum and maximum luminance, and then scales the curve so the luminance reaches maximum value at the maximum input value).

Also see the first image in Sotti's first post in this thread, which plots the same curves but with a different black offset from my example.
spacediver is offline  
post #24 of 200 Old 12-13-2013, 11:18 PM
Advanced Member
 
xvfx's Avatar
 
Join Date: Feb 2013
Location: UK
Posts: 569
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 35 Post(s)
Liked: 61
Is anyone using 2.4 in this thread? I think I might try BT.1886 with 2.4 tonight.
xvfx is offline  
post #25 of 200 Old 12-14-2013, 01:39 AM
Advanced Member
 
PE06MCG's Avatar
 
Join Date: Jun 2010
Location: West Yorkshire, UK
Posts: 729
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 48
Quote:
Originally Posted by sotti View Post

There is no such thing as Rec709 2.4 gamma, I believe when that is used together it's just short hand for the combination of two separate pieces.

Rec 709 is colorspace recommendation, it does not have any recommendation for display gamma.

2.4 gamma is shorthand for using a basic power formula with an exponent of 2.4 as the EOTF. You could use that formula with or without black level compensation.

Hi Joel,

Slightly off topic but to to do with the source material we expect to reproduce exactly, so I am still a little uncertain about the mastering process prior to Blu Ray manufacture and distribution..

Probably I am concerned about something that has already been taken care of but Is this standerdised to a specific EOTF value (eg 2.4) so that my Blu Ray will always be at the same 2.4 value at the start of my chain to my Display irrespective of my choice of Blu Ray?

Peter
PE06MCG is offline  
post #26 of 200 Old 12-14-2013, 02:14 AM - Thread Starter
Advanced Member
 
spacediver's Avatar
 
Join Date: May 2013
Location: Toronto
Posts: 738
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 25 Post(s)
Liked: 65
Just went through the lightspace CMS lumagen thread carefully, just to ensure I wasn't misrepresenting Steve Shaw.

It is crystal clear that this individual has absolutely no clue what BT.1886 is:
Quote:
Originally Posted by Light Illusion View Post

BT1886 is an idea to match modern flat panel displays to older CRTs - it is not a recommendation that has been adopted widely at all, and will probably (and should?) just disappear...
When most displays in the world are flat panels why attempt to make them emulate old CRT technology?

The BT1886 standard defines the transfer characteristics of a display - the electro optical transfer function - and defines this using a function which is simply a gamma 2.4 function between black (minimum light) and white (maximum light). There is no change in the blacks in the standard.


I strongly urge any LightSpace users to put pressure.

It is one thing to choose not to implement BT1886.

But to implement it incorrectly, and then stubbornly refuse to acknowledge this error when it is painstakingly pointed out by numerous patient individuals is appalling, and reflects extremely poorly on the company.

Again, I would like to point out here that the lead figures from Chromapure, Calman, HCFR, and ArgyllCMS have all directly tried to explain this error to Steve.


I do hope that Steve comes around and gracefully figures this out.
derekjsmith likes this.
spacediver is offline  
post #27 of 200 Old 12-14-2013, 04:10 AM
Senior Member
 
Wouter73's Avatar
 
Join Date: Jun 2012
Location: Alkmaar, Netherlands
Posts: 321
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 3 Post(s)
Liked: 33
If I understand correctly, the following:

"And thus, a CRTs natural response characteristics will produce a perceptually uniform set of brightness levels!
This is not by design, but is rather an extremely fortunate coincidence (as Charles Poynton points out in this paper, from which the above two images were taken)."


means that we are not trying to emulate a crt respons in a flat panel, as stated:

"BT1886 is an idea to match modern flat panel displays to older CRTs - it is not a recommendation that has been adopted widely at all, and will probably (and should?) just disappear...:When most displays in the world are flat panels why attempt to make them emulate old CRT technology?

"


But we are trying to adjust the flat panel so it works best with our human vision

Do I understand this correctly?
Wouter73 is offline  
post #28 of 200 Old 12-14-2013, 10:52 AM - Thread Starter
Advanced Member
 
spacediver's Avatar
 
Join Date: May 2013
Location: Toronto
Posts: 738
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 25 Post(s)
Liked: 65
It's a bit of both.

Emulating a CRT means that we are optimizing our displays to work well with human vision.

My understanding is that BT1886 accomplishes CRT emulation in two ways:

1: by explicitly specifying a luminance exponent in the display end (2.4)

2: by ensuring that the perceptual benefits of CRT emulation carry across a range of black levels

As a result of this, we end up with more consistent AND better outcomes.

edit: keep in mind that the revealing part of Steve's quote is the bolded section, where he describes what he thinks BT1886 does mathematically.
spacediver is offline  
post #29 of 200 Old 12-14-2013, 10:58 AM
AVS Special Member
 
sotti's Avatar
 
Join Date: Aug 2004
Location: Seattle, WA
Posts: 6,581
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 6 Post(s)
Liked: 163
Quote:
Originally Posted by Wouter73 View Post

If I understand correctly, the following:

"And thus, a CRTs natural response characteristics will produce a perceptually uniform set of brightness levels!
This is not by design, but is rather an extremely fortunate coincidence (as Charles Poynton points out in this paper, from which the above two images were taken)."


means that we are not trying to emulate a crt respons in a flat panel, as stated:

"BT1886 is an idea to match modern flat panel displays to older CRTs - it is not a recommendation that has been adopted widely at all, and will probably (and should?) just disappear...:When most displays in the world are flat panels why attempt to make them emulate old CRT technology?

"


But we are trying to adjust the flat panel so it works best with our human vision

Do I understand this correctly?

http://www.itu.int/dms_pubrec/itu-r/rec/bt/R-REC-BT.1886-0-201103-I!!PDF-E.pdf

If you read the spec, it's one page of relatively plain english and a second page that is the formula. It doesn't really talk about the human visual system.

The idea is that the spec slides into what we are already doing and doesn't make a significant change, but does provide a repeatable standard. Since up to the point where BVMs became unavailable all content was mastered on them.

Joel Barsotti
SpectraCal
CalMAN Lead Developer
sotti is offline  
post #30 of 200 Old 12-14-2013, 02:00 PM
Senior Member
 
Wouter73's Avatar
 
Join Date: Jun 2012
Location: Alkmaar, Netherlands
Posts: 321
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 3 Post(s)
Liked: 33
What is actually the reason this is not widely implemented yet?
Wouter73 is offline  
Reply Display Calibration

User Tag List

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off