or Connect
New Posts  All Forums:Forum Nav:

Contrast Revisited - Page 2

post #31 of 162
Thread Starter 
Quote:
Originally Posted by Chuck Anstey View Post

From what I have been reading in gregr's posts, you wouldn't use the power curve right to Y=17. At the tail of the curve, Y < 24, you would use a different curve that is more linear....

Agreed. 709 specifies a linear tail for encoding the values. The pj should use the same to decode, transform, and display.
post #32 of 162
Thread Starter 
Quote:
Originally Posted by madshi View Post

Actually, what I take from this discussion is that my initial reaction was right: Calculating the luminance for all the Y steps without substracting the black level first is *not* the way to go, IMHO. Instead the black level should be taken into account. That would allow us to calibrate a 5k:1 projector to a pure power 2.4 curve without any loss of shadow detail.

Why are you turning this into a calibration discussion? Why would you try to calibrate a 5k:1 pj to a 2.4 pure curve?
post #33 of 162
Quote:
Originally Posted by HoustonHoyaFan View Post

Agreed. 709 specifies a linear tail for encoding the values. The pj should use the same to decode, transform, and display.

BT.709 specifies content encoding, it doesn't say anything about how to calibrate displays. Both Charles Poynton and the EBU recommend calibrating displays to a pure power curve, not to a 709 curve. The native gamma response of a CRT is also a pure power curve and that's the type of display everybody used when 709 was made. Why would the 709 fathers make a spec which requires something no display at their time was capable of doing? That makes no sense. 709 was surely written with CRTs in mind, which always had a pure power gamma reponse.

Quote:
Originally Posted by HoustonHoyaFan View Post

Why are you turning this into a calibration discussion?

Because the two issues are related. How the projector is calibrated has a direct effect on the contrast between Y=17 and Y=235. Also calibration was a big part of the discussion in the other thread, from which this thread was split.

And btw, your second post in this thread is directly about calibration, too. So it's not like I turned this thread into something it wasn't from the beginning.

Quote:
Originally Posted by HoustonHoyaFan View Post

Why would you try to calibrate a 5k:1 pj to a 2.4 pure curve?

Why not? Ok, let's make it 2.2, makes no difference to my argument, really.
post #34 of 162
Quote:
Originally Posted by HoustonHoyaFan View Post

Why are you turning this into a calibration discussion?

This isn't much of a contrast discussion.

What is it really?
post #35 of 162
Quote:
Originally Posted by madshi View Post

Actually, what I take from this discussion is that my initial reaction was right: Calculating the luminance for all the Y steps without substracting the black level first is *not* the way to go, IMHO. Instead the black level should be taken into account. That would allow us to calibrate a 5k:1 projector to a pure power 2.4 curve without any loss of shadow detail.

But in your first example the projector has a CR of 2. Nobody would buy this beast...

In your 2nd example the projector has a CR of 5000 which does not support a pure gamma of 2.4 without suffering Black Crush. In actuality would one really want to use a gamma of 2.4 with this projector? If so, then I think Chuck mentioned the solution. But maybe a better solution is to use a Gamma of 2.3 or 2.22 (etc). The lower gamma setting avoids black crush.....
post #36 of 162
Thread Starter 
Quote:
Originally Posted by Lawguy View Post

...But, to bring it back to contrast, this discussion does demonstrate that where you have a display with less on/off contrast, you will have to run a lower gamma and that will result in near blacks being less black than they might be on a projector with higher on/off CR that can run a higher gamma. That, and the Video Black measurement will, of course be higher on the lower CR projector.

I view the discussion so far as a prequel to the contrast discussion. We should all be on the same page on a simple visual for image contrast. I suggest we use an hypothetical RS40 to avoid sidetracks on technology, DIs, Our RS40 measures 20K:1 gamma 2.2, iris open and 40K:1 iris closed

I believe (thanks to gregr) we have established the following:
RS40 at 20K:1 would have a luminance ratio between white (235) and near black (17) of ~ 1000:1 (985), between near black (17) and black (16) of 20, and between white (235) and black (16) of 20K:1

RS40 at 40K:1 would have a luminance ratio between white (235) and near black (17) of ~ 1000:1 (985), between near black (17) and black (16) of 40, and between white (235) and black (16) of 40K:1

There only difference between the RS40-20K and RS40-40K is the ratio between 17 and 16. For a RS40-400K, the only difference would still be the ratio between 17 and 16!

I would use a gamma 2.5 example but I do not know how to calculate a V' for gamma 2.5 in the following formula:
reference white / ((reference white/219) / 4.5) = CR
post #37 of 162
Quote:
Originally Posted by Geof View Post

In your 2nd example the projector has a CR of 5000 which does not support a pure gamma of 2.4 without suffering Black Crush.

Actually, it does. Just add the black level (0.2 lumens) to all the IREs and you have a pure power gamma of 2.4 without Black Crush.

Edit: Just to make sure. This is just an idea I had and I'm not sure if it would indeed look correct in real life.

Quote:
Originally Posted by HoustonHoyaFan View Post

I believe (thanks to gregr) we have established the following:
RS40 at 20K:1 would have a luminance ratio between white (235) and near black (17) of ~ 1000:1 (985), between near black (17) and black (16) of 20, and between white (235) and black (16) of 20K:1

RS40 at 40K:1 would have a luminance ratio between white (235) and near black (17) of ~ 1000:1 (985), between near black (17) and black (16) of 40, and between white (235) and black (16) of 40K:1

That is true only if you calibrate to a BT.709 curve with a gamma value of 1/0.45, which is not what Charles Poynton and the EBU recommend. Which brings us back to the topic "calibration".
post #38 of 162
Thread Starter 
madshi -

if you don't mind I am going to take gregr and Guy Kuo as gospel on this subject. Gamma correction has been argued on the the thread I linked in this OP.

I will add that at the post company I work with we encode and decode with 709 using the specified linear tail.
post #39 of 162
Well, who to take as gospel? That's a difficult decision. Charles Poynton and the EBU should not be rejected as valid opinions, either. Mr.D also says he's using a pure power curve, IIRC. And the Calman help suggests to calibrate to a pure power curve for batcaves and to use a BT.709 curve for rooms with ambient light or light colored walls.

And then, even if you do use a BT.709 calibration for your contrast measurements, you should surely not use 1/0.45. That's the gamma value used for encoding. The viewing gamma should have a 1.2x factor. If you calibrate a display to a BT.709 curve with a gamma value of 1/0.45 that it surely much too bright. 1.2x * 1/0.45 sounds more like it to me. And then the contrast numbers change. It's not 985 then, anymore.

Edit: Read here for Charles Poynton's view on the pure power vs. BT.709 gamma curve topic:

http://www.poynton.com/notes/PU-PR-IS/index.html
post #40 of 162
Thread Starter 
Quote:
Originally Posted by madshi View Post

Well, who to take as gospel? ...

So you want to first have a discussion on how to gamma calibrate a projector for values below 1.8% stim?
post #41 of 162
Quote:
Originally Posted by HoustonHoyaFan View Post

So you want to first have a discussion on how to gamma calibrate a projector for values below 1.8% stim?

Well, since the calibration directly effects the contrast, there's no other way, IMHO. Unless we all agree how calibration should be done. But that doesn't seem to be the case, does it? FWIW, here's what the EBU has to say about this:

http://tech.ebu.ch/docs/tech/tech3320.pdf

"monitor gamma is not, and never has been, the inverse of the camera gamma"

"The conclusion must be that any new monitor technology should retain the same electro-optical transfer function as has historically been used."
post #42 of 162
Quote:
Originally Posted by Lawguy View Post

But, to bring it back to contrast, this discussion does demonstrate that where you have a display with less on/off contrast, you will have to run a lower gamma and that will result in near blacks being less black than they might be on a projector with higher on/off CR that can run a higher gamma. That, and the Video Black measurement will, of course be higher on the lower CR projector.

I disagree contrast does not dictate display gamma except for near black. Going by the EBU and British Film Council recommendations for consumers using DVD or Blu-ray, etc... Gamma is 2.35 for a flat panel or 2.45 with a projector in a dark room, regardless of display contrast ratio.

Commercial cinema DCI only faithfully tracks gamma down to 5% of peak white, the last 5% is determined by the displays black level

Video monitors used in mastering according to the EBU only faithfully track gamma from 90% to 10%, above 90% and below 10% is determined by the displays white level and black level.

Video calibration seems to be done to make 2% above black just visible on a black screen in dim surroundings, or 4% above black just visible on black with a higher APL picture or in brighter surroundings.

All the sources are contrast limited as far as any image detail is concerned anyway. I detailed what I think is the contrast limitation of the source chain film print, dci, video in this post. http://www.avsforum.com/avs-vb/showt...0#post20495770
post #43 of 162
Quote:
Originally Posted by HoustonHoyaFan View Post

I view the discussion so far as a prequel to the contrast discussion. We should all be on the same page on a simple visual for image contrast. I suggest we use an hypothetical RS40 to avoid sidetracks on technology, DIs, Our RS40 measures 20K:1 gamma 2.2, iris open and 40K:1 iris closed

I believe (thanks to gregr) we have established the following:
RS40 at 20K:1 would have a luminance ratio between white (235) and near black (17) of ~ 1000:1 (985), between near black (17) and black (16) of 20, and between white (235) and black (16) of 20K:1

RS40 at 40K:1 would have a luminance ratio between white (235) and near black (17) of ~ 1000:1 (985), between near black (17) and black (16) of 40, and between white (235) and black (16) of 40K:1

There only difference between the RS40-20K and RS40-40K is the ratio between 17 and 16. For a RS40-400K, the only difference would still be the ratio between 17 and 16!

I would use a gamma 2.5 example but I do not know how to calculate a V' for gamma 2.5 in the following formula:
reference white / ((reference white/219) / 4.5) = CR

Okay, we have established these things. But I question what any of this has to do with a contrast discussion because it really seems like the calibration discussion that you don't want to have.

What is your thesis here?

If your premise is that regardless of the CR of a projector, the only differences in CR will be between 17 and 16, that simply isn't so unless you choose to calibrate your two displays identically (which is part of your premise). Lower contrast projectors cannot use higher gammas but higher contrast projectors can. So, the right thing to do would be to compare the CRs of the low contrast projector at 2.2 and the higher contrast projector at 2.4. If you do this (and I have not) I expect that you will find that there are more CR differences than just 16 and 17.

Plus, even if you do calibrate them the same, do you disagree with gregr who finds that "the intra-image contrast will be worse in dark scenes on the projector with the lower full-field CR (all else being equal)." So, at least gregr believes that a simple difference between 17 and 16 will improve dark scenes.

So, I am puzzled aboout what is left unresolved.
post #44 of 162
Quote:
Originally Posted by dovercat View Post

Commercial cinema DCI only faithfully tracks gamma down to 5% of peak white, the last 5% is determined by the displays black level

If this is so then how do you avoid crushing blacks?
post #45 of 162
For true reference displays, the gamma curve should be strictly adhered to.
Reference displays should be capable of displaying right down to black accurately. (I don't really consider LCD "monitors" suitable for that task)

For consumer-grade displays, black level compensation should be used. All the major calibration packages offer this feature. If it is not, you are clipping all the lower-end shadow detail out of the image, which is clearly wrong.

Madshi is right.

Quote:
Originally Posted by HoustonHoyaFan View Post

Agreed. 709 specifies a linear tail for encoding the values. The pj should use the same to decode, transform, and display.

No. The display should be using a pure 2.4 power curve.



With their older BVM-L monitors (low contrast LCD) Sony calibrated to 2.2 gamma.
Now that they have sufficiently high contrast displays, Sony is using 2.4 gamma for their ITU-R BT.709 color profile on the new BVM-E OLED monitors.

They also include use a different gamma "CRT BVM" with their SMPTE-C and EBU color profiles, but those pretty much tracked 2.4 already:




Quote:
Originally Posted by dovercat View Post

All the sources are contrast limited as far as any image detail is concerned anyway. I detailed what I think is the contrast limitation of the source chain film print, dci, video in this post. http://www.avsforum.com/avs-vb/showt...0#post20495770

Source contrast (film type used, digital camera dynamic range etc.) has no relevance to display/final image contrast.
post #46 of 162
Quote:
Originally Posted by Lawguy View Post

If this is so then how do you avoid crushing blacks?

Since pluge patterns mean 2% above black is visible on a black screen, and 4% above black on black in a higher APL screen. Then the shadow detail is there, it just has less contrast. But that is better than using a lower gamma across the entire greyscale and making the whole image look flatter or losing shadow detail by clipping above black.
post #47 of 162
Quote:
Originally Posted by Chronoptimist View Post

With their older BVM-L monitors (low contrast LCD) Sony calibrated to 2.2 gamma.
Now that they have sufficiently high contrast displays, Sony is using 2.4 gamma for their ITU-R BT.709 color profile on the new BVM-E OLED monitors.

They also include use a different gamma "CRT BVM" with their SMPTE-C and EBU color profiles, but those pretty much tracked 2.4 already:

Assuming this is true, it refinforces what I wrote earlier.

Lower contrast displays require lower gammas, which deprive the image of black level performance.

Higher contrast displays can take advantage of higher gammas, which provide greater black level performance.

Again, the comparison show be: compare a low contrast (5k:1) display at 2.2 with a high contrast (70k:1) display at 2.4.

I can't do the math, can someone else?

Chronoptimist's chart shows more contrast at 2.4 than at 2.2 throughout most of the spectrum.
post #48 of 162
EBU clearly states 2.35 gamma (2.2-2.5 leeway) in its recommendations for video mastering monitors. EBU recommend for consumer flat panels 2.35. The British film council recommends for projectors being used with material mastered to REC 709 like blu-rays 2.45 gamma. So maybe in the UK we do things a bit differently.

I believe gamma should be dictated by the environment room lighting, screen brightness and field of view. Not the display contrast.
Video material should be mastered to look OK at 2.2-2.5 gamma so if the room is brightly lit you could tweak gamma down a bit, or if the room is pitch black you could tweak gamma up a bit, to retain the perception of contrast.

Quote:
Originally Posted by Chronoptimist View Post

Source contrast (film type used, digital camera dynamic range etc.) has no relevance to display/final image contrast.

The post I referred to was the entire chain camera to review room screen, or video mastering monitor. I was trying to figure out if there was anything to see down there except for black, answer I came to is no, you just get darker black no detail is present to be revealed by having higher contrast.

I guess if you want higher contrast than the review room cinema or video monitor depends on if you are only trying to reproduce a reference cinema, or what the director saw, and so only need a setup that is good enough. Or you desire an exaggerated black for exaggerated contrast especially in dark scenes, a setup that is as good as possible.

You can display the source at lower contrast than was used during mastering.
When the master is encoded the lowest code value of luminance is assigned to the lowest luminance / black level it can display. When displayed with a higher black level - lower contrast than the master the bottom (5 or 10% depending on cinema or video) curve is made gentler - compressed so all the shadow detail is displayed.

You can also display it at a higher contrast than was used during mastering, but some standards do not recommend it.
Commercial cinema DCI standards state "Reference Image Parameters and Tolerances. In order to eliminate unwanted detail or discoloration in near blacks, it is critical that Mastering Projectors have an equal or higher sequential contrast than all exhibition projectors."
post #49 of 162
Quote:
Originally Posted by madshi View Post

This still doesn't make any sense to me. Let's think through an example. ...

I haven't read every word in this thread, but I don't think gregr would disagree with the point that gamma curves should be adjusted for poor absolute black levels (in fact I think he would agree). It may be important to note that he said, "exactly the same gamma curves" and not the same gamma numbers. You can get to the same gamma number multiple ways, but if every point on the gamma curve is the same between two displays except for the absolute black then it is like Greg said. That doesn't necessarily mean that you want to ignore absolute black when deciding how to calibrate.

The contrast calculator here:

http://home.roadrunner.com/~res18h39/contrast.htm

takes the absolute black level into account and spreads the "error" out and is described here:

http://home.roadrunner.com/~res18h39/curve.htm

Just for information, the related intrascene calculator is here:

http://home.roadrunner.com/~res18h39/intrascene.htm

--Darin
post #50 of 162
Quote:
Originally Posted by Lawguy View Post

Again, the comparison show be: compare a low contrast (5k:1) display at 2.2 with a high contrast (70k:1) display at 2.4.

Assuming both projectors have the same peak output, the 2.4 display will simply look darker on all scenes, including the high APL ones, but may be a bit more pleasing in the very low APL scenes as "black" looks blacker. Most people prefer "brighter is better" the same as comparing speakers where "louder is better". A few might notice a slight lowering of contrast in the mid-level if they are looking for it but likely the increase in brightness using 2.2 will be preferred over the loss of contrast of 2.4 because they are close. I just did this with my CRT projector and I prefer 2.2 even in my bat cave. I can see the change in contrast if I really look for it but the increase in brightness was obvious.

Now if my projector was 3000 lumens @ 40K:1, I might be using 2.5+ just to get the brightness down on average and get increased contrast to boot although there is a risk of visible banding even between single steps of gray.
post #51 of 162
Quote:
Originally Posted by dovercat View Post

You can display the source at lower contrast than was used during mastering.
........
You can also display it at a higher contrast than was used during mastering, but some standards do not recommend it.
.......
Commercial cinema DCI standards state "Reference Image Parameters and Tolerances. In order to eliminate unwanted detail or discoloration in near blacks, it is critical that Mastering Projectors have an equal or higher sequential contrast than all exhibition projectors."

Yes, this is why in our other thread people were confusing the term "infinite" contrast of the source as to mean anything other than a transcendental measuring result.

And back to the other thread that originated this discussion, just because you can change the brightness on a scene and it make look better is no indication that this is because you needed more native contrast on the projector, sure it can be an indication, but it's not a reliable method of looking at an image and saying ok this for sure is due to a lack of native contrast because if I changed the brightness of a scene it looked better.
post #52 of 162
Just to be clear, the BVM-L monitors using 2.2 gamma have a 1,000:1 contrast ratio.

Quote:
Originally Posted by Lawguy View Post

Again, the comparison show be: compare a low contrast (5k:1) display at 2.2 with a high contrast (70k:1) display at 2.4.

A 5,000:1 display at 2.2 gamma with no compensation, will likely clip below 3% grey.

A 70,000:1 display with 2.4 gamma is accurate down to black without clipping, but 1% grey will barely be above black. In reality, on a consumer display, 1% is likely to be clipped.


Quote:
Originally Posted by Chuck Anstey View Post

A few might notice a slight lowering of contrast in the mid-level if they are looking for it but likely the increase in brightness using 2.2 will be preferred over the loss of contrast of 2.4 because they are close.

I really don't like how 2.2 looks. The image seems washed out with no mid-tone contrast.
post #53 of 162
Quote:
Originally Posted by Chuck Anstey View Post

Assuming both projectors have the same peak output, the 2.4 display will simply look darker on all scenes, including the high APL ones, but may be a bit more pleasing in the very low APL scenes as "black" looks blacker.

It would depend on how a scene is composed. Where a scene mixes bright and dark, I think those kinds of scenes would look more contrasty because both displays have the same peak output but the higher gamma projector would mix in deeper blacks.

But, many scenes would look darker too.
post #54 of 162
Quote:
Originally Posted by Chronoptimist View Post

. . . but 1% grey will barely be above black.

Sounds right to me. 1% grey is barely above black.
post #55 of 162
@Chronoptimist, thanks for your input, appreciated!

Quote:
Originally Posted by Lawguy View Post

Assuming this is true, it refinforces what I wrote earlier.

Lower contrast displays require lower gammas

Not if you use black level compensation.

Quote:
Originally Posted by darinp2 View Post

I haven't read every word in this thread, but I don't think gregr would disagree with the point that gamma curves should be adjusted for poor absolute black levels (in fact I think he would agree). It may be important to note that he said, "exactly the same gamma curves" and not the same gamma numbers.

You're right, it seems.
post #56 of 162
Quote:
Originally Posted by Lawguy View Post

It would depend on how a scene is composed. Where a scene mixes bright and dark, I think those kinds of scenes would look more contrasty because both displays have the same peak output but the higher gamma projector would mix in deeper blacks.

But, many scenes would look darker too.

There are always potential optical illusions, but in the overwhelming majority of content it would just make it look darker.
post #57 of 162
Quote:
Originally Posted by Lawguy View Post

It would depend on how a scene is composed. Where a scene mixes bright and dark, I think those kinds of scenes would look more contrasty because both displays have the same peak output but the higher gamma projector would mix in deeper blacks.

But, many scenes would look darker too.

No, 100% of all scenes except a full screen 100 IRE will look darker. Look at the gamma curves. 2.4 is always lower luminance than 2.2 except at 100 IRE.
post #58 of 162
Comparing one after another, sure it's always going to look darker, he might have been referring to the illusion of brighter images when you increase the perceived contrast if you are not doing a side-by-side or an A/B, but yes I agree it is just plain flat darker.
post #59 of 162
Quote:
Originally Posted by coderguy View Post

There are always potential optical illusions, but in the overwhelming majority of content it would just make it look darker.

Not so. Look at the Gamma curve comparison pic in Chronoptimist's post and track the differnces in the 2.2 and 2.4 curves. At the high end, the values get very close and obviously converge at 100%.

Where those higher end values are in a scene mixed with something darker (as is seen on most of the rest of the chart), that scene will have better intrascene contrast at 2.4 than at 2.2 (assuming, of course, that the display is up to the job).
post #60 of 162
The overwhelming majority of scenes don't use both the high and low-end of the spectrum though, at best it can just match it if you are staring at a 100% white background, or am I missing something?
New Posts  All Forums:Forum Nav:
  Return Home