AVS Forum banner

1 - 19 of 19 Posts

·
Banned
Joined
·
1,762 Posts
Discussion Starter #1
Just tweaking my OLED TV and like most, I'm getting some black crush. It's certainly not terrible, but there's definitely some shadow detail being hidden. So here's the question: I watch mostly in a dark/dim room, and many people seem to recommend 2.4 gamma for that. The TV also has BT.1886, but it's the same thing as 2.4 since blacks are 0 nits. Anyway, BT.1886 looks nice and contrasty, but there's some black crush. If I switch to gamma 2.2, things look better. I can barely see black level 17 if I stick my face to the screen, and 18 is visible even from my seating position.

I guess it could boil down to personal preference, but using 2.2 seems to provide the most accuratepicture. So...BT.1886 with crushed blacks, but more punch, or gamma 2.2 with more shadow detail:confused: As a side note, 2.2 definitely makes cable channels look worse, but that's only because BT is crushing those low-level artifacts.
 

·
Banned
Joined
·
1,762 Posts
Discussion Starter #3
Do you have a meter? If so you can use the 20 point control to get a gamma in between the two, and it can look great.


Sent from my iPhone using Tapatalk
I really wish they made a preset in between the BT.1886 and 2.2. It seems like a waste to include BT.1886 and 2.4, since they're exactly the same (at least I think so). I do have an i1D3 that I've used for other TVs, but it's not profiled for OLED. Not sure how essential that is, but it still needs to read colors correctly in order to adjust gamma, right? Someone said they're profiled meter is actually pretty close to how the meter reads in HCFR with non-refresh mode, but not sure they're all like that.

Also, I tried raising just the 5% luminance like a few people over in the OLED forum suggested, but not sure I like the effect. It does bring out more detail, but it also seems to change the color in those areas.
 

·
Registered
Joined
·
792 Posts
Haha - I tried to bring this out as a discussion in the actual gamma thread, that had every expert in here suggesting all kinds of different things - but no luck.

http://www.avsforum.com/forum/139-display-calibration/1758185-who-prefers-higher-gamma-than-2-2-a.html

People have to loose far too much in acknowledging that bt 1886 (the Calman default *hurray!*) is utter bull****.

The explaination for why it was implemented is basically completely untrue and doesn't in the least achieve the "proclaimed desired effect" on any modern TV (especially OLEDs and IPS displays) - and calibrators run around in a frenzy procaiming all kinds of completely different solutions.

But thats only the first part of the problem.

:)

Lets continue.

i1d3s are factory calibrated - and don't color shift over time, Spectrophotometers do within the first year. And then the next. And the next. And the...
There is a publicly available test of about a dozen of them out there that suggests, that all i1d3s (= i1 Display pro) are "different" by about 0.4dE mean (1.6 dE max)
https://www.drycreekphoto.com/Learn/Calibration/MonitorCalibrationHardware.html

So yes - you can share calibrations between them. In fact, thats the entire business model of x-Rite to begin with. (Otherwise they couldnt sell them as "stand alones".) (Because the display correction profiles they pack in are identical.)

In fact, the average calibrator that drags along an old Spectro probably introduces more color errors into his calibrations, based on "trust" in his more expensive equipment alone.

The flipside to that is, that you need specros to create the correction profiles in the first place. And the ideal scenario being to have a spectro that gets vendor calibrated every 6 months, and you correcting your i1d3 against it.

Lets continue again.

What Chad B is selling you as "can look great" is Bull****.

Not because in fact it wouldnt look good- but because he has no idea how much to curve the gamma curve in the low end. Or where to lift it to at the top end (higher IRES).The curve he uses (that is shared in the 2016 LG OLED calibration thread) is purely made up.

But thats not the entire issue - actually, by him at least mentioning, that the bt1886 target doesnt look great on OLEDs, he has done you a favor already (in a world, where people are mostly concerned with not telling the public when things break).

The issue is, that "can look great" isn't substentiateable. No one can or will tell you "How much to bend the gamma curve" at low IREs (and remember, that is only after people have agreed, that the bt.1886 target of 2.4 flat on OLEDs is bull**** (doesnt look right, looses shadow detail, crashes 17, 18, 19, near blacks, by definition...)).

And what that means you have already found out on your own. Color changes, no one can attest to if they they are necessary or intended, and an image depth thats HIGHLY variable.

Also - no one agrees on the curves progression. Some use bt.1886 at artificially fixed black levels - some use slightly different custom curves (Looking at Chad B), some only brighten up 5% IRE; some brighten up 5% and 10% IRE -- and the color differences basically get masked by the dE2000 formula that "doesnt't care" - and no one has any idea what depth ("plasticity") of the image to shoot for.

This goes as far as bt.1886 being conceptualized as a compromise between people that maintain that "flat gamma is beautiful and 2.4 is where its at" and people that maintain that "the gamma curve needs to include a CRT correction (thats the increased curve at low ires", because most movies were mastered on CRTs that didn't have linear gamma.

The notion, that you would fall in one camp or the other, because of the black level of your TV, is - you guessed it - bull****.

bt.1886 is basically two groups of engineers coming up with a compromise, that in reality introduced MORE color errors (because it suggests such a wide variety of gamma curves - depending on if a display was produced yester day or the day before, ...), thus failing enitrely at the proclaimed goal.

Color correction professionals cursed, calibrators covered it up. As always. (Thats why what Chad B does (speaking up, that bt.1886 looks bad on low black level devices, because it suggests a flat PLG 2.4 calibration) has to be seen as "progressive" - so thank you for at least that much.).
--

If you take a crossection of what "experts" recommend as the "correct" gamma setting for 2016 LG OLEDs in a dark room environment you are presented with the following --

german Hdtvtest Forum:
bt.1886 with a custom black point of 0.0005 (basically flat 2.4 with the slightest of curves - 5% ire at basically 2.33).

Chad B:
custom curve with high IRE at 2.34 and a somewhat smooth rolloff to 2.21 at 5%

ConnecTEDD:
Flat PLG 2.4, because the standard says so.

Me:
Probably 2.35 at high IRE with a gradual rolloff at low ires to 2.1 or even lower. But I'm still evaluating.

Vincent Teoh:
Custom curve - only adjust 5% and 10% IREs to either 2.2 or 2.1

The next guy:
Custom curve - ony adjust 5% IRE at all.

The guy next to the next guy:
Use a flat PLG 2.2 gamma, because thats the HDTV standard. And post production studios are tasked to work with this in mind.
--

Whats correct? No one effing knows. Its completely messed up. Also - it was completely messed up in postproduction as well - because those geniuses also din't know what they were calibrating for.

Therefor, it also changes depending on the material you'll watch and how it was mastered, without there having been a standard at all - and the currenty suggested one (bt1886 breaking in all kind of funny ways - which means, directors intent is a myth.

Something calibrators sell - without being able to replicate it.

The latest suggestion from ConnecTEDDD was to buy a 3D lut Box, calibrate 5 different gamma curves and switch, whenever you feel funny.
--

This is the stuff other people in this forum will be reluctant to tell you (also they will make sure to amp up the myth of high Colorimeter variability to make sure to book their calibration services, while at the same time NEVER talking about Spectrophotometer color shift and the uncertainty that amounts because of that - or about the likelyhood, that calibrators "ship in" their spectros for recalibration ever 6 months to a year..

-

Oh, and lets end with the color difference in the image you noticed when changing 5% ire luminance slightly.

The dE 2000 formula with a bt 1886 gamma target will underreport that , and never - NEVER show it as a color difference that even exceeds a dE of 2 - and no one in the world can tell you which color (or image depth) was what the post production artist saw - or was "intended" by the director.

Its a sorry a** situation, but thats what people thought they should arrange with.

Also the answer "if you've got a colorimeter and 10/20pt calibration - you can curve gamma a little at the low end, to make "it" look better", masks the actual issues that are at play here.

Oh, and of course there is no such notion as "calibrating to prevent black crush" (loss of black detail) - because no one agrees what that is. Calibrators will happily calibrate 5% and 10% ire to 2.0 or 2.4 gamma, and all proclaim - that this is "the ideal 2.4 curve for dark room viewing". And of course - even 2.0 still crushes some detail, depending on the source material. So the next time someone suggests, that you should let an expert "calibrate your TV" because you want to see that black detail - or the movie - just as the director intended -

tell them, that they are lying - and have been covering up indecision and their "standards" breaking for the better part of 20 years.

Welcome to this community - have a cookie.
 

·
Banned
Joined
·
1,762 Posts
Discussion Starter #5
Sounds like there's lot of debate when it comes to gamma :) I've heard people say that BT.1886 is the standard, but that only happened recently, so the majority of material out there was probably mastered at something else. This is just personal experience, but what I've found so far on my OLED is this: some cable channels look better with it set to BT1886 and some don't. In particular, news stations look better, but I can flip to a movie channel, and things are too dark. So, like you mentioned, source variability is all over the place.

I used a handy pattern someone on the forum made that's composed of squares and goes from black level 16-27 or something. In a dark room, the only way I can get the pattern nearly correct is to use gamma 2.2 and bump up the brightness setting to 51 (from default 50). I could up the brightness more and keep using BT1886, but then it ruins perfect black.

So, I don't know. I think things are looking good now at 2.2. Some people say it looks washed out compared to the BT1886 setting, but I really think that's because BT is crushing so much black to begin with. Blurays are certainly watchable using BT, but I find that my eyes are always searching for detail that I know should be there, but isn't visible, if that makes sense. Also, I do a lot of photography and have my PC monitor set to 2.2, so maybe that's just the kind of image I'm used to seeing.
 

·
Registered
Joined
·
792 Posts
Oh no - there is basically no debate at all - just a bunch of people seeing that stuff breaks, and chosing not to talk about it - basically.

They see that they all recommend something different, they just don't take issue with it - its like a sort of bro code between professional (as in pays taxes) members of this community.

There is also an issue with the "use 2.2 for bright room environments" rule not being able to upheld any longer, because TVs nowadays can blow your eyes out at 2.4 gamma (avg) in any room under any lighting conditions at all -- and the HDR EOTF (gamma) being the exact same for dark and bright environments - that draws into question one of the most basic calibration rules from the past 10 years - also, nada - no one talking about it, just people ignoring that the overarching logic breaks, and me acting like a shoutbox, whenever I get the chance, to compensate for it... ;)

There is also an issue with contrast not having an "upper limit" for rec709 content, because with TVs becoming brighter all the time without loosing saturation, at some point we also have to talk about the directors intent on saturation of color. An upper limit on luminance for rec709 content would do for example - but the industry would not take that well....
 

·
Registered
Joined
·
758 Posts
...Also, I do a lot of photography and have my PC monitor set to 2.2, so maybe that's just the kind of image I'm used to seeing.
Rec 709 and sRGB standards were both targeting 2.2 display gamma.

While the two power functions appear different, the average gamma (1/2.2) is the same. This way both Rec 709 or web based sRGB will look the same with display setting 2.2. Keep in mind even though the Rec 709 color space does not technically define a display gamma, the reference grade CRT's used in the mastering process were designed to produce a standard 2.2 response.

The sRGB standard included a display target gamma of 2.2 so that legacy Rec 709 content would be displayed properly as well.
 

·
Banned
Joined
·
1,762 Posts
Discussion Starter #8
Rec 709 and sRGB standards were both targeting 2.2 display gamma.

While the two power functions appear different, the average gamma (1/2.2) is the same. This way both Rec 709 or web based sRGB will look the same with display setting 2.2. Keep in mind even though the Rec 709 color space does not technically define a display gamma, the reference grade CRT's used in the mastering process were designed to produce a standard 2.2 response.

The sRGB standard included a display target gamma of 2.2 so that legacy Rec 709 content would be displayed properly as well.
Thanks. This whole gamma business can be tricky! I think I've arrived at a reasonable compromise for now. Since I do most of my viewing in the dark, gamma 2.2 does look a bit too bright and washed out. I was able to raise the brightness setting on my OLED one click without ruining perfect black, then I switched to BT.1886 and bumped up the 5% luminance just a little bit. Still not perfect, but it brings out some more shadow detail while still retaining the depth of the BT setting.
 

·
Registered
Joined
·
184 Posts
This in depth gamma thing is way above my understanding but as Harlekin pointed out and what my eyes tell me as a home theatre hobbyist, BT1886 seems to be a fricking mess. Depending on monitor and what content you are watching it either looks good or washed out crap. Flat 2.4 gives a lot of punch but at the cost of severe black crush. Flat 2.2 also tends to crush the first couple of steps but its the least offensive "standard" out of the three. Even if it was never defined as de facto gamma it was still used for years (for a reason apparently) and everything that I have watched, from DVD's and Blurays to computer/console games and graphics, looks good with it no matter what lighting. Sometimes I switch to 2.4 for kicks and extra punch but in general I guess its better to stick with 2.2 it until HDR fully takes over which, if I am getting it right, has no gamma as we understand it now?
 

·
Registered
Joined
·
792 Posts
A PLG (power law gamma) gamma of 2.2 (flat) is not a solution, and is not the default fallback either.

Remember that bt.1886 produces an average gamma of closer to 2.2 than 2.4 on most devices (black point of 0.02 nits or higher - which is the black point of your typical VA panel without local dimming - and not even on an ANSI checkerboard (where the black level will be even higher) -)

The sRGB standard has a different gamma curve, that is also curved and not "PLG flat".

So essentially just telling people to trust in a flat 2.2 gamma is also a ruse to tell a story thats not true.

The only people that were adhering to a flat 2.2 gamma were TV manufacturers, because they could. And people "color correcting" on or for uncalibrated monitors.

The story goes a little like this:

In the olden years it was a problem for manufacturers to get a a flat (PLG) gamma response. So when they were able to get one on the cheap, marketing told them its good, because its "pure".

Also they started with 2.2 (PLG) because most earlier devices werent able to produce 2.4.

But this created three problems.

- Movies are produced in DCI-P3 with a 2.6 gamma, transfers to 2.2 look very different.
- 2.2 never was a defined standard, it was a suggested standard for bt.709 - "because it was the manufacturers default" - in the days where none of them could produce 2.4
- Many movie and post processing studios used CRTs to master material until very recently - and those monitors gamma curves never where "PLG flat". Also their average Gamma often was around 2.5

The whole "we need a different gamma standard" movement started, because of the resulting issues.

There are widely circulated anecdotes out there of directors telling screeners "this isn't my movie" - because the projector was set to PLG 2.2.

There are only two reasons for making PLG 2.2 your gamma target -

- because your TV cant handle a higher gamma target (cant produce high contrast 100% colors), respectively - because your TV can't even produce VA black levels - and curving 5% and 10% Ires even more, will produce an even more washed out image

or

- Because you are banking on a notion, that the entire movie industry color corrected on or for LCDs with factory presets.
--

So either you are banking on people in post processing having worked on crappy screens, or you are banking on people being idiots (and now you wan't to replicate their settings.. ;) ).

The idea to leave the "manufacturers default" of PLG 2.2 wasn't a bad one, and the reasons or it still uphold. Its just, that the presumed "follow up standard" bt.1886 is an utter mess.

So here is the conclusion - its probably a good thing to use a "black compensated" gamma (curving it at lower IREs), its probably a good thing to adjust the higher end to 2.3 or 2.35 - its just that no one can do it to a standard that makes any sense.

Because in making bt.1886 a compromise between "people that want flat gamma" and "people that want CRT characteristics" it lost all of its meaning and value.

So the actual solution would be to ask the governing bodies again what the "intent of bt.1886" was in the first place - but they can't give an answer without acknowledging that they bastardized the "standard" in the first place - that Spectracal jumped with JOY to integrate it to ruin calibration even further (more variability) - and without kindling another fight with the manufacturers.

"directors intent" vs. "linear is beautiful, because now we can do linear" -

The reason why we arrived at this stage is, because in the early days some idiots in high positions proclaimed, that PLG 2.2 is "good enough" as reverse gamma on early LCDs - so we transitioned from "CRT gamma was al over the place" to "LCD gamma is vastly different, but we declared it "good enough", to "now we have perfect screens for bt709 content, but we messed up forming a best practice standard - and instead caused more issues".

There are a couple of other perceptual issues as well - such as the visual system cutting contrast sensitivity in dark soroundings - so PLG 2.4 crushing lower IREs becomes even more of an issue - in dark rooms.

Or the visual system expanding the contrast area in better lit environments, making bt.1886 on some LCDs even more of a disaster, because the worse blacks get perceptually enhanced...

But no one said anything about that ever. Although it was visible and more than obvious, for the entire time. I could punch the ITU, Spectracal, ISF "theorists" in their freaking face - for letting all of this happen.

As a result btw -

We know btw - what you should calibrate a crappy LCD (not even VA panel blacks) to -
namely PLG 2.2 (because curving gamma would accentuate the ppor black levels / because maybe it couldnt handle more contrasty IREs at the high end) -

but we don't know at all - what to calibrate our good devices to.

Because no one knows "how much to curve gamma on the low end" (arguably this could have something to do with room lighting, or brightness "bleeding" of the screen - but it has almost nothing to do with a screens black level being "super dark" or "super, super dark" or "super, super, super ..." - although the bt.1886 formual suggests that) - and no one knows why bt.1886 suggests both "black level compensation" and "perfect PLG 2.4 level flat" depending on display black level.

Because either black level compensation is something we wan't because most mastering displays "used it" (CRT argument), or it is something we don't want, because "linear is beautiful" (and crashing 16, 17, 18 by default - because bt 1886 says so - in the formula) - but not

"something my black level measurement decided for me" - *shrug*

Also - again, how come that no journalist has even recognized this as an issue... Again, people like Chad B are out there, calibrating TVs to "phantasy curves" - but no one has ever criticized the big three (ITU, ISF, Spectracal) for having caused those situations, by being utter, utter morons.

;TLDR - A flat 2.2 gamma is NOT the solution, the issue is the "variability" of the "black level compensation curve for 2.4 derived gammas" that is out there. You can't fix that by just not looking at the issue - and proclaiming PLG (flat) 2.2 the promised land "because, its like the old transitioning times - still broken, but everyone being forced to deal with it, so lets copy that -".

Also - always remember, if we cant agree on gamma - the notion of "just noticable difference" ("dE below 3!") is bull****, so is a "black detail standard", and so is basically calibrating TV screens.

Photographers and visual Artists/Photoshop Jockeys don't have this issue - because they worked within sRGB, or Adobe RGB, which have a defined Gamma curve -

If you are looking for a red thread that goes through all of this it is - the TV industry always getting, what the industry wants (selling you "better" thats actually worse), they decide how strict a standard is in the end, or if it is a loose piece of crap like bt.1886. If they wan't to sell more TVs with worse black levels, they just make PLG 2.2 a "defacto standard" and it gets "copied" by "experts" that then proclaim it "good enough", money exchanges hands (speculatively), and hilarity ensues.

Its the same thing with HDR right now btw. Color accuracy is broken five ways till sunrise, everyone knows. (Dolby this time has made it their business model to "try and fix it" - but isnt there yet (havent convinced studios, or post production facilities to apply their logic, and "limit" their material to a certain standard - thats proprietary -) - but all you hear are the freaking ahs and ohs from journalists that are nothing more than empty faces that repeat industry factoids and the usual "opinions of the masses" crap, thats get formed with every press release.

Also as we have learned by now - calibrators, never say anything. Regardless how broken the logic or systems are that they are selling to people -

and Spectracal ("industry leading") happily applies target curves and make them "the standard", that break six different ways six months after they are "established" and introduce more color variability than - being able to reduce.

Hurray - its a mess. And no one wants to do anything about it.
 

·
Registered
Joined
·
792 Posts
And thinking about it - with bt.2020 ("HDR") this doesn't get necessarily better -
the uncertainty just gets shifted around -

So while you now have a "fixed" gamma target that stays the same on every device, regardless of room light -

- "Directors intent" isn't upheld, because DCI-P3 gamma is different.

- A correct gamma and color reproduction is still "variable", because according to the now "open" standard of bt.2020 - max gamma in reality is defined by "whatever capabilities the industry can produce to -" and that changes every year - also, you have color correction professionals trying to create corrections based on different monitor calibrations (peak light output can be vastly different) -- and you now have a seperate entity that promises "to calculate most of the resulting errors away" (with metadata) - but, you need to buy the newer display where they cant.

- The bt.2020 EOTF was made up by Dolby (who are now selling you the math to "adhere to this curve" as a proprietary solution), based on "perceptual experiments" that werent peer reviewed, or replicated - then they "open sourced" the curve to make sure it got picked up (but kept their math for "adhering to it" under wraps) -- also they made sure to craft it with a target that the consumer market will only ever reach in 10-15 years, if at all -- so they have ample time to sell you their correction math...

You know what - looking at it that way - this whole "science" is a joke - this is an industry creating "sales opportunities" that serve them and not the film fan, or the end user.

The important point is always to keep enough "uncertainty" around all the time, to be able to proclaim that everything you produce is "up to the standard" - regardless, of what you produce, or how it performs. And because this time around more color variance is to be expected, they created a new black box (Dolby Vision) Calibrators now can blame on, that no screen ever will look the same. For the next 15 years...

But thats the new beautiful ("we should only be limited by what a display can produce" > the solution is always "spend more") - according to the industry leading gremiums.
 

·
Registered
Joined
·
758 Posts
So essentially just telling people to trust in a flat 2.2 gamma is also a ruse to tell a story thats not true.
Actually professional grade CRT monitors that were used as reference in the mastering environment were extremely flat 2.2. In theory BT.1886 is trying to emulate their (CRT) response, by targeting 2.4 based on the assumption that's what they measured. Turns out not to be case though
I only ever report our actual findings - and as we have profiled 100's of grade 1 Sony CRTs over the years, that is what I base my comments on.

From my professional background I can state that the old Sony Grade 1 CRTs were the standard for many years, and when profiled they had a display gamma that was very close to 2.2. That is why that became the default for calibration on later, more modern, displays in the professional world, as there needed to be back compatibility.
Quote is from Steve Shaw from with Light illusion
 

·
Registered
Joined
·
9,309 Posts
Actually professional grade CRT monitors that were used as reference in the mastering environment were extremely flat 2.2. In theory BT.1886 is trying to emulate their (CRT) response, by targeting 2.4 based on the assumption that's what they measured.
2.4 is not an "assumption" made by the BT.1886 people.

EBU TECH-3221 (2007) says:
4a Display gamma The electro-optical transfer function should be a power law (commonly referred to as "Gamma"). The default value of display gamma that is required to match the television programme producer’s intent is 2.35 in a “dim-surround” environment [6], as per the measurements reported in section 4.2 in [5].
...
It has been found from measurement techniques, progressively refined over several decades, that a correctly designed CRT display has an EOTF gamma of approximately 2.35.
[5] is BBC Research Department Report RD 1992/13, by A. Roberts ("Measurements of display transfer characteristics using test pictures"), which says:
https://archive.org/stream/bbc-rd-reports-1992-13/1992_13#page/n1/mode/2up

4.2 Transfer characteristic
A range of conventional professional and domestic displays were measured using the test signal and produced gamma value of 2.3 or 2.4.
BT.1886 itself says
The EOTF specified in Annex 1 is considered to be a satisfactory, but not exact, match to the characteristic of an actual CRT.
...
the electro-optical transfer function (EOTF) of CRT displays differs amongst manufacturers, amongst models, and amongst regions, as well as varying with the settings of contrast and brightness;
They then included in Appendix 1 an alternative approximation that can be customized to match closely a specific CRT display.
 

·
Registered
Joined
·
792 Posts
I can pull quotes as well:

A cathode-ray tube (CRT) is inherently nonlinear: The intensity of light reproduced at the screen of a CRT monitor is a nonlinear function of its voltage input.
From a strictly physical point of view, gamma correction can be thought of as the process of compensating for this nonlinearity in order to achieve correct reproduction of intensity.
...

In practice, most CRTs have a numerical value of gamma
very close to 2.5.
...

The actual value of gamma for a particular CRT may range from about 2.3 to 2.6. Practitioners of computer graphics often claim numerical values of gamma quite different from 2.5. But the largest source of variation in the nonlinearity of a monitor is caused by careless setting of the Black Level [...]
Those are quotes from Charles A. Poyntons "Digital Video and HD: Algorithms and Interfaces"

https://books.google.at/books?id=Sa-cEY_ZeEQC&pg=PA315&lpg=PA315

- which gets quoted in many papers as one of the standard literature books on calibration math.

(f.e.. http://www.hpaonline.com/wp-content/uploads/2015/02/Poynton-TR2015-PUDI-pp.pdf )
--

Also -

I refer you to a posting from zoyd from 2012 - where you can see both aspects "ploted out" ("usual avg gamma of 2.5+", inherently not linear" - in "strange ways") -

edit: Found the posting I was initially looking for -
http://www.avsforum.com/forum/139-display-calibration/1409045-how-power-law-gamma-calibration-can-lead-crushed-blacks-2.html#post22011784

- a second one would be: http://www.avsforum.com/forum/139-display-calibration/1409045-how-power-law-gamma-calibration-can-lead-crushed-blacks.html#post22004457


Also, I see no reason why Sony would wan't to produce CRTs that can display flat 2.2 gamma curves.. PLG 2.2 was a standard "in practice" but never defined as an official standard, so why a manufacturer would go out of his way to produce something thats inherently hard to do with that technology - without there actually being a standard that would demand it... ;)

In general CRTs, werent PLG flat, and they didn't produce an average gamma of 2.2.
You can also read Spectracals "argument" for bt.1886 (that now holds a record consisting of one untrue statement every second paragraph) - because they sold you on bt.1886 (derived from a 2.4 gamma target and inherently non linear) based on this being needed to "emulate CRTs from the past".

http://www.spectracal.com/Documents/White Papers/BT.1886.pdf

That said - flat doesn't equal flat as we talk about it in here - PLG 2.2 is a power function, the question in here is mostly if you need to curve it to apply black compensation (respectively if you should push higher IREs to 2.3, or 2.35 ...) - the initial argument for that is also CRT related - in that production professionals would push black to hide poor low light performance of equipment especially to combat noise in such situations.
 

·
Registered
Joined
·
792 Posts
@Dominic Chan:

The default value of display gamma that is required to match the television programme producer’s intent is 2.35 in a “dim-surround” environment [6], as per the measurements reported in section 4.2 in [5].
This is an argument for perceptual correction in "dim-surroundings" which is based on the finding, that in low light environments our eyes reduce the contrast range we see.

The idea is to push gamma to balance that on high ires. (More "contrasty" picture.)

But - the same perceptual argument, If I understand it correctly, also suggests if you do that by pushing gamma to PLG 2.35 you will also reduce the visibility of darker colors, that now become even darker, some of them crushing.
-

Also I find the notion that gamma needed to be pushed beyond 2.2 because of "roomlight conditions" somewhat suspect, if I am being honest. First we move away from the original contents gamma target by a mile - then we move back a little (once TVs were able to), because of roomlight conditions?

Wasn't the understanding always, that "calibrating" in the industry should be done with a dim room "target"? So whats the big surprise about that 2.2 isnt perceptually ideal - just as 2.4 became more widely available commercially...
--

Also understand, that I have noted that I believed a 2.4 derived gamma (with high ires in the 2.35 range) would probably be the correct one for bt.709 content -

The issue is, that with bt.1886 - that 2.4 "derived" gamma can range from 1.8 avg to 2.4 avg just because of the black point alone. It can loose all its "black compensation" aspect - just because of the black point alone. - it can tell you to crush near blacks, just because of the black point, and it can tell you to wash out everything up to IRE 50, just because of black point. The industry groups never bothered to define "an ideal 2.4 derived curve", or at least limits for where to take that formula - so now we have a big problem.

The idea, that the resulting differences are intended, because they would magically produce a perceptual match on different displays is complete bull****.

bt.1886 doesn't mean gamma 2.4 anymore. It can just as well mean gamma 1.9 - because of the variability that function induces on actual displays. bt.1886 isnt a simple Power function - and that it gets harder to equate it with PLG 2.4 is actually part of the issue.
 

·
Registered
Joined
·
792 Posts
Here is something interesting - in the thread I linked to where zoyd plotted the "alternative bt.1886 curve" (in the bt.1886 paper the ITU actually defined two potential formulas - a general one ("perceptually satisfying match to a CRT") and a more strict one ("for a more accurate match to a CRT")) - he also shared the spreadsheed he used to plot it - which allows us to compare curves at different settings.

The alternative bt.1886 function has you take a screen measurement at the 1.83% IRE level instead of measuring the black point. Which is good for looking at "intent", as 1.83% isn't necessarily 0.00 nits (one extreme of the contrast spectrum) - always. ;)

Measuring 1.83% (actually 2%, but I shifted 100 ire down to 90 nits to compensate ;) ) on an LG 2016 Oled leads to 0.000 nits (2% crushing at default settings) up to 0,008 nits (with a hefty black compensation curve) -

If you plot the two values you get the following:


for 0.008 nits


for 0.000 nits

- the blue line on both graphs is bt.1886 "normal" (not alternate), plotted for a black level of 0.0005 nits (so "almost 0").

Also be aware, that the curves above include 1%-4% points - so if you are used to looking at those graphs in 5% intervals - you have to ignore the first 4 points of the curves.
-

What you are able to see here is, that the black compensation part of the bt.1886 curve is only intended for devices with a poor black level - in the "more CRT like" alternative bt.1886 function, the ITU actually recommends to "crush black" even further, by increasing gamma - rather then decreasing it.

So the "necessary for CRT emulation" part of their argument is almost exclusively "moving gamma up" to 2.4 or even 2.6 - and the black compensation part is mostly "tucked on" for bad black level devices.

Then I reread the specifications again - and indeed -

For moderate black level settings, e.g. 0.1 cd/m2, setting the LB [= luminance of black = black level] of the EOTF to 0.1 will give a satisfactory match to the CRT. In the event the CRT is operated at a lower black level, e.g. 0.01 cd/m2, the EOTF will
provide a better match with LB set to a lower value such as 0.0 cd/m2. When it is necessary to more precisely match a flat panel display characteristic to a CRT, the alternative EOTF formulation specified below may provide a solution.
So the ITU actually suggests to use a black level of 0.0 nits in the bt.1886 formula (= PLG 2.4 flat), if you want to emulate a CRT with 0.01 nits black leve rather than a CRT with a 0.1 nits black level.
--

So what does this mean?

The iTU tells you to crush lower IRE and use 2.4 to 2.6 for upper IREs, but basically 2.4 or 2.6 flat - if you want to "emulate a CRT" with a somewhat decent black level.

Why the ISF (and Calman) suggests a black compensation - and then based on the black level of the LCD/OLED you are calibrating - "takes it away" - is unknown. It doesnt make any sense - for perfect black level TVs you can calibrate to any target --

so what the ITU is actually saying is the following --

If you wan't to emulate the gamma curve of a CRT with a "moderate black level" (0.1 nits), use black compensation up to the extent of 5% IRE landing at a gamma of 1.8 (black compensation), if you want to emulate a CRT with a black level closer to 0.01 - make it 2.4 PLG flat, or 2.6 PLG flat (basically).

Funny.

Because that means, that the huge variety of bt.1886 curves we see on todays displays is intended (from gamma 1.8 at 5 IRE to gamma 2.4 (and even above) at 5 IRE) - just not, that the black level of every TV is supposed to define how the curve looks.

You are basically meant to decide based on "what kind of CRT you want to match".

The black level of your actual LCD/OLED only factors in as much as "what calibrations are possible" and what the actual black level comes down to - but actual black level is less characteristic than the picture differences (depth, colors) introduced by two moderately different bt.1886 curves.

Also near black crush is intended, according to the ITU.

So bt.1886 was formulated to introduce a huge potential variety of gamma curves.

And then implemented wrong by calibrrators, the ISF and Calman - in that they suggested that your displays (LED/OLED) black level should decide what "kind of CRT" you are emulating.

And even at 0.01 nits black level - (that OLEDs and good LCDs can reach), the ISF and Calman go against ITU recommendations, which are - if you want to emulate a CRT with that black level - use a Lb (luminance of black) of 0.0 cd/m2 rather than 0.1 cd/m2.

While this looks like a mute point in the sentence above - the resulting gamma targets are vastly different. (5% IRE at 2.1 vs 2.4).
 

·
Registered
Joined
·
792 Posts
So when the head of the ISF, Joel Silver, started to lobby at manufacturers to include a "bt.1886 preset" - they must indeed have thought that he is an idiot - because what he basically did was to tell manufacturers - to include a mode that would make every modern TV look like a different CRT.

NEVER was there the idea of those different curves "perceptually matching" indicated or spoken out loud. Its just what the ISF and Spectracal - wrongly - derived from that.

bt.1886 was used as something it never was - a curve that could be a standard for TV calibration - when the actual black level of the OLED/LCD TV calibrated would make up its progression. It never was meant to.

So instead of defining "an ideal CRT" the ISF and Spectracal actually made sure that you made your TVs look like "all kinds of different CRTs from the past". There never was a perceptual match indicated or promised for different gamma formulas resulting from different CRT characteristics - according to ITU.1886.
 

·
Registered
Joined
·
9,474 Posts
And thinking about it - with bt.2020 ("HDR") this doesn't get necessarily better -
the uncertainty just gets shifted around -

So while you now have a "fixed" gamma target that stays the same on every device, regardless of room light -

- "Directors intent" isn't upheld, because DCI-P3 gamma is different.

- A correct gamma and color reproduction is still "variable", because according to the now "open" standard of bt.2020 - max gamma in reality is defined by "whatever capabilities the industry can produce to -" and that changes every year - also, you have color correction professionals trying to create corrections based on different monitor calibrations (peak light output can be vastly different) -- and you now have a seperate entity that promises "to calculate most of the resulting errors away" (with metadata) - but, you need to buy the newer display where they cant.

- The bt.2020 EOTF was made up by Dolby (who are now selling you the math to "adhere to this curve" as a proprietary solution), based on "perceptual experiments" that werent peer reviewed, or replicated - then they "open sourced" the curve to make sure it got picked up (but kept their math for "adhering to it" under wraps) -- also they made sure to craft it with a target that the consumer market will only ever reach in 10-15 years, if at all -- so they have ample time to sell you their correction math...

You know what - looking at it that way - this whole "science" is a joke - this is an industry creating "sales opportunities" that serve them and not the film fan, or the end user.
There no need to re-calibrate HDR TV for different mastering monitor used levels per each studio, you just calibrate for up to each levels the display is capable of.

Since ST.2048 is an absolute transfer function, if you have in the future a display capable of 4.000nits calibrated and you want to view a movie mastered @ 1.000nits, when you will watch that movie you will see up to 1.000 nits.

Also from the metadata I have from about 90 movies, there is tittle that have been mastered @ 4.000 nits but have as brightest sub-pixel (MaxCLL) 247nits only (GoodFellas)....also there movie that have been mastered @ 1.000 nits but they have MaxCLL 1095 nits (The Hunger Games: Catching Fire) or other example a movie mastered @ 4.000 nits but has MaxCLL 6968nits (The Magnificent Seven).

Each edition of a movie for different distribution it's getting it's own mastering, so there will be mastering for theatrical release: 2K SDR, 4K SDR (some movies will get additionally 3D, IMAX, IMAX 3D, Dolby Cinema HDR and for home release: UHD-HDR10, Dolby Vision, Blu-Ray, Blu-Ray 3D, DVD...so it's release it will have it's own look and mastered under different specs from the team of colorists/cinematographer/director.

A good article about this: https://www.questia.com/magazine/1P3-3765389451/making-the-most-of-disparate-displays
 
1 - 19 of 19 Posts
Top