Originally Posted by Iron Mike
0.5 en with 2.4 de will get you the desired final image gamma of 1.2... and using 2.4 in the inverse Rec 709 encoding function will approximate a straight 2.2 gamma curve...
just because the specification has always been a mess, does not mean that post (--> esp. VFX) has not needed to decode camera footage... and (for the lack of a direct specification) a lot of people used the exact inverse encoding function...
and IIRC - for the sake of history - 1.0/2.2 - ca. 0.4545 - was the standard for television camera encoding before the advent of color TVs and was formalized in 1953 with the NTSC broadcast television standards...
Yes, though I don't know anyone advocating using an inverted 709 gamma curve with 2.4 substituted for the 1/0.45 exponent. That would be odd, though I guess without a standard people will do all kinds of wacky stuff.
And yes, until BT.709, the NTSC standards specified 1/2.2 for the camera encoding. The fact that the current standard includes a scaled power curve with a roughly 1/2.2 exponent is yet another source of confusion, because it's not obvious that 709 is intended to be approximately 1/2, i.e. it changed the standard from 1/2.2 to ~1/2.
Poynton used to talk about scene-referenced gamma and screen-referenced gamma, which was confusing. I think the breakthrough was when he and others realized that once something has been recorded, it's always screen-referenced from that point out. The camera gamma curve is interesting, but the important curve is the one for the screen. Once people got away from trying to figure out where the inverse camera gamma fit into the whole scheme (the answer: nowhere), things fell into place pretty rapidly.
If you look at Poynton's gamma FAQ today, it's kind of all over the map. It could really use an edit to bring it up to his current thinking and add references to BT.1886. He talks about the 2.5 gamma of the electron gun and how TVs should be modeled as L = (V + e) ^ 2.5, but he also talks about the inverse of 709 as being relevant for decoding camera material, which I think he would probably repudiate today. But I'm not sure - maybe I'm misunderstanding him.
In the end, I think for the folks that are calibrating to 2.2, the actual difference between absolute 2.2 (with whatever random black point compensation) and BT.1886 is not going to be that far apart in the midtones and low-midtones. Where it's going to help a lot is in smoothing out the shadow detail. Right now when you calibrate any real-world monitor to 2.2, you have to make some decisions about how to handle the fact that an LCD monitor can't actually display the levels near black at the levels implied by straight 2.2 (or even inverted 709 for that matter). So all the calibration software has a choice of clipping the black detail or putting some kind of knee on the low range, and exactly what knee you use is going to affect the shadows a lot. Switching to BT.1886 should make that situation better. Of course, it'll also mean that the midtones will shift up and down on different LCD monitors with different absolute black levels, and people will complain about that. People will complain: of this I am sure.
I'm going to Disney World next week, so I may drop out of the conversation for a while. Hope everyone has a great holiday season and New Year!