Originally Posted by Robbks
I find the target gamma standard issue a incredible one.
I come from a photography/ science background for my understanding of colour/ gamma/ calibration, etc and it's just common knowledge.
Profile your display for 2.2 Gamma and D65 whitepoint. simple..
Yes and no:
1) Simple in the fact that for many, many years there really wasn't much the "home viewer" could do about "gamma." ... If you had a CRT, chances are you got something sort of resembling a "power law" curve with a gamma ranging between 2.0 and 2.6, depending upon where you measured, and probably averaging somewhere around 2.4.
2) Simple in the fact that even on most modern displays (those with only 2pt or worse 1pt CUTs and DRIVEs) you're not going to be able to implement BT.1886 anyway.
In this case, a power-law gamma of 2.2 with a linear offset black level compensation* is probably the best you can do to approximate the BT.1886 function ... depending upon how bad (or good) your black level is. *Oops, just realized you'll need at least a 10pt WB for that too.
A word of advice, from someone that's "been there," before one launches a knee-jerk, reactionary assault upon the first actual "gamma" standard ever, perhaps one should first should investigate it, perhaps even experiment with it on real world displays ... particularly one with poor black levels where it would make the most difference.
PS: It's probably time to stop calling it "Gamma." Gamma is the name of the power-law exponent parameter, not the name of the function.