Ok, I have just been over this with our colour scientist, and no, we are not going to discuss this any further.
We have very valid reasons for the value we use, and we will not be changing it.
Very happy for others to disagree with us, as that is each and every one's prerogative.
But we stand by the value we use.
But please understand this is not something that limit our users to our values.
As we do not link profiling with calibration all users can define their own 'values' for final calibration after the profiling process has finished.
Just to be clear, you say sRGB is 1.95, but you don't have any data to back that up.
You also won't disclose what formula you use to calculate that.
We are wrong, you are right, and the reason you are right is proprietary.
Thank you for the contribution to the forum.
CalMAN Lead Developer
I've always thought that sRGB was 2.2, from what I've heard. But by trying to understand this thread, you guys seem to be leaning towards 2.3? I'm calibrating to sRGB for games mainly so if I'm wrong in assuming it's 2.2, is there a right one to be choosing?
I believe the authors of sRGB were trying to approximate 2.2. However, the effective exponent of the sRGB reverse transformation varies from 1.0 at black to 2.275 at white. It's about 2.224 at "middle gray" though, as shown on my table below.
|% Stimulus||Relative Luminance||Effective Power
|4.045% (break point)||.0031||1.798|
There are alot of different ways of computing an "average" exponent on a function like this. One of those methods might well yield a value in the 1.95 range. But based on the table above and graph below, I think a 1.95 gamma would be a poor approximation to the sRGB reverse transformation.
It would not have been that difficult to design the sRGB reverse transfomation to hit exactly 2.20 at 50% stimulus rather 2.224 btw. IMO though the real target of the sRGB transformation was probably 0.45 (or 1/2.2222...) on the encoding side, as shown on the graph below.
(click to enlarge for a better view)
Prior to the implementation of sRGB, most PC graphics applications (excluding Macs and SGI) typically used either 0.45 or 1/2.2 (0.454545...) for gamma correction. I guess the designers of sRGB probably felt that 0.45 was the simpler and perhaps more commonly used value, and used that as their guide.
The 0.45 and 1/2.2 values were most likely adopted by the PC graphics industry from the older NTSC camera gamma standard btw.