Originally Posted by SiegeX
Why did the manufactures of my plasma decide to make gamma dependent on the average picture level?
I'm not familiar with your TV. My TV has a setting that can change overall brightness depending on the APL of the on-screen image, and the basic idea is that it allows for more difference between dark and bright scenes than the TV can do without the setting. Gamma doesn't necessarily depend on APL, but the way gamma is typically measured is thrown off as the TV changes brightness depending on APL.
How exactly does a correct gamma manifest itself as far as PQ is concerned? For example, if I decided to go with a average gamma of 1.9, how would that PQ compare to an average gamma of 2.2? What would/should change exactly, is it just the max brightness?
To understand gamma I think it's helpful to look at the luminance graph. The luminance graph simply represents measured brightness. The graph in ColorHCFR is normalized, so that regardless if white measures 20 FtL or 40 FtL white always shows up on the graph as 100% brightness.
Lets say you open a new measurement file in ColorHCFR and go to the luminance graph. There will be a dotted reference line on the graph. If you mouse over the reference points you can see where exactly they fall between black and white for brightness. For example on the default 2.22 gamma a 50% gray has a normalized brightness of 21.46%. If you were to lower the reference gamma to 1.90 (Advanced pull-down menu, Preferences, References tab, Reference Gamma) you would see that the normalized brightness of a 50% gray would increase to 26.79% brightness.
My main point of bringing up the luminance graph is to call attention to gamma being relative to black and white. What matters for gamma is how brightness changes between black and white. For example if you increase the measured brightness of 50% gray relative to white (go from 21.46% to 26.79%) then you have lowered the gamma for that point. Another way to say it is that, if white remains the same, a darker 50% gray represents a higher gamma.
To answer the main question, what I notice with a higher gamma is that the TV comes out of black more slowly. If gamma is too low then the image can begin to look "washed out" as the near blacks come out of black too quickly and become too bright. To combat a lower gamma you can use more light in the room to try to offset the brighter image near black.
Is it worth it to get an accurate average gamma of 2.2 even if that means my max luminance is ~20 FtL? Tom Huffman's guide says plasmas should be in the 30-40 FtL range. Things do look considerably dimmer and my contrast ratio has fallen to ~400 but perhaps this is normal and it's just that my eyes need to adjust to these "correct" settings.
In my opinion any discussion of gamma also has to bring up room lighting. Generally with more light in the room you will want to have a brighter white. More light in the room will also tend to make near blacks appear darker, so a lower gamma is also acceptable in a brighter room. From what I remember I think whites on my TV have generally measured under 30 FtL, but I also tend to use my TV to watch movies with almost no light in the room.