AVS Forum Special Member
Join Date: May 2008
Location: San Francisco - East Bay area
Mentioned: 3 Post(s)
Tagged: 0 Thread(s)
Quoted: 188 Post(s)
No that's not exactly right...
Average gamma is a BAD indicator of what the displays REAL gamma is doing. Because each point in the grayscale (except 100% and 0%) can have a gamma calculated for them. You don't want a gamma of 2.5 at 90% that falls to 1.8 at 10%... it might AVERAGE 2.2 gamma, but images are going to look like crap. You want the gamma at every grayscale step to be your target (I prefer 2.25 Gamma as the target for most "modern" displays since mastering is done on displays set to 2.2 or 2.3 gamma, no matter WHAT anybody else says, that's what is actually being used during editing/grading and evaluation). What you want is a gamma of 2.25 for every step in the grayscale... or as close to that as you can get. .05 variation in gamma is fairly difficult to see, so if the gamma stays between 2.2 and 2.3 for every grayscale step, you'll have good-looking images. A variation of .1 in gamma measurements is moderately visible so +/- 0.1 variations (total range of .2) is getting fairly obvious and will impact the appearance of images.
Bottom line is that AVERAGE gamma looks at all the gamma values and calculates an average. You could have half the grayscale at 0.6 gamma and the other half at 3.3 gamma and the average gamma would be 2.25 but images would be horrendous. You really need to focus on gamma for each grayscale step and try to minimize the gamma differences between steps.
"Movies is magic..." Van Dyke Parks
THX -- ISF -- HAA