Originally Posted by PE06MCG
My second sentence in your quote of my post was in fact not an argument merely my submission / capitulation to the advance of the digital age and was not an attempt to continue my 'Luddite' tendencies.
I enjoy my amateur calibration and hopefully will continue to learn.
Incidentally, whilst digital technology is as you say a series of 0's and 1's, am I correct in saying its purpose is to emulate analogue technology (including the old fashioned voltage and presumably valves) and is not necessarily a completely new science?
What are your views on using the term gamma.
Digital video doesn't obviate gamma... at this stage in digital video, gamma is still a necessity. 8-bits (minus 0-16, but including 236-254, 255 is "reserved" and not used, ever (supposedly) but the only way 8-bits is enough is if you "bend the line" into a gamma curve so there are more bits allocated to highlights to prevent contouring. Someone else (don't recall if it was this thread or a different thread) seemed to indicate gamma was necessary to provide more bits to avoid contouring in SHADOWS... that's not true. In fact, a gamma curve like we use, allocates fewer bits to shadows... just replace % white on the x-axis with appropriate digital bits... the flatter the "curve" the more distance along the curve is covered by each bit. At the right side of the gamma curve, the slope of the line gets steep and there are many more bits allocated to highlights than other parts of the luminance scale.
So 8-bit digital video is figitty and fussy... if it is not PERFECT through the entire playback chain, you end up with visible problems in the image (most often, contouring). 10-bit digital video is NEARLY foolproof, but it could be screwed up but it would have to be a much more egregious error than would upset 8-bit video. Once 12-bit video becomes standard/normal, it would become almost impossible to introduce video problems. Obviously, you could still induce problems if you were trying, but it would be pretty difficult for visible problems to exist if you were reasonably competent at creating the content. At the point 12-bit video becomes the norm (perhaps with Samsung-like internal processing done using 18-bits so controls are useful over their entire range of settings), we could, in theory, do away with gamma and use a linear relationship between digital levels (or % white). That means 50% white would be 50% stimulus (gamma in the range of 2.2 to 2.3 puts 50% stimulus at about 75% white). And legacy content (everything made prior to the "switchover date" could carry a "gamma flag" telling displays to apply an appropriate gamma curve while new content would have either no flag or a "no gamma for this content" flag. It would then be simple to manage.
So gamma COULD disappear in the future, but now is much too soon to get rid of gamma, we just don't have enough bits in current consumer video to get good results without gamma.