That thought occured to me as well. However the last link above suggests that both computer and TV CRTs have about the same gamma of 2.2 (+/- 0.2) and that differences in measurement are due mostly to "user error" in Brightness and Contrast adjustment... which I can somewhat believe since changing Brightness gives a different gamma reading on my 34XBR800. (FWIW, in typical viewing conditions, with Brightness/black level adjusted for low ambient lighting, my 34XBR800 gives a gamma reading in Pro mode of about 2.25 which is well within the average of 2.2 +/- 0.2.)
The main reason I was interested in average CRT gamma is I was hoping it might provide some more insight into the correction used on DVDs, and the difference between DVD video in an overlay and other software applications on my PC desktop. If I had to estimate the difference visually, I would have guessed that it was somewhere around 1.25, or possibly higher, rather than just 1.14. Most mainstream DVDs seem to have pretty consistent correction though. So they seem to be following some kind of standard, maybe something like the ~0.5112 camera correction standard. And I guess other apps on my PC are probably using 1/2.2 or ~.4545 correction. (2.2 is the monitor gamma setting I use in Photoshop to match other desktop applications, so that seems about right.) And the difference between those two correction levels (~.5112 & ~.4545) works out to something in the neighborhood of 1.125 (pretty close to 1.14).