Rolls-Royce, your testing methodology worked perfectly, and I am forced to eat my hat. Calibrated Gamma of 0 tracks 2.2 perfectly. It's the pre-calibration settings (which I was able to also test) that were off with blue tracking closer to 2.2 and red and green both tracking more to 2.4, which I think is what I was seeing. (The funny thing is, both the new Sony and my old LG seemed to handle gamma similarly uncalibrated!) At any rate, I guess I'll need to either get used to the new 2.2 or switch to 2.4 - still deciding.
Thirdkind, thanks for verifying the gamma test patterns just don't work. I've tried multiple times (uncalibrated and then recently calibrated) and couldn't use them at all. You're probably right as far as the upscaling, and I'm wondering since the days of brightness causing clipping or crush are gone now on modern TV's if that has something to do with it as well.
You make a good point re source material. I've tested on shows I recently watched, both digital (AppleTV) and DVR, since I'm familiar with how they looked prior to calibration.
Thanks all for all the help figuring this out.