Originally Posted by Phase700B
I would agree , but only to a point. If one were to be totally subjective, then each input device would have to be verified as to whether it tracked all calibration parameters according to the Rec 709 spec you speak of. As soon as you have all devices going through an AVR and one cable, the only device that is really "calibrated" with the TV is the device used for the calibration (usually a DVD/Blu-ray player). I have run Blu-ray players through cal and they all have enough difference to warrant individual calibration. So if an owner has a Blu-ray player, XBox or PS3, cable or Satellite box, the only one used during the calibration is calibrated to the input. There is that much difference between devices. And, when using a Blu-ray player for signal source (such as the Sony S550) it also has it's own video settings, ergo, those would have to be at some default.
Any device that itself can be used to generate the test patterns (grey scale, R/G/B/C/Y/M) will result in a proper calibration for that device. I always use the blu-ray/dvd player attached to a tv (directly or through an a/v receiver, however it is connected) to calibrate the tv, as that is the device that will be used for source content (movies). It doesn't make sense to use a different blu-ray player to generate the test patterns than the blu-ray player that is used to actually view movies, or to directly connect the blu-ray player to the tv for the calibration if it is normally connected through an a/v receiver. The signal path should be left as is it used.
But for devices like a cable box, DirecTV receiver, etc., there may not be a way to have them generate their own test patterns. The optimal settings generated by the use of the blu-ray player can be applied to other HD sources, like a DirecTV receiver, with the assumption that the DirecTV player is also outputting a Rec 709 HD signal. I have never seen a case where the settings generated by using a blu-ray player to generate the test patterns looked so far off when applied to other HD input devices as well. Granted the content from channel to channel on DirecTV can be drastically different quality, but that's a completely different issue.
If a blu-ray player has its own video settings (like our BDP-S550 does), those settings should be nulled out prior to calibration anyway since they would of course affect the results.
If I had a system where there were multiple source devices that could generate test patterns (say a blu-ray player and an Xbox 360 with an HD DVD drive), I would do an initial calibration with one device, and then check the data with the other device, making any adjustments as necessary, of course. But it's not often people have multiple devices like that anyway.
In my theater, I have another BDP-S550, but I also have two 400 disc DVD changers that are run through a Pioneer VSX-1120 and upscaled from 480i to 1080i. In my own case, I use the blu-ray player to calibrate my projector for it and my DirecTV receiver, but I used a DVD with test patterns from the DVD changer to calibrate my projector for the DVD changers, thus giving me 2 sets of projector calibrations even through everything runs through my a/v receiver.