Ah, input lag, another little headache to add to the epic migraine of choosing an HDTV.
As a frequent gamer, I've been getting caught up in the mania of lag figures and stopwatch screenshots while trying to choose a 40-46 inch LCD. With the advent of Rock Band 2 and it's hardware-based lag calibrator, I've come up with some surprising numbers myself on my CRT HDTV.
The Rock Band 2 hardware uses a light sensor placed in front of the screen and a series of flashes to calibrate the time between outputting the flash and perceiving it on screen. I'd always assumed, being a CRT, my Sony Wega HD (model KV-30HS420) had no lag as long as it was sent its native resolution, 1080i, but apparently that's not the case. Here are the results I got when outputting the Xbox 360 at different resolutions:
1080i: 31 ms
720p: 14 ms
480p: 30 ms
480i: 45 ms
These were surprising results to me. Even the Rock Band 2 manual suggests a setting of 0 ms when using a CRT, but I don't think they figured on the oft forgotten HD
The good part about this is, I've apparently been playing Xbox 360 since it launched with 31 ms lag, unaware. To me, this means if I can find an LCD with 31 ms lag or better then it's fast enough. Comforting; the Samsung 6 series and the Sharp tested by ARogan seem to be able to get there, albeit only via VGA. However, I'll be playing at 1080p, not the 720p used in the tests, and our measuring methods are completely different. It would be silly of me to treat our results as though they were derived from the same methods. What I need, and am searching for, is a list from other players with the Rock Band 2 hardware who have used the auto calibration and posted their figures for their TVs! As it's hardware based, it'd be a nice standardized way to compare input lag on a range of current HDTVs. If I do find this, perhaps on the Rock Band forums, I will post it.
Aside from this, I'm baffled by my much faster lag results when setting the Xbox 360 to 720p. If this was an LCD I'd understand - sending 720p would mean the TV would only have to scale to 1080, not de-interlace and scale. But my CRT HDTV certainly can't output 720p (as I recall from their heyday, almost none can) so presumably it has to scale this to its native 1080 and, erm, 'interlace' it too. How can it do this in half the time it takes to simply display a signal sent to it in its native 1080i with no conversion required? Some of it may come down to the Xbox, which people often cite as running largely at 720p internally then scaling to whatever resolution you've chosen in the display settings, which must take some time. This theory doesn't bode particularly well for my hope that, say, feeding a Samsung A630 a signal in its native 1080p will yield faster results than ARogan's 720p example, which it obviously had to scale.
All of which, of course, leaves me flummoxed until I can find that mythical thread of auto calibration numbers from Rock Band 2 players.