Originally Posted by tommyv2
I stand by my logic: If there's multiple factors/distractions (something moving + audio) AND a user has to have some input and their own timing + reaction, there's no way that's accurate. I've had a friend "calibrate" his Rock Band and then I played and found it way off. That's an opinion, I suppose, but this is AV*S* forum, you know, science!
there are many problems with rock band lag calibrations. we have gone over this a lot in this thread before and one of the major problems is that it deals with three types of lag not just one. this means too many variables to work for video input lag testing. these 3 variables are
video lag (typically TV generated by can be console generated as well)
audio lag (surround receivers can cause audio delay due to decoding)
and the least known, controller lag! (there is a wide variance on how fast certain controllers esp wireless ones lag)
the controller lag + human RT + TV input lag = completely worthless numbers
the only rock band lag test that is *somewhat* accurate are the ones that use the sensor built into the guitar and auto calibrate using it.
the manual calibrations are flawed for obvious reasons, human reaction time is bad, even good RT by a person exceeds what many find acceptable with input lag and is also one of the reasons why input lag is so important, it stacks on top of your own reaction time.