Originally Posted by tgm1024
First of all, WOW, what a well thought out informative post (and thread!) you started here.
The thing that threw me a little was the analog->digital statement above. I think I'm reading it wrong.
According to you and the Wikipedia link you supplied, the delay culprits are from the digital signal processing, not the conversion from analog to digital. The possible culprits listed were HDCP, Digital Rights Management, and intensive DSP issues for ghosting, etc.
A lot has been learned since this thread was started, and even since the wikipedia article was last revised.
It's telling how the definition for "input lag" doesn't include the word "input" anywhere in it. It's actually quite difficult to define input lag in terms of input that lags. This is the reason wikipedia has renamed and redirects "input lag" to "display lag". While it's interesting to see how arbitrarily people on wikipedia can decide to defy prevailing usage and just coin a new term at will, it's also quite fitting here that they try. The term input lag is not very accurate, and it's not yet an industry standard term, so there's time to change it.
Both definitions you're looking at cite observations of particular causes of lag on particular displays, though they may not recognize it. Displays process digital and analog signals alike for purposes of color correction, upscaling, frame interpolation, noise reduction, edge enhancement, motion blur reduction, motion resolution enhancement, 3d crosstalk reduction, etc. Some digital displays are much faster or slower at performing these tasks than others, and some displays do not perform them at all or can have them disabled. All of these tasks are probable factors on some displays. However, only a small number of processing tasks, such as frame interpolation, guarantee a non-trivial amount of lag, since they require data from at least two frames in order to proceed with processing, which must be completed before display. Otherwise, it is not the particular processing task that causes latency to be high or low, merely the display performing the task. D/A conversion and upscaling, HDCP processing and cross-talk reduction are specific examples of tasks that have been observed to cause lag problems in some, but not all (or even most), displays. In some other displays there is no way to reduce the latency, so for these displays it is both difficult and irrelevant to narrow it down to a specific processing technology because latency was designed-in across the board.
There is no single root cause; the only way to avoid being stung by display lag is to know about the display you're considering, and whether its lag-prone scenarios are important to you.
Actually, there is a single root cause: there is no industry driving factor for display lag, whereas there are a lot of drivers (specifically, journalist reviews) for increasing the amount of processing done on a signal in order to improve picture quality. As a result, most manufactures deprioritize or ignore the video game playing scenario. I'd like to say "to their folly", but they've been doing it for years, so it must not be hurting enough to notice.