Originally Posted by skschatzman
See this is where your information is skewed. The importance of a low input lag isn't always just about human reaction time. It is more about timing controller input to the visual information at hand. I can feel and see the difference of 52ms display vs a lower input lag display easily. When I turn left and right on a 3D game like COD, I can see the lag response difference between a 52ms display vs my 1ms monitor. Put me in front of whatever scientific measuring equipment you want. I can feel/see the difference 100% of the time. Just like when these displays have GLL. I can feel the difference when it is off/on 100% of the time.
First, there are no monitors with 1ms of input lag. There are monitors with 1ms response time, certainly, but the lowest input lag measured from a monitor is still about 9ms (depending on how you measure - you could argue that no monitor has less than 14ms input lag using the Leo Bodnar method of testing). That's relevant only in that for 60fps (which is what the discussion revolves around), that means you are still at 1 frame of input lag with the fastest display available. You believe that you can "see" the difference between 1 frame and 3 frames... but I still posit that it's largely psychological. In a blind test, I'm pretty confident that you would not be able to choose between 1/60th of a second and 1/20th of a second. There is actually enough variance even in button/stick travel that you couldn't. The vast majority can't detect input lag until about 4-5 frames. And if the P-series displays are measuring 52ms using the Leo Bodnar test or its equivalent, that's more like 2 frames in actual practice.
Originally Posted by DJ Lushious
In other news, water is wet. Anyone with a brain can tell the difference between those two, since 1ms is effectively no lag. It's not the most fair comparison you're making. What about the difference between an LG OLED and your Vizio? Could you tell the difference between them? I'd wager not, since the differences are much smaller. EDIT - at least the difference would be negligible, at best.
I would hope so! Turning on GLL can, in some cases, cut the input lag by half or more.
was trying to say, I believe, is that
's post is hyperbolic to a fault.
That's exactly what I was saying. It's an overhyped metric. People lean on it for competitive gaming (though even the best fighting game players don't feel a 3 frame difference), but in practice, it's negligible. For 60Hz console gaming, unquestionably. But people believe it's a thing, so... it's a thing.
Originally Posted by skschatzman
Then he should have left the human response time out of it as that information wasn't relative to what everyone refers to with display input lag. That is my point. If you are going to make an argument, then try to use relevant information.
It's absolutely relevant, as input lag tends to be mentioned in terms of lower input lag benefiting competitive gamers. For gaming purposes, you are reacting to things on-screen. I mention that information because for online gaming, most people don't realize that you could have near-zero input lag and you're still beholden to the network code. But then, most people don't understand things like hit detection and prediction. If you're in the same room as me, on a local network, playing on a monitor with 1 frame of input lag compared to my 3 frames, and we both pull the trigger at the same time (which is nigh impossible in practice), the networking code still has to GUESS which of us "shot first" based on the information available. And that guesswork can fall anywhere between a 100-200ms window, depending on the game's code. And that's on a local network, where your latency would be nil. On the internet, you'd have variance of 40-70ms of latency depending on connection to the host.
Hell, the game itself may have its own internal detection latency even for single player games, wherein certain features are running at a partial update rate of the targeted refresh. A good example of this would be games that target 60fps (16.6ms/frame) but only update their physics/location data every 33ms.
My point remains: 3 frames being characterized as AWFUL for a 60Hz source is a bit of an overreaction. If you don't agree, that's fine... but the point remains. People worry about it way more than it matters in practice.