Honestly I was disappointed with uniformity compared to the Samsung F8000 in bestbuy.
Very dirty. Bjorn's stays away from all-white screens, so it was fine. That's all one can do.
Well, first, before anyone hops on this trivial detail here: when I talk about the input port speed, I'm talking about what is accepted as video feed to it and not the "undocumented" higher rates that some TVs can handle (when driven by a PC, for instance).
But no, you've misunderstood something. Lag is defined as the amount of time between a signal sent to the TV and the TV actually displaying it. That lag can be much longer than the time between input frames, as in the case of the car wash. The fastest that car wash can handle is 12/hour (5 minutes between each). But the lag is 1 hour long. There's no issue the car wash input outracing the car wash output. Nor with the input outracing the TV's ability to output.
You'll routinely see motion handling of some TVs push the lag into the 60's or higher. That's just a more complicated car wash. Interpolation is where the analogy to cars breaks down. Basically it's like doubling each car mid stream and outputting them 2 1/2 minutes apart.
Thanks for the compliment on my analogy.
Motion resolution is hugely affected by the sample-and-hold smearing on the retina. It's a very complicated set of algorithms now involving many variables including, but not limited to:
I'm not so sure on the 8ms. 1/120th of a second? I almost doubt it, but can't completely form why because I don't know 1. what the TV's are fully doing in detail (they keep that @#$% a secret), and 2. what the leo bodnar device is doing (I mean precisely). Especially since there were a few times that I think FourWude (from displaylag.com) found a 0ms return from it. Eeeeeeeek. Pretty soon the thing will say -60000 ms and you'll have a TV that displays your First Person Shooter's motion a minute before you make it.
BTW, if you haven't been following the motion blur thread, you really should. Sorry, I don't remember if I saw you there or not. You'd love Mark Rejhon's most recent blur example. It's very cool.
Which makes sense, or at least is good news. It's pulsing ideally.
By the way Mark, regarding lag timing, in the CRT case that you're talking about, the vertical blanking period is substantial. I (and all graphics engineers actually) used to use this time (triggered by interrupt) to transfer image data to the raster to prevent flicker. Double-buffering approaches do that behind your back all the time. The top to bottom scan time is not as it is for digital devices: the device is displaying as it receives.
In the case of an LCD though, the top to bottom scan time is exceedingly and unpredictably fast. I looked quickly, probably too quickly, through your links, and didn't see an average timing for it---nor would I expect there to be one that is of any value. But there's something more too: LCD doesn't strictly *need* that scanning mechanism. Unlike analog CRT, the image data is always 100% present at the point the LCD decides to do anything with it, correct? So displays are free to play games with how the LCD "fetches" that information and displays, no?