Originally Posted by Chronoptimist
"Lines of resolution" is a completely meaningless test
It was originally thought up because it puts LCD in a very bad light. With a sample and hold LCD at 60Hz you get 300 lines of resolution no matter what - this made plasma look a lot better.
The problem is that when you have a 240Hz sample & hold LCD, you now have 1200 lines of resolution, beating the Plasma displays. Guess why they no longer advertise that.
We also have the Panasonic NeoPDP's which use short-persistence phosphors. Those have WAY more motion resolution.
In addition, some strobe-backlight LCD's have over 10x more motion resolution than a 60Hz LCD. The motion test patterns starts to hit their limiting factor.
Technically, if you could get a pattern fine enough, you'd essentially have over 3,600 lines of resolution on a LightBoost LCD, if you properly linearly scale the motion resolution based on how much the blur trail shortens (e.g. 35 pixels of motion blurring trail becomes only 3 pixels long, for a specific speed motion). But all the test patterns for "Lines of motion resolution" caps out a specific number, such as "1080 lines of motion resolution", which is also meaningless too.Computer motion and game motion can be much faster than movie/television motion. In addition, source-based blur is added less frequently to videogame motion
. So humans can easily tell apart a display of "2000 lines of motion resolution" (extrapolated from existing standards) versus "5000 lines of motion resolution" (extrapolated from existing standards) -- but those numbers are meaningless and beyond the limits of a common motion resolution test pattern. Recent vision research have showed that people can tell the difference between 240Hz-equivalence, 480Hz-equivalence and 960Hz-equivalence displays, under ideal motion conditions. (~4 pixels of motion blur, ~2 pixels of motion blur, and ~1 pixel of motion blur respectively for 1000 pixel/sec motion). Virtual reality is an excellent use case. Of course, many displays "fall short of expectations", due to various scientific factors (such as phosphor decay, continuous modulation such as temporal dithering that's necessary for DLP; and affects motion resolution, diffusion between adjacent scanning backlight segments and affects motion resolution, or LCD pixel response being too slow to fit in the black frame interval between strobes, etc.)
See thread in Display Calibration:Standardizing Motion Resolution: "Milliseconds of motion resolution" is better than "lines of motion resolution"
It's no less legitimate than say, measuring contrast ratios. Just like "ANSI checkerboard contrast ratio" is fairer (despite still being occasionally problematic such as due to ambient light or halos), the use of "milliseconds of motion resolution" is fairer than "lines of motion resolution". "Motion Picture Response Time" is what vision researchers and scientists use, and the milliseconds method is already used by other scientific papers
. It's both subjectively and objectively measurable at least on an averaged basis -- with test equipment and with test patterns.Edited by Mark Rejhon - 9/16/13 at 5:36pm