Originally Posted by p5browne
See my posting #68 - Computer settings on the HDMI Input are NOT the same as Inputing a Video Input on the same HDMI such as a Blu-ray player.
But I am not talking about picture settings.
I am asking about about benchmarking display motion blur
(separately of just benchmarking ghosting effects, like Raymond Soneira did.)
Right now, we're isolating the motion blur down to display own sample-and-hold (e.g. as explained by scientific papers
). It is still quite fully relevant, as the sample-and-hold effect (a known major problem in past OLD's
) determines the absolute floor for the minimum motion blur created by the display's own sample-and-hold characteristic.
...Video mode motion blur benchmarking -- One can also use certain TestUFO benchmarks this (e.g. panning photo tests) in video mode, interpolation enabled if you wish, rather than Game Mode, to get a representation of what guaranteed minimum motion blur that the display would create from a regular video source in your favourite motion mode (treating the computer input as a regular video).
...Game mode motion blur benchmarking -- For benchmarking computer based motion blur, use Game Mode / PC Mode -- People do use computers, consoles, etc, with HDTV's, too. Computer/game based material moves faster than video and movie material, and motion is often faster, so leading to more opportunity to noticing motion blur. Also, some other people, like me and thousands of Blur Busters readers, are very sensitive to motion blur (e.g. the videophiles of motion blur, like there are other people who are audiophiles, or people who are videophiles about contrast ratio or about color gamut). For some people, they see the motion blur more quickly than they see the colors. (Even if they're not color blind -- remember, about 8% of population is color blind) Just like colors sensitivity varies in population, motion blur sensitivity varies in the population too as well, and some of us, are quite interested to know about this.
Another thing I am curious about, is the black frame insertion feature of some of the OLED's -- what the black frame insertion duty cycle is. (50%:50% black frame insertion only reduces motion blur by 50%). And also whether adjusting brightness would adjust the strobe length of the black frame insertion feature (e.g. dimmer picture leading to better motion clarity). It has already been proven that mathematically, the guaranteed minimum amount of motion blur is dictated by the length of the sample-and-hold. Usually, source-based motion blur (e.g. softness in video and movies) is the dominant factor, but when playing video games and computer use, or even simple things like scrolling, the sample-and-hold effect (as seen at www.testufo.com/eyetracking
) is the dominant cause of motion blur on many displays. It is relevant in the era where people increasingly connect computers and consoles to HDTV's today, as such material pushes the motion blur limits of displays more than video-based material does.
So I re-iterate the question again: Can someone do some real *scientific* motion blur benchmarks on these OLED displays, please? BTW -- you don't have to use my tests, by the way. As long as the motion blur measurement is a measurement that takes into account of sample-and-hold -- such as via MPRT scientific measurement standard
(pursuit camera technique). Or is the manufacturer too afraid to let people actually measure motion blur on material (games, computers) that's more demanding than mere video?Edited by Mark Rejhon - 9/29/13 at 2:18pm