I've done tests -- as did dozens of others via my Blog -- it is surprising that it's very clearly detectable, under ideal conditions. 8-bit pixel art (high-contrast perfect boundaries between pixels), moving at motions at 1920 pixels per second -- the blur difference of 1/400sec strobe (4.8 pixels of blur) versus 1/700sec strobes (2.7 pixels of blur) becomes clearly noticeable for people who are sitting in front of a computer monitor at 1:1 viewing distance to screen width.
In general video situations (source based static blur & source based motion blur & compression based blur), this will be impossible to notice.
My point is, 600fps@600Hz isn't even the ultimate frontier.
Yes, most people do not care.
Yes, it won't usually be noticed in video material (except perfect pans with ultra-fast shutter speeds, and when you're sitting very close)
But, the point is, under ideal conditions, running motion tests (such as PixPerAn, or the upcoming Blur Busters Motion Tests) under _ideal_ conditions (1:1 view distance, 8-bit pixel art, framerate=Hz, perfect motion moving at exact pixel steps per frame) 2 millisecond of motion blur differences are _definitely_ noticeable by >50% of human population, especially if you turn your head while tracking the moving object (to improve eye tracking accuracy further), it becomes even easier to see the blur differences. But I don't even need to turn my head to notice the difference -- eye tracking alone is good enough to detect 1ms blur differences under ideal conditions.Useful tracking accuracy test: Hold up a magazine steadily with both of your hands outstretched. Stand up. Now spin while reading the magazine. Spin at ~30 degrees per second while reading the magazine -- that's spinning one revolution every ~10 seconds -- the rough tracking speed needed for 2000 pixels/sec at 1:1 view distance from a 27" monitor.
More than half of human population is still able to accurately read the magazine text while spinning at this speed. Maybe your reading speed slows down, but you won't completely be unable to read the magazine. These humans will see motion blur at 1ms differences under ideal conditions (1:1 view distance from 27" 1080p monitor, 8-bit pixel art in existing motion test software). Once you're trained to see the differences (like learning to detect judder, or learning to detect DLP rainbows), it's quite easy to see. You do get spoiled by motion clarity once you're used to it (fast panning motions as perfectly clear as stationary motion).
I have users of my Blog who told me who they said they wouldn't return to LightBoost=100% (400fps@400Hz motion blur equivalence) after becoming used to LightBoost=10% (700fps@700Hz motion blur equivalence). And that's in regular video games (not ideal scenario of motion tests). A small picky segment of population, but not a non-existent segment of population. I can personally vouch for seeing the motion blur differences of 1/400sec strobes versus 1/700sec strobes in actual video games when playing VSYNC ON at framerate=Hz, while strafing sideways (via arrow keys) at high speeds in front of high-detail wall textures (e.g. posters on virtual walls). The motion blur differences is less noticeable (or almost nonnoticeable) with VSYNC OFF at fluctuating framerates (judder) but immediately reveals itself when the motion becomes more perfectly matched with frame rate (e.g. playing old Source-engine games at 120fps on Geforce 600-series or 700-series cards). Becomes even easier to see when you use ultra-high-contrast boundaries.Good example video game test case: Borderlands 2 video game
(released September 18th 2012). Cartoony rotoscoped style graphics with lots of thin pixel-thick black lines and super-sharp contrasts. My trained eye can instantly see 0.5 millisecond blur differences (while turning or strafing) in THAT game during solo play when under these conditions: VSYNC=ON, framerate=Hz, on a powerful GPU, using a controller/mouse that introduces no microstutters
(and in game options, slightly reduce view render distance until you get consistent framerate=Hz with no framedrops), playing using a gaming-caliber laser mouse (precise enough to eliminate visible microstutters). LightBoost=10% versus 50% versus 100% is very clearly distinguishable to my eyes (1.4ms vs 1.9ms vs 2.4ms, and yes, I measured using my oscilloscope -- which I did for my high speed video
during LightBoost=100% at 2.4ms). The differences are not night and day (like 60Hz vs 120Hz), but easy to identify instantly during a smooth in-game turn or strafe.
Bottom line: During motion under ideal conditions, 600fps@600Hz (sample-and-hold method), or the use of 1/600sec strobes (flicker method), is certainly NOT the final frontier in motion blur for five-sigma population. (e.g. Finding a Hz or short strobe length that 99.999% population, when _fully_ trained-and-pointed-out-to-do-so, wouldn't be able to detect any display-enforced motion blur in 'perfect' blur free source material). Even 1/700sec refresh samples doesn't exceed a single sigma when pre-training the humans, and when using _ideal_ test case (motion test app). Sure, most population won't notice just like most population won't see 3:2 pulldown judder (until they are trained to see it), but once you are trained to see the blur (and use VSYNC=ON, framerate=Hz, for the 'perfect synchronized motion effect' -- ala Nintendo 60fps "Super Mario Brothers" silky smooth pan effect on CRT as the benchmark of motion perfection), it becomes easy to see 0.5ms differences in motion blur at typical computer monitor viewing distances. Yes, most modern games like Crysis 3 won't permit you the perfect synchronized motion effect, but many not-too-old games do at reduced detail settings or on Titan/700-series cards (e.g. Borderlands 2, Bioshock, Portal 2, etc) -- once you configure a game to do perfect refresh-synchronized motion, it becomes easy to see tiny differences in motion blur once you're trained to do so.
Displays are getting bigger, and many home theater users are viewing displays at 1:1 view distances -- the view distances necessary to easily (after pre-trained) see minor 0.5ms differences in motion blur (in perfect-sharp source material such as video games). More and more are using home theaters to play games too, so there's some overlap there. Video games are an extreme test case of seeing display based motion blur. Mind you, first-person-shooters and racing video games often cause people to track eyes faster than when watching most hockey/football material. Enough semi-new and many older PC video games exists today that are capable of being configured (VSYNC=ON, framerate=Hz, when using a sufficiently powerful GPU, and using a controller that introduces no visible stutters) to approach ideal test case scenarios necessary for humans to see (after pre-trained) 0.5ms differences in motion blur. Sometimes motion blur is good, natural and intended, but there are use cases where zero motion blur is strived for.
Future display tech such as OLED's (flicker based), blue-phase LCD's (microsecond speed LCD's), and strobe-backlight in regular LCD's (backlight strobes bypassing LCD pixel speed limitations), all can theoretically eventually gain motion resolution necessary to completely eliminate human-perceptible motion blur for five-sigma pre-trained population, possibly using 0.2ms strobes (a guessed number, based on my ability to easily tell apart motion blur of 1.4ms strobes versus 1.9ms strobes) -- similar to a CRT with shorter-persistence phosphor versus a CRT with a medium-persistence phosphor (another excellent example of humans seeing millisecond differences in motion blur/phosphor ghosting). At this point, 2000 pixels/sec would have less than half a pixel motion blur. Manufacturers may not go for it this decade, but the potential is there to successfully pull this off on a flat panel technology.Edited by Mark Rejhon - 6/27/13 at 12:06am