Originally Posted by MrBonk
It's not just the eye though.
The heart of it lies in the content. Much of the content has that blur (Especially motion blur) to begin with.
Freeze frame during any film or TV with the camera in motion.
That's why Blur Busters focusses on motion blur elimination for computers and video games:
1. Close viewing distances (1:1 view distance). Motion blur becomes easier to see.
2. Ultra high definition (1080p and up). Motion blur becomes easier to see relative to stationary images.
3. Faster movement speeds (faster than video/movie panning)
4. No content-based motion blur (unlike movies/video), or GPU blur effects easily disabled in Game Options
We have run into situations where we even see motion blur with just 2ms of persistence -- which translates to 2 pixels of motion blurring during 1000 pixels/second motion.
(persistence, not GtG -- persistence and transitions are two different things -- see www.testufo.com/eyetracking
) ... Tomorrow, it's virtual reality goggles strapped to our faces, where we turn our heads. Here, even 1ms of persistence becomes a problem -- 4 pixels of motion blurring during 4000 pixels/second motion (full screen width per second for a 4K display). Imagine turning your head and suddenly seeing unwanted display-based motion blur being forced upon you. Whether it's 1ms of persistence caused by strobing (1ms flashes), or 1ms of persistence from short frames (1000fps@1000Hz). There would still be human visible motion blur caused by flickerfree 1000fps@1000Hz (1ms persistence per refresh) even if transitions is instantaneous at 0ms. Short-persistence CRT's have less motion blur than that (e.g. 0.1ms persistence)
And video games / computer use is generally so demanding of low latency, that interpolation is verboten unless it can be achieved at low latencies. For example, native 240fps@240Hz could be interpolated to 960fps@960Hz with only an added ~4.1ms of latency with 1-frame-lookahead. But at current (today's) native refresh rates, interpolation is simply not an option, as no supercomputer can accurately do interpolation without frame lookahead (which requires buffering, which causes input lag). So many of today's "960Hz" televisions with interpolation, don't work very well for video games...
Finite/discrete refresh rate displays are always going to cause problems for blurfree media such as videogames and virtual reality, as in tomorrow's holodecks.
Longer run, dynamic higher-speed refreshing where the eye is pointing is at, to other exotic refreshrateless display technologies (continuous movement without the need for interval-based refreshing), may save the day. Zero strobing, zero phosphor, zero laser scanning, perfectly continuous light. But for a long time, the refreshrate-driven display metaphor will stick with us for a long time.
The good news is manufacturers are FINALLY starting to recognize computer/game-friendly strobe backlights, to fill needs.
- NVIDIA LightBoost
-- the one that started it all! -- unofficial for 2D
- NEW: NVIDIA G-SYNC's optional strobe mode
-- Official "sequel" to LightBoost
- NEW: Eizo Turbo240 Mode (FG2421)
-- official strobe backlight
- NEW: BENQ Blur Reduction Mode (XL2720Z)
-- official strobe backlight
- Samsung 120Hz 3D Mode
-- unofficial for 2D
- Sony Motionflow Impulse
-- 60Hz interpolation-free low-latency mode available in Game Mode on certain models.
As you can see, today, we've come a long way -- and what I said last year was prescient/correct.
Today, LCD's that beat CRT motion clarity, now exist today
. (at least for LightBoost=10%, benchmarked against a Sony FW900 CRT -- testimonials
My predictions one year ago is correct.
The day of CRT-motion-quality computer monitors -- that aren't CRT's -- have finally arrived.
These ultra-high-efficiency strobe backlight displays are vastly superior to scanning backlights in terms of motion quality; as they have none of the inefficiencies caused by scanning backlights.
As we approach Holodeck towards the end of this century (bigger displays, bigger FOV, bigger resolution, less content motion blur, more human-natural motion blur, less display-forced motion blur), the elimination of unavoidable display-based motion blur through various technologies are necessary. Strobe backlights are only a band-aid stop gap for now, simply because we can't achieve low-persistence with steady-light-output displays (non-light-modulated, zero flicker even under high speed camera). So we need scanning, strobing, phosphor, subfields, or another form of light modulation (for now), and strobe backlights are a great, inexpensive improvement to today's LCD's, now that it's become much easier to bypass LCD pixel speed limits (on modern/fast panels) as the motion blur limiting factor for games/computers. Massive order-of-magnitude motion blur improvements are now today routinely possible with fast-response LCD's through ultra-high-efficiency backlight strobing. A good big incremental step to enjoy while waiting for OLED's to mature.