or Connect
AVS › AVS Forum › Display Devices › Flat Panels General and OLED Technology › LCD motion blur: Eye-tracking now dominant cause of motion blur (not pixel transition/GtG)
New Posts  All Forums:Forum Nav:

LCD motion blur: Eye-tracking now dominant cause of motion blur (not pixel transition/GtG) - Page 5

post #121 of 184
Quote:
Originally Posted by Mark Rejhon View Post

CROPPED_LightBoost50-1024x341.jpg

At 120fps@120Hz, a 1/30second camera exposure captures 4 refreshes. All 4 refreshes are stacked on each other, because the pursuit camera is moving in sync with the 120fps@120Hz moving object at a 1/30second camera exposure. The brief backlight flash prevents tracking-based motion blur.

There is extremely little leftover ghosting caused by pixel persistence (virtually invisible to the human eye), since nearly all (>99%+) pixel persistence ghosting & overdrive artifacts are kept unseen by the human eye, while the backlight is turned off between refreshes. The backlight strobe flash length, measured to be 1.5ms by TFT Central, is more than 90% shorter than a 60Hz refresh (16.7ms). The LightBoost 10% setting uses 1.5ms strobe flashes, while the LightBoost 100% setting uses 2.4ms strobe flashes. This is still greatly shorter than even a 120Hz refresh (8.3ms)!   As a result, motion clarity on a LightBoost monitor is comparable to a CRT display.

Observe the leftmost end of the bottommost image. You do see a very faint dot to the left of the line of white dots (in the red base). That's an example of the tiny remnant pixel persistence the human eye sees. This pursuit camera photographic proof, proves that the vast majority of pixel persistence (roughly ~99%) is successfully hidden from the human eye -- completely unseen by the human eye.

Please explain these physics.
I have sent you a PM to the Blur Busters Motion Tests, with a VIP invite.[/quote]

Couple of verifications I need:

1. Is the camera a film or digital? This by itself doesn't necessarily matter at all, but I'd like to know. Discrete mechanisms (even when assuming a wide open shutter for 1/30th of a second) can cause beat frequencies against each other depending upon exactly what is being done to cause the 1/30th of a second effective shutter. Even CCD and CMOS arrays are scanned internally I think, so I'm simply curious about that.

2. When the backlight strobe length is measureed by TFT central is that for the entire frame, or is that for each of the individual pixels. That is, is it possible in this scenario for the pixels to be flashing for a much shorter time than the entire frame seems to be? Is the top of the frame drawn and kept there while the rest of the frame is drawn as well, or does it have a much briefer strobe to it?

All that aside, isn't what you just shown proof that by limiting the pixel persistence you get a clearer picture?

Gear mentioned in this thread:

post #122 of 184
Thread Starter 
Contrast-enhancement of LightBoost photograph, showing ultra-faint pixel persistence that's still seen by human eye. It's almost below the noise floor. Most of the pixel persistence (transitions and overshoots) are hidden during the time period the backlight is turned off.

faint_pixel_persistence-246x300.jpg
Quote:
Originally Posted by tgm1024 View Post

1. Is the camera a film or digital? This by itself doesn't necessarily matter at all, but I'd like to know. Discrete mechanisms (even when assuming a wide open shutter for 1/30th of a second) can cause beat frequencies against each other depending upon exactly what is being done to cause the 1/30th of a second effective shutter. Even CCD and CMOS arrays are scanned internally I think, so I'm simply curious about that.
The camera is a Casio EX-FC200S. Any common film or digital camera can be used; they all yield the same motion blur result at the same camera exposure, if you properly synchronize the camera motion to the moving-object onscreen (within accuracy of +/-1 pixel during the duration of the exposure). I've confirmed there's no difference. Also tried an old Panasonic Lumix.
Quote:
2. When the backlight strobe length is measureed by TFT central is that for the entire frame, or is that for each of the individual pixels. That is, is it possible in this scenario for the pixels to be flashing for a much shorter time than the entire frame seems to be?
At the LCD panel level, the pixel transitions are taking longer than the backlight flash. However, the pixel transitions are taking place in the dark. Marc Repnow (scientist) reversed engineered LightBoost, and found it uses a vertical blanking interval of approximately 5 milliseconds. See his discoveries as well as his HardForum post. (Frames are buffered in order to allow accelerated scanouts, in order to have a long blanking interval). The refresh is scanned out fast (top-to-bottom) in total darkness, to allow a bigger idle time period between refreshes.

5ms -- backlight turned off; LCD is being refreshed in total darkness. Pixel transitions is occuring in the dark.
2ms -- backlight turned off; idle period, to wait for the last pixel transitions to settle in total darkness.
2ms -- backlight flash; fully refreshed frame seen by human eye.

This corresponds to one refresh cycle (8.3ms) at the 120Hz refresh. During longer strobe flashes, the backlight flash actually overlaps the next refresh slightly, causing slightly increased ghosting at the edges of the screen (a few percent more visible, only on the top/bottom 10% of the screen, and not objectionable). During shorter strobe flashes, the backlight flash completely fits within artificially-lengthened blanking interval between refreshes. Thus, by this technique, a good strobe backlight on a fast LCD, bypasses greater-than-99 percent of pixel persistence from becoming visible to the human eye.

There is definitely some vertical non-linearity in ghosting; e.g. top and bottom edges have more remnant pixel persistence than the center of the screen. That's an unavoidable situation caused by the top-to-bottom scanout of LCD, versus the all-at-once stroboscopic flash. But this no longer a motion blur bottleneck, unlike scanning which still has unavoidable light leakage between backlight segments (on vs off segments). Only strobe backlights (given a sufficiently fast LCD) currently make possible unbounded improvement in motion clarity.
Quote:
All that aside, isn't what you just shown proof that by limiting the pixel persistence you get a clearer picture?
Therein lies the confusion. The LCD has the same pixel persistence, during all these photographs. It's a photo of the same LCD. The LCD isn't accelerating pixel transitions by 10x. (But yes, it's accelerating the top-to-down refreshing, but the individual pixels still take the same amount of time to transition). The backlight is simply hiding the pixel transitions, simply by being turned off between refreshes.

There is little difference in the LCD panel's ability to change the speed of pixel transitions (overdrive tweaks) in the photographs between 60Hz versus 120Hz versus LightBoost. All of the photos were taken using the same camera on the same computer monitor. The law of physics in the LCD remain the same; it doesn't transition pixels faster between the 120Hz versus 120Hz LightBoost (aka the pixel persistence, from the point of view of the LCD law of physics, is unchanged). Marc Repnow mentioned to me that if you open up a LightBoost computer monitor and force the backlight to continuously shine during LightBoost mode, the motion blur instantly comes back -- including pixel persistence/overdrive artifacts -- which immediately comes back and becomes far more visible to the human eye. Further proof that the backlight being turned off, is successfully hiding pixel persistence. This A/B test of overriding the strobing, proves this.

The fact that the backlight is able to eliminate motion blur in this A/B test, also proves the following:
Quote:
Recently, with today’s faster LCD’s, pixel persistence now only has a minor factor in motion blur.

Edited by Mark Rejhon - 5/19/13 at 7:18pm
post #123 of 184
Let's take this offline (off thread) for now. It requires way too much back and forth here to be productive, and I'm very interested in a bunch of what still seems vague. We'll chat soon! Thanks for the PMs.
post #124 of 184
Thread Starter 
Quote:
Originally Posted by tgm1024 View Post

Let's take this offline (off thread) for now. It requires way too much back and forth here to be productive, and I'm very interested in a bunch of what still seems vague. We'll chat soon! Thanks for the PMs.
I prefer to educate hundreds of people publicly in public forums and blogs (efficient education), so I prefer not to educate one person privately (inefficient education) (unless it's paid work, of course!)

If sending PM, please keep your questions short enough that I can reply to them in less than 5 minutes. Otherwise, I prefer to discuss publicly.

I hereby invite you to do the following steps:
1. Buy a LightBoost monitor. Supported monitors are listed in the LightBoost HOWTO.
2. Download a motion test such as PixPerAn. (Or use Blur Busters Motion Test if you have an account).
3. Run the motion test. Observe the motion blur with your eyes.
4. Enable LightBoost.
5. Run the motion test again. Observe the lack motion blur with your eyes. The ghosting/overdrive almost completely disappears*

*Some LightBoost monitors (e.g. VG278HE instead of VG278H) don't do as good as job at eliminating visibility of pixel transitions (e.g. 95% gone instead of 99% gone). I also have a BENQ XL2411T, which does an excellent job of hiding pixel transitions.

If you want to actually run a true scientific A/B test with exactly the same LCD refresh pattern (the LightBoost-tweaked overdrive algorithm), then do this too:
6. Open the LightBoost monitor and make the backlight shine continuously while in LightBoost mode.
7. Run the motion test again. Observe the return of motion blur (and ghosting/overdrive artifacts) with your eyes. (unchanged LCD panel pixel persistence)
8. You can run A/B test by disconnecting/reconnecting the backlight strobe feature, while keeping the LCD panel refresh behavior unchanged. Even while the monitor is still running!
9. Observe that whenever the backlight continuously shines, there's a lot of motion blur and ghosting (like regular 120Hz). But if you let the backlight strobe once per refresh, the motion blur (and most LCD transition artifacts disappear -- including most of ghosting/overdrive disappearing).

This proves: "Recently, with today’s faster LCD’s, pixel persistence now only has a minor factor in motion blur."
For Blur Busters Blog purposes, "LCD pixel persistence" means "physical pixel transitions within the LCD panel, independently of whether backlight is ON or OFF". Some sites, including prad.de and PixPerAn, has tended to use this terminology. Others standardize on "pixel transitions" terminology. If this is the confusion, then now we're clear!

That's all, folks. I gotta focus on family and work!
Edited by Mark Rejhon - 5/19/13 at 2:45pm
post #125 of 184
Thread Starter 
Confusion is resolved over PM.
"pixel" = A single controllable LCD element; independently of whether it's seen by human eye or not (e.g. backlight turned off)
"pixel persistence" = The transitions of an LCD element from one state to a different state; independently of whether it's seen by human eye or not (e.g. backlight turned off)

These terminologies has precedent:
"pixel" can be used from the perspective of a digital technology, e.g. an array of CCD pixels, even if the CCD chip is not currently capturing pictures, or when an LCD is not currently visible (e.g. backlight off).
"pixel persistence" has long been used in many contexts, such as PRAD's Pixel Persistence Analyzer.

That said, Blur Busters is going to address this to fix potential terminology confusion, such as posting of a Blur Busters Glossary Page, to keep things consistent. The phrase "pixel response time" is more common nowadays, and may be clearer.

"Recently, with today’s faster LCD’s, the LCD panel's native pixel response time now only has a minor factor in motion blur"
Edited by Mark Rejhon - 5/19/13 at 7:25pm
post #126 of 184
^what he said. smile.gif.

Yeah, it's a good idea to start with known stakes in the sand. I was trying to come up with various symbolic ways of representing this stuff, but it's been done 100 times already I'm sure.

The confusion came in (in my case) because of backward comparisons to CRT persistence of phosphor, which by itself is a crummy analogy because in the CRT case it is the only thing remotely synonymous with "pixel persistence" and being based upon phosphor excitation always refers to light flying out. So Mark was referring to LCD element state with or without light and I thought he was referring to the pixel as viewed (with light coming out of it). LCD element persistence perhaps? "Response time" is also well known as you pointed out.
post #127 of 184
Thread Starter 
What's so exciting to me about LCD today, is that it became possible for motion clarity of LCD to become unbounded from LCD's own panel limitation (provided certain stringent requirements are met). When these requirements are successfully met, the LCD panel's native pixel response is no longer has a factor in motion blur. Pixel transitions completed in total darkness, all the way to below human vision noise floor at normal viewing distances, before the backlight is strobed.

By effective purposes (to the human eye vision), it becomes a true impulse driven display to the human brain. Doesn't matter if parts of the technology is intrinsicially unavoidably sample-and-hold (NRZ), due to the teamwork between the LED backlight & the LCD panel, is all that matters.

That said, other limitations apply (e.g. imperfect black levels are an unavoidable limitation, and faint vertical non-linearities in picture quality due to different ages of pixel transitions, due to the LCD scan-out versus full backlight strobe). Some of the transitions are faintly visible, as seen, and some GtG are worse than others, but are otherwise impossible to see in most scenery (ultra-faint 3D crosstalk too). There can be interplay between the strobing and LCD inversion, causing amplified inversion artifacts (but not on all LightBoost monitors; it's a non-issue on the XL2411T, while far moreso on the VG278HE). Also, some LightBoost monitors are better than others. Older LightBoost monitors such as XL2420T, have more visible ghosting, while newer ones like VG248QE, can have nearly perfectly hidden pixel transitions (at least across a large middle band across the screen, with slight faint ghosting at top/bottom edges).

The pursuit camera photos truly show how successfully the panel's own pixel transitions can be hidden. For many competitive FPS video gamers, the remaining (and created) artifacts are not issues.
Edited by Mark Rejhon - 5/21/13 at 7:55am
post #128 of 184
Thread Starter 
I've created a new graph to compare the amount of motion blur between different common LightBoost and non-LightBoost modes:

motion-blur-graph.png

It's now added to PHOTOS: 60Hz vs 120Hz vs LightBoost.
It shows that even the worst 100Hz LightBoost mode has far less motion blur than 144Hz non-LightBoost.
post #129 of 184
Mark, to your knowledge, which LED or LCD TV panels will include Lightboost or is it a PC monitoer thing and applicable to video graphics only?

thank you,

G
post #130 of 184
Thread Starter 
Quote:
Originally Posted by GmanAVS View Post

Mark, to your knowledge, which LED or LCD TV panels will include Lightboost or is it a PC monitoer thing and applicable to video graphics only?
Some HDTV's do. The problem is the 60Hz native refresh rate causes a lot of flicker for pure backlight-only motion blur reduction technologies. Video standardization to 120Hz can't come soon enough, for fans of impulse-driven displays.

Some Sony HDTV's (e.g. HX950) include a LightBoost-like mode called Motionflow Impulse which strobes the backlight (once per refresh) without using interpolation. Unfortunately, this flickers like a 60Hz CRT, for 60Hz signals. There are also scanning backlights, which are like strobe backlights but instead strobes segments of the screen at a time (usually sequentially).

As we know, we need the once-per-refresh flash for the most efficient motion blur elimination. The lower the number of unique frames, the lower the Hz of the flashing needed to eliminate motion blur. So that is why interpolation is done -- create unique frames (even if "faked") to add more Hz. By itself, that reduces motion blur, down to the sample-and-hold effect limitations (e.g. 100Hz = 10ms of sample and hold). To go further without raising Hz, you can shorten the length of frames by adding black periods between refreshes (e.g. CRT flicker, plasma flicker, black frame insertion, backlight strobes, etc), to reduce motion blur even further.

So to eliminate the 60Hz flicker, that's where interpolation comes in in these HDTV's, in addition to scanning backlights. The "960" HDTV's (simulates motion clarity of 960Hz) -- such as Sony Motionflow XR 960 or Samsung Clear Motion Ratio 960, utilize interpolation to go to 240Hz, then uses 1/960sec strobes (240 times per second per scanning backlight segment) to go the rest of the way. See existing HDTV scanning backlight technology. (Sony, Samsung, Panasonic, etc). Interpolation has a lot of compromises/problems for video games and computer use, due to input lag created.

What makes LightBoost special is that it is a computer friendly strobe backlight that's been very efficiently successful in eliminating motion blur, AND without the input lag of interpolation, AND without creating noticeable flicker (for most people) due to 120Hz, There are of course, some image quality tradeoffs (lower contrast ratio, slight vertical non-linearities). So obviously, there are tradeoffs.

There are many FPS video game players that value motion clarity above color quality. (Examples of FPS games include Quake Live, Team Fortress 2, Counterstrike, Call of Duty, etc) For the most perfect motion (120fps@120Hz) without blur, having 120Hz creates a disadvantage of having stringent demands for GPU horsepower to pull off the frame rates necessary to match refresh rate. A lot of users on HardForum, OCN, etc. favour the high-end GeForce 600-series, 700-series or Titan graphics cards, sometimes multiple GPU's running in parallel (SLI). Otherwise repeated refreshes (e.g. 60fps @ 120Hz) adds back perceived motion blur (and/or artifacts, such as a double-edge effect similiar to 30fps @ 60Hz).

So in a nutshell:
60Hz native video sources -- Impulse-driven displays with too much flicker (like a 60Hz CRT), *unless* you also add in interpolation
120Hz native video sources -- Impulse-driven displays with less flicker (like a 120Hz CRT), making interpolation less necessary
Edited by Mark Rejhon - 6/6/13 at 7:59am
post #131 of 184
Quote:
Originally Posted by Mark Rejhon View Post

As we know, we need the once-per-refresh flash for the most efficient motion blur elimination. The lower the number of unique frames, the lower the Hz of the flashing needed to eliminate motion blur.

Yes. And this is being completely ignored by the manufacturers. They're producing 120Hz monitors with ever higher strobed repeats. And each flash is showing up "in the wrong place" on the retina....the more of them the closer to get to the smear. Again, as you state endlessly, this is not with interpolation.

The only potentially necessary reason for it that I can think of is if the screen just isn't bright enough when strobing with too short a flash. Additional strobes can bring the brightness level back up.
post #132 of 184
Repeat strobes are bad but then again interpolation lags down gamers. I suspect interpolation will never do for games.
But interpolation takes what, 4-6 frames...why couldn't you add a couple more processors, get it down to one frame and then...
watch those ugly artifacts.
post #133 of 184
Thread Starter 
Quote:
Originally Posted by tgm1024 View Post

Yes. And this is being completely ignored by the manufacturers. They're producing 120Hz monitors with ever higher strobed repeats. And each flash is showing up "in the wrong place" on the retina....the more of them the closer to get to the smear.
Regularly spaced pulses of the same brightness, at multiple per refresh, produces PWM motion artifacts as follows:

pursuitcam_pwm.jpg
(From LCD Motion Artifacts 101).

As you track your eyes continuously across the screen, the multiple pulses put copies of the same frame in different parts of your retina, which produces the multiple-edge sensation. (A similar edge-repeating phenomenon during panning is often observed at 30fps@60Hz CRT/plasma, 24fps@96Hz plasma, when interpolation is turned off. Also 24fps@48Hz projector strobing using old film projectors) .

If you cluster the flickers closer together, with bigger black gaps between the individual frames -- motion blur gets reduced. Have fewer bright pulses closer together, like Panasonic 2500Hz Focussed Field Drive, it does solve a lot of that repeat-edge issue. Good motion blur reducing plasmas try to avoid too many bright subfield pulses spaced too wide apart, and some add motion-compensated subfield refreshes -- e.g. interpolated 600Hz to help reduce plasma motion blur even further, from the repeated subfield flickers. Mind you, even the Panasonic 2500 FFD is still hamstrung by the 5ms-8ms phosphor decay limitations of red/green phoshpor, so you get the yellow ghosting effect for sufficiently fast motion. (Although plasmas have vastly better color, LightBoost monitors now actually outperforms plasma in terms of motion clarity -- thanks to the very clean one-strobe-per-refresh it uses)
Quote:
Originally Posted by borf View Post

Repeat strobes are bad but then again interpolation lags down gamers. I suspect interpolation will never do for games.
But interpolation takes what, 4-6 frames...why couldn't you add a couple more processors, get it down to one frame and then...
watch those ugly artifacts.
Some interpolation algorithms already reduce it down to roughly one frame ahead or two frames ahead, but there can be compromises in interpolation quality, as you need good lookahead and lookbehind information to efficiently interpolate.

Throwing more processors at it, won't solve the problem. Online competition gamers can feel an 8ms lag -- shooting first in a game, both parties presses fire, the one that shoots first, will win the game. It's like crossing the finish line 8 milliseconds sooner. There won't ever be lagfree interpolation due to the mathematic lookahead requirement. Milliseconds can matter a huge deal. (In addition, elimination of motion blur can increase reaction time too -- LightBoost is known to add a very minor amount of input lag due to waiting for the LCD to refresh in dark before strobing -- averages less than one frame of lag -- so there's a trade off between slight input lag and elimination of motion blur via full-strobe backlights)

If you had real time 1000fps at real time 1000Hz, interpolation and flicker would not be necessary.
Edited by Mark Rejhon - 6/7/13 at 11:15am
post #134 of 184
Quote:
Originally Posted by Mark Rejhon View Post

there can be compromises in interpolation quality, as you need good lookahead and lookbehind information to efficiently interpolate..

Best case is always let the GPU interpolate .It has the 3d information a 2d TV can only guess at (and get wrong). But if new games forever leap-frog the newest GPU's will TV interpolation be the only strategy?

Quote:
Originally Posted by Mark Rejhon View Post

Online competition gamers can feel an 8ms lag -- so there's a trade off between slight input lag and elimination of motion blur via full-strobe backlights

160ms reaction time at best here so not a problem. Detecting 8ms would be superhuman.
But good to know interpolation is down to 1ms (wouldn't be that sony you've mentioned would it?)
post #135 of 184
Thread Starter 
Quote:
Originally Posted by borf View Post

Best case is always let the GPU interpolate .It has the 3d information a 2d TV can only guess at (and get wrong). But if new games forever leap-frog the newest GPU's will TV interpolation be the only strategy?
This is correct, that the GPU can theoretically be more efficient at interpolating. But it is still input lag.
Quote:
160ms reaction time at best here so not a problem. Detecting 8ms would be superhuman.
Borf do you understand that it is not a reaction time problem.
It is a cross-the-finish-line problem.

Scenario.
1. First person shooter game (e.g. Counterstrike, Team Fortress, Quake Live, Battlefield 3). Two people playing online against each other.
2. Two people with exactly the same 200ms reaction time.
3. They shoot at each other at exactly the same time they see each other in the video game.
3. The person with the display with less input lag (rest of the system being identical including Internet), can be the winner here.

This is real life: It actually happens statistically: The average online scores (of equally-skilled) people who have higher input lag tend to be lower scores than the people who have less input lag (good Internet, good mouse, good display with less motion blur and less input lag, fast GPU with high framerate, etc). Good skills can compensate for the input lag, but when you're matching evenly-matched skills and reaction time, then that's where the input lag matters. One frame at 60Hz, 16ms, can be massive chasm that turns a 160ms reaction time into a 176ms reaction time (due to input lag), making it harder for you to out-compete someone else of the same skill as yours. A very laggy display and too many buffer layers (e.g. double buffering, and buffering through a scaler, etc) can turn your 160ms reaction time into a 250ms-300ms reaction time. Triple ouch. In such a scenario, how does a 160ms person with a handicapped 300ms reaction time shoot first, before another person with 160ms reaction time -- if you two attempt to draw at the same time? That's why gamers hate interpolation -- even one frame.

Again, it is not a reaction time problem but a cross-the-finish-line-first problem
The problems even affects you.
Quote:
But good to know interpolation is down to 1ms (wouldn't be that sony you've mentioned would it?)
That's not what I said -- The new Motionflow Impulse mode does not use interpolation. There's no interpolation that fast (1ms).
Edited by Mark Rejhon - 6/7/13 at 3:58pm
post #136 of 184
Input lag is surely a debated topic (Guitar Hero) and I dislike it almost as much as you. Good point about the cumulative lag effect and that anybody can benefit from just a few milliseconds in a "cross-the-finish-line scenario". A question to ask is how often one encounters that scenario.
post #137 of 184
Quote:
Originally Posted by borf View Post

160ms reaction time at best here so not a problem. Detecting 8ms would be superhuman.
Actually, display latencies are very noticeable, especially when you're playing PC games with a mouse at high framerates.

The issue is not whether your reaction time is 160ms at best, the issue is that your reaction time has now been increased to 168ms, and actions in the game lag behind your physical inputs.
In PC gaming where you want to be using a mouse that is polled at 1ms, that has no acceleration or other "processing" and inputs are 1:1, latency is incredibly noticeable.

Here's a demo from Microsoft that shows just how easy it is to notice latency: http://www.youtube.com/watch?v=vOvQCPLkPt4

Now imagine that rather than a small box displayed on the screen, it's your whole view of the "world" that is being delayed like that.

Reaction times are only one factor when it comes to latency, and not even the most important one as far as I'm concerned. (I don't play multiplayer games much any more)
Edited by Chronoptimist - 6/8/13 at 8:46am
post #138 of 184
Great video....never thought you could see 10ms lag. I agree one shouldn't minimize the lag issue, but giving up a few milliseconds of lag to something like lightboost seems worth the CRTmotion you get. Not an ideal scenario though.
post #139 of 184
Thread Starter 
Quote:
Originally Posted by borf View Post

Great video....never thought you could see 10ms lag. I agree one shouldn't minimize the lag issue, but giving up a few milliseconds of lag to something like lightboost seems worth the CRTmotion you get. Not an ideal scenario though.
Yes, this is an excellent video of differential lag.

Also, in fact, it's possible to see a 1ms differential, if you're watching two different objects moving fast. For example, 1000 pixels/second movement would lag behind by 1 pixel. If the DPI is low enough (e.g. 72dpi) and your pointing device was sharp (e.g. stylus instead of finger), then a 1ms input lag differential would definitely still be noticeable during fast stylus movement on a 1000Hz low-resolution touchscreen: It lags behind by 1 pixel... There are laws of diminishing returns here, but differential lag is much easier to detect when you can see watching two frames of reference (e.g. finger versus screen), that can be used as measuring sticks against each other.

Fortunately, LightBoost doesn't give you the opportunity to see differential input lag -- you're not watching two objects measuring against each other (e.g. finger versus screen). So you can't feel the input lag difference (provided you've followed the new ToastyX lightboost technique; the lowest input-lag method of enabling LightBoost for 2D). The old LightBoost enable method had >1 extra frame of lag, but the new ToastyX LightBoost enabling method (2D-only, VSYNC OFF allowed, Control+T no longer necessary) reduces input lag even further.

That said, the effective LightBoost input lag is approximately ~5-6ms average (roughly ~4ms for bottom edge, ~8ms for top edge). The blanking interval between refreshes is artificially lengthened during strobe backlight operation. Half a frame is buffered, then executes an accelerated LCD scan-out (catching up to the input signal) before a pixel transition waiting period, before the all-at-once strobe (reverse engineered by Marc Repnow, aka StrobeMaster, a vision researcher in Europe).

This minor increase in input lag is tiny enough that the improved reaction time by the elimination of motion blur, can greatly outweigh that for many people, and it feels like less input lag to many people. In a game like Battlefield 3, you're running while you're shooting, or you're doing a high-speed flyby in a helicoptor. The motion blur makes it harder to identify far-away enemies. There are reports of better scores with LightBoost than without, including this user and this user (look at those graphs!). Having one full order of magnitude less motion blur during fast motion, makes a real big difference for many gamers. Motion blur can hide a semi-camoflaged enemy, so the lack of motion blur gives you a competitive advantage.



Look at the red vertical line at the right edge. The user enabled LightBoost just shortly after 2012-12-19. His game stats improved steadily afterwards, thanks to the elimination of motion blur, despite the average +5ms LightBoost input lag penalty. This is not the only one. There are other charts/graphs of other users who enabled LightBoost, and noticed that their scores improved.

NOTE: Technically, it's a +6ms delta, if calculated based on StrobeMaster's data.
However, it's really effectively a +5ms delta to the human eye. This is because we're comparing a 2ms directly-visible pixel risetime (2ms TN LCD pixel transition time -- that takes approximately 1ms before the pixel has noticeably transitioned *visibly* close to its new color). So we add a 1ms offset for that. Comparing a slowly rising pixel (towards its final color value), versus a delayed-but-instantaneously-flashed pixel (already near its final value), can be challenging to calculate from an input lag perspective, since the start of traditional LCD pixel transition is not visible to the human eye until a tiny bit of time has passed.
Edited by Mark Rejhon - 6/8/13 at 7:27pm
post #140 of 184
Quote:
Originally Posted by Mark Rejhon View Post

Fortunately, LightBoost doesn't give you the opportunity to see differential input lag -- you're not watching two objects measuring against each other (e.g. finger versus screen). So you can't feel the input lag difference
When you are using a proper gaming mouse - one which is using an optical sensor so there's no acceleration and your inputs are displayed 1:1 on the screen, you absolutely feel that differential lag in action - you don't need to have something moving in front of the screen to notice it. Even when moving the mouse cursor around on the desktop you can feel the disconnect between your inputs and the cursor moving on-screen.

Considering how drastic the difference in motion handling is with Lightboost enabled, and how minimal the latency impact is, it's a trade-off worth making - but you can feel the difference when using a mouse, rather than it being limited to touch.
Now if you're using a wireless controller and sitting further back from the display, I can see an argument being made that latency is far less noticeable in that situation, as it's a more "disconnected" experience.
post #141 of 184
Thread Starter 
Quote:
When you are using a proper gaming mouse - one which is using an optical sensor so there's no acceleration and your inputs are displayed 1:1 on the screen, you absolutely feel that differential lag in action
You are right -- however, 5ms isn't usually directly felt by itself for these situations (eye-hand-coordination delays, rather than visible touchscreen delays) -- it's the accumulation that you directly feel, in the whole input lag chain:




--versus--




(Source: AnandTech: Exploring Input Lag Inside & Out)
A gaming mouse can reduce input lag quite a lot (I have a Logitech G9x -- a 1000Hz gaming mouse -- and it makes a BIG difference in games and Windows use, from sheer accuracy). A regular mouse adds about +7ms more lag than a gaming mouse (Common 125Hz mouse means 8ms delay between position reports -- while a 1000Hz gaming mouse means only 1ms between position reports). Also, there are other bigger sources of lag to fix. As you can see, running at only 30 frames per second, adds an approximately +33ms input lag, while running at 100 frames per second, adds approximately +10ms input lag. Frame rate has a massive impact on input lag, as does other things such as buffering layers. This is why some competitive gamers like an uncapped framerate and VSYNC OFF -- allowing their game to run at reduced detail levels at at 500 frames per second actually to give them a competitive advantage even at only 120Hz, because those frames are rendered only 1/500th of a second ago -- giving you a competitive advantage in a "cross the finish line first" scenario. So the insane framerates you hear about, isn't silly.... (although not every gamer do that; some like to match framerate closer to refresh rate, for better fluidity, etc). That said, the 1ms can give you a winning advantage in a "cross the finish line first" scenario, even if you can't "feel" it.

But yes, you're right, the 5ms can be the "straw that broke the camel's back" to actually directly feeling the lag.
However, if your whole chain is really good, most really won't be feeling the +5ms delta directly.
That said, for the most part, the lack of blur (of a strobe backlight) reduces human brain lag in a way sufficient to compensate for the ultra-minor contribution that LightBoost does to the input lag chain.
Edited by Mark Rejhon - 6/10/13 at 5:31am
post #142 of 184
Thread Starter 
I've created an excellent web-based animation that demonstrates eye-tracking-based motion blur.
You're probably viewing this AVSFORUM post on your existing LCD computer monitor, iPad, or laptop, so it's just an easy click:

TestUFO Animation: Eye-Tracking Motion Blur

You need a VSYNC-capable web browser such as IE10+, Chrome 16+, Opera 15+, Safari 6+, or FireFox pre-Beta 24+.
The common IE9 and FireFox 22 release won't work smoothly. The preferred web browser is Chrome, as it also supports 120fps@120Hz.
* This is a beta web site. To diagnose bugs with animations; please send me a PM.

Traditional sample-and-hold LCD's will make this animation look like a checkerboard. This animation looks very different on CRT, plasma, LCD, DLP. DLP produces interesting temporal dithering artifacts if you use dark or pastel shades. Plasma shows patterns caused by subfield refreshes. LCD and non-flicker OLED shows simple motion blur. CRT and LightBoost look nearly identical, showing thin broken lines. Sony Trimaster OLED (PVM) will have a reduced-checkerboard effect (7.5ms of motion blur), somewhere in between CRT and traditional LCD. This temporal test pattern is a rather interesting demonstration of how eye-tracking brings out motion artifacts in displays.
Edited by Mark Rejhon - 7/17/13 at 11:38am
post #143 of 184
Quote:
Originally Posted by Mark Rejhon View Post

I've created an excellent web-based animation that demonstrates eye-tracking-based motion blur.
You're probably viewing this AVSFORUM post on your existing LCD computer monitor, iPad, or laptop, so it's just an easy click:

TestUFO Animation: Eye-Tracking Motion Blur

You need a VSYNC-capable web browser such as IE10+, Chrome 16+, Opera 15+, Safari 6+, or FireFox pre-Beta 24+.
The common IE9 and FireFox 22 release won't work smoothly. The preferred web browser is Chrome, as it also supports 120fps@120Hz.
* This is a beta web site. To diagnose bugs with animations; please send me a PM.

Traditional sample-and-hold LCD's will make this animation look like a checkerboard. This animation looks very different on CRT, plasma, LCD, DLP. DLP produces interesting temporal dithering artifacts if you use dark or pastel shades. Plasma shows patterns caused by subfield refreshes. LCD and non-flicker OLED shows simple motion blur. CRT and LightBoost look nearly identical, showing thin broken lines. Sony Trimaster OLED (PVM) will have a reduced-checkerboard effect (7.5ms of motion blur), somewhere in between CRT and traditional LCD. This temporal test pattern is a rather interesting demonstration of how eye-tracking brings out motion artifacts in displays.

 

I LOVE THAT THING!  Very cool animation.  I'll have to think some about the details of it.  But that's really really cool that you can pull that off with a browser!

 

I didn't know that browsers were so capable of vertical sync control and detection like that!  Awesome!  Thanks!!!!

post #144 of 184
Thread Starter 
Quote:
Originally Posted by tgm1024 View Post

I LOVE THAT THING!  Very cool animation.  I'll have to think some about the details of it.  But that's really really cool that you can pull that off with a browser!

I didn't know that browsers were so capable of vertical sync control and detection like that!  Awesome!  Thanks!!!!
Yes, Blur Busters has launched the world's first web-based precision motion tests (announcement), thanks to modern web browsers. And I even convinced Mozilla to support 120Hz VSYNC in FireFox, now in Version 24+ pre-beta. This is great when you run these motion tests on a computer outputting 120fps to a 120Hz monitor.

Don't forget to try these other tests:

- TestUFO: Black Frame Insertion Demo
- TestUFO: 15fps vs 30fps vs 60fps (adds "vs 120fps" if you are using a 120Hz monitor)
- TestUFO: Scrolling Text at 30fps vs 60fps
- TestUFO: Moving Photo

Over the course of this year, I will be adding extra selectable tests in the top selector, including moving test patterns.
Once it exits Beta, I'll make a wider announcement.
Edited by Mark Rejhon - 7/17/13 at 6:13pm
post #145 of 184
Fantastic site Mark, thanks for posting it. Your examples - particularly the scrolling photographs - are some of the best examples I have seen to illustrate just how much of an improvement even motion interpolation on its own (without backlight scanning) can make.
It's funny that so many people argue it doesn't make a difference. Clearly they have not seen a proper demo.
My set has 240Hz processing, and optionally you can enable backlight scanning for 480Hz, if I recall correctly. (it's a few years old now)

Even just enabling interpolation on its own is a stark difference compared to 60Hz, and motion is extremely sharp once you enable either of the backlight scanning options. (I was going to say "perfectly sharp" but you would probably disagree wink.gif)
Unfortunately my set does not allow you to use backlight scanning on its own (it's a HX900 and the HX920 introduced Impulse) but then it is the best local dimming set they have produced so far, so it is a compromise I am willing to make. (the Sharp UV2A panel they use is better than the ones used in the HX920/950)
This means you can get interpolation artifacts and it is unsuitable for gaming however, and there are some minor artifacts on the leading and trailing edges at high speeds. (overdrive?)


It's too bad that TVs aren't going to get to where LCD monitors are now for a long time.
In the next few years they'll be focusing on selling OLED for its "perfect motion" based on its switching time, and not retinal persistence. (where LED wins)
post #146 of 184
Quote:
Originally Posted by Mark Rejhon View Post


Traditional sample-and-hold LCD's will make this animation look like a checkerboard. This animation looks very different on CRT

I get no checkerboard on my CRT at any speed. This is a good thing right? FYI, not getting smooth motion at 120hz. Have a good card, Firefox 24.0a2. Only other problem is judder interfering with the black frame insertion test due to the fps / Hz differential. Doing it with fps=Hz might avoid people crying foul.
post #147 of 184
Quote:
Originally Posted by Chronoptimist View Post

In the next few years they'll be focusing on selling OLED for its "perfect motion" based on its switching time, and not retinal persistence.

I'm sure they will do that, and it'll be meaningless.
post #148 of 184
Quote:
Originally Posted by borf View Post

I get no checkerboard on my CRT at any speed. This is a good thing right? FYI, not getting smooth motion at 120hz. Have a good card, Firefox 24.0a2. Only other problem is judder interfering with the black frame insertion test due to the fps / Hz differential. Doing it with fps=Hz might avoid people crying foul.
It doesn't play smoothly in Firefox for me either (25.0a1) though that is at 60Hz - it keeps dropping one frame each time it crosses the screen. IE10 works fine.
post #149 of 184
Quote:
Originally Posted by Chronoptimist View Post

Quote:
Originally Posted by borf View Post

I get no checkerboard on my CRT at any speed. This is a good thing right? FYI, not getting smooth motion at 120hz. Have a good card, Firefox 24.0a2. Only other problem is judder interfering with the black frame insertion test due to the fps / Hz differential. Doing it with fps=Hz might avoid people crying foul.
It doesn't play smoothly in Firefox for me either (25.0a1) though that is at 60Hz - it keeps dropping one frame each time it crosses the screen. IE10 works fine.

 

Borf: yes: not getting a checkerboard on a CRT is as expected.

 

Chron: FF has been pissing me off incrementally for some time now.  If it weren't for the fact that I can't live without a few of the FF addons, I'd be on Chrome a long time ago.  Chrome, BTW, is by far the most popular browser out there---and that took me by surprise that it happened "that quickly".  Chrome runs that thing flawlessly.

post #150 of 184
Quote:
Originally Posted by tgm1024 View Post

Chron: FF has been pissing me off incrementally for some time now.  If it weren't for the fact that I can't live without a few of the FF addons, I'd be on Chrome a long time ago.  Chrome, BTW, is by far the most popular browser out there---and that took me by surprise that it happened "that quickly".  Chrome runs that thing flawlessly.
It may be one of the more popular "alternative" browsers out there, but its market share has started going down again recently. Frankly, I dislike Chrome, and I hate the trend of all browsers moving towards WebKit now.
  • I don't trust any browser created by an advertising company that has proven they have no regard for privacy laws. (though that may be irrelevant if you are a US citizen)
  • Chrome skips a lot of things about rendering quality for the sake of speed. It's probably been fixed since I last used it (a few months ago) but font rendering used to be horrible in Chrome. Nothing compares to the color management and font rendering of Firefox - especially once tweaked for your display & preferences.
  • The Firefox team have been doing great things for performance and efficiency for some time now. I keep hearing it parroted that Chrome uses so much less memory than other browsers etc. but that has not been true for a long time. In fact Chrome has become bloated, and may only appear to use less resources through obfuscation.
  • Chrome's "better stability" due to process separation doesn't actually seem to help. I don't think I have ever seen one window/tab crash in Chrome and not take down the rest of the browser as well.

Firefox just recently won Toms Hardware's latest performance review: http://www.tomshardware.com/reviews/chrome-27-firefox-21-opera-next,3534-12.html
And there is still nothing which has the flexibility of Firefox. I know that most people don't care, but for power users, it's great being able to tweak just about any aspect of the program to work the way you want it to, and it allows for more powerful extensions than Chrome does.

Surprisingly, I actually don't mind the direction that Internet Explorer has been heading recently. IE10 is a pretty decent browser these days, though I'm still not a huge fan of its UI.
But IE10 has Flash built in, which means that I don't need an actual Flash installation on my system, so it's useful to have around when I encounter a site that still requires it rather than using HTML5 video.


What are the Firefox extensions you can't live without? For me, the main ones are Adblock Plus (Chrome Adblock is not the same) Cookie Monster, DownThemAll, Noscript, and RequestPolicy.
New Posts  All Forums:Forum Nav:
  Return Home

Gear mentioned in this thread:

AVS › AVS Forum › Display Devices › Flat Panels General and OLED Technology › LCD motion blur: Eye-tracking now dominant cause of motion blur (not pixel transition/GtG)