or Connect
AVS › AVS Forum › Display Devices › Flat Panels General and OLED Technology › LCD motion blur: Eye-tracking now dominant cause of motion blur (not pixel transition/GtG)
New Posts  All Forums:Forum Nav:

LCD motion blur: Eye-tracking now dominant cause of motion blur (not pixel transition/GtG) - Page 3

post #61 of 184
Quote:
Originally Posted by Mark Rejhon View Post

Via following "LightBoost HOWTO" (to enable the strobe backlight mode in 2D) (requires nVidia graphics card)

- BENQ XL2411T (best; near zero crosstalk, lowest input lag)
- ASUS VG248QE (best; near zero crosstalk)

I take it that for 120Hz PC gaming that the BenQ XL2411T and ASUS VG248QE are indeed the best currently available? Are there any other models to replace or outperform them in the next 6 months? Is the XL2420 the replacement model for the 2411?

Gear mentioned in this thread:

post #62 of 184
Thread Starter 
Quote:
Originally Posted by rgb32 View Post

I take it that for 120Hz PC gaming that the BenQ XL2411T and ASUS VG248QE are indeed the best currently available? Are there any other models to replace or outperform them in the next 6 months? Is the XL2420 the replacement model for the 2411?
Yep, XL2411T and the ASUS VG248QE are the very best ever non-CRT video game monitors for motion quality for fast action gaming -- FPS games. They were introduced only weeks ago! Great buzz in all the gaming forums (e.g. teamfortress.tv, quakelive.com, etc) with rave reviews for these monitors, even without LightBoost. These are the only 1ms monitors on the market now, they just came on the market. Even without LightBoost enabled.

Both XL2410 and XL2420 are older models (2ms). The XL2411T is a newer model (1ms), which just came out last month (December). The ASUS VG248QE (1ms) just came out at the end of January 2013.

The telltale sign is to look for "1ms" LCD. Although the "1ms" is an exaggeration(!) in metrics, it apparently proved itself in having amazingly near-zero crosstalk with 3D stereoscopic glasses. But I'm not really interested in 3D. I'm more interested in motion blur elimination in 2D. But I've never, never, never seen LCD's with darn near zero crosstalk between left eye and right eye. So the "1ms" technology is doing something right. And it also benefits 2D zero motion blur, by preventing the previous refresh from leaking into the next refresh. So it is really good when you combine "1ms" + "strobe backlight" (LightBoost) for regular PC gaming without 3D glasses.

But these monitors have awful color out of the box, and you need to calibrate the color via nVidia Control Panel, using test patterns (Lagom Contrast and Lagom Black Level). The colors then look much better, even if it will not be as good as IPS. But you'll get the CRT-quality motion resolution!

Make sure you have a powerful nVidia GPU (or an SLI), since you need to eliminate your judder and run fps=Hz (or fps>Hz with VSYNC OFF). That means 120fps@120Hz (or at the very least, 100fps@100Hz), since the strobe backlights on these monitors only functions at these refresh rates. I recommend a GTX 680, at the minimum.

Play slow games (non-FPS) or mainly use desktop? Get a good 1440p IPS instead (Catleap 2B overclockable to 1440p@120Hz refresh, Overlord Monitor is another brand). Even at high refresh rates, it will be far more motion-blurry than all the LightBoost monitors, but have far better color.

Pick your poison -- color quality? motion quality? The only way to get both CRT-quality in the same LCD monitor is this $10,000+ monitor (23 inch) -- Viewpixx scientific vision research monitor.
Edited by Mark Rejhon - 2/22/13 at 3:36pm
post #63 of 184
Quote:
Originally Posted by Mark Rejhon View Post

Pick your poison -- color quality? motion quality?

Are they using (8-bit?) 1ms TN monitors out of necessity. Can IPS or samsung's s-PVA be used. Lightboost is great but i still see demand for 1) better color 2) a GPU independent solution 3) A big screen solution (HTPC is the future). So just hoping you haven't dropped the baclkight project.
post #64 of 184
Quote:
Originally Posted by Mark Rejhon View Post

The only way to get both CRT-quality in the same LCD monitor is this $10,000+ monitor (23 inch) -- Viewpixx scientific vision research monitor.
Or buy OLED for half that? http://www.bhphotovideo.com/c/product/766411-REG/Sony_PVM_2541_Professional_OLED_Picture_Monitor.html
post #65 of 184
Quote:

Has someone verified that these Sony OLED's have CRT level motion quality? The mobile OLED's don't really do any better than a typical LCD based on my experience with the Sony Vita. They would need to utilize some type of strobe to overcome the sample-and-hold induced blur.
post #66 of 184
Thread Starter 
Quote:
Originally Posted by borf View Post

Are they using (8-bit?) 1ms TN monitors out of necessity. Can IPS or samsung's s-PVA be used. Lightboost is great but i still see demand for 1) better color 2) a GPU independent solution 3) A big screen solution (HTPC is the future). So just hoping you haven't dropped the baclkight project.
The faster the panel, the much more effective scanning backlights and strobe backlights are in eliminating motion blur. Here, a 1ms difference actually makes a big chasm of a difference for backlights that require pixel persistence to fit completely inside the length of a vertical blanking interval between refreshes. Read further to understand why:

By having only 1ms out of a refresh for pixel persistence (pixel transition 90% complete), you theoretically have plenty of time to let the pixel finish fully (let the pixel settle; pixel transition 99.5%+ complete) before strobing the backlight before the next refresh begins. A refresh at 120 Hz is 8.33 milliseconds long, and pixel persistence is now a tiny fraction of a refresh, giving plenty of time to "finish off" the pixel transition.

However, strobe backlights have a much tighter "timing window" than a scanning backlight does, because the vertical blanking interval between refreshes is usually only a millisecond long! You need to squeeze the pixel persistence completely inside the vertical blanking interval (You can also use logic to create artifically longer blanking intervals by fast-scanning the panel; but that can add input lag due to pre-buffering the refresh in order to make the fast-scanout possible)

Past scanning backlights don't meet manufacturer claims well (20-30% blur elimination, and resulting motion equivalence ratios closer to "200" instead of "960") because of pixel persistence leakage between frames, and because of backlight diffusion (leakage between on-segments and off-segments). Strobe/scanning backlights work best when pixel persistence is darn near 100% complete at the moment the LCD is illuminated. 3D panels forced that to be possible because you need separate frames for left and right eye. Incomplete pixel persistence show up as crosstalk with 3D glasses. That benefitted strobe backlights hugely. You pretty much want pixel persistence to be gone before the backlight is flashed. LightBoost strobe backlights provide motion blur elimination of about one order of magnitude -- in one case, the BENQ XL2411T measured 92% less motion blur (MPRT=1.4) than a sample-and-hold non-PWM 60 Hz LCD (MPRT=16.7).

Alas, it is true, color quality is terrible with TN compared to the alternatives (IPS and s-PVA). You can use a scanning backlight, but they appear to be unable to successfully achieve the "order-of-magnitude" motion blur elimination (yet). An IPS panel often takes much longer to finish transiting, and there's still some leakage of the pixel transition well into the next refresh, so it becomes much tricker to strobe.

Here's a very good comparision:
High Speed Video of 2007-era LCD refresh pattern (5-8ms monitor)
High Speed Video of 2012-era LCD refresh pattern (2ms monitor)

These videos make it much easier to understand the tricky science of scanning/strobe backlights.
-- Notice how the backlight is strobed during the blanking interval on the 2012 LCD?
-- The 2007 refresh pattern shows that strobe backlights won't work well on this type of LCD, and you need a scanning backlights (lighting up one row of LED's at a time) to keep synchronized with the refresh pattern of a "slow LCD"/
-- The strobe backlight lighting up the whole backlight/edgelight all at once.
-- Strobe backlights are much simpler than scanning backlights, but they only work with very fast LCD's that can finish refreshing before the next refresh begin as you can see in the 2012 high speed video. Strobe backlights eliminates backlight diffusion issues (between on/off segments) provided you've got a fast LCD that can have a fully clear frame for the whole screen to strobe the backlight on. However, strobe backlights require really fast LCD's (1ms or 2ms) because you don't have much time in the vertical blanking interval between refreshes.

It was only in the last few years that LCD's finally was able to erase more than 99% (and for certain GtG transitions, 99.8%+) of remnants of pixel persistence before the next refresh begins. This allow pixel persistence to be virtually completely hidden simply by turning off the backlight between refreshes -- resulting in an impulse-driven display as seen by eyes. The new "1ms" (controversal spec, but actually VERY beneficial for 3D and strobe backlights) actually seem to be proving themselves in gaming forums with near-zero crosstalk between refreshes (1ms apparently was discovered to be good for eliminating more crosstalk -- good for 3D shutter glasses, good for strobe backlights, good for zero motion blur operation). Once pixel persistence is fully hidden in total darkness between refreshes, there is no upper limit on how "clear motion" LCD monitors can be; the limiting factor is simply how briefly you can strobe the backlight like a high-speed flash. At that point, it becomes possible for LCD to have less motion blur than CRT, since backlights can be flashed faster than CRT phosphor.

The only commercially available LCD's with true motion-test measured "order-of-magnitude" motion blur elimination today (vs 60Hz LCD), are full-strobe LightBoost backlights on 1ms and 2ms TN panels (with the 1ms panels apparently clearly superior, due to the need to squeeze the pixel persistence completely inside the vertical blanking interval, as shown in the 2012 high speed video). This is a situation where even a 1ms difference in pixel persistence matters!

Summary:
-- Strobe backlights allow much more motion blur elimination than scanning backlights.
-- Scanning backlights work with almost any LCD technology, including IPS and PVA technologies.
-- Strobe backlights currently only really well on TN panels. (which means lower color quality)
-- Scanning backlights works with longer pixel persistences.
-- Strobe backlights requires very short persistence that fit completely in the vertical blanking interval (less than 2ms).
-- 1ms difference in pixel persistence is a major difference when it comes to scanning backlights
-- 1ms difference in pixel persistence is a galaxy-sized, massive chasm of difference when it comes to 3D crosstalk and strobe backlights that must fit in blanking intervals.

When it comes to 3D crosstalk and inter-frame leakage in strobe backlights, sub-millisecond differences in pixel persistence is actually noticeable to human eyes in the form of fainter/stronger crosstalk and artifacts during strobe backlight technologies. I'm talking about average pixel persistence for ALL GtG transitions (not just BWB or WBW transitions).
Edited by Mark Rejhon - 2/7/13 at 8:44am
post #67 of 184
Thread Starter 
Quote:
Originally Posted by Wizziwig View Post

Has someone verified that these Sony OLED's have CRT level motion quality? The mobile OLED's don't really do any better than a typical LCD based on my experience with the Sony Vita. They would need to utilize some type of strobe to overcome the sample-and-hold induced blur.
Correct. Some Crystal LED's and OLED's are strobe-driven, while other OLED's are sample-and-hold. Unfortunately, the "sample-and-hold" type OLED's are the same as LCD in motion blur. That's why the Sony Vita OLED is no better than a fast LCD (1ms/2ms) in motion blur, even though the OLED colors are so much better.

The expensive Sony Crystal LED prototype does strobe from what I heard -- so you'll get really good motion on that one (as long as the strobe lengths are <2ms, similiar to the illumination period of a medium-persistence CRT phosphor, and much better than plasma phosphor)
post #68 of 184
Quote:

Could OLED be driven bright enough to support the short hold times needed to eliminate blur. Would blue lifetime suffer. If not, LCD could be the only possible option for CRT-like motion for the next 10-20 years despite OLED's ultra-fast respose.


Quote:
Originally Posted by Mark Rejhon View Post

An IPS panel often takes much longer to finish transiting, and there's still some leakage of the pixel transition well into the next refresh, so it becomes much tricker to strobe..

But like you say, scanning should be used instead of strobing with slower panels (IPS). Do you think reduced effectiveness and more complicated implemention still makes it feasable. Can't download those videos right now at work.
post #69 of 184
Thread Starter 
Quote:
Originally Posted by borf View Post

Could OLED be driven bright enough to support the short hold times needed to eliminate blur. Would blue lifetime suffer. If not, LCD could be the only possible option for CRT-like motion for the next 10-20 years despite OLED's ultra-fast respose.
I hope not. The dream display would be a strobe-driven Crystal LED display, but the cost of over 6 million separate LED chips is mind blowing. LCD panels has fundamental limitations, such as imperfect black levels (although various workarounds exists with pros and cons, such as local dimming).
Quote:
But like you say, scanning should be used instead of strobing with slower panels (IPS). Do you think reduced effectiveness and more complicated implemention still makes it feasable. Can't download those videos right now at work.
Scanning backlights still benefits LCD quite a lot. But it doesn't seem to make LCD have less motion blur than plasma. Presently, only strobe backlights (e.g. LightBoost) successfully make LCD be better than plasma in motion blur, but you lose color quality because those are TN LCD's. So full-strobe backlight LCD's are not that suitable for home theater, yet. It is, however, of great interest to competitive online gamers due to the reaction time advantage of eliminating motion blur. Although effective, it is a trick/workaround (some say "gimmick"; but it's far less "gimmick" now on better LCD's) adapted to LCD. Good motion blur reduction can occur, but not full order(s) of magnitude that a full-strobe backlight can do.

If you skip plasma displays for any reason -- then the closest thing in a full HDTV set, that comes to a full-panel strobe in a home theater LCD, is Sony's "Motionflow Impulse" mode (in sets with the Motionflow XR 960 feature). The "Motionflow Impulse" setting does not use interpolation. It dims the backlight dramatically, and it flickers annoyingly for many (60 Hz). But if you want the 'purest' non-interpolated CRT-style motion in a home theater LCD display, it's the closest thing in a non-plasma flat panel. If you despise plasma for some reason (??) and if you despise interplation, then it's one option to try. But not many people like its CRT-style 60 Hz flicker of the "Motionflow Impulse" mode. Strobe backlights is far more attractive and effective at 120 Hz, due to lack of flicker (to most eyes). But the world is not standardized on 120fps source. We'd have to wait for 120 Hz to be widespread in two decades; e.g. NHK 8K 120Hz biggrin.gif

Wish there was a simple answer / simple solution to the motion blur problem. It's a flicker-versus-refresh tradeoff. It will be something we will need to keep fighting for, for decades to come, probably. Flicker free is a strong advantage for eyestrain for many. However, that automatically means motion blur (even for instant-pixel-response displays) unless you shorten the frame refresh sample lengths without adding flicker (black periods between samples). 1ms samples would require an insane real-time 1000fps@1000Hz if you wanted to avoid interpolation *and* flicker, and get CRT-motion clarity on a sample-and-hold display.

Summary:
- Some people hate flicker. (Most at 60 Hz, other can tell at 85 Hz, a few can see 120 Hz)
- Some people hate motion blur.
- Some people hate both.
- There can be stroboscopic effects in fast motion even at 120fps (e.g. wagon wheel effect, phantom array effect)
- Some people hate interpolation artifacts.
- There's input lag with interpolation.
- Some people even get headaches with 360 Hz PWM dimming (so 360fps@360Hz impulse-driven isn't yet the final frontier)
- It's confirmed that motion blur of 120Hz sample-and-hold (LCD) is far worse than 60 Hz impulse-driven (CRT)
- It's confirmed that diminishing points of returns go for quite a long time beyond 1/120sec samples, and even beyond 1/1000sec samples (1ms).
- It's confirmed motion blur is still visible even at 240fps@240Hz on sample-and-hold displays, when testing common motion test patterns.

The only way to solve ALL above problems simultaneously (at least to essentially five-nines percentages -- 99.999% of population), is insanely high framerates on a sample-and-hold display. Real-life does not flicker. So why should displays? So that means real-time 1000fps@1000Hz; and that will never happen in the mainstream within our lifetimes (Perhaps in the distant future, of course). Can you imagine 8K@1000fps, displayed on 1000Hz displays in real-time? That's probably not going to happen within our lifetimes! This fact essentially guarantee that motion blur will remain a topic of discussion for decades to come. It is going to be a field of study for decades to come because you can't satisfy every single criteria.

I think that a compromise will probably happen within a lifetime, however -- eventual (within 2 decades) standardization to 120Hz source material for 'video' material, the shortening of frame sample lengths will continue to be done via existing means such as impulse driving (adding black period between samples) and/or interpolation (adding more samples). These are the only two practical ways to shorten frame sample lengths, since frame sample length is unavoidably linearly proportional to display motion blur. (this is above and beyond eye tracking inaccuracies and source-based motion blur). This is regardless of technology (OLED, plasma, LCD, Crystal LED, CRT, etc).

OLED is the correct direction to go into for image quality. It solves the black level and color problem, and they CAN be impulse-driven for the CRT-quality zero motion blur. Traditional LED's have become brighter and brighter since the 1960's, to the point where you can use LED's as a light source for projectors, streetlamps, and even 100% LED stadium lighting! OLED should follow a similar brightness improvement trajectory over the years, so impulse-driving is increasingly more and more feasible. 1ms impulse-driving timescales for OLED is possible, this only requires OLED to be ~8x times brighter relative to 120 Hz sample-and-hold (1ms vs 8.33ms), and ~16x times brighter than 60 Hz sample-and-hold. (1ms vs 16.7ms). In the interim, interpolation can be accomplished for OLED too (240fps@240Hz, 480fps@480Hz, etc) to avoid the need for them to flicker (but interpolation does disqualify them from computer/games due to the input lag issue of interpolation).

At the same time, LCD's will be a viable technology for a very long time, and there's still lots of incremental improvements left in LCD left, before _cheap_ large-screen super-bright OLED exist (with ability to optionally impulse-drive at 1ms strobes in a computer/game-friendly manner, without interpolation and without becoming dim) -- which would not easily be until the 2020's.

P.S. To complicate things.... For movies, I still prefer 24fps over 48fps, and non-interpolated. biggrin.gif
Edited by Mark Rejhon - 2/7/13 at 12:43pm
post #70 of 184
Thanks Mark, interesting read.

I was curious if motion blur had been solved on Home Theater LCDs, but sounds like not.
post #71 of 184
Thread Starter 
Quote:
Originally Posted by VarmintCong View Post

Thanks Mark, interesting read.

I was curious if motion blur had been solved on Home Theater LCDs, but sounds like not.
You can eliminate a lot of it, on some high-end models such as Elite LCD HDTV. Not all, but most of it. But not in a game-compatible manner (lots of input lag). Then you've got the LightBoost LCD computer monitors, which are amazing for completely blur-free gaming (but not much else).

100% Interpolation-free + 100% flicker-free + fully CRT-clear motion = making 99.999% of population happy is probably an unsolved problem this century. And I haven't even added the "+ videophile friendly" requirement either!
post #72 of 184
Quote:
Originally Posted by Mark Rejhon View Post

Scanning backlights still benefits LCD quite a lot. But it doesn't seem to make LCD bypass plasma. Presently, only strobe backlights (e.g. LightBoost) successfully make LCD be better than plasma in motion blur.

Pretty straightforward. I guess that's partly due to segment crosstalk?.
Quote:
Originally Posted by Mark Rejhon View Post

The closest thing in a full HDTV set, that comes to a full-panel strobe in a home theater LCD, is Sony's "Motionflow Impulse" mode.

i'll look into it, thanks (I do prefer the sharper, brighter lcd image over plasma, especially for PC use)
Quote:
Originally Posted by Mark Rejhon View Post

Strobe backlights is far more attractive at 120 Hz. We'd have to wait for 120 Hz to be widespread in two decades; e.g. NHK 8K 120Hz biggrin.gif

Depressing but here's to seeing 8k 120hz sooner with decent eyesight.
Quote:
Originally Posted by Mark Rejhon View Post

1ms samples would require an insane real-time 1000fps@1000Hz if you wanted to avoid interpolation *and* flicker, and get CRT-motion clarity on a sample-and-hold display.

Yea i guess this would be the holy grail for frame based displays. Honestly, 75 hz/fps (1ms strobes) are just fine for me, going by my crt. II still have a hard time believing some notice flicker at 120hz. A fluorescent lit room must look like a disco by that logic. Surely 120hz, if noteable is not objectional.
Quote:
Originally Posted by Mark Rejhon View Post

I think that a compromise will probably happen within a lifetime, however -- eventual (within 2 decades) standardization to 120Hz source material for 'video' material.

That's my theory reading how things are progressing. At least now there will be Lightboost in the meantime.
Quote:
Originally Posted by Mark Rejhon View Post

P.S. To complicate things.... For movies, I still prefer 24fps over 48fps. biggrin.gif

Ha ha well i admit "soap opera" looks unappealing to me sometimes. Others i love it.
post #73 of 184
Thread Starter 
Quote:
Originally Posted by borf View Post

Pretty straightforward. I guess that's partly due to segment crosstalk?.
Yes, two causes:
-- Backlight diffusion between adjacent backlight segments (on segments vs off segments); and
-- Pixel persistence limitations of IPS or PVA; which have far more pixel persistence than TN panels.

Trying to squeeze a massive elephant into a vertical blanking interval (squeezing pixel persistence into the time period of a vertical blanking interval) is extremely hard to do. Very few IPS panels are able to do it (3D shutter glasses) and there's a lot of crosstalk on those. Inter-refresh crosstalk also directly interfere with motion blur elimination of scanning backlights too (even for non-3D). Less crosstalk = automatically allows more successful motion blur elimination via backlight control. Dramatic motion improvements occur once you squeeze more and more pixel persistence into the time period of a lengthened vertical blanking interval (and only then, finally be able to completely hide pixel persistence by strobing the whole backlight before the next refresh begins, no backlight diffusion issue).
Edited by Mark Rejhon - 2/7/13 at 2:25pm
post #74 of 184
Thread Starter 
Quote:
Originally Posted by borf View Post

A fluorescent lit room must look like a disco by that logic.
My mom was sensitive to 120 Hz flicker. She got splitting headaches. Even when it does not look like a disco, the light often "doesn't look right".

Fortunately, most fluorescent lights in today's offices (here in Canada) have switched to electronic ballasts (well over 1Khz) so they don't flicker anymore. High-frequency electronic ballasts are highly recommended installations in new offices nowadays. They have an economic benefit too: They make tubes last longer. Ergonomic corporate code have been written to highly recommend electronic ballasts due to better work productivity:

"When compared to regular fluorescent lights with magnetic ballasts, the use of high frequency electronic ballasts (20,000 Hz or higher) in fluorescent lights resulted in more than a 50% drop in complaints of eye strain and headaches."
Source: Canadian Centre for Occupational Health and Safety
Here in Canada, we have gotten rid of old-fashioned ballasts from most workplaces because of 120 Hz flicker. Most USA and European offices have also followed suit.

My Casio Exilim EX-FC200 high speed 1000fps camera shows that my office flurescent lights are 100% flicker free at these timescales. Not even a 1% brightness fluctuation at all in high speed video.

Even the 120Hz fluorescent lights have not flickered very much for a long time around here. In a traditional fluorescent fixture with a 30-year-old ballast, and cheap tubes (non-full-spectrum), although I couldn't tell directly, I can easily tell 120 Hz fluorescent lamp flicker indirectly through the phantom array effect / wagon wheel effect. (Same effect that causes tire hubcaps to look stationary in speeding cars under harsh sodium-arc street lamps). I can see the flicker indirectly, even if it doesn't bother my eyes. So I'm not surprised it probably bothers some people. An alternate way to test fluorescent flicker is the hand-wave test. Stare at the cieling fixture, wave your hand fast across it -- you can tell if your hand wave is a smooth blur or a stroboscopic blur (usually with yellow/blue fringes if using a very old fluorescent fixture and cheap/old-style "cold white" tubes). These days now, the 120Hz flicker amplitude of modern fluorescent lighting is subtle due to better phosphor decay on newer tubes, and some good expensive "full-spectrum" tubes are as subtle as the 120Hz incandescent flicker (faint flicker). Even modern cheap tubes only dim to approximately ~50%-75% during the trough of their flicker on an old-fashioned ballast; unlike tubes from half a century ago. However, electronic ballasts (20KHz) make a cheap tubes perfectly flicker free to humans, and even prolongs the tube's life too.

Link: Some good further science reading
(including being able to detect 500 Hz indirectly via the strobosopic / phantom array effect)
Edited by Mark Rejhon - 2/7/13 at 2:52pm
post #75 of 184
Quote:
Originally Posted by Mark Rejhon View Post

I can easily tell 120 Hz fluorescent lamp flicker indirectly through the phantom array effect / wagon wheel effect. (Same effect that causes tire hubcaps to look stationary in speeding cars under harsh sodium-arc street lamps). I can see the flicker indirectly, even if it doesn't bother my eyes. So I'm not surprised it probably bothers some people.)

You convinced me. It's been a pleasure following this, thanks.
post #76 of 184
Had to try this for myself so I picked up the VG247QE. It is indeed quite impressive, to the point where you can move entire windows of small text around the screen and read it completely clearly. The panel itself is... not great. I haven't used a TN panel at home for years aside from my laptop. There's some backlight bleed, and a general lack of uniformity on the panel. The entire top 1/4 is somewhat darker then the rest of the screen (regardless of viewing angle), and I'm wondering if this is normal due to how the monitor is built or if some of the LEDs aren't on up there. On the other hand, the response time is very impressive. Even with LightBoost off it's the best LCD motion I've ever seen. You can tell that the blur is almost entirely from your eyes as you actually see a blur of sharp edges instead of the liquid-like smear that lower LCDs produce. For example, when moving a window of text with LB off you can clearly see a wall of sharp-edged text moving about instead of a black/gray smear. With LightBoost on, it'd be hard to differentiate the motion from a CRT. Of course, I haven't used a CRT in a few years, but I'll always remember how you could read any moving text clearly. When you're over 120fps the motion clarity is initially a little shocking, and I'd see it as a huge advantage for competitive gamers.

At the old age of nearly 30 I don't really keep my CS skills up, but I loaded up CSS and played a few games just to see what it was like. I'm quite rusty but I definitely felt like I had an advantage over some of the players I encountered (others were likely already on similar displays). I went to Battlefield 3 after that, which is more my speed now (a bit less emphasis on rushes to choke points and face to face headshot competition). I had to drop the mesh quality to keep at the 100+ range on 64 player maps, but it again seemed like a huge advantage. BF3 is especially bad on a 60hz IPS, as the detailed environments are much harder to read correctly if they start to blur even slightly. I definitely ran into situations where I could identify enemy players long before they were even aware of me. As everyone told me, it does look bad if the framerate drops much below 100; there's a ton of judder at this point. It's not as bad as 60hz LCD blur, but at this point plasma wins easily.

I can't see any flicker, however you can pass your hand back and forth in front of the screen and clearly see the strobe effect that way. It seems like it might bother my eyes a bit, but certainly not as much as a CRT. It's also hard to tell if this is because of the strobe or the fact that the panel is very bright in any LightBoost mode if you're in a dark room. I've heard people complaining about it not being bright enough, but I have to imagine that these people must be in very bright spaces and should get some curtains. I actually feel the need to turn more light on than I do for gaming on my plasma at night.

Overall it's a very impressive technical feat. I think pretty many of us are now considering the same thing: how long it will be until we see the first "1ms GTG" IPS at a commercial level that can work effectively with this type of strobing (that 10k monitor looks great, but ... yeah). The fact that this functionality only gets to be marketed as a "3D" feature speaks to the fact that NVidia and the monitor manufacturers currently don't have a lot of confidence in a market for high-performance monitors for serious gamers, but I would imagine they're looking at how they can get IPS to the level where it performs this well for 3D anyway.

*edit: Moved discussion about BLB to a thread specifically about the monitor, this isn't the right place. Short version: Has plenty of backlight bleed.
Edited by headlesschickens - 2/8/13 at 2:46pm
post #77 of 184
Thread Starter 
Quote:
Originally Posted by headlesschickens View Post

Had to try this for myself so I picked up the VG247QE. It is indeed quite impressive, to the point where you can move entire windows of small text around the screen and read it completely clearly. The panel itself is... not great. I haven't used a TN panel at home for years aside from my laptop. There's some backlight bleed, and a general lack of uniformity on the panel. The entire top 1/4 is somewhat darker then the rest of the screen (regardless of viewing angle), and I'm wondering if this is normal due to how the monitor is built or if some of the LEDs aren't on up there.
This is a known problem with 27" TN monitors, especially the VG278HE.
Also, you apparently actually ended up picking the LightBoost monitor with the most crosstalk problem (I didn't know it at the time). A person on the overclock.net Forums had both the VG278H/VG278HE side-by-side and found that the HE has more crosstalk between refreshes. Myself, I own an ASUS VG278H, and a BENQ XL2411T, the latter has even less crosstalk. The VG248QE / XL2411T are the only computer monitors with 1ms-rated pixel response time; and while merits have been debatable; it has major benefit for 3D/strobe backlights because it actually shows in reduced crosstalk in my testing between my ASUS VG278H (2ms) and the BENQ XL2411T -- the inter-refresh crosstalk is much fainter (about 5x) for many GtG pixel transitions. I'm quite impressed at the XL2411T far more so than the VG278HE.
Quote:
Originally Posted by headlesschickens View Post

On the other hand, the response time is very impressive. Even with LightBoost off it's the best LCD motion I've ever seen. You can tell that the blur is almost entirely from your eyes as you actually see a blur of sharp edges instead of the liquid-like smear that lower LCDs produce. For example, when moving a window of text with LB off you can clearly see a wall of sharp-edged text moving about instead of a black/gray smear. With LightBoost on, it'd be hard to differentiate the motion from a CRT. Of course, I haven't used a CRT in a few years, but I'll always remember how you could read any moving text clearly. When you're over 120fps the motion clarity is initially a little shocking, and I'd see it as a huge advantage for competitive gamers.
I have very interesting quotes from several gamers that the competitive advantage was quite massive for them, far outweighing the input lag disadvantage of LCD over CRT. (The BENQ is measured to be really good for this).
Quote:
Originally Posted by headlesschickens View Post

At the old age of nearly 30 I don't really keep my CS skills up, but I loaded up CSS and played a few games just to see what it was like. I'm quite rusty but I definitely felt like I had an advantage over some of the players I encountered (others were likely already on similar displays). I went to Battlefield 3 after that, which is more my speed now (a bit less emphasis on rushes to choke points and face to face headshot competition). I had to drop the mesh quality to keep at the 100+ range on 64 player maps, but it again seemed like a huge advantage. BF3 is especially bad on a 60hz IPS, as the detailed environments are much harder to read correctly if they start to blur even slightly. I definitely ran into situations where I could identify enemy players long before they were even aware of me. As everyone told me, it does look bad if the framerate drops much below 100; there's a ton of judder at this point. It's not as bad as 60hz LCD blur, but at this point plasma wins easily.
Yes, judder control is even more important on an impulse-driven display (You no longer have the additional motion blur to hide judders better). Compare a plasma with lots of judder to LightBoost with lots of judder. Plasma's run at 60 Hz refresh rate while LightBoost runs at 120 Hz refresh rate, so it's easier to get fps=Hz when running at only 60fps. Note that you can run LightBoost at refresh rates as low as 100Hz, though it uses longer strobe lengths (albiet adjusting the LightBoost OSD setting shortens this).
Quote:
Originally Posted by headlesschickens View Post

I can't see any flicker, however you can pass your hand back and forth in front of the screen and clearly see the strobe effect that way. It seems like it might bother my eyes a bit, but certainly not as much as a CRT. It's also hard to tell if this is because of the strobe or the fact that the panel is very bright in any LightBoost mode if you're in a dark room. I've heard people complaining about it not being bright enough, but I have to imagine that these people must be in very bright spaces and should get some curtains. I actually feel the need to turn more light on than I do for gaming on my plasma at night.
It's true that LightBoost brings back the CRT flicker disadvantage, but it doesn't seem to bother my eyes at all. It's because of the very even 120 Hz flicker, unlike most CRT's which ran at a lower refresh rate (except for the very good ones).
Quote:
Overall it's a very impressive technical feat. I think pretty many of us are now considering the same thing: how long it will be until we see the first "1ms GTG" IPS at a commercial level that can work effectively with this type of strobing (that 10k monitor looks great, but ... yeah). The fact that this functionality only gets to be marketed as a "3D" feature speaks to the fact that NVidia and the monitor manufacturers currently don't have a lot of confidence in a market for high-performance monitors for serious gamers, but I would imagine they're looking at how they can get IPS to the level where it performs this well for 3D anyway.
I did several MPRT measurements recently (Motion Picture Response Time). Provided a strobe backlight is done properly (turning it off during pixel persistence), you can have MPRT's much lower than the pixel response time. I'm going to post in the next reply:
Edited by Mark Rejhon - 2/9/13 at 8:45pm
post #78 of 184
Thread Starter 
I'm getting 1.4ms MPRT from my 2ms ASUS VG278H in tests.

....drum roll....
Measured MPRT < panel's pixel response time !!!!!!
Yes, measured MPRT that's LESS than the LCD panel native pixel response time!

You heard me right -- via synchronized backlight strobes -- a measured motion picture response time (M.P.R.T.) that's lower than the physical LCD pixel response time! PixPerAn confirmed this. Plus, my own motion tests (now in beta) confirmed this. I also put my oscillscope with photodiode, against my screen -- and saw the ~1.4-1.5ms strobe waveform. The pixel persistence stage of the refresh is simply kept in total darkness with backlight turned off, while waiting for pixel transitions, then the backlight is strobed quickly on a fully-refreshed frame (full backlight strobe timed during the vertical blanking interval). The strobes (seen by eye) can be shorter than the pixel persistence itself (kept in dark). Already here, already in actual monitors, off the shelf!

Very impressive -- successfully bypassing pixel persistence as the motion blur barrier.
(High speed 1000fps video proof -- for LightBoost 100% setting)

This video is of LightBoost set to a mode that matched an MPRT of about 2.4ms. This video was made before I made this measurement discovery of a new LightBoost setting (LightBoost OSD setting readjusted to "10%" (not "OFF") -- too dim for most uses except in darkness, but has the interesting quirk of shorcutting around the LCD law of physics limitation, via backlight strobe). I knew it was possible, but I only discovered the actual real-world example, currently sitting on my desktop, of an actual example of (MPRT < pixel response). I correctly predicted was possible. I'll create a new, 2nd video proof of MPRT's lower than pixel persistence.

Some manufacturers trying to advertise to competition gamers, should put a big, bleeping, large sticker "10x CLEARER MOTION" -- It is an actual measurable order-of-magnitude improvement. Recently, I tested PixPerAn on both my BENQ XL2411T and ASUS VG278H, and came up with these numbers, which had roughly identical amounts of improvements.

LCD 60 Hz = baseline (MPRT=16.7)
LCD 120 Hz = 50% less motion blur than 60 Hz (2x sharper motion) (MPRT=8.33)
LCD 120 Hz (LightBoost at 100% setting) = 85% less motion blur than 60 Hz (7x sharper motion) (MPRT=2.4)
LCD 120 Hz (LightBoost at 10% setting) = 92% less motion blur than 60 Hz (11x sharper motion) (MPRT=1.4 !!! !!! !!!)

There is a growing amount of buzz on several forums (600 posts and 50,000 views in the HardForum.com gamers thread) -- so I hope that eventually monitor manufacturers will start to advertise this feature better. Two have already privately contacted me already inquiring interest. My sense, really, is that it's very hard to advertise a strobe backlight, and it's easier to advertise "Hz" or "3D". Another sense I get, is that many people at monitor manufacturers (not as motion-oriented as HDTV manufacturers) are not as familiar with user demand for CRT-quality LCD's. BENQ tried to advertise their black frame insertion in 2006 ("AMA-Z") but it only reduced motion blur only by about 30%, which is less than a 1.5x improvement. People were dissappointed and monitor makers stopped advertising backlight modulation as a motion blur eliminating feature. Today's strobe backlights are vastly superior -- with some yielding a 7x to 11x improvement in motion clarity, so they should be given a second chance to be advertised well. LCD true measured MPRT's are finally approaching CRT league (at least for medium-persistence phosphor monitors such as Sony FW900 CRT's).

MPRT's faster than panel native pixel response is now possible today with LCD's due to a confluence of several very major factors:
- Panels that refresh quickly before next refresh (necessary to make 3D shutter glasses possible)
- Full-panel strobe backlights (scanning backlights have a backlight diffusion problem which interferes)
- Pixel persistence that fits inside a vertical blanking interval (as seen in high speed video). Necessary for full-strobe backlight.

When these the above factors are satisfied, there is no upper limit to how brief the LCD MPRT's can become. At that point, the LCD motion blur limiting factor now becomes how briefly the LED backlight can be strobed. It is possible to get MPRT 0.5ms, or even MPRT 0.1ms -- better than CRT. This became possible only recently, thanks to 3D-compatible panels which has the side effect of making possible the odd situation of (MPRT < pixel response).

We need to get better color quality, no doubt -- and it'll probably happen within a few years -- but the "zero motion blur" holy grail is proven possible with LCD's! And there's plenty of time before inexpensive good short-impulse-driveable 4K OLED's become popular, anyway (especially since impulse-driving an OLED to the point of motion clarity (of a typical CRT) *without* motion interpolation, requires an approximately 10x brighter OLED than for sample-and-hold).

Keep tuned for my upcoming motion tests; the world's easiest MPRT measurement test...
Edited by Mark Rejhon - 2/22/13 at 3:40pm
post #79 of 184
Quote:
Originally Posted by Mark Rejhon View Post

This is a known problem with 27" TN monitors, especially the VG278HE.
Also, you apparently actually ended up picking the LightBoost monitor with the most crosstalk problem (I didn't know it at the time). A person on the overclock.net Forums had both the VG278H/VG278HE side-by-side and found that the HE has more crosstalk between.

The monitor I picked up is the VG248QE, not either of the 27" versions. Its overall screen uniformity is terrible, especially for a 24" screen, with massive backlight bleed that covers something like 75% of the screen. The motion performance is so amazing that it pretty much wipes this concern away, but I do wish they'd made the quality constraints a bit better... I would have been willing to pay more than ~$270 to know I was getting a display with less chance of such severe uniformity problems. I posted about this on the long thread over at overclock.net and the general response I got was a lot of agreement that the BLB is fairly bad. Somehow LB makes it far worse, which I gather is somehow a result of the fact that the individual strobes are so much brighter than the normal brightness settings you'd use on the monitor.

Here's a shot in case anyone's curious:

Quote:
I have very interesting quotes from several gamers that the competitive advantage was quite massive for them, far outweighing the input lag disadvantage of LCD over CRT. (The BENQ is measured to be really good for this).

This is why I'll definitely be keeping it (well I also want to support this technology in general). Regardless of the uniformity issues, I could never go back to "normal" LCD or 60hz plasma for competitive games now that I've experienced 120fps with no motion blur. The screen could lose all color except green tomorrow and I'd still feel like I had a serious advantage in competitive shooters. I'll likely continue to throw games like Skyrim or Far Cry 3 up on the plasma (basically anything I feel okay vsyncing), but for fast-paced competition there's just no comparison.
Quote:
There is a growing amount of buzz on several forums (600 posts and 50,000 views in the HardForum.com gamers thread) -- so I hope that eventually monitor manufacturers will start to advertise this feature better. Two have already privately contacted me already inquiring interest. My sense, really, is that it's very hard to advertise a strobe backlight, and it's easier to advertise "Hz" or "3D". Another sense I get, is that many people at monitor manufacturers (not as motion-oriented as HDTV manufacturers) are not as familiar with user demand for CRT-quality LCD's. BENQ tried to advertise their black frame insertion in 2006 ("AMA-Z") but it only reduced motion blur only by about 30%, which is less than a 1.5x improvement. People were dissappointed and monitor makers stopped advertising backlight modulation as a motion blur eliminating feature. Today's strobe backlights are vastly superior -- with some yielding a 7x to 11x improvement in motion clarity, so they should be given a second chance to be advertised well. LCD true measured MPRT's are finally approaching CRT league (at least for medium-persistence phosphor monitors such as Sony FW900 CRT's).

I imagine it's a tough sell in general. Even on the forums where people are supposedly "hardcore" gamers I've seen a lot of posts suggesting that 144hz would of course be significantly better than some "gimmicky 3D mode," or that the monitor has "zero motion blur, even without LB." Obviously neither of these are true but in general people prefer stats that they at least think they understand, and the current implementation makes it hard for the average user to even test it out. I saw a post somewhere a couple days ago where the user suggested that they didn't need to even try the 2D LB trick because the monitor was "already bright enough."

NVidia needs to jump on this immediately, that would be the best chance for it to gain traction in the wider market. Optimally this technology will be available on any video hardware, but since it's currently not I'm shocked NV isn't all over this. Just get a toggle in the drivers that enables LB without the need for any 3D functionality, spoofed or real. And call it something other than LB if that helps, like NVidia (tm) ZeroBlur (tm) LightBlaster (tm) 9000 (tm?). The best case scenario would probably be monitor manufacturers making this a toggle in the OSD, but that would require a new batch of hardware
Quote:
We need to get better color quality, no doubt -- and it'll probably happen within a few years -- but the "zero motion blur" holy grail is proven possible with LCD's! And there's plenty of time before inexpensive good short-impulse-driveable 4K OLED's become popular, anyway (especially since impulse-driving an OLED to the point of motion clarity (of a typical CRT) *without* motion interpolation, requires an approximately 10x brighter OLED than for sample-and-hold).

I'm glad that got cleared up in this thread. I never understood why OLED was being hailed as the thing that would soon destroy plasma and LCD by solving motion blur and burn-in simultaneously. In the current implementations my understanding is that it has both of those issues. The low response time will improve motion blur on its own compared to slow LCDs, but I haven't heard about any manufacturer planning to strobe them at this stage (am I wrong, is this a topic that's being discussed beyond the enthusiast circles?). At best these early displays are going to have >= plasma colors with "fast LCD" level motion blur and a potential for burn in. Seems like it's going to be a while before these displays can beat any existing technology.
post #80 of 184
Thread Starter 
Quote:
Originally Posted by headlesschickens View Post

http://www.youtube.com/watch?v=hD5gjAs1A2sI posted about this on the long thread over at overclock.net and the general response I got was a lot of agreement that the BLB is fairly bad. Somehow LB makes it far worse, which I gather is somehow a result of the fact that the individual strobes are so much brighter than the normal brightness settings you'd use on the monitor.
No kidding about the backlight bleed, but there's actually a different reason: The LCD pixel black level is set differently (slightly higher black level) for LightBoost mode than non-LightBoost mode, to hide pixel persistence better between refreshes. This makes pixel overshoot/undershoot more predictable at the ends of the greyscale range, and a big purpose is to reduce 3D crosstalk, by calibrating the LCD greyscale for ideal pixel persistence parameters for strobing, rather than ideal contrast. The technology does need to continue to improve. The new 24" 1ms panels are a big improvement for LightBoost operation (more even uniformity, less crosstalk) showing one of the few benefits of the 1ms-vs-2ms response time panels. But yes, enabling this motion-blur-elimination mode it has this nasty side effect. Easiest way to turn on/of LightBoost is simply switch between 120Hz, and 144Hz, which can easily be done via a utility.
Quote:
Regardless of the uniformity issues, I could never go back to "normal" LCD or 60hz plasma for competitive games now that I've experienced 120fps with no motion blur. The screen could lose all color except green tomorrow and I'd still feel like I had a serious advantage in competitive shooters. I'll likely continue to throw games like Skyrim or Far Cry 3 up on the plasma (basically anything I feel okay vsyncing), but for fast-paced competition there's just no comparison.
I imagine it's a tough sell in general. Even on the forums where people are supposedly "hardcore" gamers I've seen a lot of posts suggesting that 144hz would of course be significantly better than some "gimmicky 3D mode," or that the monitor has "zero motion blur, even without LB." Obviously neither of these are true but in general people prefer stats that they at least think they understand, and the current implementation makes it hard for the average user to even test it out. I saw a post somewhere a couple days ago where the user suggested that they didn't need to even try the 2D LB trick because the monitor was "already bright enough."
It's true that it's quite a tough sell, but it's at least time to change perceptions -- that the causes of LCD motion blur are actually apparently solvable and future LCD technology improvements can lead to superior impulse-driven LCD's in the future (the league of a "full order of magnitude reduction in perceived motion blur", like LightBoost LCD's now provide).
My general feeling is that there's more latent demand for zero-motion-blur LCD's (10x clearer motion than traditional LCD's) than for stereoscopic 3D LCD's, but the market is not taking advantage of the potential/latent demand by people who dislike LCD motion blur. This might solve itself within a few years (as LCD quality improves), especially if OLED"s take much longer than expected to arrive cheaply in the home theater.
Quote:
NVidia needs to jump on this immediately, that would be the best chance for it to gain traction in the wider market. Optimally this technology will be available on any video hardware, but since it's currently not I'm shocked NV isn't all over this. Just get a toggle in the drivers that enables LB without the need for any 3D functionality, spoofed or real. And call it something other than LB if that helps, like NVidia (tm) ZeroBlur (tm) LightBlaster (tm) 9000 (tm?). The best case scenario would probably be monitor manufacturers making this a toggle in the OSD, but that would require a new batch of hardware
Agreed, it's already an undocumented toggle on Samsung SA700D, SA750D and SA950D's already -- works on ATI and nVidia cards -- but it has the disadvantage of only being advertised as a 3D-benefitting feature (reduces crosstalk) rather than a 2D-benefiting feature (eliminates motion blur). -- Samsung Zero Motion Blur HOWTO.

A case study to be aware of:
Certain BENQ computer monitors had a button to enable a scanning backlight in year 2006 (AMA-Z), but it eliminated only 30%-ish of motion blur (less than a 1.5x improvement) and it flickered annoyingly at 60Hz. LightBoost, today, provide about 10x improvement in motion blur (relative to 60Hz LCD), so the full order of magnitude improvement pulls LCD motion clarity into CRT territory, and the 120Hz flicker is far less annoying to most people. It should be a feature that is now much easier to advertise, than it was in the past.
Quote:
I'm glad that got cleared up in this thread. I never understood why OLED was being hailed as the thing that would soon destroy plasma and LCD by solving motion blur and burn-in simultaneously. In the current implementations my understanding is that it has both of those issues. The low response time will improve motion blur on its own compared to slow LCDs, but I haven't heard about any manufacturer planning to strobe them at this stage (am I wrong, is this a topic that's being discussed beyond the enthusiast circles?). At best these early displays are going to have >= plasma colors with "fast LCD" level motion blur and a potential for burn in. Seems like it's going to be a while before these displays can beat any existing technology.
From what I heard, Sony's "Crystal LED" prototype is actually impulse-driven. It strobes the LED's and produces excellent motion clarity. Alas, it's not OLED, but discrete crystalline LED's.
The cheap PS Vita OLED has lots of motion blur during fast panning motion, alas.

I really want OLED to succeed, if they can solve the problems, but it's easily another decade before *low-price* and *quality* and *zero-motion-blur* is possible simultaneously in the same OLED panel, all at the same time.

Meanwhile, we might as well milk higher qualities out of LCD technology as much as possible, while we're at it -- the plants are built, the panels can be relatively cheap, and there are really good quality LCD panels today already (color-wise), and it's now proven pixel persistence is bypassable (what a technical achievement to witness MPRT benchmarks lower than manufacturer rated pixel persistence!) But that's not enough. We need to combine the best quality of good IPS LCD's with the order-of-magnitude blur elimination from a videogame-friendly synchronized strobe backlight. It costs far less, too. (A good LightBoost monitor such as ASUS VG248QE's can be had for $300 each, for example).
Edited by Mark Rejhon - 2/22/13 at 3:39pm
post #81 of 184
Thread Starter 
Quote:
From what I heard, Sony's "Crystal LED" prototype is actually impulse-driven. It strobes the LED's and produces excellent motion clarity. Alas, it's not OLED, but discrete crystalline LED's.
Actually, just found out it doesn't -- there's some motion artifacts on the Crystal LED. I guess the refresh pattern on that display is not very good (yet).
Quote:
I'm quite rusty but I definitely felt like I had an advantage over some of the players I encountered (others were likely already on similar displays). I went to Battlefield 3 after that, which is more my speed now (a bit less emphasis on rushes to choke points and face to face headshot competition). I had to drop the mesh quality to keep at the 100+ range on 64 player maps, but it again seemed like a huge advantage.
Do you see a difference in further improvements when you adjust the LightBoost OSD setting downwards (not "OFF") That uses shorter strobes. It does dim the picture, so there's a tradeoff. Some gamers such as Vega have been reporting that using a LightBoost setting of 10%, have even more clarity for the very fastest motion.

So far, these types of video game motions benefit hugely from LightBoost:
-- Fast 180-degree flick turns in FPS shooting.
-- Shooting while turning, without stopping turning (easier on CRT or LightBoost)
-- Close-up strafing, especially circle strafing, you aim better.
-- Running while looking at the ground (e.g. hunting for tiny objects quickly).
-- Identifying multiple far-away enemies or small targets, while turning fast
-- Playing fast characters such as "Scout" in Team Fortress 2
-- High-speed low passes, such as low helicoptor flybys in Battlefield 3, you aim better.
post #82 of 184
Thread Starter 
Tests show LightBoost outperforms scanning backlights in eliminating motion blur:
Quote:
TFTCentral has tested LightBoost with their equipment and found
LightBoost outperforms all past scanning backlights they have ever tested,
including the old BENQ AMA-Z and Samsung MPA from 2006.

tftcentral.JPG

Check out TFTCentral's Motion Blur Reduction Backlights article!
post #83 of 184
So, Mark, pardon me if I'm repeating myself, are any of Sony OLED BVMs strobe-driven? How about that 55 inch LG one? I thought all AMOLEDs were sample and hold currently.

It's good to see technology is improving to cover up the deficiencies. 2013 is a good year for plasmas as well. ZT60 is reported to have even better motion performance and ABL. The input lag of ST60, however, is disappointing. Samsung F8500 also has a meh input lag as well, just when I thought I would have a perfect flat panel TV.

So, instead of replacing my GDM-FW900 with an LCD or a plasma, I replaced it with...another CRT. biggrin.gif lol
In an age where I've seen XBR960 being dumped, it seemed unwise to actually pay a hefty money for a CRT monitor, but this Sony BVM was worth every penny. It easily bests my FW900. Thanks to Mark, I'm happy to know that motion blur and input lag is being taken care of, but those are still only two criteria out of three that I deem important to my gaming display. The final one? The scaling.

I was actually shocked to find out those scaling differences are huge, even on 480i. My BVM easily shows those differences. On my BVM, native 240P games look the best and extremely organic. But, those same games linedoubled to 480i showed a nasty symptom : a very emulator-looking pixelation. I used GSM to cut those ugly even lines to create scanlines, and while there was an improvement, that emulator-looking feeling still didn't go away. For a few seconds I was wondering why, but then I realized the horizontal resolution needs to be chopped as well to truly make it back to 320x240. Simply discarding a field still meant a 640x240 resolution, that was why I was still seeing those nasty emulation-like pixelation.

After seeing how true 240P games look on my BVM, I immediately regretted having bought all those scaling devices like XRGB3. They were simply waste of my money.
post #84 of 184
Quote:
Originally Posted by KOF View Post

So, instead of replacing my GDM-FW900 with an LCD or a plasma, I replaced it with...another CRT. biggrin.gif lol
In an age where I've seen XBR960 being dumped, it seemed unwise to actually pay a hefty money for a CRT monitor, but this Sony BVM was worth every penny. It easily bests my FW900.
What model BVM do you have and in what way it is superior?
P.S. I have highly tweaked GDM-FW900.
post #85 of 184
Right now, I have BVM-20F1U because 15hz support was direly lacking from my FW900, but after seeing what this one was capable of, I'm definitely getting another one, probably a D32WU or an F24. The most obvious difference is a contrast ratio. FW900, being a computer monitor, never gave me an exciting pop to begin with. That was one area where I thought XBR960 was better as well. But I've been using my FW900 extensively for gaming since 2005 and now it has taken a lot of abuse. The damper lines are vertically shifted,and blacks got a lot worse and the monitor got dimmer from what I remember as well. Maybe it's possible to restore my FW900's contrast ratio? I don't know.

My BVM also looks more crisp at 480i than my FW900 at 1080P as well. I don't know how that's possible but that's my observation. Of course I'm not talking about resolving detail(which my BVM can do plenty of, but still wouldn't match higher resolution ones), but how pixels look to me. Scanlines from classic console games are very sharp and I don't know if I'm going to like that. It was the first CRT display that I thought was very LCD like. FW900, even when running in far higher resolutions still felt very CRT to me. BVM just seems to strike a nice balance between LCD-like crispness and CRT-like smoothness.

It's a bit too revealing for classic game consoles than I would like. For PS2, I think this one puts out a lovely picture. I never thought there was going to be a display that could actually handle shimmering hell games such as Gran Turismo 3 and Virtua Fighter 4. But for Sega Genesis, I prefer Barco RGB monitor. While its contrast ratio is also nothing to write home about, it offers 3D-like depth for paralax scrolling heavy games better than the BVM.
post #86 of 184
Quote:
Originally Posted by KOF View Post

So, Mark, pardon me if I'm repeating myself, are any of Sony OLED BVMs strobe-driven? How about that 55 inch LG one? I thought all AMOLEDs were sample and hold currently.

Watch the video at the bottom of this article:

http://www.engadget.com/2013/04/08/sony-4k-oled-prototypes/

When they switch to the smaller panel, you will notice a distinct flicker/strobing that is absent on the larger 56" version (which is the same as the 56" OLED TV shown at CES). All the videos I've seen of the 55" LG OLED show no visible flicker. Based on these videos, I would say only the smaller Sony OLED panels have any chance to be free from sample-and-hold blur.
post #87 of 184
This is an older video but gives further evidence that Sony uses some type of CRT-like scanning refresh on their smaller OLED sets:

http://www.youtube.com/watch?v=jTfvwOGu4EI

Too bad they can't release a much larger version of this.
post #88 of 184
Thread Starter 
Quote:
Originally Posted by Wizziwig View Post

This is an older video but gives further evidence that Sony uses some type of CRT-like scanning refresh on their smaller OLED sets:
http://www.youtube.com/watch?v=jTfvwOGu4EI
Too bad they can't release a much larger version of this.
In this high-speed video, it looks like a 50%:50% bright:dark impulse ratio.
This would unfortunately reduce motion blur on this OLED by only 50%. Not as good for motion blur as Plasma or LightBoost LCD.
For clean stroboscopic flashes such as this, eye-tracking-based motion blur is directly proportional to impulse length.

Plus, it looks like it would create an annoying 60 Hz flicker. It would be nice to see them come out with 120Hz OLED's. Otherwise, manufacturers have to decide whether to make them flicker-free (eliminate eye annoyance) or make them flicker (eliminate motion blur). Needs to be a configurable option, to accommodate people who are motion-blur-sensitive or flicker-sensitive.
post #89 of 184
Mark Rejhon: Can you make LCD not suck?

If you can then they should give you a job here.
post #90 of 184
Thread Starter 
Quote:
Originally Posted by Artwood View Post

Mark Rejhon: Can you make LCD not suck?
If you can then they should give you a job here.
There are now thousands of enthusiast gamers who love LightBoost -- see media coverage of LightBoost and testimonials. ASUS, NewEgg, pcmonitors.info, TFTCentral, even Ars Technica. Of course, LightBoost only solves motion blur and doesn't solve LCD color, and the colors are worse (at least until you calibrate).

If you have $10K+ to spend, check out the Viewpixx Scientific Research LCD. It's got 1920x1200 with 120Hz refresh, better-than-CRT color gamut, scanning backlight with 1ms response, full array RGB LED backlight. NOW.... That's an LCD that doesn't seem to suck! Well, except for black levels, maybe.

As we already now know, some OLED's have motion blur because of sample-and-hold. This is a problem easily solved because OLED's switch very fast. However, due to the sample-and-hold issue, most OLED's still fall short of many of the better displays today (in terms of motion blur). This will change in the future, but these things take time. LCD technology has been around for 40 years, while OLED's been around for far less.
Edited by Mark Rejhon - 4/30/13 at 11:40pm
New Posts  All Forums:Forum Nav:
  Return Home

Gear mentioned in this thread:

AVS › AVS Forum › Display Devices › Flat Panels General and OLED Technology › LCD motion blur: Eye-tracking now dominant cause of motion blur (not pixel transition/GtG)