Outlook for Blur Reduction in OLED (or the lack thereof) - Page 4 - AVS Forum
View Poll Results: How will OLED compare to LED/LCD on blur reduction by 2018?
No better than a 60Hz LED/LCD (same as today - no improvement) 2 22.22%
Equivalent to a 120Hz native refresh LED/LCD with 2X frame interpolation (resulting in 120fps SOE) 0 0%
Equivalent to a 240Hz effective refresh rate LED/LCD with 2X frame interpolation and 1/2 backlight scanning (120fps SOE) 0 0%
Equivalent to a 240Hz native refresh LED/LCD with 4X extreme frame interpolation (resulting in 240fps SOE) 0 0%
Equivalent to a 360Hz effective reresh rate LED/LCD with 2X frame interpolation and 1/3 backlight scanning (120fps SOE) 0 0%
Equivalent to a 480Hz effective refresh rate LED/LCD with 2X frame interpolation and 1/4 backlight scanning (120fps SOE) 0 0%
Equivaent to a 480 effective refresh rate LED/LCD with 4X extreme frame interpolation and 1/2 backlight scanning (240fps SOE) 0 0%
Equivalent to plasma one way or the other (600Hz equivalent, ~1200 lines of motion resolution) 2 22.22%
Better than plasma and close to CRT (the Holy Grail) 5 55.56%
Voters: 9. You may not vote on this poll

Forum Jump: 
Reply
 
Thread Tools
post #91 of 120 Old 05-16-2014, 08:36 PM
AVS Special Member
 
Wizziwig's Avatar
 
Join Date: Jul 2001
Location: SoCal, USA
Posts: 1,161
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 81 Post(s)
Liked: 94
This is not a computer monitor forum so this entire discussion is utterly pointless. Who cares if a tiny TN LCD with horrible image quality can produce a blur free image? There are no LCD TVs on the market with equivalent results and there likely never will be. The compromises are just too great.

Real world LCD TVs have horrible GtG which does visibly affect blur. If they didn't, nobody would complain about crosstalk when watching active 3D on LCD TVs.

The people who prefer LG OLED are comparing it to similar sized TVs, not small TN computer monitors. Why keep posting pictures and discussing them?
Wizziwig is offline  
Sponsored Links
Advertisement
 
post #92 of 120 Old 05-16-2014, 09:01 PM
AVS Special Member
 
Mark Rejhon's Avatar
 
Join Date: Feb 1999
Location: North America
Posts: 8,124
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 96
Quote:
Originally Posted by Wizziwig View Post

This is not a computer monitor forum so this entire discussion is utterly pointless. Who cares if a tiny TN LCD with horrible image quality can produce a blur free image? There are no LCD TVs on the market with equivalent results and there likely never will be.
Wrong. See Sony Motionflow Impulse, which is a successful pure-strobing-based low-latency blur elimination that works great with zero interpolation artifacts, and suitable for computer/gaming, too. Granted, it flickers a lot (60Hz), but the option is there for those comfortable with the flicker . Several Blur Busters readers have purchased this TV, and have complimented its motion blur eliminating ability, remarking it as the most CRT-like LCD HDTV they have ever seen, for the purposes of 60fps@60Hz gaming. Just look at the Comments section on Blur Busters. It's not perfect, but it only needs to strobe at 60Hz (CRT style flicker), which gives more time for incomplete GtG to finish, than TN panels at 120Hz.
Quote:
Originally Posted by Wizziwig View Post

Real world LCD TVs have horrible GtG which does visibly affect blur ghosting. If they didn't, nobody would complain about crosstalk when watching active 3D on LCD TVs.
Fixed it for you.
Also, it is quite true that crosstalk is a problem on many LCDs, especially VA LCDs which are slower than TN LCDs.
However, the point remains that there are huge amounts of motion blur reductions.
And the crosstalk is not visible in all material; just mainly material with high-contrast boundaries.

Also, 2D strobe crosstalk (from incomplete GtG) is a lot less visible than 3D stereoscopic crosstalk for the same contrast images because:
(A) You have the shutter-glasses leakage to contend to, as well (above and beyond the GtG limitations of the LCD); and
(B) The 2D crosstalk only appears during fast motion, and the separation is only one frame apart. Even fast 1/2-screen-width-per-second motionspeeds (960pix/sec) create crosstalk that's only 16-pixels apart, which is often less distance than the distance between objects during 3D stereoscopic crosstalk.
Quote:
Originally Posted by Wizziwig View Post

The people who prefer LG OLED are comparing it to similar sized TVs, not small TN computer monitors. Why keep posting pictures and discussing them?
Because new TVs have come out that are using strobing to reduce motion blur.
For example, a bunch of new Vizio TV's have now recently introduced a low-persistence Game Mode, and more manufacturers have started studying ways to add low persistence (at low latency, either via strobing and/or ultralow latency interpolation) for gaming.

Examples:
- Sony Motionflow Impulse in "Game Mode".
- Vizio High Velocity Mode in "Game Mode", debutted at CES 2014
etc.

It's a tsunami wave occuring right now, and manufacturers has suddenly started paying attention. Two years ago when I brought in a high speed camera into a TV showroom (Casio EX-ZR200), only the plasmas (and 2 or 3 high end LCD TVs) were flickering in high speed video. But earlier this month when I brought in a camera to a TV showroom, more than one-third of LCDs were flickering in the high speed video at the strobe frequency of the video (e.g. 120 strobes per second for 120fps interpolated video, or 240 strobes per second for 240fps interpolated video). It's pretty clear that over the last 2 years, huge strides have been made by LCD manufactuers in adding strobe-based (scanning/strobed) motion blur reduction on LED/LCD televisions, even to the midrange models and certain semi-entrylevel models. On most, the strobe modes can be disabled, but it has become the default mode on a lot of models as it reduces motion blur above-and-beyond interpolation alone.

My eyes prefer the colors of OLED, and I have already seen really good low-persistence darn-near-CRT-quality OLEDs (the Oculus Development Kit 2 virtual reality headset), so I hope that awesome CRT-approaching OLED persistence that I witnessed, comes to large-screen-format OLEDs. OLEDs will certainly make it, but right now, for the large-format TVs, LCDs have been remarkably racing forward in blur-elimination technologies, both in the computer display arena and the big screen arena. They need to ramp up the luminance of OLEDs and make sure that there are adequate low-persistence modes available on OLEDs going forward, including low-latency interpolation-free methods of low persistence.

As you can see, my posts are fully relevant to this discussion, especially if you value low persistence even in video material like a lot of big-screen-gaming and sports afficanados -- a lot more of us are using our big screen displays during persistence-limits-pushing situations. Computers and gaming is more mainstream in home theaters today than it was 10 years ago. Most people don't care (like people who don't care on Plasma vs LCD) but the slowly increasing availability of optional low-persistence modes is always a Good Thing, regardless of OLED or LCD. Also, where we got the niche of videophiles in various category (e.g. contrast ratio nuts, or color uniformity nuts), we also have the niche of motion-clarity-nuts (e.g. plasma lovers, CRT, newer modern strobed LCD), and motion clarity is a big niche: A lot of users prefer the 120fps/240fps soap opera effect (low persistence achieved via interpolation). Most users don't understand this stuff. But there are also many see motion on TVs in a showroom, and understand "Hey, that looks very clear in motion" in sports (that person is instantly seeing the benefits of low persistence). Obviously, some of us motion clarity nuts are videophile-league above that: Wanting blur-free motion clarity without the soap opera effect or interpolation artifacts (motion more like a CRT), and successfully largely achieving that overall effect on certain strobed LCDs (including certain full sized HDTVs). And those of us (like me) who want that low persistence effect but in a low-latency manner suitable in Game/PC Mode, on an increasing number of HDTV models nowadays. If you're denying that market exists, you've been living under a rock based on old 2012-era LCD opinions.

Thanks,
Mark Rejhon

www.BlurBusters.com

BlurBusters Blog -- Eliminating Motion Blur by 90%+ on LCD for games and computers

Rooting for upcoming low-persistence rolling-scan OLEDs too!

Mark Rejhon is offline  
post #93 of 120 Old 05-17-2014, 08:15 AM
AVS Special Member
 
tgm1024's Avatar
 
Join Date: Dec 2010
Posts: 6,083
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 239 Post(s)
Liked: 644
Quote:
Originally Posted by Wizziwig View Post

This is not a computer monitor forum so this entire discussion is utterly pointless. Who cares if a tiny TN LCD with horrible image quality can produce a blur free image? There are no LCD TVs on the market with equivalent results and there likely never will be. The compromises are just too great.

 

Mark is addressing the general field of blur reduction.  This involves discussions of theory, which pulls in technologies that have proven themselves elsewhere and may one day end up in TVs.  Besides, there's no way to discuss blur within the confines of what currently is for TVs, because the TV world is slowly catching up.  Most likely because the awareness or demands of the public keep rising.

 

The problem I have is that I want to the field to listen more to people's first hand experiences.  As you were speculating earlier, there are reasons for impressions such as Conan's, he is absolutely not alone, and I'm not sure we're properly accounting for it.

 

Quote:
Originally Posted by Mark Rejhon View Post
 
Quote:
Originally Posted by tgm1024 View Post

Where I'm heading with this ultimately is that a pursuit camera might not capture what folks like Conan are experiencing.  He's seeing phenomenal motion handling (low discomfort I suppose) even with the persistence that the LG OLED is having.  And I don't believe he's limiting this to stationary eye viewing.  Even though that eye tracking graph of yours (and earlier, Microsoft) is sound, the pursuit camera might show a clearer image than we (or at least some) see with non-OLED GtG.  The camera doesn't move like an eye, the eye can stutter (I think), and I think the discomfort might be more from GtG than you or I are giving credit for.
That's called eye saccades (eye tracking errors). However, that's insignificant at 960 pixels/second for most humans at 1:1 view distance from a 1080p gaming display, based on my tests. Eye tracking during these situations is accurate. Photographs of consistent motion artifacts tend to show up both by vision as well as by camera (e.g. pursuit camera successfully capturing plasma contouring artifacts that is also seen by human eye).

 

No, I know what eye saccades are; and those are much slower things that happen with or without attempted tracking.  And I'm talking about more than just a physical stutter, so I shouldn't have just listed that alone.

 

Your reply was thorough and complete, but the camera tracking still doesn't quite address what I'm trying to explain.

 

Even if the camera was moving along steadily, and even if the eye moved just as steadily (I don't believe that's true, but we'll stick with it for the moment), I believe there is a kind of unpredictable relaying of information from the eye->brain.  Our eyes are not CCD arrays, and the information isn't continuously fed.  Even though the camera can pick up artifacts that the human eye can pick up I believe that the camera cannot pick up the sum total of what the person is experiencing.  For instance, the clarify of the edges of the object itself would alter how well we track: this might dramatically change how we then are able to interpret the object because the eye loses acuity off center.  That's not the case with a pursuit camera: that will track the same regardless.

 

Start with the basics for a sec.  Which of these two people experience more discomfort (both eye tracking and stationary), in a 16.67 refresh with no pulse.

 

Person A: <--2ms GtG--><---------------8 ms static hold------------------><--2ms GtG--><-----4.67 ms blanking----->

Person B: <------------------12 ms static hold with mythical 0ms GtG------------------><-----4.67 ms blanking----->

 

?


WARNING: You have now entered a no @#$%tard zone. Please gather your anti-vaccine propaganda nonsense and slowly back out the way you came in.
tgm1024 is offline  
post #94 of 120 Old 05-17-2014, 09:37 AM
AVS Special Member
 
Mark Rejhon's Avatar
 
Join Date: Feb 1999
Location: North America
Posts: 8,124
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 96
Quote:
Originally Posted by tgm1024 View Post

The problem I have is that I want to the field to listen more to people's first hand experiences.  As you were speculating earlier, there are reasons for impressions such as Conan's, he is absolutely not alone, and I'm not sure we're properly accounting for it.
Yep: correct. That is what a lot of my precious post covers.
Quote:
Originally Posted by tgm1024 View Post

No, I know what eye saccades are; and those are much slower things that happen with or without attempted tracking.  And I'm talking about more than just a physical stutter, so I shouldn't have just listed that alone.
Yes. It is all relative. Eye saccades and any other eye tracking errors of any kind create additional tracking inaccuracies. No dispute. So are display stutters, that is additional tracking inaccuracy. Relative eye position versus object position, at the end of the day, is what matters, and accuracy can degrade at both ends (e.g. Saccades at source, stutter at destination, or whatever other tracking degradations).

In fact, persistence blur can also be categorized as a form of tracking inaccuracy. Lower framerates leads to lower accuracy tracking, defined as worse sync between source (eye) and destination (display). Lower framerates have more sensation motion blur, down to a certain point where the blur looks more like stutters. High frequency stutters convert into motion blur. Motion blur persistence is simply high frequency stutters turning into blur. The stutters (seen in high speed video) is invisible to human eye and just looks like natural motion blur. The 60fps@60Hz UFO, viewed on 60Hz LCD is blurrier than on a 60Hz CRT. Longer persistence leads to more time a frame is displayed, which produces more time for a frame to fall behind during tracking. From this, you recognize persistence based blur is simply high frequency stutter/judder that are so fast that it blends into blur, which we call motion blur. It is the interaction between continuously moving eyes, and the discrete stepping forward of the static frames, and this occurs well beyond stutter-detection frequencies. It is so natural looking at high frame rates (when GtG is a nonissue) that it often looks like source based blur.

View the 5-UFO version on a native 120Hz computer monitor display with strobing disabled, www.testufo.com#count=5 and you will recognize persistence based blur is simply high frequency stutters blending into blur. On flickerfree displays with average GtG significantly less than a refresh cycle, this happens with the animation

Viewing www.testufo.com/#count=5 on a 120Hz computer monitor:
120fps - shortest trail
60fps - twice as long trail as 120fps at same motionspeed
30fps - four times as long trail as 120fps at same motionspeed, blur trail starts to look stuttery/juddering
15fps - eight times as long trail as 120fps at same motionspeed, but blur trail looks visually more like stutter than blur
7.5fps - sixteen times as long trail as 120fps at same motionspeed, it is simply stutter as it is so slow to be perceived as blur.

You can clearly tell the amplitudes of the stutters halve as you double framerate and it even keeps going on beyond 60Hz, 120Hz, and beyond, when the stutters has already blended into motion blur (that is what persistence based motion blur is!). The thresholds vary from person to person (some see 30fps more as stutter, others more as additional motion blur, as an example, based on how sensitive you are to stutters), but the trail length is always consistent, regardless of whether "trail" describes the stutter amplitude or the length of blurring.

(Bonus extra exercises for one equipped with extra displays to test on:
Turn on strobing, or use a CRT. You will see multi-image effects instead of blur. Now run the above tests on an adjustable persistence display (e.g. BENQ Z-series). As you adjust persistence, the amount of motion blur dynamically changes. For the low framerates, the multi image effect becomes fainter at higher persistence, and stronger at lower persistence. Once the persistence hits a full frame cycle, the multi image effect disappears for the lower frame rates (e.g. 30fps@60Hz double image effect) as there ceases to be black periods between refreshes, no longer chopping up the motion blur into multi-image effects.)

Eye tracking inaccuracies at the eye level, do not change the fact that tracking is more accurate on the display plane at higher framerates or lower persistence levels.

The educational comparative animation can still be seen on your existing 60Hz display (iPad, computer monitor, laptop), but it is even more obvious when seeing this animation on a native-120Hz non-impulse display. This is because you get multiple samples of pure persistence based blur (60fps and 120fps) and multiple samples of stutter amplitudes (30fps, 15fps, 7.5fps) to allow you to more easily reach the "Eureka, I Understand" moment. And fully recognizing the continuum between the persistence blur amplitude (blur trail length) and stutter amplitude. Where persistence blur is simply a high frequency stutter/judder blending into blur, and thus increases in persistence is simply enforced degradations in tracking accuracy (as visible stutters are, as well). It is so natural looking as motion blur that we never call them stutters when the stutter frequency is beyond human detection limits. Persistence blur surprisingly simple & elementary when one get the "Eureka" moment. Like successfully teaching someone to see 3:2 judder.
Quote:
Originally Posted by tgm1024 View Post

Even if the camera was moving along steadily, and even if the eye moved just as steadily (I don't believe that's true, but we'll stick with it for the moment), I believe there is a kind of unpredictable relaying of information from the eye->brain.  Our eyes are not CCD arrays, and the information isn't continuously fed.  Even though the camera can pick up artifacts that the human eye can pick up I believe that the camera cannot pick up the sum total of what the person is experiencing.  For instance, the clarify of the edges of the object itself would alter how well we track: this might dramatically change how we then are able to interpret the object because the eye loses acuity off center.  That's not the case with a pursuit camera: that will track the same regardless.
Although not relevant to this thread as it has nothing to do with impressions here except insofar as tracking accuracy relationship to seeing persistence, you are certainly right, static photos do not capture the dynamic ness of things like temporal dithering or other temporal effects, that is just averaged out when the camera, for instance, captures multiple plasma sub fields. Whereas info is continuously fed to the eye, but only a slice of time fed to a camera. So on that note, you are right. However, this effect doesn't play here in why some do not notice the persistence blur. Except if you're just simplifying to "their eyes dont track as often and/or they don't track as accurately" (which is true, these inaccuracy error factors overwhelms persistence), then we can skip the complex speculation as all that matters is sync between source (eyes) and destination (display).

I can at least tell you I confirm these factors do not cover the impressions such as Conan's, so I fail to see how it is any relevant to a person not seeing persistence related motion blur, except that the more inaccurate eye tracking is, the harder it is to see persistence related blur. That is a simpler and scientifically confirmable statement, than your speculation. Persistence based blur is tracking based blur, and tracking accuracy degradations affect this (whether via source stutter like eyes saccades or destination stutter such as video stutter). Tracking accuracies added by eye saccades simply degrade ability to see persistence based motion blur, but doesn't affect low framerate or stutter always degrading tracking accuracy. The tracking errors add up. More stutter, more tracking error.

Example
- Lower framerates - www.testufo.com/framerates#count=3 - tracking is more accurate relative between eye position and object position on a time basis
- Stutters - www.testufo.com/stutter - tracking is more accurate the less object position error (stutter) over time.

Now, additionally, view these on a CRT, plasma, LightBoost, or other low persistence tech, and what I am talking about, suddenly becomes obvious that persistence blur is minimized during the full framerate marching refreshrate, consistent motion scenario. Lower framerates create positional inconsistencies (showing up as stutter, judder, natural looking persistence-based blur, or multi-image effects; all of which are tracking inconsistencies!).

Inability to see persistence blur is simply tracking related, such as not tracking. That part is not speculation. You do not see easily stutters either when not tracking motion. (Ever tried to see 3:2 judder while keeping eyes perfectly stationary? It is much harder). There are speculations in this thread going in the obviously wrong directions (wrong answers why persistence blur is not seen), I am simply steering these in a more correct direction based on known sciences.

Obviously, for real world material, importance of persistence to someone varies a lot as most material watched on a screen is not persistence-critical except for certain sports broadcasts, for certain computer activities, and for several kinds of video games. So a heck of a lot of us don't exercise our eyes much during watching material, especially at typical TV viewing distances. If you really wanted to run those motion bluerays, I should point out that successfully getting "1080 lines of motion resolution" test patterns (at fast 16ppf motionspeeds) is already using a display with low persistence one way or another, it is not possible to see all the lines at such motionspeeds without a low persistence technique of some kind. So those who are claiming excellent motion handling on that pattern at fast motionspeeds, is already testing a low persistence mode (even without always knowing how the display is achieving low persistence). I am simply "cutting to the chase" here, since that is a litmus test of low persistence. The test always is guaranteed fail to reach full motion resolution (at 16ppf - 16 pile step per frame - 960pixels/sec - half screenwidth per second motion) when you disable BFI or interpolation. So encountering a display that passes 1080 at 16ppf by default, is already impulsing and/or interpolating.

*Note: Stutter and judder are often confused with each other, even I may have interchanged the words. But the basic point is the same (both are tracking inconsistencies)
Quote:
Originally Posted by tgm1024 View Post

Start with the basics for a sec.
I just did.
Quote:
Originally Posted by tgm1024 View Post

Which of these two people experience more discomfort (both eye tracking and stationary), in a 16.67 refresh with no pulse.
All humans are different. On Blur Busters, there are hundreds of people who get flicker eyestrain and hundreds who get motion blur eyestrain during PC gaming. People who get eyestrain from flicker, will prefer flickerfree (full persistence). People who get eyestrain from motion blur will prefer low persistence, especially at high flicker frequencies (lesser of evil). Blurry motion, especially at close viewing distances, creates eye-focussing strain as the eye tries but fails to focus on the motion but is impossible due to persistence based blur. There are people reporting reduction of eyestrain from strobe backlights during fast-motion gaming, this is why the LightBoost FAQ has both entries for "Why Does LightBoost Have MORE Eyestrain?" as well as "Why Does LightBoost Have LESS Eyestrain?".

More study is needed on this (no dispute) for an objective study. Adjustable-persistence displays are a new invention in the gaming world, for the first time showing a same-display same-environment effect with a statistically significant boom (hundreds) of anecdotal reports in both directions at Blur Busters, and comments/forums relating to LightBoost and similar tech. Regardless, it had no relationship on how much blur reduction was actually witnessed by users, people who had flicker eyestrain still typically saw the persistence blur anyway (and the blur reducing effect of lower persistence), if they track their eyes in a motion test.

Thanks,
Mark Rejhon

www.BlurBusters.com

BlurBusters Blog -- Eliminating Motion Blur by 90%+ on LCD for games and computers

Rooting for upcoming low-persistence rolling-scan OLEDs too!

Mark Rejhon is offline  
post #95 of 120 Old 05-17-2014, 10:21 AM - Thread Starter
AVS Special Member
 
fafrd's Avatar
 
Join Date: Jun 2002
Posts: 3,063
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 14 Post(s)
Liked: 418
Mark,

appreciate your continued contributions to the thread. Here's a repost of the two questions I asked you earlier in the thread in case you missed them:
Quote:
Mark,

I've learned a great deal from all of your posts but there are two questions that I would appreciate further clarity on:

  • Is there any difference between the persistence and perceived motion blur between a 120Hz 50% BFI refresh (60Hz framerate with 8.3ms persistence) and a 240Hz 50% BFI with frame-repeat (120Hz framerate with 4.1ms persistence but each frame repeated twice)?
  • In terms of perceived 'flicker' when using BFI (or a scanning backlight), are higher frequencies of BFI/ flicker harder to detect? So if 50% BFI is going to be used to reduce persistence with 60fps source material, will 120Hz 50% BFI no frame repeat have more noticeable flicker than 240Hz 50% BFI with single-frame-repeat both of which will be more noticeable than 480Hz 50% BFI with triple-frame-repeat?


The reason I ask is that the newer LED/LCD displays are developing faster and faster action rates for their scanning backlights and I am trying to understand the advantages these higher action rates may provide in the case than no frame interpolation is used.

Thanks in advance.
fafrd is offline  
post #96 of 120 Old 05-17-2014, 02:43 PM
AVS Special Member
 
Mark Rejhon's Avatar
 
Join Date: Feb 1999
Location: North America
Posts: 8,124
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 96
In the whirlwind of this thread, I missed your questions, apologies. Here are my answers:
Quote:
Originally Posted by fafrd View Post

Is there any difference between the persistence and perceived motion blur between a 120Hz 50% BFI refresh (60Hz framerate with 8.3ms persistence) and a 240Hz 50% BFI with frame-repeat (120Hz framerate with 4.1ms persistence but each frame repeated twice)?
They are totally different.
You must do only one impulse per unique frame, for full motion clarity.

Repeat refreshes of the same frame, during eye-tracking of motion, on impulse displays, always creates a double-image effect or multi-image effect.
  • CRT 30fps@60Hz = double image effect during eye tracking of motion
  • Plasma 30fps@60Hz = double image effect during eye tracking of motion
  • LighBoost 60fps@120Hz = double image effect during eye tracking of motion
  • PWM dimming 60fps@180Hz (example) = triple image effect during eye tracking of motion
  • CRT/LightBoost 120Hz doing 30fps@120Hz = quadruple image effect during eye tracking of motion

If you're doing 60 unique frames per second at 120Hz+50% BFI (240Hz equivalence) with no frame interpolation, you get a double-image effect. I've seen this happen on several strobed LCDs, including Eizo's Turbo240 during 60Hz signal, that enables an annoying double-strobe mode during 60Hz mode, to reduce flicker (120 dominant strobes per second instead of 60), but at the cost of a double-image effect witnessed. The cause is due to eye-tracking occuring between repeat refreshes; so the repeat flickers occurs in different parts of the retina, due to the temporal offset between repeat impulses of the same static frame, as eye tracking has already moved on between the repeat impulses. Double strobing is pretty common (fortunately for flicker haters, but unfortunately for motion clarity lovers). Manufacturers are scared to strobe at low frequencies because it flickers so painfully for general computer use, so they prefer to double-strobe (120 strobes per second during 60Hz). This is the unfortunate non-override situation for EIZO FG2421, as well as Version 1 of BENQ XL2411Z, XL2420Z and XL2720Z.

I convinced BENQ (press release) to release Version 2 firmware that permits single-strobe at 60Hz on the BENQ Z-Series, so when you click the "Override" checkbox in Blur Busters Strobe Utility, it disables the flicker-reducing-but-double-image-inducing 120Hz strobe during 60Hz, and does only 60 strobes per second at 60Hz. This brings CRT clarity 60fps @ 60Hz on this LCD, but at the severe cost of annoying flicker. It is the first LCD computer monitor to bring CRT-league motion clarity to PlayStation4 60fps games and XBox360 60fps games. It's not comfortable for computer desktop use (imagine sitting at similar vision coverage to a 60Hz CRT or a low-persistence plasma (e.g. Panasonic VT50) plasma at 1:1 view distance: eye searing flicker), but it is fine by several users playing video games at slightly further distances in properly lit rooms, since games have lower average picture level than computer desktop, and the motion clarity of fast motion starts to balance things out better.

So, we now have several LCD blur-reducing strobe backlights (in both the monitor & the HDTV world) that works at 60fps@60Hz at one impulse per refresh, for proper blur reduction without artifacts (except flicker). A lot of users hate flicker (and just use only 120Hz strobing to solve flicker, during 120fps@120Hz PC gaming) but it's an optional feature that other users love -- since gaming consoles don't support 120Hz. The option is there, enableable/disableable, just as it should be -- it's much easier to strobe an LCD at 60Hz than at 120Hz, just manufacturers are reluctant to kill users' eyes with the CRT 60Hz experience. Besides, shutter-glasses give your eyes the 60Hz flicker experience anyway, so it's no more eye-strain-inducing than shutter glasses running at 60/60, especially if you adjust brightness (independently of persistence: pulseheight) and persistence (independently of brightness: pulsewidth) accordingly, something you're able to do with the BENQ Z-Series, as the "Brightness" OSD adjustment adjusts pulseheight, and the "Persistence" Strobe Utility adjustment adjusts pulsewidth. So the strobe duty cycle of a BENQ Z-Series is customizable by advanced users, to minimize the eyestrain-tradeoff of flicker discomfort versus motion blur discomfort versus brightness discomfort.
Quote:
Originally Posted by fafrd View Post

In terms of perceived 'flicker' when using BFI (or a scanning backlight), are higher frequencies of BFI/ flicker harder to detect? So if 50% BFI is going to be used to reduce persistence with 60fps source material, will 120Hz 50% BFI no frame repeat have more noticeable flicker than 240Hz 50% BFI with single-frame-repeat both of which will be more noticeable than 480Hz 50% BFI with triple-frame-repeat?
Flicker is independent of framerate, and is based on the flicker frequency and flicker duty cycle. So the answer is yes, 120 impulses per second (regardless of framerate or unique refreshes) has less visible flicker than 60 impulses per second.

Duty cycle also has a big effect on how easy it is to see flicker. An example is www.testufo.com/blackframes#count=3 ... Longer black frame duty cycles makes flicker easier to see, so very short-persistence displays work best at higher strobe rates (e.g. 120 impulses per second or higher). That's why motion interpolation is common, even in combination with BFI. (Certain 240Hz HDTVs using interpolation to go 60Hz->120Hz, and then BFI to emulate 120Hz->240Hz). It is is important if you want to avoid the double-image effect (or quadruple image effect, e.g. 30fps@120 strobes per second). Many display uses interpolation to increase frame rates to 120 or 240 unique images per second, and then strobe each of them at 120 or 240 impulses per second (strobes or scanning backlight passes per second). Some displays DO use shortcuts such as repeat impulses on the same frame, but that is deleterious to motion clarity as already described above. It's certainly a "pick your poison" effect occuring when you're trying to balance flicker annoyance and interpolation annoyance, when designing motion-blur-eliminating schemes for displays. It's one of the reasons I wish video was based on true 120fps or true 240fps, so we could have low persistence, without the flicker problems of 60Hz impulsing (plasma/CRT/BFI/strobe flicker), and without the artifact problems of interpolation.

So as far as we're stuck with the 60fps video standard, and for those of us that seek maximum motion sharpness/clarity above all else (regardless of display tech), we're stuck with the compromise soup of impulsing+interpolation combinations, or putting up with 60Hz flicker of interpolation-free/repetition-free impulsing like CRTs, plasmas, 60Hz LCD strobing (Sony Motionflow Impulse & BENQ Z-Series V2), sony PVM-series rolling-scan OLED displays, or _any_ low-persistence 60Hz that avoids motion interpolation and avoids multi-image effects -- all of them flicker at 60 cycles per second for people who are sensitive to 60Hz flicker.

All of this is relevant to OLED, as we'll have to cope with OLED flicker during rolling-scan 60Hz operation. If you've seen a Sony PVM series, you'll see what I mean about 60Hz OLED flicker. It also happens with the Samsung curved OLED when you fiddle the settings to the pure strobe mode (without interpolation). It's certainly a tradeoff that occurs. Most future OLEDs will continue to be using BFI and interpolation in order to achieve low-persistence without flicker, as long as we're stuck on the 60fps video standard.

Achieving low persistence can be verified by testing the Blu-Ray motion tests on HDTVs too. While a a TestUFO.com user can test the ability to pass the Panning Map Test, a BluRay user can also verify seeing the full "1080 lines of motion resolution" at similiar motion speeds (assuming 1-pixel-thick lines). These are reasonable litmus tests of low persistence (neighborhood of ~2ms or less causes either of these TestUFO or BluRay tests to reliably pass at 960 pixel/sec motion speeds). An advantage of frame interpolation is that it brightens the image, since you can get motion clarity at double strobe rate (2x brighter) or quadruple strobe rate (4x brighter), while maintaining strobe lengths, and one-strobe-per-frame. This is why some OLEDs HDTVs are continuing to use interpolation (MCFI) to lower motion blur (aka lower persistence).

Thanks,
Mark Rejhon

www.BlurBusters.com

BlurBusters Blog -- Eliminating Motion Blur by 90%+ on LCD for games and computers

Rooting for upcoming low-persistence rolling-scan OLEDs too!

Mark Rejhon is offline  
post #97 of 120 Old 05-17-2014, 03:16 PM
AVS Special Member
 
tgm1024's Avatar
 
Join Date: Dec 2010
Posts: 6,083
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 239 Post(s)
Liked: 644
Quote:
Quote:
Originally Posted by Mark Rejhon View Post
 
Quote:
Originally Posted by tgm1024 View Post

Even if the camera was moving along steadily, and even if the eye moved just as steadily (I don't believe that's true, but we'll stick with it for the moment), I believe there is a kind of unpredictable relaying of information from the eye->brain.  Our eyes are not CCD arrays, and the information isn't continuously fed.  Even though the camera can pick up artifacts that the human eye can pick up I believe that the camera cannot pick up the sum total of what the person is experiencing.  For instance, the clarify of the edges of the object itself would alter how well we track: this might dramatically change how we then are able to interpret the object because the eye loses acuity off center.  That's not the case with a pursuit camera: that will track the same regardless.

Although not relevant to this thread as it has nothing to do with impressions here except insofar as tracking accuracy relationship to seeing persistence, you are certainly right, static photos do not capture the dynamic ness of things like temporal dithering or other temporal effects, that is just averaged out when the camera, for instance, captures multiple plasma sub fields. Whereas info is continuously fed to the eye, but only a slice of time fed to a camera. So on that note, you are right. However, this effect doesn't play here in why some do not notice the persistence blur.

 

Backing up several steps, there are a few (meta) observations here in our conversation:

  1. I'm not making clear that I'm not just referring to physical tracking, but an overall tracking of some kind.  We might want to call it some kind of discomfort, or just overall impression of following it.
  2. You're giving me an enormous amount of information, 99% of which I agree with, but are still predicated on a concept I'm merely wondering out loud has too much faith put in it.
  3. I'll keep trying to explain this, because you're still convinced of one kind of blur being one thing and another kind of blur being another (substitute blur for whatever you like), and I'm trying to say we might be overlooking something.  This is frustrating for both of us because you believe it to be a misunderstanding of your research which I've been following, and I'm trying to explain that it isn't but can't because it involves speculation.
     
Quote:
I can at least tell you I confirm these factors do not cover the impressions such as Conan's, so I fail to see how it is any relevant to a person not seeing persistence related motion blur, except that the more inaccurate eye tracking is, the harder it is to see persistence related blur. That is a simpler and scientifically confirmable statement, than your speculation.

 

I don't believe it is, and the reason for this is because you haven't found a way yet to simulate eye tracking.  And a pursuit camera is not it.  At least I don't think it is.

 

 

Quote:

Quote:

Originally Posted by tgm1024 View Post

Which of these two people experience more discomfort (both eye tracking and stationary), in a 16.67 refresh with no pulse.

All humans are different.

 

Precisely.  But which of the two do you suspect BEST fits your theories and observations?  If you answer anything in this post, answer that.

 

I'm afraid a lot of the "persistence blur is only related to eye tracking" is leaving out a distinct possibility that someone who tracks perfectly might see persistence blur AND someone who tracks poorly might see persistence blur.  Again, substitute blur for discomfort or judder or whatever.  The pursuit camera does not prove what's happening neurologically as the eye tries to keep up with a stuttering object----we have theories on this for nearly (14 years?) but as soon as we say "all people are different", we're exposing the holes in a theory that tries to lock things down without exception.

 

Quote:

 

On Blur Busters, there are hundreds of people who get flicker eyestrain and hundreds who get motion blur eyestrain during PC gaming. People who get eyestrain from flicker, will prefer flickerfree (full persistence). People who get eyestrain from motion blur will prefer low persistence, especially at high flicker frequencies (lesser of evil). Blurry motion, especially at close viewing distances, creates eye-focussing strain as the eye tries but fails to focus on the motion but is impossible due to persistence based blur. There are people reporting reduction of eyestrain from strobe backlights during fast-motion gaming, this is why the LightBoost FAQ has both entries for "Why Does LightBoost Have MORE Eyestrain?" as well as "Why Does LightBoost Have LESS Eyestrain?".

More study is needed on this (no dispute) for an objective study.

 

Exactly.  Now this is where I'm starting from, not ending at, and what I'm saying is it's entirely possible we're overlooking something with persistence.  It is possible for GtG to dramatically trump persistence for some people, and when we overly focus (no pun intended) on certain tests we may miss the forest for the trees.

 


WARNING: You have now entered a no @#$%tard zone. Please gather your anti-vaccine propaganda nonsense and slowly back out the way you came in.
tgm1024 is offline  
post #98 of 120 Old 05-17-2014, 03:22 PM
AVS Special Member
 
tgm1024's Avatar
 
Join Date: Dec 2010
Posts: 6,083
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 239 Post(s)
Liked: 644

I'm sorry, but I'm punting this sub-conversation.  As soon as it turns into "you're not listening", "no, you're not listening", etc., it's gone too far.


WARNING: You have now entered a no @#$%tard zone. Please gather your anti-vaccine propaganda nonsense and slowly back out the way you came in.
tgm1024 is offline  
post #99 of 120 Old 05-17-2014, 03:34 PM
AVS Special Member
 
Mark Rejhon's Avatar
 
Join Date: Feb 1999
Location: North America
Posts: 8,124
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 96
I feel a lot of discussions is being sidetracked, from the topic of why people don't see persistence-based blur. There are many legitimate sub-topics (I agree with most of what you're saying, tgm1024 except for their significance), which can conceivably cover a minority of factors. If the discussion was to "find all possible reasons, including minority reasons" then I am the one at fault for sidetracking this discussion (for that, I apologize. I'm sorry). If the discussion is to find the dominant reason, then I would be the one steering this back on topic. Otherwise it's like "humans can't tell apart 30fps and 60fps" which was a flamewar-worthy discussion in the 80s/90s but more of a dead horse in the 21st century.

It's a forest of factors out there, but as I'm trying to re-explain, the minority factors you speak of, can be safely overlooked because of an overwhelming and obvious factor. The obvious primary reason why a lot of people often don't immediately notice persistence-based blur is that persistence-based blur is so natural looking. It looks fully like source-based motion blur, especially when GtG are a nonfactor (as it is for OLEDs) or close to it (as it is for better strobe LCDs) for real world material, to the point where persistence blur is not noticed at all because it just looks like source based blur. One can train oneself to pay more attention to persistence-based blur via a self-test that makes it obvious (especially if you own a 120fps@120Hz non-impulse display, such as 120Hz monitor in non-strobe mode, a 120Hz overclockable HDTV such as the SEIKI 50" HDTV, or the true-120Hz-capable DLP projectors with BFI disabled/3D mode disabled), and it's quite immediately obvious this is the overwhelming factor.
Quote:
Originally Posted by Mark Rejhon 
In fact, persistence blur can also be categorized as a form of tracking inaccuracy. Lower framerates leads to lower accuracy tracking, defined as worse sync between source (eye) and destination (display). Lower framerates have more sensation motion blur, down to a certain point where the blur looks more like stutters. High frequency stutters convert into motion blur. Motion blur persistence is simply high frequency stutters turning into blur. The stutters (seen in high speed video) is invisible to human eye and just looks like natural motion blur. The 60fps@60Hz UFO, viewed on 60Hz LCD is blurrier than on a 60Hz CRT. Longer persistence leads to more time a frame is displayed, which produces more time for a frame to fall behind during tracking. From this, you recognize persistence based blur is simply high frequency stutter/judder that are so fast that it blends into blur, which we call motion blur. It is the interaction between continuously moving eyes, and the discrete stepping forward of the static frames, and this occurs well beyond stutter-detection frequencies. It is so natural looking at high frame rates (when GtG is a nonissue) that it often looks like source based blur.

View the 5-UFO version on a native 120Hz computer monitor display with strobing disabled, www.testufo.com#count=5 and you will recognize persistence based blur is simply high frequency stutters blending into blur. On flickerfree displays with average GtG significantly less than a refresh cycle, this happens with the animation

Viewing www.testufo.com/#count=5 on a 120Hz computer monitor:
120fps - shortest trail
60fps - twice as long trail as 120fps at same motionspeed
30fps - four times as long trail as 120fps at same motionspeed, blur trail starts to look stuttery/juddering
15fps - eight times as long trail as 120fps at same motionspeed, but blur trail looks visually more like stutter than blur
7.5fps - sixteen times as long trail as 120fps at same motionspeed, it is simply stutter as it is so slow to be perceived as blur.

You can clearly tell the amplitudes of the stutters halve as you double framerate and it even keeps going on beyond 60Hz, 120Hz, and beyond, when the stutters has already blended into motion blur (that is what persistence based motion blur is!). The thresholds vary from person to person (some see 30fps more as stutter, others more as additional motion blur, as an example, based on how sensitive you are to stutters), but the trail length is always consistent, regardless of whether "trail" describes the stutter amplitude or the length of blurring.

(Bonus extra exercises for one equipped with extra displays to test on:
Turn on strobing, or use a CRT. You will see multi-image effects instead of blur. Now run the above tests on an adjustable persistence display (e.g. BENQ Z-series). As you adjust persistence, the amount of motion blur dynamically changes. For the low framerates, the multi image effect becomes fainter at higher persistence, and stronger at lower persistence. Once the persistence hits a full frame cycle, the multi image effect disappears for the lower frame rates (e.g. 30fps@60Hz double image effect) as there ceases to be black periods between refreshes, no longer chopping up the motion blur into multi-image effects.)

Eye tracking inaccuracies at the eye level, do not change the fact that tracking is more accurate on the display plane at higher framerates or lower persistence levels.

The educational comparative animation can still be seen on your existing 60Hz display (iPad, computer monitor, laptop), but it is even more obvious when seeing this animation on a native-120Hz non-impulse display. This is because you get multiple samples of pure persistence based blur (60fps and 120fps) and multiple samples of stutter amplitudes (30fps, 15fps, 7.5fps) to allow you to more easily reach the "Eureka, I Understand" moment. And fully recognizing the continuum between the persistence blur amplitude (blur trail length) and stutter amplitude. Where persistence blur is simply a high frequency stutter/judder blending into blur, and thus increases in persistence is simply enforced degradations in tracking accuracy (as visible stutters are, as well). It is so natural looking as motion blur that we never call them stutters when the stutter frequency is beyond human detection limits. Persistence blur surprisingly simple & elementary when one get the "Eureka" moment. Like successfully teaching someone to see 3:2 judder.


Motion clarity can be subjective, but there is the subjective versus objective issue at play here.

Let's look at a bit of history. When the FPD test came out in 2007 from the plasma coalition for measuring motion clarity on plasmas, it was standardized at a pokey slow rate of 6.5 pixels per frame. That motionspeed came up because it came as an average motionspeed of several test video material measured (from movies to sitcoms to sports), so the arbitrary motionspeed of 6.5ppf was chosen. So, motion of 6.5 pixels per frame is too slow to see the effects of 1ms-vs-2ms persistence, and too slow to really distinguish 2ms-vs-4ms persistence. But this motion speed was perfect for the plasma coalition, since plasma persistence isn't a motion blur botteneck at 6.5 pixels per frame (390 pixels/second). Convenient number that FPD chose. This was more-or-less the fastest motionspeed that plasmas can display without any noticeable persistence-based blur (caused by the persistence of plasma phosphor) at the higher line densities (e.g. neighbourhood of 1080 lines). Currently, I do not recommend testing motion at that slow motionspeed, but more of a motionspeed of 16ppf (960 pixels/second), which is more than twice the speed. Sports/games speeds. These speeds are fast enough to really reveal persistence-based motion blur at persistences less than 5ms (typical plasma phosphor). Such faster motionspeeds are witnessed more often during sports material, and computer/game material which often goes even faster than sports broadcasts. Many motion tests have still standardized on slow motion speeds, so it's not too difficult to achieve "1080 lines of motion resolution" on displays with 4ms persistence if the motionspeed is going only 6.5 pixels per frame. Now, ramp up to 960 pixels/second and faster, this is where 1ms-vs-2ms begins to readily reveal itself, and the difference of high-quality 120Hz MCFI (~8ms persistence)-versus-240Hz MCFI (~4ms persistence) becomes much more obvious. There are several Blu-Ray motion tests that includes the faster 16ppf (@ 60Hz) tests, but not everyone owns them or standardizes on those faster motionspeeds. This is one of the reasons I frown down on the "lines of motion resolution" test, but if motionspeeds are defined and they are 1-pixel-thick lines, the "lines of motion resolution" tests are still valid limit tests of low persistence, if done at fast motion speeds.

At the end of the day, we have to keep in mind these are more subjective and varying based on a human's view of perfection. People posting have often posted statements in OLED-related threads, with sentences resembling:
- "I don't see motion blur on my display"
- "My motion handling is good"
- "I've never seen any other TV as clear motion as my OLED HDTV"
- Statements resembling the above don't define the variables or even whether or not they're actually looking at or paying attention to persistence at all, as they're seeing motion blur but it completely looks like it's part of the source material because it's so natural looking.

However, the below are confirmed tests of persistence, as this forces human eye to pay attention to persistence-based motion blur effects:
- "I see a fully distinct 1080 lines of motion resolution at X pixels/sec on my Blu-Ray test disc"
- "I see the street name labels in TestUFO Panning Map Test at X pixels/sec"
Passing these at ~500 pix/sec, you're already ~4ms persistence or less.
Passing these at ~1000 pix/sec, you're already ~2ms persistence or less.
These tests force human attention to persistence, full stop. These tests are impossible to pass at 16ppf (960 pixels/sec) at full 16.7ms persistence 60Hz (even with OLED, even with 0ms GtG), and requires lower persistence than that (impulsing and/or interpolating) to pass either of these tests.

At 2ms persistence at 1000 pixels/sec, single-pixel-details are already motionblurred by 2 pixels, sufficient to blot out single-pixel-thick lines in "lines of motion resolution" test (BluRay), or make tiny text hard to read in a panning map test (TestUFO). Persistence is the primary dominant factor affecting the results of these specific motion tests. There are certainly subjective factors (see David Katmaier's article on CNet (FPD scrolling monoscope tests at 6.5ppf) but it is far more objective than the generic "My motion handling looks good" talk. The numbers do generally converge on the approximate ballparks (e.g. 500,600,500 and 900,1080,1200) despite the different numbers read by different humans. It only requires OLED to do very light/mild BFI (120Hz BFI) to succeed the FPD monoscope test at 1080 lines at the common pokey medium-speed of 6.5ppf, which doesn't make the lower-persistence displays (CRT, short-pulse OLED such as Oculus DK2, or strobed LCD such as LightBoost) stand out dramatically like it does at 16ppf or 32ppf. It goes very linearly: At 6.5ppf, resolving say, 600 lines of motion resolution, you'll resolve only 300 lines of motion resolution at 13ppf (assuming 13ppf is still within your eye tracking speeds, otherwise, you resolve even less than 300 lines of motion resolution). You then need to halve persistence to get back to the original measurement. Resolving at 32ppf versus resolving at 6.5ppf requires approx 1/5th (~6.5/32nd) the persistence to resolve the same detail. These kind of motion resolution tests are unambigiously persistence blur tests.

Thanks,
Mark Rejhon

www.BlurBusters.com

BlurBusters Blog -- Eliminating Motion Blur by 90%+ on LCD for games and computers

Rooting for upcoming low-persistence rolling-scan OLEDs too!

Mark Rejhon is offline  
post #100 of 120 Old 05-17-2014, 05:18 PM
AVS Special Member
 
Wizziwig's Avatar
 
Join Date: Jul 2001
Location: SoCal, USA
Posts: 1,161
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 81 Post(s)
Liked: 94
Quote:
Originally Posted by Mark Rejhon View Post

Also, 2D strobe crosstalk (from incomplete GtG) is a lot less visible than 3D stereoscopic crosstalk for the same contrast images because:

(A) You have the shutter-glasses leakage to contend to, as well (above and beyond the GtG limitations of the LCD); and

I hear that excuse a lot. Try those exact same shutter glasses with a DLP TV or Projector and then tell me why there is zero visible crosstalk? Even with 24 Hz content which allows a huge amount of time to complete the frame transitions, people complain of crosstalk. No complaints on DLP even at 60Hz per eye. The LCD response time specs just don't add up. The GtG tables I linked prove it, just like those old Displaymate pictures.
Quote:
Originally Posted by Mark Rejhon View Post

Because new TVs have come out that are using strobing to reduce motion blur.
For example, a bunch of new Vizio TV's have now recently introduced a low-persistence Game Mode, and more manufacturers have started studying ways to add low persistence (at low latency, either via strobing and/or ultralow latency interpolation) for gaming.

Examples:
- Sony Motionflow Impulse in "Game Mode".
- Vizio High Velocity Mode in "Game Mode", debutted at CES 2014
etc.

I have a USB thumb drive filled with 60 fps video game content that I take to stores every few months. I have tested the Sony Impulse mode, Sammy OLED, LG OLED, Plasmas, etc.. Nothing currently available in a viable TV size comes even remotely close to my 34" Sony CRT or smaller 22" Diamondtrons when playing the same content. Any fast complex motion (especially with rotation) completely breaks all the interpolation algorithms. Just using pulsing/strobing helps to an extent but still is not at CRT level. On LCD, the strobing actually makes the ghosting/halos more obvious to me on some backgrounds but at least it does reduce texture blur.

Quote:
Originally Posted by Mark Rejhon View Post

It's a tsunami wave occuring right now, and manufacturers has suddenly started paying attention. Two years ago when I brought in a high speed camera into a TV showroom (Casio EX-ZR200), only the plasmas (and 2 or 3 high end LCD TVs) were flickering in high speed video. But earlier this month when I brought in a camera to a TV showroom, more than one-third of LCDs were flickering in the high speed video at the strobe frequency of the video (e.g. 120 strobes per second for 120fps interpolated video, or 240 strobes per second for 240fps interpolated video). It's pretty clear that over the last 2 years, huge strides have been made by LCD manufactuers in adding strobe-based (scanning/strobed) motion blur reduction on LED/LCD televisions, even to the midrange models and certain semi-entrylevel models. On most, the strobe modes can be disabled, but it has become the default mode on a lot of models as it reduces motion blur above-and-beyond interpolation alone.

I hope your vision of the future comes true. But we're not anywhere near where we were 10 years ago when it comes to motion quality on televisions. I remain skeptical until I see proof with my own eyes - not more marketing BS.

I also don't trust reviewers or people on forums unless they have a quality CRT to use as reference and do more than just run the FPD Motion Resolution pattern.
Wizziwig is offline  
post #101 of 120 Old 05-17-2014, 05:36 PM
AVS Special Member
 
Mark Rejhon's Avatar
 
Join Date: Feb 1999
Location: North America
Posts: 8,124
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 96
Quote:
Originally Posted by Wizziwig View Post

I hear that excuse a lot. Try those exact same shutter glasses with a DLP TV or Projector and then tell me why there is zero visible crosstalk?
Right, it's a very minor factor, but it's there.
Given sufficiently high contrast material on a bright DLP light cannon (e.g. white stuff on black), the crosstalk does shows up. And LCD shutter glasses have become much better (higher contrast ratio between shutter open versus shutter close -- and better black frame insertions for the shutter open/close intervals to prevent frame transitions from becoming visible -- DLP does a very good completely-black frame ), so the this factor could be rightfully insignificant at typical large screen home theater brightnesses. Also, LCD displays are much brighter than DLPs. Granted, there's often much more crosstalk on VA/IPS HDTV's than on TN 120Hz monitors, and we wouldn't even consider current TN tech for full size televisions. That said, strobe crosstalk is typically less visible for 2D than during 3D, since the crosstalk disappears during 2D when motion stops.
Quote:
Originally Posted by Wizziwig View Post

I have a USB thumb drive filled with 60 fps video game content that I take to stores every few months. I have tested the Sony Impulse mode, Sammy OLED, LG OLED, Plasmas, etc.. Nothing currently available in a viable TV size comes even remotely close to my 34" Sony CRT or smaller 22" Diamondtrons when playing the same content. Any fast complex motion (especially with rotation) completely breaks all the interpolation algorithms. Just using pulsing/strobing helps to an extent but still is not at CRT level. On LCD, the strobing actually makes the ghosting/halos more obvious to me on some backgrounds but at least it does reduce texture blur.
This is quite true, as I agree none of them eliminate motion blur as well as a good ultralow persistence CRT (<1ms) like those CRTs mentioned. (On the other hand -- comparing to a medium-persistence CRT (~2ms) such as the Sony FW900, becomes a different story, especially with the lower-persistence gaming strobe backlights recently encountered; you can now get less motion blur with some of those models than a FW900 CRT if you can stand the large brightness loss of the ultralow persistence settings). And, yes, solid backgrounds do make sub-refresh-cycle GtG ghosting more visible (2ms vs 3ms vs 4ms GtG becomes much more noticeable during strobing) since every millisecond of incomplete GtG leads to progressively more and more visible crosstalk effects, GtG no longer being obscured by persistence-based motion blurring.
Quote:
Originally Posted by Wizziwig View Post

I also don't trust reviewers or people on forums unless they have a quality CRT to use as reference and do more than just run the FPD Motion Resolution pattern.
For FPD talk, see the post above (that I made an hour ago) about why I am somewhat disdainful about those types of tests due (as well as my old post, Standardizing Motion Resolution: "Milliseconds of Motion Resolution" (MPRT) better than "Lines of Motion Resolution"). So I have a fairly low opinion of those arbitrary tests, but I still bring them up since they are still litmus tests of reasonably low persistence (~2ms league), especially when run at adequate motionspeeds (~16ppf or 960 pixels/second, about half screen width per second motion).

You've probably heard of my old NEC XG135 CRT projector (2001), and I've used a variety of 21" CRTs that I long used. Regardless of display tech (CRT or BENQ Z-series strobe LCD), I can tell apart 0.5ms and 1.0ms persistence quite easily (at super fast motionspeeds, ~48ppf at 60fps@60Hz or ~24ppf at 120fps@120Hz, right at the limits of my accurate eye tracking speed of 3000pix/sec at 1:1 view distance from a 1080p display). I agree if you're comparing to those CRTs you mentioned, Sony Motionflow Impulse don't hold a candle to that, if that's your reference benchmark. And none of the LCDs I have come close to black levels and color quality of the CRTs. Now, that said, some of the better computer monitor strobe backlights are now competitive/surpassing the medium-persistence phosphor CRTs (e.g. FW900 and similar) in motion clarity of real world game material. The FW900 CRT (longer persistence) is the motion clarity benchmark that I compare the strobed gaming monitors to, and it's competitive. I hope such motion clarity becomes more common for HDTVs regardless of tech (OLED or LCD). Either way, as many people posted earlier, we need ultrabright and ultrashort strobing capability to approach a low-persistence CRT that uses thousands of cd/m2 momentary brightness at the beam spot. A ultrabright Trinitron CRT with ultralow persistence (sub-millisecond), is obviously extremely hard to beat, since those are sub-millisecond persistence displays that still remain bright at sub-millisecond persistence.

I think OLED is a better ultimate goal, though in the time of OLED progression (price falls, brightness improvements, persistence improvements), I see lots of time for LCDs to continue improving over the years (extra strobe brightness for even lower persistence, fainter GtG leaks between refreshes, VA/IPS getting closer to TN speeds, better per-scanline-optimized response time acceleration (LightBoost uses per-scanline RTC; using a different RTC algorithm per scanline, accounting for freshness of GtG from scanout, to compensate for the less time of GtG at bottom edge versus top edge, before strobe), and other advanced engineering improvements that are going on as we speak. It's a fun progression to watch, even if a lot of us is dissapointed that the reference CRTs remain untouched. But even a lot of us didn't think LCD displays would ever approach a medium-persistence CRT, and some of them actually did (only in the specific category of motion clarity, not blacks or colors).

Thanks,
Mark Rejhon

www.BlurBusters.com

BlurBusters Blog -- Eliminating Motion Blur by 90%+ on LCD for games and computers

Rooting for upcoming low-persistence rolling-scan OLEDs too!

Mark Rejhon is offline  
post #102 of 120 Old 05-17-2014, 06:32 PM - Thread Starter
AVS Special Member
 
fafrd's Avatar
 
Join Date: Jun 2002
Posts: 3,063
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 14 Post(s)
Liked: 418
Quote:
Originally Posted by Mark Rejhon View Post

In the whirlwind of this thread, I missed your questions, apologies. Here are my answers: Warning: Spoiler! (Click to show)
Quote:
Originally Posted by fafrd View Post

Is there any difference between the persistence and perceived motion blur between a 120Hz 50% BFI refresh (60Hz framerate with 8.3ms persistence) and a 240Hz 50% BFI with frame-repeat (120Hz framerate with 4.1ms persistence but each frame repeated twice)?
They are totally different.
You must do only one impulse per unique frame, for full motion clarity.

Repeat refreshes of the same frame, during eye-tracking of motion, on impulse displays, always creates a double-image effect or multi-image effect.
  • CRT 30fps@60Hz = double image effect during eye tracking of motion
  • Plasma 30fps@60Hz = double image effect during eye tracking of motion
  • LighBoost 60fps@120Hz = double image effect during eye tracking of motion
  • PWM dimming 60fps@180Hz (example) = triple image effect during eye tracking of motion
  • CRT/LightBoost 120Hz doing 30fps@120Hz = quadruple image effect during eye tracking of motion

If you're doing 60 unique frames per second at 120Hz+50% BFI (240Hz equivalence) with no frame interpolation, you get a double-image effect. I've seen this happen on several strobed LCDs, including Eizo's Turbo240 during 60Hz signal, that enables an annoying double-strobe mode during 60Hz mode, to reduce flicker (120 dominant strobes per second instead of 60), but at the cost of a double-image effect witnessed. The cause is due to eye-tracking occuring between repeat refreshes; so the repeat flickers occurs in different parts of the retina, due to the temporal offset between repeat impulses of the same static frame, as eye tracking has already moved on between the repeat impulses. Double strobing is pretty common (fortunately for flicker haters, but unfortunately for motion clarity lovers). Manufacturers are scared to strobe at low frequencies because it flickers so painfully for general computer use, so they prefer to double-strobe (120 strobes per second during 60Hz). This is the unfortunate non-override situation for EIZO FG2421, as well as Version 1 of BENQ XL2411Z, XL2420Z and XL2720Z.

I convinced BENQ (press release) to release Version 2 firmware that permits single-strobe at 60Hz on the BENQ Z-Series, so when you click the "Override" checkbox in Blur Busters Strobe Utility, it disables the flicker-reducing-but-double-image-inducing 120Hz strobe during 60Hz, and does only 60 strobes per second at 60Hz. This brings CRT clarity 60fps @ 60Hz on this LCD, but at the severe cost of annoying flicker. It is the first LCD computer monitor to bring CRT-league motion clarity to PlayStation4 60fps games and XBox360 60fps games. It's not comfortable for computer desktop use (imagine sitting at similar vision coverage to a 60Hz CRT or a low-persistence plasma (e.g. Panasonic VT50) plasma at 1:1 view distance: eye searing flicker), but it is fine by several users playing video games at slightly further distances in properly lit rooms, since games have lower average picture level than computer desktop, and the motion clarity of fast motion starts to balance things out better.

So, we now have several LCD blur-reducing strobe backlights (in both the monitor & the HDTV world) that works at 60fps@60Hz at one impulse per refresh, for proper blur reduction without artifacts (except flicker). A lot of users hate flicker (and just use only 120Hz strobing to solve flicker, during 120fps@120Hz PC gaming) but it's an optional feature that other users love -- since gaming consoles don't support 120Hz. The option is there, enableable/disableable, just as it should be -- it's much easier to strobe an LCD at 60Hz than at 120Hz, just manufacturers are reluctant to kill users' eyes with the CRT 60Hz experience. Besides, shutter-glasses give your eyes the 60Hz flicker experience anyway, so it's no more eye-strain-inducing than shutter glasses running at 60/60, especially if you adjust brightness (independently of persistence: pulseheight) and persistence (independently of brightness: pulsewidth) accordingly, something you're able to do with the BENQ Z-Series, as the "Brightness" OSD adjustment adjusts pulseheight, and the "Persistence" Strobe Utility adjustment adjusts pulsewidth. So the strobe duty cycle of a BENQ Z-Series is customizable by advanced users, to minimize the eyestrain-tradeoff of flicker discomfort versus motion blur discomfort versus brightness discomfort.
Quote:
Originally Posted by fafrd View Post

In terms of perceived 'flicker' when using BFI (or a scanning backlight), are higher frequencies of BFI/ flicker harder to detect? So if 50% BFI is going to be used to reduce persistence with 60fps source material, will 120Hz 50% BFI no frame repeat have more noticeable flicker than 240Hz 50% BFI with single-frame-repeat both of which will be more noticeable than 480Hz 50% BFI with triple-frame-repeat?
Flicker is independent of framerate, and is based on the flicker frequency and flicker duty cycle. So the answer is yes, 120 impulses per second (regardless of framerate or unique refreshes) has less visible flicker than 60 impulses per second.

Duty cycle also has a big effect on how easy it is to see flicker. An example is www.testufo.com/blackframes#count=3 ... Longer black frame duty cycles makes flicker easier to see, so very short-persistence displays work best at higher strobe rates (e.g. 120 impulses per second or higher). That's why motion interpolation is common, even in combination with BFI. (Certain 240Hz HDTVs using interpolation to go 60Hz->120Hz, and then BFI to emulate 120Hz->240Hz). It is is important if you want to avoid the double-image effect (or quadruple image effect, e.g. 30fps@120 strobes per second). Many display uses interpolation to increase frame rates to 120 or 240 unique images per second, and then strobe each of them at 120 or 240 impulses per second (strobes or scanning backlight passes per second). Some displays DO use shortcuts such as repeat impulses on the same frame, but that is deleterious to motion clarity as already described above. It's certainly a "pick your poison" effect occuring when you're trying to balance flicker annoyance and interpolation annoyance, when designing motion-blur-eliminating schemes for displays. It's one of the reasons I wish video was based on true 120fps or true 240fps, so we could have low persistence, without the flicker problems of 60Hz impulsing (plasma/CRT/BFI/strobe flicker), and without the artifact problems of interpolation.

So as far as we're stuck with the 60fps video standard, and for those of us that seek maximum motion sharpness/clarity above all else (regardless of display tech), we're stuck with the compromise soup of impulsing+interpolation combinations, or putting up with 60Hz flicker of interpolation-free/repetition-free impulsing like CRTs, plasmas, 60Hz LCD strobing (Sony Motionflow Impulse & BENQ Z-Series V2), sony PVM-series rolling-scan OLED displays, or _any_ low-persistence 60Hz that avoids motion interpolation and avoids multi-image effects -- all of them flicker at 60 cycles per second for people who are sensitive to 60Hz flicker.

All of this is relevant to OLED, as we'll have to cope with OLED flicker during rolling-scan 60Hz operation. If you've seen a Sony PVM series, you'll see what I mean about 60Hz OLED flicker. It also happens with the Samsung curved OLED when you fiddle the settings to the pure strobe mode (without interpolation). It's certainly a tradeoff that occurs. Most future OLEDs will continue to be using BFI and interpolation in order to achieve low-persistence without flicker, as long as we're stuck on the 60fps video standard.

Achieving low persistence can be verified by testing the Blu-Ray motion tests on HDTVs too. While a a TestUFO.com user can test the ability to pass the Panning Map Test, a BluRay user can also verify seeing the full "1080 lines of motion resolution" at similiar motion speeds (assuming 1-pixel-thick lines). These are reasonable litmus tests of low persistence (neighborhood of ~2ms or less causes either of these TestUFO or BluRay tests to reliably pass at 960 pixel/sec motion speeds). An advantage of frame interpolation is that it brightens the image, since you can get motion clarity at double strobe rate (2x brighter) or quadruple strobe rate (4x brighter), while maintaining strobe lengths, and one-strobe-per-frame. This is why some OLEDs HDTVs are continuing to use interpolation (MCFI) to lower motion blur (aka lower persistence).


Mark,

thanks for the detailed reply. I think you've helped me to understand the areas I was unclear on. I'm going to attempt a recap of the overall topic to see if I have it right:

1/ for 24fps film source, either you accept the natural persistence that makes the material 'film-like' or it bothers you so much that you prefer the 'soap opera effect' to increase the effective framerate to 48, 60, 120, or whatever higher framerate with a corresponding reduction in persistence-based motion blur (with BFI or whatever - the gap between 24fps and high effective frame-rates is so large that there is not much more to be said about 24fps material - accept soap opera effect or persistence-based motion blur).

2/ for 60fps fast-motion video such as sports, reduction of motion blur is most effective in the following order:

-BEST: FI to highest frame-rate possible (240fps of 120fps) coupled with minimum persistence per frame (<1ms or 25%persistence at 240fps would be 'CRT-like' assuming frame interpolation did not introduce artifacts).

-NEXT BEST: No FI but BFI to at least 50% - this will halve persistence-based motion blur but will introduce noticeable flicker. Higher BFI % will further decrease persistence motion-blur but at the expense of even more noticeable 60Hz flicker.

-NEXT BEST (PICK YOUR POISON): No FI but Frame-repeat coupled with BFI will allow flicker to be less noticeable (because the BFI will be at 120Hz or higher) at the expense of a double image effect (or even higher).

3/ for 120fps material such as video-gaming (hopefully :-), BFI alone to 50%, 75%, or even higher would allow persistence to be reduced to 4.2ms, 2.1ms, or even lower levels without introducing noticeable flicker. FI to 240Hz (if effective and not introducing artifacts) could complement that and further reduce persistence-based motion-blur by another 50%

I have read repeatedly that 240Hz LCD panels offer no real benefit over 120Hz LCD panels, but this analysis is leading me to a different conclusion (unless triple-frame-interpolation is crappy).
fafrd is offline  
post #103 of 120 Old 05-17-2014, 06:57 PM
AVS Special Member
 
Chronoptimist's Avatar
 
Join Date: Sep 2009
Posts: 2,559
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 203
Quote:
Originally Posted by fafrd View Post

I have read repeatedly that 240Hz LCD panels offer no real benefit over 120Hz LCD panels, but this analysis is leading me to a different conclusion (unless triple-frame-interpolation is crappy).
I have read this posted here a lot too. If you use a test like the ones on Mark's TestUFO site, the differences between 120Hz and 240Hz interpolation are obvious.

I was fortunate enough to have the store I bought my display from to have 60Hz, 120Hz, and 240Hz sets all of the same make and size lined up next to each other with fast-moving demo material like this.
The difference between all the panels was huge. It was even greater once you combined that with backlight scanning. ("Clear" motionflow modes)

Even at really slow speeds such as 120px/sec, interpolation and backlight scanning make a huge difference.
On my set, switching to the "Clear Plus" mode (also called Clear 2 I think) significantly reduces the occurrence of interpolation artifacts compared to the other modes.
If I put it in a mode that just uses 240Hz interpolation there's a lot of flickering around some of the text, which disappears in Clear Plus mode.

Running at 60Hz without any interpolation, even the 120px/sec speed is a blur on this TV.
In the "Clear Plus" mode, I can easily read the text at 1920px/sec. At higher speeds it is still sharp, but difficult to make out simply due to the speed it's moving across the screen.
Chronoptimist is offline  
post #104 of 120 Old 05-17-2014, 07:21 PM - Thread Starter
AVS Special Member
 
fafrd's Avatar
 
Join Date: Jun 2002
Posts: 3,063
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 14 Post(s)
Liked: 418
Quote:
Originally Posted by Chronoptimist View Post

Quote:
Originally Posted by fafrd View Post

I have read repeatedly that 240Hz LCD panels offer no real benefit over 120Hz LCD panels, but this analysis is leading me to a different conclusion (unless triple-frame-interpolation is crappy).
I have read this posted here a lot too. If you use a test like the ones on Mark's TestUFO site, the differences between 120Hz and 240Hz interpolation are obvious.

I was fortunate enough to have the store I bought my display from to have 60Hz, 120Hz, and 240Hz sets all of the same make and size lined up next to each other with fast-moving demo material like this.
The difference between all the panels was huge. It was even greater once you combined that with backlight scanning. ("Clear" motionflow modes)

Even at really slow speeds such as 120px/sec, interpolation and backlight scanning make a huge difference.
On my set, switching to the "Clear Plus" mode (also called Clear 2 I think) significantly reduces the occurrence of interpolation artifacts compared to the other modes.
If I put it in a mode that just uses 240Hz interpolation there's a lot of flickering around some of the text, which disappears in Clear Plus mode.

Running at 60Hz without any interpolation, even the 120px/sec speed is a blur on this TV.
In the "Clear Plus" mode, I can easily read the text at 1920px/sec. At higher speeds it is still sharp, but difficult to make out simply due to the speed it's moving across the screen.

What is the native refresh rate of Gen-1 LG WOLEDs?

What is the refresh rate expected on Gen-2 LG WOLEDs?

What is the outlook for 240Hz native refresh WOLED panels?
fafrd is offline  
post #105 of 120 Old 05-17-2014, 07:32 PM
AVS Special Member
 
Chronoptimist's Avatar
 
Join Date: Sep 2009
Posts: 2,559
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 203
The problem with OLED is that, because it has essentially perfect response times, you don't really want to "ruin" that by using interpolation.
You really want to be using dark frame insertion instead.
Chronoptimist is offline  
post #106 of 120 Old 05-17-2014, 07:47 PM - Thread Starter
AVS Special Member
 
fafrd's Avatar
 
Join Date: Jun 2002
Posts: 3,063
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 14 Post(s)
Liked: 418
Quote:
Originally Posted by Chronoptimist View Post

The problem with OLED is that, because it has essentially perfect response times, you don't really want to "ruin" that by using interpolation.
You really want to be using dark frame insertion instead.

The two are not mutually exclusive.

If you have 60fps fast-action sports content, that will looks better interpolated to 120fps and then strobed to 50% BFI.

If we just focus on 120Hz refresh for the moment, persistence (and perceived motion blur) would be ranked as follows:

BEST
  • 2.1ms: 120Hz FI + 75% BFI
  • 4.2ms: 120Hz FI + 50% BFI
  • 4.2ms: 60Hz (no FI) + 75% BFI (no interpolation but most noticeable 60Hz flicker)
  • 8.4ms: 120Hz FI (no BFI)
  • 8.4ms: 60Hz+ 50% BFI (no interpolation but noticeable 60Hz flicker)
  • 16.7ms: 60Hz (no FI and no BFI)
WORST

For fast-action sports, soap-opera concerns don't really arise, so unless the frame interpolation is really crappy, why would you think frame interpolation would 'ruin' it?

And if and when video games support 120Hz framerate, will OLD panels be able to refresh quickly enough to take advantage of that or not?

Can OLEDs refresh at 240Hz?
fafrd is offline  
post #107 of 120 Old 05-17-2014, 08:27 PM
AVS Special Member
 
Mark Rejhon's Avatar
 
Join Date: Feb 1999
Location: North America
Posts: 8,124
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 96
Quote:
Originally Posted by fafrd View Post

I have read repeatedly that 240Hz LCD panels offer no real benefit over 120Hz LCD panels, but this analysis is leading me to a different conclusion (unless triple-frame-interpolation is crappy).
There are quite obviously point of diminishing returns, so you really want bigger steps in persistence to "make it worth it".

Motion clarity ratio 60 = 16.7ms persistence = baseline
Motion clarity ratio 120 = 8.3ms persistence = 50% less motion blur
Motion clarity ratio 240 = 4.2ms persistence = 75% less motion blur
Motion clarity ratio 480 = 2.1ms persistence = 87.5% less motion blur
Motion clarity ratio 960 = 1.0ms persistence = 93.75% less motion blur

Samsung uses "Clear Motion Ratio #"
Sony uses "Motionflow XR #"
LG uses "Motion Clarity Index #"
Sharp uses "Aquomotion #"

Motion clarity ratios are more representative of motion blur than "Hz" marketing -- they are closer approximations of actual observed motion blur. Although the use of motion blur numbers are highly subjective (and sometimes controversial), it is less controversial than the "Hertz Hype". What makes ratios simple, is that the inverse of "motion clarity ratios", is persistence. Viola, and a more proper marketing number than Hz, if you're a person concerned about display persistence above all else. It fully accounts for black frame insertion and/or interpolation. I, however, wish more TV manufacturers would start advertising motion clarity ratio numbers and latency numbers for Game Mode, because achieving large motion clarity ratios at low latencies (needed for games/computers) is a difficult engineering achievement for any display. For now, we can only enjoy the nice clarity ratios on our sports television broadcasts since all the 960 algorithms all have a lot of input lag.

To estimate the persistence of these TVs, you simply use 1000 divided by "#", and you get milliseconds of persistence. Although these are often hyped numbers, the ratio numbers are finally starting to approach sensibility (FINALLY, YOU TELEVISION SET MANUFACTURERS!). Measured persistence values still vary a lot (like claimed contrast ratio versus measured contrast ratio) but isn't marketing stupidity such as "600Hz subfield rate" which has absolutely zilch to do with motion blur, especially if the subfields are simply temporally dithered patterns derived from exactly the same refresh.

Some of these new TVs go all the way up to "960" and sometimes beyond. So we already have some new LED HDTVs that achieve 1.0ms persistence via a combination of MCFI + BFI techniques. Now, you can see 16.7ms->8.3ms persistence is 50 percentage points less motion blur than 60Hz full-persistence, while 8.3ms->1.0ms persistence is 43.75 percentage points less motion blur than 60Hz full-persistence. In order to get almost as a dramatic jump between 60Hz->120hz, you need to jump 120Hz->960Hz to make it feel as substantial a jump between 60Hz->120Hz. Due to diminishing points of returns, the 120Hz->240Hz upgrade is quite small, so you need a bigger jump upwards. Doing 120Hz->"motion clarity ratio 960" is a much bigger step that has a level of dramaticness more similar to the 60Hz->120Hz jump, at least for sports-based material.

Only Panasonic still sticks with the stupid Hz silliness. "Panasonic 1600Hz scanning backlight" which doesn't even produce 1/1600sec persistence! Yes, granted, the Panasonic scanning backlight does an impressive job, but the number is still hype that doesn't derive as accurately to persistence numbers. Motion clarity ratios DOES correspond to persistence numbers (if measured correctly!). Of course, it's still often a hyped number as inefficiencies come into play (e.g. GtG issues, bleed between scanning backlight segments, and other inefficiencies that worsens persistence), so real-world motion clarity can vary.

However, motion clarity ratios are the exact inverse of persistence, so motion clarity ratios is conveniently the amount of persistence based motion blur that they claim you will you get. It's not perfect, and the worst "motion clarity ratio 960" LCD will look worse than the best "motion clarity ratio 480" LCD, however, they are at least more accurate marketing numbers (for motion blur) than Hertz.
Quote:
Originally Posted by fafrd View Post

BEST
  • 2.1ms: 120Hz FI + 75% BFI
  • 4.2ms: 120Hz FI + 50% BFI
  • 4.2ms: 60Hz (no FI) + 75% BFI (no interpolation but most noticeable 60Hz flicker)
  • 8.4ms: 120Hz FI (no BFI)
  • 8.4ms: 60Hz+ 50% BFI (no interpolation but noticeable 60Hz flicker)
  • 16.7ms: 60Hz (no FI and no BFI)
WORST
Good math, you've learned the motion blur mathematics properly. These numbers are correct.
(Except 8.4ms should be 8.333333ms, but I'm stupidly super-nitpicking that. GtG slowness also fudge the numbers upwards a little bit)
Also, I've seen displays that do better than this by achiving ~1.0ms persistence.
Quote:
Originally Posted by fafrd View Post

For fast-action sports, soap-opera concerns don't really arise, so unless the frame interpolation is really crappy, why would you think frame interpolation would 'ruin' it?
Indeed, when it comes to sports material, frame interpolation is the lesser of evil for a lot of people.
It's DOA for videogames (lag), and controversial for movies (soap opera effect), but more acceptable for sports.
Quote:
Originally Posted by fafrd View Post

And if and when video games support 120Hz framerate, will OLD panels be able to refresh quickly enough to take advantage of that or not?
OLEDs have no problem refreshing faster at 60Hz, it is a matter of drive electronics. Already, OLED already run at 120fps via motion interpolation. It's a matter of letting it be done from the video input circuitry, rather than via the interpolator circuitry.

Also, consider the gaming HTPC situation. Computer video games already routinely can achieve 120fps@120Hz if you have a fast enough GPU (e.g. a $Geforce Titan) and adjust the graphics detail. Plus, a lot of 2-year-old-and-older games successfully run at max detail levels. Today, I'm running 1920x1080 120fps on my Geforce GTX Titan, in games as recent as Bioshock Infinite (at near full graphics detail, except shadow detail down 1 notch, and view distance down 2 notch, and using FXAA instead of FSAA). I can reach 120fps@120Hz at reduced detail levels at Crysis3/Battlefield4. But at max detail level on newest games, I can't reach those framerates.
Quote:
Originally Posted by fafrd View Post

Can OLEDs refresh at 240Hz?
I don't see why not. I think 480fps@480Hz should eventually be doable.

120fps at 50% BFI has the same amount of motion blur as 240fps full-persistence, anyway. OLED persistence is easy to lower to less than 1ms persistence (it has already been done on existing OLEDs in the lab) but the light output sucks (90% BFI on 300cd/m2 yields only 30cd/m2). It looks like full CRT motion clarity, like a CRT with its brightness adjusted WAY down -- the motion looks really pure when a rolling-scan OLED is adjusted to 1ms persistence and less (Oculus experimented with all kinds of persistence numbers in DK2). For adjusting persistence in an OLED rolling scan, it's just simply adjusting the chase distance between the OFF-scan-pass chasing behind the ON-scan-pass in an OLED rolling scan. Even though sub-1ms in an OLED is easy to achieve in the lab (at low light output), you won't yet commonly be seeing such low persistence numbers in an interpolation-free manner yet due to insufficient light output during 1ms interpolation-free persistence. But I've got faith it will happen eventually.

________

On a related topic, if only broadcasters would start broadcasting native 240fps or 480fps video, so we we can have low-persistence without always needing MCFI. It wouldn't help non-persistence-critical material such as movies and sitcoms. However, it will certainly help persistence-critical material (fast motion) such as sports broadcasts, allowing them to pull off motion clarity while completely avoiding visible flicker (60Hz) and completely avoiding interpolation.

Optimstically, an easier interpolation-free HFR scenario (more easy to imagine) is perhaps the emergence of 120fps HFR video talk is an distant early signal of 120fps standardization (e.g. cinema of the 2020's displaying 120fps HFR movies) could probably influx to displays quite easily (since lots of them already do 120fps by MCFI) eventually leading to widespread standardized true-120fps video broadcasts of the 2030's. The Moore's law of bandwidth/cameras/display tech already made it possible and it could happen, especially if the snowball starts rolling in the cinemas, since the whole technological chain already exists. You can already click to playback 120fps video if you already own a true-120Hz monitor. But once true-120fps hits movie theaters, that might seed the latent 120Hz-capacities to be standardized at the broadcasting level and the HDMI specification document level (rather than undocumented HDTV overclocking for true 120Hz from computer). The bar is quite low because of the huge amount of 120fps MCFI capacity but simply isn't standardized at the cabling and content distribution level. Ironically, it is technologically easier to do true 120fps (less processing) than to do 120fps MCFI (more processing), except insofar as more difficulties for the cameras and the bandwidth needed (but those are being solved -- the upcoming GoPro Hero 4 can do 120fps@1080p for under $500). But a small domino as a 120fps HFR movie at the theaters, and BAM! More 120fps content that impresses with low persistence without the side effect of 60Hz flicker. Who knows? What may influx over the coming years, and at 120fps, the very least persistence can finally be comfortably lowered via impulsing alone (120Hz flicker is far more comfortable than 60Hz flicker). You can easily achieve "motion clarity ratio 960" (1ms persistence) on 120fps material via pure-strobe-based methods, without soap opera effect, and without annoying 60Hz flicker. Wouldn't it be wonderful if our sports broadcasts were at least, merely, true 120fps -- so we can comfortably use strobe-based techniques to get CRT-quality persistence without CRT-annoying flicker? Doing 120fps at 90% BFI (<1ms persistence) produces essentially CRT-clarity motion (especially if done on OLED) without the 60Hz CRT flicker, and without soap opera effects. Achieving good brightness at 90% BFI is a simple matter of engineering extra light output over the years: Let the Moore's Law equivalent happen on the light output too -- LED are still continuing to get brighter and brighter, and the same is happening to OLED too.

Thanks,
Mark Rejhon

www.BlurBusters.com

BlurBusters Blog -- Eliminating Motion Blur by 90%+ on LCD for games and computers

Rooting for upcoming low-persistence rolling-scan OLEDs too!

Mark Rejhon is offline  
post #108 of 120 Old 05-17-2014, 09:17 PM - Thread Starter
AVS Special Member
 
fafrd's Avatar
 
Join Date: Jun 2002
Posts: 3,063
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 14 Post(s)
Liked: 418
Quote:
Originally Posted by Mark Rejhon View Post

Quote:
Originally Posted by fafrd View Post

I have read repeatedly that 240Hz LCD panels offer no real benefit over 120Hz LCD panels, but this analysis is leading me to a different conclusion (unless triple-frame-interpolation is crappy).
There are quite obviously point of diminishing returns, so you really want bigger steps in persistence to "make it worth it".

Motion clarity ratio 60 = 16.7ms persistence = baseline
Motion clarity ratio 120 = 8.3ms persistence = 50% less motion blur
Motion clarity ratio 240 = 4.2ms persistence = 75% less motion blur
Motion clarity ratio 480 = 2.1ms persistence = 87.5% less motion blur
Motion clarity ratio 960 = 1.0ms persistence = 93.75% less motion blur

Samsung uses "Clear Motion Ratio #"
Sony uses "Motionflow XR #"
LG uses "Motion Clarity Index #"
Sharp uses "Aquomotion #"

Motion clarity ratios are more representative of motion blur than "Hz" marketing -- they are closer approximations of actual observed motion blur. Although the use of motion blur numbers are highly subjective (and sometimes controversial), it is less controversial than the "Hertz Hype". What makes ratios simple, is that the inverse of "motion clarity ratios", is persistence. Viola, and a more proper marketing number than Hz, if you're a person concerned about display persistence above all else. It fully accounts for black frame insertion and/or interpolation. I, however, wish more TV manufacturers would start advertising motion clarity ratio numbers and latency numbers for Game Mode, because achieving large motion clarity ratios at low latencies (needed for games/computers) is a difficult engineering achievement for any display. For now, we can only enjoy the nice clarity ratios on our sports television broadcasts since all the 960 algorithms all have a lot of input lag.

To estimate the persistence of these TVs, you simply use 1000 divided by "#", and you get milliseconds of persistence. Although these are often hyped numbers, the ratio numbers are finally starting to approach sensibility (FINALLY, YOU TELEVISION SET MANUFACTURERS!). Measured persistence values still vary a lot (like claimed contrast ratio versus measured contrast ratio) but isn't marketing stupidity such as "600Hz subfield rate" which has absolutely zilch to do with motion blur, especially if the subfields are simply temporally dithered patterns derived from exactly the same refresh.

Some of these new TVs go all the way up to "960" and sometimes beyond. So we already have some new LED HDTVs that achieve 1.0ms persistence via a combination of MCFI + BFI techniques. Now, you can see 16.7ms->8.3ms persistence is 50 percentage points less motion blur than 60Hz full-persistence, while 8.3ms->1.0ms persistence is 43.75 percentage points less motion blur than 60Hz full-persistence. In order to get almost as a dramatic jump between 60Hz->120hz, you need to jump 120Hz->960Hz to make it feel as substantial a jump between 60Hz->120Hz. Due to diminishing points of returns, the 120Hz->240Hz upgrade is quite small, so you need a bigger jump upwards. Doing 120Hz->"motion clarity ratio 960" is a much bigger step that has a level of dramaticness more similar to the 60Hz->120Hz jump, at least for sports-based material.

Only Panasonic still sticks with the stupid Hz silliness. "Panasonic 1600Hz scanning backlight" which doesn't even produce 1/1600sec persistence! Yes, granted, the Panasonic scanning backlight does an impressive job, but the number is still hype that doesn't derive as accurately to persistence numbers. Motion clarity ratios DOES correspond to persistence numbers (if measured correctly!)
Quote:
Originally Posted by fafrd View Post

BEST
  • 2.1ms: 120Hz FI + 75% BFI
  • 4.2ms: 120Hz FI + 50% BFI
  • 4.2ms: 60Hz (no FI) + 75% BFI (no interpolation but most noticeable 60Hz flicker)
  • 8.4ms: 120Hz FI (no BFI)
  • 8.4ms: 60Hz+ 50% BFI (no interpolation but noticeable 60Hz flicker)
  • 16.7ms: 60Hz (no FI and no BFI)
WORST
Good math, you've learned the motion blur mathematics properly. These numbers are correct.
(Except 8.4ms should be 8.333333ms, but I'm stupidly super-nitpicking that. GtG slowness also fudge the numbers upwards a little bit)
Also, I've seen displays that do better than this by achiving ~1.0ms persistence.
Quote:
Originally Posted by fafrd View Post

For fast-action sports, soap-opera concerns don't really arise, so unless the frame interpolation is really crappy, why would you think frame interpolation would 'ruin' it?
Indeed, when it comes to sports material, frame interpolation is the lesser of evil for a lot of people.
Quote:
Originally Posted by fafrd View Post

And if and when video games support 120Hz framerate, will OLD panels be able to refresh quickly enough to take advantage of that or not?
OLEDs have no problem refreshing faster at 60Hz, it is a matter of drive electronics. Already, OLED already run at 120fps via motion interpolation. It's a matter of letting it be done from the video input circuitry, rather than via the interpolator circuitry.

Also, consider the gaming HTPC situation. Computer video games already routinely can achieve 120fps@120Hz if you have a fast enough GPU (e.g. a $Geforce Titan) and adjust the graphics detail. Plus, a lot of 2-year-old-and-older games successfully run at max detail levels. Today, I'm running 1920x1080 120fps on my Geforce GTX Titan, in games as recent as Bioshock Infinite (at near full graphics detail, except shadow detail down 1 notch, and view distance down 2 notch, and using FXAA instead of FSAA). I can reach 120fps@120Hz at reduced detail levels at Crysis3/Battlefield4. But at max detail level on newest games, I can't reach those framerates.
Quote:
Originally Posted by fafrd View Post

Can OLEDs refresh at 240Hz?
I don't see why not. I think 480fps@480Hz should eventually be doable.

If only broadcasters would start broadcasting native 240fps or 480fps video, so we we can have low-persistence without the need for MCFI. It won't help movies and sitcoms, but it will certainly help sports broadcast pull off motion clarity while completely avoiding flicker and completely avoiding interpolation.

Perhaps maybe sometime next century, or best case scenario, by the second half of this century. Optimstically, an easier scenario (more easy to imagine) is perhaps the emergence of 120fps HFR video talk is an distant early signal (e.g. cinema of the 2020's displaying 120fps HFR movies) could probably influx to displays quite easily (since lots of them already do 120fps by MCFI) eventually leading to widespread standardized true-120fps video broadcasts of the 2030's. The Moore's law of bandwidth/cameras/display tech already made it possible and it could happen, especially if the snowball starts rolling in the cinemas, since the whole technological chain already exists. You can already click to playback 120fps video if you already own a true-120Hz monitor. Who knows? What may influx over the coming years, and at 120fps, the very least persistence can finally be comfortably lowered via impulsing alone (120Hz flicker is far more comfortable than 60Hz flicker).

120fps at 50% BFI has the same amount of motion blur as 240fps full-persistence, anyway. And 120fps at 90% BFI produces essentially CRT clarity motion without the 60Hz CRT flicker Achieving good brightness at 90% BFI is a simple matter of engineering extra light output over the years: Let the Moore's Law equivalent happen on the light output too -- LED are still continuing to get brighter and brighter, and the same is happening to OLED too.

How do you interpret the Vizio 'Clear Action Rate'?

The Reference Series has a Clear Action Rate of 1800Hz and the backlight has a native refresh rate of 120Hz. If the Clear Action Rate is a Motion Clarity Index like the others, it would mean persistence of 0.6ms and 93% BFI. That is all fine, but even with a backlight with 800 Nits, the resulting brightness would be reduced to 53 Nits.

The 2014 M Series has a Clear Action Rate of 720 and (we believe) a backlight with a 120Hz native refresh rate. If the Clear Action Rate is a Motion Clarity Index like the others, it would mean persistence of 1.4ms and 83% BFI. The backlight is 400 Nits, so the resulting brightness would be reduced to 67 Nits.

I suppose if your watching hockey or playing video games in the dark, it's possible that the Vizio Clear Action Rate is a true Motion Clarity Index like the others, but it seems like a stretch - do you know what Vizio is doing?

For OLED, do you know if the current LG WOLED TVs support 120Hz refresh (with frame interpolation if needed)? What about the Gen 2 WOLEDs coming later this year?

True 120Hz native refresh rate seems like the enabling technology to use high %s of BFI without introducing noticeable flicker - that should be the minimum refresh rate requirement for OLED TVs (whether the customer wants to make use of that capability or not).
fafrd is offline  
post #109 of 120 Old 05-17-2014, 09:27 PM
AVS Special Member
 
Mark Rejhon's Avatar
 
Join Date: Feb 1999
Location: North America
Posts: 8,124
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 96
Quote:
Originally Posted by fafrd View Post

For OLED, do you know if the current LG WOLED TVs support 120Hz refresh (with frame interpolation if needed)? What about the Gen 2 WOLEDs coming later this year?
No idea, and no idea. Hopefully.
That said, for any panel that supports 120fps MCFI, it's a matter of motherboard design, without any panel modifications necessary.
Quote:
Originally Posted by fafrd View Post

How do you interpret the Vizio 'Clear Action Rate'?
It's truly a good question.
The slapping on of a "Hz" at the end of the "Clear Action Rate" is a big whopping red flag to me.
Maybe it's a legitimate ratio that's reasonably close to persistence, but I need to see for myself...

But if it's a sequential scanning backlight, the frequency of how rapidly the next segments get illuminated (e.g. 1800 segments/second scanning backlight, should never be called "1800Hz"). Higher resolution (finer granularity) scanning backlights scan at a higher scan rate, until it line-based granularity like a CRT. But you don't market motion blur via scan rate. There are CRTs running at a 37.5KHz horizontal scanrate at 60Hz, would have been advertised as "37500Hz" (the horizontal scan rate) instead of "60Hz" (vertical refresh rate). But as everyone with enough brain cells knows, horizontal scan rate has no bearing in motion resolution. I call out those TV manufacturers tempted to advertise the segment scanrate of their scanning backlight, as that's not a valid motion clarity measurement. Ideally, persistence equals how briefly each segment is illuminated (e.g. 1/1800sec) but light bleed between adjacent segments always lengthens persistence (usually by a factor of 3-4x) so it may be effective motion clarity ratio 400 or 500. Global strobe backlights have no adjacent-segment backlight diffusion inefficiencies, so motion clarity ratios of full-strobe backlights are more closely representative. Of course, scanning backlights can keep GtG artifacts more vertically consistent than global strobe backlights (due to the asymmetry between LCD scanout and the global strobe), at the great cost of a large increase in persistence compared to global strobe backlights.

A lot of displays like to fudge the motion clarity ratios around already (witnessing some better 480 ratio displays with better looking motion than the worse 960 ratio displays), but I've seen Hz numbers that's an order of magnitude away from the predicted motion blur I thought it would have. So I don't trust all manufacturers to always handle the motion clarity ratios properly. Some are doing a good job, but obviously, not all. And of course, GtG limitations can interfere with reality. There are even manufacturers that surpass expectations where the persistence actually fell below the motion clarity ratio index numbers. For example, some 240Hz TVs are using 120Hz MCFI + variable BFI where the BFI ratio becomes bigger the more you darken brightness (because that specific TV manufacturer chose to use strobe pulse-width modulation as a method of brightness adjustment during 240Hz). So at least 240Hz TV I ran into, actually ended up having motion clarity ratios far exceeding 480, although it had some mediocre GtG artifacts, it actually sucessfully passed the TestUFO Panning Map Test at 1920 pixels/sec (requires sub-2ms persistence to pass). So there's been one unexpected motion-resolution side effect. Motion clarity testers should try adjusting brightness upwards/downards during strobe mode during ultrafast moving-photo tests, to see if the brightness adjustments actually is also a strobe pulsewidth adjustment -- aka persistence adjustment! -- like it is on some strobed LCDs.

Someone needs to pick a few HDTVs of each, and measure actual motion blur of each, and find out how it compares to motion clarity indexes. Perhaps a good website candidate, would be RTings.com, which has already done some excellent motion blur photography utilizing my invention (which costs as little as $200 -- just a camera rail & an off-the-shelf camera -- the world's most inexpensive pursuit camera that's far more accurate than static photography).

w800b-motion-blur-small.jpg Sony W800B, interpolation disabled

w800b-soap-opera-effect-small.jpg Sony W800B, interpolation enabled

w800b-impulse-small.jpg Sony W800B in Motionflow Impulse Mode

In these above images, you can see GtG-related components of blurring, and persistence-related components of blurring.
-- GtG artifacts during eye tracking, is the ghosting artifacts at the left edge.
-- Persistence blurring is the overall global symmetric blurring during tracking (natural-looking).
-- Strobing reduces the persistence based blurring.
-- In some cases like this, incomplete GtG artifacts can become easier to see during strobing, because the GtG artifacts are no longer hidden by motion blur.
-- However overall motion clarity still improves anyway, independently of GtG. The faintness of GtG ghosting in this example shows that GtG is mostly complete (approx ~95% in this specific case) by the first subsequent strobe and far more complete (approx ~99%) by the second subsequent strobe. This isn't the most GtG-complete strobe backlight I've seen, but it's a good example of GtG-versus-persistence, and a very clear visual demonstration of how strobing reduces the tracking-based motion blur independently of the GtG ghosting artifacts.
-- Human eye observations confirms the photos are accurately representative for normal vision.

There's also plenty of other wonderful motion blur photography, via the same pursuit camera technique I developed, now made cheap, easy and accessible to everyday reviewers and bloggers.

The blurring is the same amount seen by human eye during accurate eye-tracking situations. Every single W800 user I met, have confirmed these images are accurate representation of what their eye saw in 960 pixels/sec motion tests. Google "MPRT pursuit camera" on Google Scholar, and you'll see pursuit cameras are already a scientifically accepted tool. It just never became cheap to do accurately (until my invention -- scroll down that page halfway to see the temporal test patterns that allowed inexpensive tracking accuracy verification). It is somewhat more visually accurate for LCD than plasma, since the LCD motion blur doesn't temporally vary like it does on plasmas (no subfield dithering, etc). It still looks reasonably representative on plasmas, but even more accurate on LCDs since their motion blur doesn't temporally vary, so capturing a slice is accurately representative of tracking by majority (i.e. the average human). Covers a wide variety of motion artifacts, both by GtG (including strobe crosstalk) and by persistence (natural blurring). Obviously, it's not a perfect equivalent of human-eye tracking, but far closer and more representative than static camera photography attempting to capture motion blur -- and an excellent comparative tool between displays -- since the more motion blur in the photographs, the more motion blur will be seen by eye (during the persistence-critical situation of eye tracking). The pursuit camera method also works well on OLED displays too as well.

At the moment, I'd think testers such as RTings would be one of the websites currently best-equipped to call-out how "motion clarity ratios" corresponds to actual amount of motion blurring seen. I may write a letter to RTings to perhaps ask them to loudly call out mis-marketing if they notice a large divergence (e.g. off by a factor of more than 2x) between the motion blur photographed and the motion clarity ratios. Who knows -- if just a mere few review websites did this, all the TV manufacturers would standardize motion clarity ratios more consistently and accurately as the inverse of persistence, as a few of the better manufacturers have finally started doing.

Note -- A group of researchers are now about to work on a scientific paper within a year -- writing about my pursuit camera technique (yep -- peer reviewed), so this will be the first real science paper I'm mentioned in, as they were very impressed at what I have done -- my $200 pursuit camera setup outperformed a $50,000 commercial rig!

Thanks,
Mark Rejhon

www.BlurBusters.com

BlurBusters Blog -- Eliminating Motion Blur by 90%+ on LCD for games and computers

Rooting for upcoming low-persistence rolling-scan OLEDs too!

Mark Rejhon is offline  
post #110 of 120 Old 05-18-2014, 07:04 AM
AVS Special Member
 
tgm1024's Avatar
 
Join Date: Dec 2010
Posts: 6,083
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 239 Post(s)
Liked: 644
Quote:
Originally Posted by conan48 View Post

Yes, an OLED only passes 300 lines on the MOTION TEST. Find me 1 complaint about motion with ACTUAL CONTENT, either from a reviewer or owner. Most reviews only mention the motion res test, which by the way was specifically created to favor plasma, but fail to run other tests such as image smearing/trailing motion tests, or phophor lag tests. These additional tests would clearly show that LED LCD with scanning backlight and plasma, clearly have some serious motion issues and they all pass 1080 lines on the "motion resolution test". Don't drink the motion test coolaid and believe that it's the only measurement for great motion.

I'm one of the pickiest SOBs when it comes to motion and almost every tech other then CRT has caused me issues with motion. I have every motion test known to man, and ran them all on the LG OLED and many LCDs, Plasmas, DLPs, and LCOS projectors and I can definitely say that overall the OLED is only second to CRT. **** the one single motion test they do run. They should test overall motion with many more varied tests.

BTW, as a few reviews have mentioned, it's possible to run the LG OLED with Dejudder and Deblur at setting of 0 which raises the motion res to over 600 lines and doesn't add SOE at all. NOt sure how they do this technically, and one would assume frame interpolation but their is no evidence of SOE with these settings, so I'd say the motion res right now is 600 lines for ALL CONTENT.

I'm not saying it wouldn't be nice to have some kind of Dark frame insertion in the future, but as of now the motion is GREAT on OLED and I don't think people will have any issues what so ever. I will be certainly buying the upcoming 4K 65", even if the motion res is still only 600.

 

Conan, on a scale of 1 to 10, with 10 being the best, can you give me a sense of your experience with the other TVs in relation to your LG OLED?  Just a single number is ok for now...no need yet to break it into multiple metrics.


WARNING: You have now entered a no @#$%tard zone. Please gather your anti-vaccine propaganda nonsense and slowly back out the way you came in.
tgm1024 is offline  
post #111 of 120 Old 05-18-2014, 10:19 AM
AVS Special Member
 
Mark Rejhon's Avatar
 
Join Date: Feb 1999
Location: North America
Posts: 8,124
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 96
I have some curiousities myself:
Quote:
Originally Posted by conan48 View Post

Yes, an OLED only passes 300 lines on the MOTION TEST.
Did you do the motion test yourself?
This confirms your vision is capable of seeing persistence-related blur, in a human visual-ability manner.
Quote:
Originally Posted by conan48 View Post

...say that overall the OLED is only second to CRT.
Mind if you elaborate which metric distinguishes OLED from CRT?
Is this because CRT motion is still sharper? And OLED motion looks so much more natural compared to other non-CRT tech?
I know there are no temporals (dithering, rainbows) and no blur distortions (unbalanced blur/GtG ghosting/corona effects), despite more persistence-related blur.

Not all of us are picky about natural persistence-only blur, except a niche of us (60fps gamers/sports/etc) who want to push motion clarity to its limits, and the OLED market is small.

Thanks,
Mark Rejhon

www.BlurBusters.com

BlurBusters Blog -- Eliminating Motion Blur by 90%+ on LCD for games and computers

Rooting for upcoming low-persistence rolling-scan OLEDs too!

Mark Rejhon is offline  
post #112 of 120 Old 05-18-2014, 10:52 AM
AVS Special Member
 
8mile13's Avatar
 
Join Date: Nov 2009
Posts: 3,820
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 195 Post(s)
Liked: 209
Conan is talking about his LG OLED, probably a curved one.

LG OLED_300 lines of motion resolution
http://www.hdtvtest.co.uk/news/55ea980w-201312083487.htm


According Doug Blackburn there are some LCd's with pixels that can change state faster (high performance computer monitors and Sony's LCoS digital cinema projectors). In his view Sony's $25,000 4K 1000ES projector's blur and 3D problems are VERY small compared to other consumer LCd displays. Seems that he thinks that LCd blur problems are slow pixel state change related (not Sample and Hold related).
http://www.avsforum.com/t/1523682/auto-motion-plus
8mile13 is online now  
post #113 of 120 Old 05-18-2014, 03:03 PM
AVS Special Member
 
Wizziwig's Avatar
 
Join Date: Jul 2001
Location: SoCal, USA
Posts: 1,161
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 81 Post(s)
Liked: 94
Quote:
Originally Posted by Mark Rejhon View Post

In these above images, you can see GtG-related components of blurring, and persistence-related components of blurring.
-- GtG artifacts during eye tracking, is the ghosting artifacts at the left edge.
-- Persistence blurring is the overall global symmetric blurring during tracking (natural-looking).
-- Strobing reduces the persistence based blurring.
-- In some cases like this, incomplete GtG artifacts can become easier to see during strobing, because the GtG artifacts are no longer hidden by motion blur.
-- However overall motion clarity still improves anyway, independently of GtG. The faintness of GtG ghosting in this example shows that GtG is mostly complete (approx ~95% in this specific case) by the first subsequent strobe and far more complete (approx ~99%) by the second subsequent strobe. This isn't the most GtG-complete strobe backlight I've seen, but it's a good example of GtG-versus-persistence, and a very clear visual demonstration of how strobing reduces the tracking-based motion blur independently of the GtG ghosting artifacts.
-- Human eye observations confirms the photos are accurately representative for normal vision.

Thanks for the link to RTINGS. Never saw that website before. I can also confirm that these shots match what I see when testing Sony Impulse mode TVs at the store.

My only criticism is that they might want to test with a variety of background/foreground colors. I've seen much worse examples of GtG trails/halos with some games. They could also add some higher frequency details to that image so you can detect when those fine details are lost/obscured by blur.

By the way, Conan might reply to your OLED questions in the owner thread here:

http://www.avsforum.com/t/1493578/lg-55ea9800-55-oled-owners-thread
Wizziwig is offline  
post #114 of 120 Old 05-18-2014, 04:14 PM
AVS Special Member
 
Mark Rejhon's Avatar
 
Join Date: Feb 1999
Location: North America
Posts: 8,124
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 96
Quote:
Originally Posted by Wizziwig View Post

Thanks for the link to RTINGS. Never saw that website before. I can also confirm that these shots match what I see when testing Sony Impulse mode TVs at the store.
Thanks for the additional vision-versus-photo data point.
Quote:
Originally Posted by Wizziwig View Post

My only criticism is that they might want to test with a variety of background/foreground colors. I've seen much worse examples of GtG trails/halos with some games. They could also add some higher frequency details to that image so you can detect when those fine details are lost/obscured by blur.
Certainly a valid criticism. It is a simplification compromise, as they told me they had to choose a single intuitive example for a non-advanced audience. An understandable choice to be sure, but heads-and-heels above only having non-illustrated motion blur comments on certain websites that only have stock photos of the TV. The good news is RTings has made the motion test MP4 video file publicly freely available (including my pursuit camera sync test pattern embedded in the video), for personal comparative purposes between your displays and photographs on their site.

You can even set up your own pursuit camera with an existing $100 Amazon camera slider (Drylin W1080 slider) and your existing point-and-shoot (typically $100 nowadays), and use the color selectors at www.testufo.com/ghosting ... I am open to ideas on new pursuit camera test patterns to develop for the TestUFO site.

Thanks,
Mark Rejhon

www.BlurBusters.com

BlurBusters Blog -- Eliminating Motion Blur by 90%+ on LCD for games and computers

Rooting for upcoming low-persistence rolling-scan OLEDs too!

Mark Rejhon is offline  
post #115 of 120 Old 05-18-2014, 04:41 PM
AVS Special Member
 
Chronoptimist's Avatar
 
Join Date: Sep 2009
Posts: 2,559
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 203
Quote:
Originally Posted by Mark Rejhon View Post

Sony W800B, interpolation disabled: http://www.rtings.com/images/reviews/w800b/w800b-motion-blur-medium.jpg
Sony W800B in Motionflow Impulse Mode: http://www.rtings.com/images/reviews/w800b/w800b-impulse-medium.jpg
Thanks for the images Mark. To me, this makes it clear that the biggest problem with motion today is persistence.
Yes, response times still need to improve - but that would only remove the trailing from these images - the text would remain just as blurred.

Images taken from an OLED with full persistence would look similar to the first image, without the trailing on the left edge.
I would rather take the LCD with a mostly sharp low-persistence image than a display with high persistence.

I'd also say that if the W800 is anything like my panel, red moving over a green background like that is about the slowest pixel transition time there is. Trails like that are extremely rare in actual video content.

Of course, if someone were to release an OLED display which accepted a high framerate input (minimum 120fps, preferably 240+) and offered a low latency, low persistence mode (1ms) that would be the best of both.

What is the persistence of the Sony Impulse mode on the W800? 2ms? (my older HX900 is rated as being "480Hz")
Perhaps the balance would shift in favor of OLED at 4ms where you have more persistence-based blur, but none of the response time issues.
Chronoptimist is offline  
post #116 of 120 Old 05-18-2014, 05:48 PM - Thread Starter
AVS Special Member
 
fafrd's Avatar
 
Join Date: Jun 2002
Posts: 3,063
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 14 Post(s)
Liked: 418
Quote:
Originally Posted by Chronoptimist View Post

Quote:
Originally Posted by Mark Rejhon View Post

Sony W800B, interpolation disabled: http://www.rtings.com/images/reviews/w800b/w800b-motion-blur-medium.jpg
Sony W800B in Motionflow Impulse Mode: http://www.rtings.com/images/reviews/w800b/w800b-impulse-medium.jpg
Thanks for the images Mark. To me, this makes it clear that the biggest problem with motion today is persistence.
Yes, response times still need to improve - but that would only remove the trailing from these images - the text would remain just as blurred.

Images taken from an OLED with full persistence would look similar to the first image, without the trailing on the left edge.
I would rather take the LCD with a mostly sharp low-persistence image than a display with high persistence.

I'd also say that if the W800 is anything like my panel, red moving over a green background like that is about the slowest pixel transition time there is. Trails like that are extremely rare in actual video content.

Of course, if someone were to release an OLED display which accepted a high framerate input (minimum 120fps, preferably 240+) and offered a low latency, low persistence mode (1ms) that would be the best of both.

What is the persistence of the Sony Impulse mode on the W800? 2ms? (my older HX900 is rated as being "480Hz")
Perhaps the balance would shift in favor of OLED at 4ms where you have more persistence-based blur, but none of the response time issues.

The W800 appears to be MotionFlow 480, which seems to be 240Hz refresh (triple-frame-interpolation) coupled with 50% BFI (meaning 2.1ms persistence).

So from this example, it's pretty clear that a 240Hz refresh OLED with 50% BFI would look much sharper (no ghosting), but without the 50% BFI (240Hz FI) would not look to much different than the LED/LCD.

Once OLED has a 120Hz refresh rate coupled with FI and 50% BFI (4.2ms), it should look slightly better than the equivalent LED/LCD (but not by much), but it looks like an LED/LCD with a 240Hz refresh rate (and BFI) is always going to deliver better motion performance than an OLED with a refresh of 120Hz (to say nothing of 60Hz).
fafrd is offline  
post #117 of 120 Old 05-18-2014, 06:43 PM
AVS Special Member
 
Mark Rejhon's Avatar
 
Join Date: Feb 1999
Location: North America
Posts: 8,124
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 96
Quote:
Originally Posted by Chronoptimist View Post

Images taken from an OLED with full persistence would look similar to the first image, without the trailing on the left edge.
I would rather take the LCD with a mostly sharp low-persistence image than a display with high persistence.
True -- myself too.

However, some people are MUCH more bothered by unbalanced/artificial looking motion blur. There are people see rainbows more than others. There are people who see plasma contouring and DLP dithering artifacts more easily than others. There are people who see flicker more than others The same occurs for GtG effects like unnatural ghosting and blurring. These can stand out more strongly above the natural persistence-based motion blur. Persistence blur is pretty natural looking, since it's actually natural motion blur (from tracking) that is fully balanced at all color levels. Persistence blur is virtually identical-looking to slower shutter speeds during camera panning.

1. HDTV camera with slow shutter during pan + high persistence display == natural looking motion blur
2. HDTV camera with fast shutter during pan + high persistence display == natural looking motion blur
3. HDTV camera with slow shutter during pan + ultralow persistence display == natural looking motion blur
4. HDTV camera with fast shutter during pan + ultralow persistence display == crystal sharp CRT pans that has no motion blur

So you see, once GtG effects are completely eliminated, the leftover blur is scientifically natural, that it is hard to distinguish 1 versus 2 versus 3.

Here's some other common LCD GtG artifacts:

---

l1400u-motion-blur-small.jpg
Severe ghosting: This GtG artifacts often looks like this when you have little or no overdrive, and/or when the LCD panel is very cold (cold VA LCDs create much longer ghost trails than when warmed up). Also, low-quality strobing is occuring with this one, creating severe multi-image effects.

---

h7150-soap-opera-effect-small.jpg
Bright ghosting: This GtG artifact is called a corona, or inverse ghosting. The ghosting is brighter than the original pixels. This is caused by overdrive (response time acceleration) overshooting its final pixel color value.

---

h6350-soap-opera-effect-small.jpg
Inconsistent ghosting: This GtG artifact is caused by inconsistent GtG response speeds. Some pixels are slower than others. The pixels at the upper-left corner and upper-right corner of RTings are transitioning slower, because the pixels have been staying red longer than the pixels in the middle of the logo (where there a lot of white pixels).

---

Lots of people don't understand the motion artifacts they're seeing, as well as some of us do, and there are many reasons, but these are among the quite common reasons:

The GtG haters
Unbalanced motion blur effects caused by GtG is hugely annoying to some people. They are seeking a different kind of immersion, wanting the motion to look natural without any distortions caused by GtG inconsistencies. There are many reasons why some dislike GtG more than persistence, and vice versa, but as I have been saying, persistence is actually scientifically 100% natural motion blur -- true real motion blur. A lot of people don't mind natural motion blur, which is what persistence during tracking is (even if it's enforced natural blur).

The persistence haters
However, for a small niche of audience (like me) they are tolerable, and when playing persistence-critical applications such as 60fps@60hz FPS gaming or 120fps@120Hz FPS gaming, the "CRT sharp" effect hugely increases immersion for some of us when you don't have any motion blur during panning (strafing/turning/coasting/riding in Quake Live, Portal 2, Team Fortress, Bioshock, Call of Duty, Counterstrike, etc). It feels like the "Arcade CRT" effect when you have zero motion blur (persistence so low that it's not noticed) at all panning speeds. Fine-detail preservation at all motionspeeds becoming numero uno for some people (like me!)

Some of us spends thousands on "gaming theatres" including this BlurBusters-featured gaming setup driven by >$4000 worth of GPUs).
More info in the spoiler below from those of us willing to pay $1000 for a 24" low-persistence rolling-scan OLED: Warning: Spoiler! (Click to show)
and within this, some of us want the CRT clarity effect without CRTs -- for some of us, overall motion clarity (texture clarity) can massively outweighs any GtG artifacts. You have videophiles, you have audiophiles, I (Blur Busters) serve the audience of motion clarity nuts (including myself). Many of us are demanding OLED displays and we want them now -- we want to even pay $1000+ for a 24" desktop 120Hz zero-latency(CRT equivalent) OLED with adjustable persistence, with a complete 0.5ms-through-8.3ms persistence adjustable rolling-scan, completely adjustable continuum all the way from full-persistence all the way down to low-persistence. If there is an incentive to do so, I think it can be done in less than 2-3 years. The Oculus DK2 actually achieves close to this already as being the best video-game OLED I have ever seen in my lifetime (when running at 75fps@75Hz). But it is only a tiny smartphone display (in a VR headset) being co-opted with a special custom persistence-lowering display-driver motherboard. Lowering OLED persistence is simply driver electronics. As long as we can get 400cd/m2 at full persistence, we can still get a usable 100cd/m2 at 2ms persistence (equivalent to LightBoost) which is the same brightness as LightBoost (100cd/m2), but much better color quality than almost all LCD monitors (without putting up with a LightBoost-color-quality-degraded TN panel). But let the persistence keep being adjustable past that -- some of us are tolerating ~30cd/m2 during 0.5ms persistence on BENQ Z-Series LCD, by gaming in a totally dark room -- it's not too different from some projection home theater brightnesses. 0.5ms matches CRT clarity (zero blur at all eye-tracking speeds); an OLED running a 0.5ms rolling scan looks nearly identical to a CRT with its brightness setting adjusted way down. Doesn't mean the lack of brightness is worthless on the market -- there are people in the Blur Busters audience ready to pay, and the adjustable persistence allows you to choose the brightness-versus-motion-clarity tradeoff. Full persistence adjustment range is a trivial firmware programming modification for an OLED rolling scan, simply adjusting the distance between the OFF-pass rolling scan (switch the pixel's active matrix transitors off) chasing behind the ON-pass rolling scan (switch the pixel's active matrix transitors on). Any AMOLED is capable of a rolling scan, including your Samsung Galaxy S3 OLED, when attached to a custom electronics board. Let's see these low-persistence OLEDs arrive in 24" format for desktop monitors. Certain OLEDs are already starting to hit 400cd/m2 and so we should have usable 2ms persistence now at the 120-impulse-per-second league (quarter-persistence). This isn't all the way to CRT clarity, but it begins to hit that territory. Hopefully when 50" 4K OLEDs 120Hz-MCFI come out and start falling to sub-$3000 levels, they can be profitabaly cut up into four $800-$1000 1080p elite gaming computer monitors driveable at native-120Hz. Then we can have our cake (virtually perfect GtG) and eat it too (great adjustable persistence).

Thanks,
Mark Rejhon

www.BlurBusters.com

BlurBusters Blog -- Eliminating Motion Blur by 90%+ on LCD for games and computers

Rooting for upcoming low-persistence rolling-scan OLEDs too!

Mark Rejhon is offline  
post #118 of 120 Old 05-18-2014, 09:46 PM
AVS Special Member
 
Chronoptimist's Avatar
 
Join Date: Sep 2009
Posts: 2,559
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 203
Quote:
Originally Posted by Mark Rejhon View Post

However, some people are MUCH more bothered by unbalanced/artificial looking motion blur. There are people see rainbows more than others. There are people who see plasma contouring and DLP dithering artifacts more easily than others. There are people who see flicker more than others The same occurs for GtG effects like unnatural ghosting and blurring. These can stand out more strongly above the natural persistence-based motion blur. Persistence blur is pretty natural looking, since it's actually natural motion blur (from tracking) that is fully balanced at all color levels. Persistence blur is virtually identical-looking to slower shutter speeds during camera panning.
I typically see all of these artifacts to strong effect.
The worst for me is when you have the image rendered perfectly sharp, multiple times - similar to what you see with PDP or DLP, as the image is not temporally stable.
With LCD, unless the set has some very bad overshoot, artifacts are typically trailing behind the moving object, rather than being on the leading-edge. With plasma and DLP I see completely separate images, in their individual colors.
With my HX900, the worst-case scenario is usually something like this, or in extremely rare circumstances, some blue overshoot on black - though it's almost completely eliminated in the 480Hz "Clear Plus" mode.
With DLP I see separate R/G/B images, and with PDP, I see a bright blue leading edge, and a bright greenish-yellow trailing edge - both of which are perfectly sharp, bright, and obvious.
The "fading" trail of LCD response times on a set with backlight scanning are not very obtrusive - similar to the relatively unobtrusive trailing you would occasionally get on a CRT.
Quote:
Originally Posted by Mark Rejhon View Post

...an OLED running a 0.5ms rolling scan looks nearly identical to a CRT with its brightness setting adjusted way down...
Nice. Sounds like the biggest hurdle for OLED is primarily brightness.
I do have some concern about whether or not OLED response times can match a CRT though. A CRT has nearly instantaneous rise times, and a steep falloff. The rise time for this OLED is longer than the entire transition for the CRT.
Chronoptimist is offline  
post #119 of 120 Old 05-19-2014, 01:44 PM
AVS Special Member
 
Mark Rejhon's Avatar
 
Join Date: Feb 1999
Location: North America
Posts: 8,124
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 96
Quote:
Originally Posted by Chronoptimist View Post

I do have some concern about whether or not OLED response times can match a CRT though. A CRT has nearly instantaneous rise times, and a steep falloff. The rise time for this OLED is longer than the entire transition for the CRT.
I have that graph in my article, Why Do Some OLEDs Have Motion Blur?.

The OLED GtG doesn't have to be squarewave or full range. Previous OLED colors don't interfere with GtG of subsequent OLED colors the same way it does for LCD, especially with the long black frame resetting the pixel state. The rise doesn't have to be fully complete for usability. Just that the number of photons for darker colors needs to be fewer than the number of photons for brighter colors. You do lose luminance potential, though. You may not get the OLED to fully illuminate, but you will still get a pulse out. At 0.5ms, the pulse is roughly sharkfin-shaped (or sawtooth shaped). The grayscale linearity degrades a bit at the bottom end, but you can re-calibrate grayscale for a given OLED persistence, since the behavior is extremely stable/predictable (it doesn't temporally vary from refresh to refresh). You can get gamma distortions, which may be nonlinear across the OLED substrate due to OLED manufacturing imperfections, and this could get amplified during low persistence settings when short persistence begins to "eat" into the risetime/falltime. However, the quality of the motion is still quite comparable to CRT even with the trunctated (sharkfin waveform) rise/falls at lower persistence values. Technically, risetime affects artifacts in the leading edge of motion blur, and falltime affects artifacts in the trailing edge of motion blur. Let's note, CRT illumination isn't squarewave either; as you pointed out, the fast rampup and slow decay (that sometimes lasts several refreshes later). Which, actually happens to create a different type of ghosting artifacts on CRT, especially higher-persistence ones (e.g. green ghosting behind bright images on black backgrounds). Typically, OLED has none of that. Faster risetimes/falltimes is beneficial for efficiency purposes however, to take advantage of the full luminance potential of that OLEDs' pixel. In theory, OLED drivers might someday use boost voltages via a special OLED-optimized kind of response time acceleration, if you needed faster risetimes/falltimes and/or boosted brightness for cleaner and brighter low-persistence operation, though overdrive might end up introducing problems itself that outweigh not using it. The firmware simplicity of an adjustable rolling scan, you can choose your own compromise/tradeoff. Doing 2ms persistence on an OLED isn't a problem, except as a light output issue. The OLED pixel risetime/falltimes varies from OLED to OLED, as it is heavily dependant on the active matrix transitors and the electronics driving them - a higher voltage at the transitor gate of the active matrix pixel (aka overdrive techniques) can speed up the risetimes/falltimes, if it was beneficial to improving efficiency of low-persistence operation.

There is a lot of engineering creativity still untapped for OLED, while the LCD world has done crazy stuff such as scanline-optimized overdrive algorithms found in the fastest strobed LCD gaming monitors (done to account for different GtG freshnesses during LCD scanout in dark before backlight strobe) and variable-refresh-rate algorithms such as NVIDIA GSYNC (refresh rate can seamlessly change 100+ times a second to keep refresh perfectly in sync with varying frame rates of a PC-based game, making it possible to have stutter-free randomization/fluctuation of framerate) to things like advanced motion interpolation combined with ultrahigh-granularity scanning backlights. OLED rolling scans technically are far simpler than that. I have no doubt that a lot of OLED problems are fixable (on a ten-year timescale). That said, the OLED burn-in problem is actually more difficult than achieving CRT motion clarity parity.

Manufacturer inertia is sometimes the bigger problem even when a panel they have, is capable of clean, low persistence operation. Manufacturers are often reluctant to give users a lot of adjustment range, e.g. adjustability of a rolling-scan, or interpolation-free 60Hz strobing. At 60Hz, there is a lot of flicker that is sometimes worse than a 60Hz CRT (partially due to squarewave 60Hz LED backlight strobing versus softer 60Hz phosphor flicker), and the average everyday users can complain more about flicker than they do complain about soap-opera effects (interpolating to higher-framerates to reduce low-persistence flicker). Even today, Sony Motionflow Impulse gets complaints, that Sony had to display a warning message whenever this feature is enabled. And some plasma manufacturers lengthens persistence of their displays to reduce plasma flicker, and add motion interpolation.
markrubin and Chronoptimist like this.

Thanks,
Mark Rejhon

www.BlurBusters.com

BlurBusters Blog -- Eliminating Motion Blur by 90%+ on LCD for games and computers

Rooting for upcoming low-persistence rolling-scan OLEDs too!

Mark Rejhon is offline  
post #120 of 120 Old 05-20-2014, 01:36 PM
AVS Special Member
 
Wizziwig's Avatar
 
Join Date: Jul 2001
Location: SoCal, USA
Posts: 1,161
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 81 Post(s)
Liked: 94
Quote:
Originally Posted by Chronoptimist View Post

I typically see all of these artifacts to strong effect.
The worst for me is when you have the image rendered perfectly sharp, multiple times - similar to what you see with PDP or DLP, as the image is not temporally stable.
With LCD, unless the set has some very bad overshoot, artifacts are typically trailing behind the moving object, rather than being on the leading-edge. With plasma and DLP I see completely separate images, in their individual colors.
With my HX900, the worst-case scenario is usually something like this, or in extremely rare circumstances, some blue overshoot on black - though it's almost completely eliminated in the 480Hz "Clear Plus" mode.
With DLP I see separate R/G/B images, and with PDP, I see a bright blue leading edge, and a bright greenish-yellow trailing edge - both of which are perfectly sharp, bright, and obvious.
The "fading" trail of LCD response times on a set with backlight scanning are not very obtrusive - similar to the relatively unobtrusive trailing you would occasionally get on a CRT.
Nice. Sounds like the biggest hurdle for OLED is primarily brightness.
I do have some concern about whether or not OLED response times can match a CRT though. A CRT has nearly instantaneous rise times, and a steep falloff. The rise time for this OLED is longer than the entire transition for the CRT.

Agreed about DLP and Plasma rainbow artifacts. I even see them on LED based DLP which most claim is fast enough to eliminate the problem. It's really frustrating that other people can enjoy these displays without seeing these problem. Both my parents don't see them so it can't be genetic. I wonder why most can't see them? My only theory is that it's some kind of conditioning/training of the brain. I've spent most of my life using CRTs both at home and at work. Maybe they have warped my perception of motion. confused.gif

Regarding the LCD trailing/ghosting, the problem is not as minor as you describe. Think of a case where you have a high-detail textured surface. As an artificial example - a line grid of alternating bright and dark gray lines. What will happen in this case is that the trails will cover up the neighboring lines and effective blend the surface into a solid gray - losing all details and making the individual lines invisible. It also produces this strange effect that objects appear to get darker while moving, then suddenly snap into brighter color and detail when they stop. I find this very annoying. The rtings.com pattern is too simple and low resolution to illustrate it.
Wizziwig is offline  
Reply Flat Panels General and OLED Technology

User Tag List

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off