30fps gaming at 60hz and the effects of frame doubling? - Page 2 - AVS Forum
Forum Jump: 
Reply
 
Thread Tools
post #31 of 72 Old 01-03-2012, 08:11 AM
Senior Member
 
skidawgz's Avatar
 
Join Date: Jun 2011
Location: New Jersey
Posts: 494
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 19 Post(s)
Liked: 43
Quick question, it may sound silly, but that is simply because I haven't been able to buy a plasma set (yet). How does the TV (particularly the OP) perform in "Game Mode".

AVR: Marantz 7008 Phono: Pro-Ject Debut III Speakers: BW CMC2 + 2xCM9 + Energy 2xCB-10 (rear)
TV: Panasonic 65ZT60, Samsung 60F5300
skidawgz is offline  
Sponsored Links
Advertisement
 
post #32 of 72 Old 01-30-2012, 01:11 AM
AVS Special Member
 
WaveBoy's Avatar
 
Join Date: Aug 2008
Location: Vancouver
Posts: 2,051
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 103 Post(s)
Liked: 40
30fps based games on any LCD and Plasma HDTV is incredibly annoying. With my Plasma, the doubleing effect is just hideous...
On my LCD the blur looks even worse. When i pop in anything 60fps the motion blur on my LCD is nowhere near as bad and the doubleing effect on my Plasma is gone.

Now, I'm a massive Nintendo nut and most of their exclusives(Super Mario Galaxy 2, Wario Land Shake it, Metroid Prime 3, SSB Brawl ect ect)are all 60fps for the Wii, or 95% that is. It's the 3rd party titles which usually push the graphical boundries(Resident Evil 6, Uncharted 3 ect ect) that sacrifice the frame rate over detail which are then locked in at 30fps and the end result mucks up the experience on any HDTV.

With 30fps, either you put up with the ugly and distracting image doubleing as seen on a plasma, or you deal with the added extra motion blur(by the looks of it, it seems unusual and smear-ish as if it's covering up the doubleing effect. It's pretty bad, the Blur seems like it's 2x worse, but it must be the combination of blur and frame doubleing that creat that smeary messy effect.)

Either way the end result is nasty. Will this problem ever be ressolved? How's about developers make 60fps the bloody standard next gen.
My 2 CRT's NEVER had this problem. They displayed 30fps titles perfectly, with zero image double and no motion problems at all. The motion handeling was 'perfect' and retained every single little detail no matter
how fast you zipped the camera around. That tech still has the advantage based on having no 30fps image doubleing and perfect motion. While Plasma is great in the motion department, it's not not up snuff in comparison to CRT.
It's clearly evident in 3rd and first person titles.
WaveBoy is offline  
post #33 of 72 Old 01-30-2012, 03:22 AM
AVS Special Member
 
Chronoptimist's Avatar
 
Join Date: Sep 2009
Posts: 2,644
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 260
Quote:
Originally Posted by WaveBoy View Post

Either way the end result is nasty. Will this problem ever be ressolved? How's about developers make 60fps the bloody standard next gen.

The only way to fix it is to either have hardware that is so powerful that you cannot improve the visuals in any way other than by doubling the framerate.

I had thought that 3D might be the solution (where running a 3D title in 2D might run at 60fps) but it turns out that the framerate loss is not 50%, and developers are significantly compromising on image quality and framerate in 3D. It's not that we have games running at 720p, 30fps in 3D on consoles, they are often far below 720p iyh lower resolution textures/effects, and average below 25fps even then.

Most developers are always going to choose better graphics, something that is obvious in screenshots and especially when most web videos are 30fps, compared to an improved framerate that you can't see unless you're playing the game. Rage gambled on that and look at how that turned out for id

Quote:
Originally Posted by WaveBoy View Post

My 2 CRT's NEVER had this problem. They displayed 30fps titles perfectly, with zero image double and no motion problems at all. The motion handeling was 'perfect' and retained every single little detail no matter how fast you zipped the camera around. That tech still has the advantage based on having no 30fps image doubleing and perfect motion. While Plasma is great in the motion department, it's not not up snuff in comparison to CRT.

Go back to a CRT capable of natively displaying a 720p or 1080p image, and you will find that it's just as evident there. The problem is the framerate and the way that games are rendered, not the displays.

The only solution is to stick to Nintendo developed games (and even they're not all 60fps) which may change once they have to render at HD resolutions, or get into PC gaming. A GTX570 will play most games at 1920x1080 with the majority of graphics settings turned up as high as they can go (you may have to cut back on some antialiasing or ambient occlusion) essentially locked to 60fps. I don't mean 60fps with lots of slowdown every so often, as console games (except Nintendo titles) often have, I mean never dropping below 60fps. There have only been one or two titles in the last few years that have been an exception to that, and there you can just turn down a couple of settings from the maximum to smooth things out. (most console versions of a game are equivalent to low/medium settings at 720p, 30fps)
Chronoptimist is offline  
post #34 of 72 Old 01-30-2012, 10:43 AM
AVS Special Member
 
Nielo TM's Avatar
 
Join Date: Oct 2005
Location: London
Posts: 1,479
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 11
It's much worse on CRT. Nearly impossible to play.

It is now bad to LCDs as well thanks to FQ miss-match
Nielo TM is offline  
post #35 of 72 Old 09-10-2012, 10:05 PM
Newbie
 
Quic K Bunnie's Avatar
 
Join Date: Sep 2012
Posts: 6
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by id0l View Post

It appears that, using this game as a testing ground for different framerates, that any game (I assume other content as well) displaying at 30fps effectively has "doubles frames" for each screen refresh. This is done by the display device itself, and not the game/content. This is evident when fast moving images (full screen or otherwise)/objects are displayed on-screen at 30fps. I believe it is less noticeable on an LCD because of the inherent motion blur due to the nature of LCD technology. Since Plasma can refresh much faster than LCD, the frame doubling is more apparent, drawing each frame clearly hence showing the seperation of the image clearer than on an LCD. This is not ghosting because the "doubled image" is the same color as the original and is not a dark/fading silhouette of the image (clearly evident on Sony Bravia LCD TVs during fast-motion gaming, and some film/TV sources as well - this is due to the slower response time of these panels).

So here's where I don't think I agree with you (at least in the case of the xbox) - the scaling and frame rendering is done by the game console. I've checked this personally - and all games that run at 720p and 30 fps still are output on the HDMI as 1080p at 60 Hz. The images is identical if the output is 720p at 60 Hz, and there is a drastic difference in cadence between 30 fps games like Gears of War 2 and 60 Hz games like Forza 2, but the video source properties over the HDMI are unchanged. If, somehow, the display were showing 2 sequential frames at the same time, it would show the double image effect for BOTH 60 fps and 30 fps games - because the signal is identical to the TV.

My theory is that it's actually the Xbox that is doing this - perhaps a faulty scaling to 60 Hz - interlacing instead progressively doubling each frame. Perhaps it is on purpose - by displaying 2 frames stacked together, they may be trying to mask the slower 30 fps cadence and give the illusion of motion (something that LCD motion blur will assist in).
It is possible that by some weird quirk, the World in Conflict somehow does the same thing, or perhaps the video card does. It's also possible that since you see motion blur on both the LCD at 30 fps and 60 fps, what you feel is image doubling at 60 fps is actually exaggerated LCD "motion blur." It's the effect of the same LCD pixel clearing properties that cause motion blur. The extra distance traveled between one frame to the next at 30 fps (as compared to 60 fps) causes the ghost image of the previous frame to look more like a double image purely as an optical effect of the greater gap between frame and ghost frame edges. Compare to 60 fps motion blurring, the edges are close enough together such that it appears more like motion blur rather than image doubling.

I wouldn't be surprised that the interlacing is the industry standard way of handling 30 fps video game content mainly to give the appearance of smoother motion - so the effect would be present on xbox, ps3, wii, etc.

As for retina persistence - I find that unlikely since this is not a phenomenon I experience in 24p cadence film sources. If someone can take some high speed camera stills of this effect, that would be good information. The shutter speed would need to be at least 1/125s to be able to accurately pinpoint any type of image double over retina persistence.
Quic K Bunnie is offline  
post #36 of 72 Old 09-10-2012, 10:23 PM
AVS Special Member
 
specuvestor's Avatar
 
Join Date: Jun 2010
Posts: 2,399
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 13
Quote:
Originally Posted by Quic K Bunnie View Post

So here's where I don't think I agree with you (at least in the case of the xbox) - the scaling and frame rendering is done by the game console. I've checked this personally - and all games that run at 720p and 30 fps still are output on the HDMI as 1080p at 60 Hz. The images is identical if the output is 720p at 60 Hz, and there is a drastic difference in cadence between 30 fps games like Gears of War 2 and 60 Hz games like Forza 2, but the video source properties over the HDMI are unchanged. If, somehow, the display were showing 2 sequential frames at the same time, it would show the double image effect for BOTH 60 fps and 30 fps games - because the signal is identical to the TV.

AFAIK there is no commercially available TV that can receive 1080p signal at 60Hz. Check again or let us know what model TV you have.
specuvestor is offline  
post #37 of 72 Old 09-11-2012, 06:22 AM
AVS Special Member
 
Chronoptimist's Avatar
 
Join Date: Sep 2009
Posts: 2,644
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 260
Quote:
Originally Posted by Quic K Bunnie View Post

My theory is that it's actually the Xbox that is doing this - perhaps a faulty scaling to 60 Hz - interlacing instead progressively doubling each frame. Perhaps it is on purpose - by displaying 2 frames stacked together, they may be trying to mask the slower 30 fps cadence and give the illusion of motion (something that LCD motion blur will assist in).
It is not doing this. The effect is simply caused by the fact that 30fps is not enough for smooth motion, and games do not have the natural motion blur that filmed content has to mask the artefact.
Quote:
Originally Posted by Quic K Bunnie View Post

As for retina persistence - I find that unlikely since this is not a phenomenon I experience in 24p cadence film sources. If someone can take some high speed camera stills of this effect, that would be good information. The shutter speed would need to be at least 1/125s to be able to accurately pinpoint any type of image double over retina persistence.
Anything filmed with a real camera has a ton of motion blur in it. Film is generally shot at 24fps with a shutter speed of 1/48s. If you have ever tried to photograph anything moving at 1/50s on your camera, you will find that there is a lot of motion blur there. It’s this motion blur that smooths out the appearance of motion.

And I do not perceive 24fps content to be smooth at all. I need interpolation for it to be tolerable, even though interpolation brings its own problems. (though Sony at least avoids the sped-up look that most sets introduce)
Quote:
Originally Posted by specuvestor View Post

AFAIK there is no commercially available TV that can receive 1080p signal at 60Hz. Check again or let us know what model TV you have.
Just about every display on the market will currently accept a 1080p60 image.

Over HDMI, there is only 1080p24 and 1080p60. There is no 1080p30. Each frame of 30fps content is doubled and fit into a 60Hz signal.
Chronoptimist is offline  
post #38 of 72 Old 09-11-2012, 10:53 AM
Newbie
 
Quic K Bunnie's Avatar
 
Join Date: Sep 2012
Posts: 6
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by Chronoptimist View Post

It is not doing this. The effect is simply caused by the fact that 30fps is not enough for smooth motion, and games do not have the natural motion blur that filmed content has to mask the artefact.
There is a difference between smooth motion and double frames displayed simultaneously. I agree that 30 fps results in a somewhat choppy motion - the clear detection of which is part of the reason I don't believe this is retina persistence. The choppy motion from the 30 fps framerate is an expected effect, the double image on a display that is very good at resolving motion is not.
Quote:
Anything filmed with a real camera has a ton of motion blur in it. Film is generally shot at 24fps with a shutter speed of 1/48s. If you have ever tried to photograph anything moving at 1/50s on your camera, you will find that there is a lot of motion blur there. It’s this motion blur that smooths out the appearance of motion.
I'm not sure where you are speccing your 1/48s number from... Professional film cameras have variable shutter speeds. You are correct that a film maker may choose a longer shutter speed to introduce blur on the film frame (counter-intuitively during high motion shots) and result in a perception of smoother motion. However, working with home made digital films where my friend and I have added special effects - 30 fps with no motion blurring on purpose, most certainly did not result in a the same image doubling effect.
If your retina could not clear faster than 30 fps, you should be able to resolve 3 images in 60 fps footage when panned fast enough to create enough distance from one frame to the next - this is not a phenomenon I think people are experiencing on high end plasmas (but I don't have one to test).
Quote:
And I do not perceive 24fps content to be smooth at all. I need interpolation for it to be tolerable, even though interpolation brings its own problems. (though Sony at least avoids the sped-up look that most sets introduce)
I agree, film cadence is not smooth - nor is it intended to be smooth. You will see 2 problems with a native film cadence because your retina can tell the difference from one frame to the next - on strobing displays you'll see flicker until you can get 4:4 @ 96 Hz (and even then some people see flicker), and motion will be choppy without interpolation. This is a film effect that some people prefer, and others don't. The Hobbit is being filmed at 48 fps, and the trailer was not well received by everyone, though Peter Jackson feels the 48 Hz effect is superior.
Personally - I like slow panning shots at a slower refresh rate like 24 or 30. It just feels more like a movie - which is totally subjective and maybe not even a valid reason to prefer the slower cadence. But action shots I would heavily prefer a faster cadence - I do not like the Bourne movie styles of not knowing what the hell is going on during an action scene - something the 24 fps cadence contributes to (but its mostly the shaky camera in the Bourne movies specifically).
Quote:
Just about every display on the market will currently accept a 1080p60 image.
Over HDMI, there is only 1080p24 and 1080p60. There is no 1080p30. Each frame of 30fps content is doubled and fit into a 60Hz signal.

No arguments here. Just FYI - I did my output testing on a Dell U2711 - the feed was definitely 1920x1080 @ 60 Hz.
Quic K Bunnie is offline  
post #39 of 72 Old 09-12-2012, 05:21 AM
AVS Special Member
 
specuvestor's Avatar
 
Join Date: Jun 2010
Posts: 2,399
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 13
^^ I thought you were referring to TV not monitors ie "because the signal is identical to the TV" smile.gif Monitors can take many more varied signals than TV. I didn't know TV can accept 1080p60 signal and had assumed 30fps games are just output to the TV in 1080i60, since it was not too long ago when TV just started to accept 1080p24, or at least relative to my age biggrin.gif Thanks Chronopt for the correction.
specuvestor is offline  
post #40 of 72 Old 09-12-2012, 10:40 AM
Newbie
 
Quic K Bunnie's Avatar
 
Join Date: Sep 2012
Posts: 6
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by specuvestor View Post

^^ I thought you were referring to TV not monitors ie "because the signal is identical to the TV" smile.gif Monitors can take many more varied signals than TV. I didn't know TV can accept 1080p60 signal and had assumed 30fps games are just output to the TV in 1080i60, since it was not too long ago when TV just started to accept 1080p24, or at least relative to my age biggrin.gif Thanks Chronopt for the correction.

I was referring to televisions, but I was doing my testing with a monitor because the on-screen display gives me more detailed information about the source (exact resolution and frequency). The settings are on the xbox itself, so the output is the same regardless of whether it is connected to a TV or a monitor.

Basically, everything is at 60 Hz except for 1080p24 but that came along later. It's actually a lower frequency, but matches how film is captured even today, so is considered by film purists to be a superior format (for sources captured at 24p - not video games).

You're correct that a monitor can take way more variations of signals, but all the Xbox 360 output settings are really for televisions (1080p, 720p, 1080i, etc, all at 60 Hz). You're also correct that 1080p24 signals are a newer feature but that is because it is part of the bluray technology evolution. The ATSC specified signals that are used in broadcast television are 1080i or 720p, both at 60 Hz, and the growth of LCDs lead to the eventual 1080p60 television (since LCDs are always progressive).

So 720p/1080i (60 Hz) were part of ASTC, which lead to 1080p60 on LCDs first, then plasmas. Around this time, blurays then began offering the full 1080p24 signal (typically output at 60 Hz), but since all the information was already encoded on the disc, the 1080p24 standard was adopted to match the source film material.
So timeline wise, 720p/1080i (both at 60 Hz) --> 1080p60 --> 1080p24

Hope this helps!
Quic K Bunnie is offline  
post #41 of 72 Old 09-13-2012, 01:32 PM
 
Lee Stewart's Avatar
 
Join Date: Feb 2007
Location: Albuquerque, NM
Posts: 19,369
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 44
Quote:
Originally Posted by Quic K Bunnie View Post

Around this time, blurays then began offering the full 1080p signal (typically at 60 Hz), but since all the information was already encoded on the disc, the 1080p24 standard was adopted to match the source film material.
So timeline wise, 720p/1080i (both at 60 Hz) --> 1080p60 --> 1080p24
Hope this helps!

Blu-rays were never offered at 1080x60P. That resolution is not in the Blu-ray specs. They are either at 24P or 1080i. (USA)

Look just below CODECS about halfway down:

http://en.wikipedia.org/wiki/Blu-ray_Disc
Lee Stewart is offline  
post #42 of 72 Old 09-13-2012, 06:47 PM
AVS Special Member
 
Mark Rejhon's Avatar
 
Join Date: Feb 1999
Location: North America
Posts: 8,125
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 3 Post(s)
Liked: 102
Quote:
Originally Posted by id0l View Post

30fps on LCD @ 60hz: Moving images appear to double. Slight motion blur due to LCD response time.
60fps on LCD @ 60hz: Moving images are not doubled. Smooth movement. Slight motion blur due to LCD response time.
30fps on Plasma @ 60hz: Moving images appear to double. No motion blur detected.
60fps on Plasma @ 60hz: Moving images are not doubled. Smooth movement. No motion blur detected.
Your posting is quite accurate. As the eye tracks across the screen during video game motion, the repeated frame, creates a doubling effect. I also see the doubling effect during fast pans on old 35mm films at the movie theaters, since old 35mm projectors flashed the projector light at 48Hz. Pans in scenery have the same 'doubling' effect.

Anyway, my reply is to say that high-end LCD displays with scanning backlights (e.g. Elite HDTV, Sony XR 960, Samsung CMR 960) eliminates the motion blur for LCD, although not as much as on those plasmas. On these scanning backlights, it's dark only 75% of the time -- only sufficient to reduce only 75% of motion blur. I've observed tests of XBox360 and PS3 on these expensive $3000 HDTV's, and it's rather impressive how much the motion blur disappears from 60fps games. However, the input lag of motion interpolation is quite annoying for games.

...On a related note...
Today, it's now finally possible to have the same effect without motion interpolation -- and thus, no further input lag! Recent LCD panels and LED electronics developments, now make high-speed scanning backlights possible for LCD to have less motion blur than either CRT or plasma (e.g. simulate "960Hz" or "1920Hz" via brief illumination, to reduce 90% or 95% of motion blur respectively, without needing interpolation). The backlight is dark while waiting for the slow LCD response (2ms-5ms) so the human eye no longer sees the limitation of LCD response, and after the LCD pixel is finished refreshing unseen in the dark -- the backlight is flashed quickly after (0.5ms or 1ms) -- that way, you bypass the LCD response speed as the limiting factor as motion blur, which is possible as long as the LCD response is faster than a single frame (3D LCD's made this necessary, paving the way for high-speed scanning backlights that allow LCD's to have less motion blur than both CRT and plasma). Persistence of vision and flicker fusion does the rest, just like on CRT displays. See this thread on accomplishing LCD with less motion blur than CRT and plasma. Scanning backlights that are dark 90% or 95% of the time (Required to make LCD better than CRT in "motion resolution" while also avoiding doing interpolation, and avoid adding input lag) requires large numbers of extremely bright LED's, but LED prices have finally fallen far enough to make this feasible.

Thanks,
Mark Rejhon

www.BlurBusters.com

BlurBusters Blog -- Eliminating Motion Blur by 90%+ on LCD for games and computers

Rooting for upcoming low-persistence rolling-scan OLEDs too!

Mark Rejhon is offline  
post #43 of 72 Old 09-14-2012, 06:12 AM
Newbie
 
Quic K Bunnie's Avatar
 
Join Date: Sep 2012
Posts: 6
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by Lee Stewart View Post

Blu-rays were never offered at 1080x60P. That resolution is not in the Blu-ray specs. They are either at 24P or 1080i. (USA)
Look just below CODECS about halfway down:
http://en.wikipedia.org/wiki/Blu-ray_Disc

Yes, you're correct - it was 1080p24, but the output of most early bluray players was at 60 Hz, via 3:2 pulldown or similar scaling. The 24 Hz output came out later, as a result of the bluray 1080p24 spec once TVs were able to either do the 3:2 pulldown processing themselves or natively displaying the 24 Hz signal.

I see how what I wrote was confusing. I was trying to say that the blurays began offering 1080p24 source material that was the closest to film, and until they started to actually output 24 Hz, 1080p60 was the signal used to transmit information via HDMI after the bluray player scaled the signal (3:2 - leading to the infamous 'judder'). Thus it was the bluray source material and the detection of judder by film purists that lead to the natural evolution of the 1080p24 sent over HDMI and TV's actually displaying 24 fps. In contrast, it was 1920x1080 LCD panels (which were available before blurays) that lead to players using a 1080p60 signal over HDMI.

EDIT: Turns out I was incorrect on a couple of points - I've edited both my posts to reflect this. Some of the ATSC standards were different than what I remembered, but the only ATSC specs that are used in broadcast television are 1080i and 720p because the cable/broadcast/satellite ecosystem all use MPEG-2 decoders in their hardware. I actually can't remember if the earlier bluray players had 1080p24 as an option, but I do remember a PS3 update that added this feature later on, so I'm still fairly certain that 1080p24 over HDMI hadn't been established/available well after 1080p60 over HDMI.
Quic K Bunnie is offline  
post #44 of 72 Old 09-14-2012, 09:55 AM
 
Lee Stewart's Avatar
 
Join Date: Feb 2007
Location: Albuquerque, NM
Posts: 19,369
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 44
Quote:
Originally Posted by Quic K Bunnie View Post

Yes, you're correct - it was 1080p24, but the output of most early bluray players was at 60 Hz, via 3:2 pulldown or similar scaling. The 24 Hz output came out later, as a result of the bluray 1080p24 spec once TVs were able to either do the 3:2 pulldown processing themselves or natively displaying the 24 Hz signal.
I see how what I wrote was confusing. I was trying to say that the blurays began offering 1080p24 source material that was the closest to film, and until they started to actually output 24 Hz, 1080p60 was the signal used to transmit information via HDMI. Thus it was the bluray source material that lead to the natural evolution of the 1080p24 spec being on HDMI and TV's handling 24 fps, which it was LCD technology that lead to the evolution of going from 1080i to 1080p60.

The first HDTV that could process BD's 24P signal was not an LCD. It's was Pioneer's Kuro. It processed the 24P signal as 72Hz.
Lee Stewart is offline  
post #45 of 72 Old 09-14-2012, 10:27 AM
Newbie
 
Quic K Bunnie's Avatar
 
Join Date: Sep 2012
Posts: 6
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by Lee Stewart View Post

The first HDTV that could process BD's 24P signal was not an LCD. It's was Pioneer's Kuro. It processed the 24P signal as 72Hz.

The Pioneer Kuro was the first display that could actually display a 24p signal, which is different from saying that other sets couldn't process one. Besides, I wasn't trying to say anything about LCDs being able to handle OR display 24p - just that LCDs are the reason 1080p60 is a common signal when all source material is either 720p, 1080i, or 1080p24.

EDIT: Ah I see. Poor writing on my part again - I've cleaned it up. "Thus it was the bluray source material and the detection of judder by film purists that lead to the natural evolution of the 1080p24 sent over HDMI and TV's actually displaying 24 fps. In contrast, it was 1920x1080 LCD panels (which were available before blurays) that lead to players using a 1080p60 signal over HDMI."
Quote:
Originally Posted by Mark Rejhon View Post

Your posting is quite accurate. As the eye tracks across the screen during video game motion, the repeated frame, creates a doubling effect. I also see the doubling effect during fast pans on old 35mm films at the movie theaters, since old 35mm projectors flashed the projector light at 48Hz. Pans in scenery have the same 'doubling' effect.
Yeah man, I dunno, I really don't feel like I see this. It's subjective, for sure, which is why I'd like someone to take some high speed images of higher end plasmas and get some objective images.

I've played PC games on my Sony Trinitron CRT monitor for years (I only got my U2711 last year). My PC games often times rendered at 30 fps because I keep V-sync on to prevent tearing (so anytime a game would drop below 60 fps, the output would be 30 fps - which happened a lot with my mid-range PC). I really don't ever recall seeing double images - I loved how fast my CRT used to resolve motion compared to the terrible TN LCD panels most gamers would blabber about. I know the display is smaller, but I sat much closer too. There is a definite increase in how smooth the video looks when switching from 30 fps to 60 fps, but that is expected. I never saw any image doubling though.
Quote:
Anyway, my reply is to say that high-end LCD displays with scanning backlights (e.g. Elite HDTV, Sony XR 960, Samsung CMR 960) eliminates the motion blur for LCD, although not as much as on those plasmas. On these scanning backlights, it's dark only 75% of the time -- only sufficient to reduce only 75% of motion blur. I've observed tests of XBox360 and PS3 on these expensive $3000 HDTV's, and it's rather impressive how much the motion blur disappears from 60fps games. However, the input lag of motion interpolation is quite annoying for games.

I still find it more likely that the reduction of motion blur on 60 fps games is still due to Xbox/PS3 output having video processed frame interlacing in 30 fps games - but obviously not on 60 fps games. I think they are trying to mask a slower cadence by inserting merged frames between each game-rendered frame - a much easier job that the GPU can do on the fly without impacting performance when compared to trying to render 60 fps. This should appear as a flickering double image on high end plasmas and a substantial motion blur on most LCDs, possibly with some double image on better LCDs. 60 fps games will not show any of these artifacts on a solid plasma and LCD will have what appears to be reduced motion blur.
Quote:
...On a related note...
Today, it's now finally possible to have the same effect without motion interpolation -- and thus, no further input lag! Recent LCD panels and LED electronics developments, now make high-speed scanning backlights possible for LCD to have less motion blur than either CRT or plasma (e.g. simulate "960Hz" or "1920Hz" via brief illumination, to reduce 90% or 95% of motion blur respectively, without needing interpolation). The backlight is dark while waiting for the slow LCD response (2ms-5ms) so the human eye no longer sees the limitation of LCD response, and after the LCD pixel is finished refreshing unseen in the dark -- the backlight is flashed quickly after (0.5ms or 1ms) -- that way, you bypass the LCD response speed as the limiting factor as motion blur, which is possible as long as the LCD response is faster than a single frame (3D LCD's made this necessary, paving the way for high-speed scanning backlights that allow LCD's to have less motion blur than both CRT and plasma). Persistence of vision and flicker fusion does the rest, just like on CRT displays. See this thread on accomplishing LCD with less motion blur than CRT and plasma. Scanning backlights that are dark 90% or 95% of the time (Required to make LCD better than CRT in "motion resolution" while also avoiding doing interpolation, and avoid adding input lag) requires large numbers of extremely bright LED's, but LED prices have finally fallen far enough to make this feasible.

I know scanning backlights can do some good things, but I'm wondering if this tech will catch on before or after OLEDs. With greater dark times, you need even more LEDs than current "full array" LEDs, and the peak brightness needs to be higher. The advantages of all these are supposedly built into OLEDs.
Either way, good things should be coming around
Quic K Bunnie is offline  
post #46 of 72 Old 09-21-2012, 12:50 PM
AVS Special Member
 
Mark Rejhon's Avatar
 
Join Date: Feb 1999
Location: North America
Posts: 8,125
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 3 Post(s)
Liked: 102
Quote:
Originally Posted by Quic K Bunnie View Post

I know scanning backlights can do some good things, but I'm wondering if this tech will catch on before or after OLEDs. With greater dark times, you need even more LEDs than current "full array" LEDs, and the peak brightness needs to be higher. The advantages of all these are supposedly built into OLEDs.
Either way, good things should be coming around
Yes, that's true.

However, it's actually easy enough to do with factory-produced LED reels that recently became cheap. A typical 24" computer monitor has approximately 20 watts of LED's. One can put 200 watts (20 average) of LED's behind a 24" panel for only $160 today, with common LED ribbons -- $11ea off eBay, or $40ea for high quality 6500K 80+CRI for backlight quality. I only need four reels for 200 watts worth of 6500K. The scale of economies afforded by LED ribbons make it easy to install 800 watts (surge only; 80 watt average) of LED's behind a 47" LCD, especially if the reels fall further in cost. I'm going to be purchasing several of the better reels for my homebrew scanning backlight mod project for a 24" monitor, driven by an Arduino (24" only needs 200 watts (20 average) worth for 90% motion blur reduction via short strobes). These 8 millimeter wide ribbons can be cut at designated intervals, and almost 40 rows of ribbons can be squeezed behind a 24" LCD.

Also, OLED will not necessarily solve motion blur because the OLED's are not bright enough to be strobed very short. The first generation OLED's have more motion blur than plasma. It will be a long time before OLED has CRT-perfect motion, because pixel response is not the cause of motion blur, but the continuous hold illumination of the pixel while eyes are continuously moving. Even if pixels can "turn on and off" instantaneously, there's still motion blur on OLED, because OLED pixels are illuminated longer than the phosphor decay of CRT. You need to strobe the OLED pixels fast, and for one short cycle per refresh (not multiple strobes spread over a refresh) to gain the "perfect blur-free motion" effect that CRT's are able to do. The advantage of OLED is that it can be strobed fast, but OLED is not as bright as CRT phosphors (during the 1 millisecond illumination) required for perfectly sharp motion on OLED. So manufacturers aren't going to give you CRT perfect motion on OLED displays. Putting enough power into an OLED array is also a concern. Some OLED's even have to dim the whole screen (like some plasmas do) if too many bright OLED pixels are lit up. I anticipate that OLED will not provide CRT perfect motion this decade. Maybe in the next decade for "CRT perfect" motion from OLED.

Thanks,
Mark Rejhon

www.BlurBusters.com

BlurBusters Blog -- Eliminating Motion Blur by 90%+ on LCD for games and computers

Rooting for upcoming low-persistence rolling-scan OLEDs too!

Mark Rejhon is offline  
post #47 of 72 Old 09-22-2012, 12:25 PM
Newbie
 
Quic K Bunnie's Avatar
 
Join Date: Sep 2012
Posts: 6
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by Mark Rejhon View Post

Also, OLED will not necessarily solve motion blur because the OLED's are not bright enough to be strobed very short. The first generation OLED's have more motion blur than plasma. It will be a long time before OLED has CRT-perfect motion, because pixel response is not the cause of motion blur, but the continuous hold illumination of the pixel while eyes are continuously moving. Even if pixels can "turn on and off" instantaneously, there's still motion blur on OLED, because OLED pixels are illuminated longer than the phosphor decay of CRT. You need to strobe the OLED pixels fast, and for one short cycle per refresh (not multiple strobes spread over a refresh) to gain the "perfect blur-free motion" effect that CRT's are able to do. The advantage of OLED is that it can be strobed fast, but OLED is not as bright as CRT phosphors (during the 1 millisecond illumination) required for perfectly sharp motion on OLED. So manufacturers aren't going to give you CRT perfect motion on OLED displays. Putting enough power into an OLED array is also a concern. Some OLED's even have to dim the whole screen (like some plasmas do) if too many bright OLED pixels are lit up. I anticipate that OLED will not provide CRT perfect motion this decade. Maybe in the next decade for "CRT perfect" motion from OLED.

I'm sorry, but there are some fundamental inaccuracies in your post. Pixel response is the cause of blur. Despite LCDs quoting gray-gray pixel transitions of less than 2 millisec, high speed digital photography clearly shows that on some LCDs, pixels cannot completely do transition even as fast as 1/30 sec (33 msec), showing up to 3 frames simultaneously. Scanning the backlight can improve the situation by illuminating less of the transition, but unless pixels are able to properly do full transitions within the window of time the scanning backlight provides - say 16 msec for a 60 Hz scanning backlight - LCDs simply won't be able to fully remove ghosting/trailing/motion blur. Some LCDs have been getting at this, such as the BenQ MVA panels which have fantastic 3000:1 native contrast ratios and very good pixel response times showing basically no motion but suffer from poor color reproduction and mediocre viewing angles (better than TN).
This phenomenon is well documented by Raymond Soneira and Anandtech - feel free to research their findings yourself.
Only one OLED TV has ever been on the market - a ridiculously overprice Sony 11" TV. The Korean 50" OLED TVs are not on the market yet, and won't be for a while yet at a reasonable price point, but already have been noted to have virtually no motion blur. With no manufacturer specs research response times, I find that a very popular number quoted on the Internet is around 0.01 milliseconds. However, laboratories publishing OLED properties have quoted response times of around 500 ns (0.5 ms) - I would guess that, similar to LCD pixels, a full transition will take longer than an optimal transition. I'm guessing the labs publish the full transition times and the numbers floating on the internet are optimal transition - but I won't bother to hunt this down as the numbers are a moot point. Anywhere in this range of pixel response, the human eye is quite incapable of perceiving the blur associated with transition.
These results are backed up by even handheld devices that currently use OLED devices. Compare the original PSP which was known for having just awful motion blur (worse than any LCD should) to the PS Vita. The OLED display in the PS Vita has no motion blur - absolutely undetectable - when playing the same games as the original PSP.

Anyways, this thread is now completely derailed, and I really only posted to see what people thought of my theory that Xbox/PS3/Wii consoles were interlacing frames that ran at 30 fps. Feel free to carry on the discussion, but I feel my points have been made and I rest my case. Take care everyone!
Quic K Bunnie is offline  
post #48 of 72 Old 09-22-2012, 06:22 PM
AVS Special Member
 
Mark Rejhon's Avatar
 
Join Date: Feb 1999
Location: North America
Posts: 8,125
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 3 Post(s)
Liked: 102
First, let me humbly apologize to the OP for bringing this thread off-track. However, I do want to bring something into correct record, though.
Quote:
Originally Posted by Quic K Bunnie View Post

I'm sorry, but there are some fundamental inaccuracies in your post. Pixel response is the cause of blur. Despite LCDs quoting gray-gray pixel transitions of less than 2 millisec, high speed digital photography clearly shows that on some LCDs, pixels cannot completely do transition even as fast as 1/30 sec (33 msec), showing up to 3 frames simultaneously. Scanning the backlight can improve the situation by illuminating less of the transition
1. LCD pixel response speed used to be the main cause of blur when LCD pixel response exceeded half the length of a frame. But 2ms is less than 20% of an LCD refresh cycle at 60Hz. The REMAINING (14ms out of 1/60sec) motion blur is CAUSED by eye tracking motion, smearing the already-refreshed LCD image across your retinas while you're tracking moving objects.
True, confirmed academic FACT: On today's modern LCD's _Most_ motion blur is now caused by eye motion while tracking a moving object on an LCD.

2. See academic citations -- This academic paper explains the difference between LCD pixel response and eye tracking blur, and they can actually be independent variables. See the diagram on page 3. And shows how it is possible to 'bust' the LCD pixel refresh barrier, because many LCD panels today already practically finish refresh the frame by the end of their frame refresh cycle. See more citations -- I can provide more, if desired.

EXAMPLE: How an ultra-short-strobe scanning backlight can bust the LCD pixel response barrier (for the human eye)
Example of one 16.666ms refresh at 60Hz (1/60 = ~16.666ms)
T+0ms = LCD monitor begins refreshing pixel (unseen in the dark)
T+2.4ms = Average LCD pixel response (Samsung SyncMaster SA950 example) (unseen in the dark)
T+2.4ms = Approximate start of pixel ripple/bounce (e.g. response-time-compensation error recovery) (unseen in the dark)
T+15ms = Slowest grey-to-grey transitions are finished (unseen in the dark)
T+15.5ms = Strobe the backlight very brightly for only 1ms or 0.5ms (say, 1/960 or 1/1920 second) (seen by human eye)
T+16.666ms = Next LCD monitor pixel refresh begins (unseen in the dark) (and the cycle repeats)
Voilà - Pixel response no longer the motion-blur barrier

For this example, your eye only sees a 0.5ms strobe of the finished LCD refresh. Your eyes DO NOT see the slow 2.4+ms LCD pixel response.
Persistence of vision and flicker fusion, just like for CRT, sees it as a solid image, despite the flicker. Given a sufficiently high refresh rate (72Hz, 85Hz, even 120Hz), the flicker is not noticed by most humans except the flicker-sensitive ones. The example applies to both scanning backlights (single strobe per refresh per backlight segment) and full black frame insertion (single full backlight strobe for full screen) -- equivalent effect. However, many LCD's take a finite time to begin refreshing the top and they gradually refresh, one pixel row at a time, towards the bottom, so scanning the backlights works best for that method of LCD controller behavior.

Scanning backights using ultrashort strobes (requires 10-20x brighter flashes to compensate for long dark periods) are not yet on the market, but vision tests and academic papers prove it is possible, and a few professionals in the display industry has agreed it is possible, given the right backlight and sufficiently bright flashes (at least as bright as a CRT phosphor -- they're really bright for that short time period before phosphor decay). So as you can see, today's LCD's have to finish refreshing pixels well before the end of the refresh, in order for them to be compatible with 3D, otherwise the images bleed to the other eye. This situation permits the design of ultra-short strobes of a duration that is shorter than the LCD refresh -- you just simply strobe at a different period in the timeline of a single refresh cycle.
With a high speed camera, an LCD with a ultrashort-strobe scanning backlight (90%:10% dark:bright), looks just like a scanning CRT, such as in this Youtube video.

Thus, LCD pixel response is no longer the primary cause of most of the motion blur on today's LCD anymore. Most of the motion blur is being caused by eye motion tracking a moving object across an LCD. Each frame is static for a whopping 16.6ms (at 60Hz) on a continuously-lit LCD, and your eyes is in motion during this period. I can give you some more citations (click), and find even more, if you'd like.

About PS Vita:
Note: I'm talking about full size displays. The PS Vita is wonderful, especially when dimmed, have very short OLED pixel pulses, so that eliminates quite a lot of motion blur. The pixels on it is not as fast as CRT phosphor decay, but at small sizes, you don't need CRT speed, because eye tracking blur is very small on small displays -- not much distance for eyes to track. So motion blur on handheld OLED is generally a non-issue -- It's a really very good OLED display, but I was originally talking about full size displays, which is a lot more challenging, as they need to be much brighter than a portable display, thus requiring longer OLED strobes, creating more opportunity for eye-tracking-based motion blur.

About OLED
You're right in it's possible that they've solved the problem after years of delays/trying with OLED. I'm not saying it can't be done with OLED's, but the early OLED's have had so much problems when run at high brightnesses. It's possible that they may have solved the short-pixel-strobe problem. So *that* part, I *can* be proven wrong. Also, it takes very certain material to see motion blur at higher extremes (1) framerate equalling refresh rate; (2) fast pans in wide field of view (big displays), and (3) individual frames don't have blur in them (e.g. 3D games without GPU motion blur effects; or video taken by camera with fast shutter speed) .... So many people don't know how to detect for the "limits" of motion blur. However, my comment is correct: on modern LCD displays, most motion blur is caused by eye tracking, and not by the pixel response.

So will let this thread get back on topic. For discussion regarding the LCD pixel response no longer being the motion blur barrier, further discussion can be transferred to this other AVSFORUM thread. Again, my apologies to the OP!

Thanks,
Mark Rejhon

www.BlurBusters.com

BlurBusters Blog -- Eliminating Motion Blur by 90%+ on LCD for games and computers

Rooting for upcoming low-persistence rolling-scan OLEDs too!

Mark Rejhon is offline  
post #49 of 72 Old 10-21-2012, 04:59 PM
Newbie
 
bombaman's Avatar
 
Join Date: Oct 2012
Posts: 5
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
I am still confused now.

So are LCDs better for 30fps games than Plasmas?

Are Plasmas still suffering from judder or has this issue been solved?

The only downside as far as i can tell is the motin blur + input lag of an LCD in comparison to a Plasma.
bombaman is offline  
post #50 of 72 Old 10-22-2012, 05:19 PM
Newbie
 
bombaman's Avatar
 
Join Date: Oct 2012
Posts: 5
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
No one any insight?

To make things clear, i am not talking about double images or motionblur. I mean that Plasmas tend to have no smooth movements like lcds with 30 fps games.

However since better Plasmas came out i am curious if this problem ist still existing.
bombaman is offline  
post #51 of 72 Old 10-23-2012, 04:50 PM
AVS Special Member
 
Chronoptimist's Avatar
 
Join Date: Sep 2009
Posts: 2,644
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 260
This is not an LCD or Plasma issue at all. The issue is that games do not have the natural blurring that filming content with a camera does, so 30fps is not enough for smooth motion. Some games try to implement motion blur effects to smooth the appearance of motion, but it's still not enough to perceive motion as being fluid in my opinion.

Some people claim that the higher amounts of motion blur that LCD exhibits vs Plasma can smooth this over, but this is not the case in my experience, and most LCDs have reduced motion blur to the point where it's not going to make a noticeable difference. I would not choose a display technology on the hopes that one may show 30fps smoother.

LCDs with interpolation can help smooth things out, but interpolation generally adds several frames of delay, which makes it unsuitable for gaming.


The only real fix is to build a PC that is capable of playing games at 60+fps. (A GTX 660 or higher should be able to run anything at 1080p60 with ease)
In some cases, and for some people, even 60fps is not enough, which is why 120Hz monitors exist. Unfortunately I don't know of any high-end TVs that support 120Hz inputs, though there are some 720p120 DLP projectors available.


That's all there is to it really, as it's not the display technology at fault. If you're into games consoles, well the only solution is to either avoid 30fps games (which is most of them) or put up with it. Even next generation, I expect games will be 1080p30 and few if any will be 1080p60.
Chronoptimist is offline  
post #52 of 72 Old 10-23-2012, 04:56 PM
AVS Special Member
 
Mark Rejhon's Avatar
 
Join Date: Feb 1999
Location: North America
Posts: 8,125
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 3 Post(s)
Liked: 102
bombaman,
Unfortunately, it's related.
(For the below effects, assume that motion interpolation is completely disabled. Interpolation generally should not be used for interactive use, such as games and computer)

From the science of the human vision system, motion blur and double images are actually somewhat related to smooth movements. Double images occur during 30fps at 60Hz on impulse-driven displays such as CRT and plasma (or LCD with scanning backlight -- e.g. Elite LCD HDTV), but has less motion blur and the double-image effect feels like "judder" (not-smooth motion), which some people hate.

When running 30fps on 60Hz sample-and-hold displays such as LCD, the sample-and-hold nature masks the frame repeat, so you never see a double-image effect on these displays. Thus, the "judder" effect disappears, and 30fps motion looks smoother, even though there's somewhat more motion blur (caused by eye tracking motion on a sample-and-hold display).

Pick your poison. Bring a console to the store and test it out, find out what effect you prefer for 30fps.

Thanks,
Mark Rejhon

www.BlurBusters.com

BlurBusters Blog -- Eliminating Motion Blur by 90%+ on LCD for games and computers

Rooting for upcoming low-persistence rolling-scan OLEDs too!

Mark Rejhon is offline  
post #53 of 72 Old 10-23-2012, 05:10 PM
AVS Special Member
 
Chronoptimist's Avatar
 
Join Date: Sep 2009
Posts: 2,644
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 260
Quote:
Originally Posted by Mark Rejhon View Post

When running 30fps on 60Hz sample-and-hold displays such as LCD, the sample-and-hold nature masks the frame repeat, so you never see a double-image effect on these displays. Thus, the "judder" effect disappears, and 30fps motion looks smoother, even though there's somewhat more motion blur (caused by eye tracking motion on a sample-and-hold display).
Pick your poison. Bring a console to the store and test it out, find out what effect you prefer for 30fps.
I see this whether I have backlight scanning enabled or not on my LCD. (and on LCDs long before scanning backlights were a thing)

It's simply that 30fps is not a high enough framerate for smooth motion with games.
Chronoptimist is offline  
post #54 of 72 Old 10-23-2012, 10:33 PM
AVS Special Member
 
Mark Rejhon's Avatar
 
Join Date: Feb 1999
Location: North America
Posts: 8,125
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 3 Post(s)
Liked: 102
Quote:
Originally Posted by Chronoptimist View Post

I see this whether I have backlight scanning enabled or not on my LCD. (and on LCDs long before scanning backlights were a thing)
Yes, I agree. Most scanning backlights don't do enough motion blur reduction. You will see doubling of image (if you enable a scanning mode WHILE disabling interpolation) as I mentioned, but some motion blur is still there.
Quote:
It's simply that 30fps is not a high enough framerate for smooth motion with games.
Agreed. :-)

....

A note -- Today's scanning backlights don't reduce motion blur enough, but hopefully that will change in the future by manufacturers -- there are technical ways to reduce motion blur by 95% on an LCD (but it requires something like a 240 watt LED backlight in the space of a 24" monitor, for the ultrashort strobes) -- something that manufacturers have not done yet, even the best scanning backlights by Samsung (Clear Motion Rate 960) and Sony (Motionflow XR 960) as well as Elite LCD HDTV, only use a 75%:25% dark:bright scanning backlight configured at their most aggressive (flickery but maximum motion blur reduction) setting -- so you'll still see quite a bit of motion blur with those, the motion blur trails only reduce by up to approximately 75%; they don't 'disappear' like when playing on a CRT @ 60fps. I'm aiming to create a prototype (for computer monitors/games) that reduce motion blur by 95% (FAQ about how this will be done) -- I have sourced components allowing a way to build a 240 watt backlight in a 24 inch computer monitor for less than $120 electronics components at 6500K 80 CRI (BlurBusters Blog -- scroll down to see the blog entry about LED ribbons.) The wattage is needed for the unprecedented super-short strobes (100W/sqft strobed for ~5-10W/sqft actual average power consumption) Right now, it's targeted towards computer monitors, due to PC's ability to hit 60fps and above.

Thanks,
Mark Rejhon

www.BlurBusters.com

BlurBusters Blog -- Eliminating Motion Blur by 90%+ on LCD for games and computers

Rooting for upcoming low-persistence rolling-scan OLEDs too!

Mark Rejhon is offline  
post #55 of 72 Old 10-25-2012, 05:35 AM
Newbie
 
bombaman's Avatar
 
Join Date: Oct 2012
Posts: 5
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by Mark Rejhon View Post

bombaman,
Unfortunately, it's related.
(For the below effects, assume that motion interpolation is completely disabled. Interpolation generally should not be used for interactive use, such as games and computer)
From the science of the human vision system, motion blur and double images are actually somewhat related to smooth movements. Double images occur during 30fps at 60Hz on impulse-driven displays such as CRT and plasma (or LCD with scanning backlight -- e.g. Elite LCD HDTV), but has less motion blur and the double-image effect feels like "judder" (not-smooth motion), which some people hate.
When running 30fps on 60Hz sample-and-hold displays such as LCD, the sample-and-hold nature masks the frame repeat, so you never see a double-image effect on these displays. Thus, the "judder" effect disappears, and 30fps motion looks smoother, even though there's somewhat more motion blur (caused by eye tracking motion on a sample-and-hold display).
Pick your poison. Bring a console to the store and test it out, find out what effect you prefer for 30fps.

Thank you for your detailed reply.
Unfortunatly i am not ably find anyone who owns a plasma and the tv store is not letting me...

I have a couple aof questions.

1: Is the double image problem of Plasmas as bas as it is presented in this Video? http://www.youtube.com/watch?v=X113cl9K_Oo&feature=related

2:: i am using a Sony Bravia hx753 "40" for gaming. There is an impulse mode which seems to copy the way a Plasma works.
If i am using it, i can see double images aswell. As far as i can tell i would prefer the double images over the lcd blur since it is more accurate.
However my double images are not as bad as the video i posted above. For tests i used FF XIII-2. Is that a game specific problem?

3: Sony has got the interpolation mode Motionflow "smooth". This increases FPS nearly to 60fps or even more. Also i am noticing zero motionblur or double images. There is however some weird Picture problems when i am moving the camera. Why is there such a problem?
bombaman is offline  
post #56 of 72 Old 10-25-2012, 06:09 AM
AVS Special Member
 
Chronoptimist's Avatar
 
Join Date: Sep 2009
Posts: 2,644
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 260
Quote:
Originally Posted by bombaman View Post

1: Is the double image problem of Plasmas as bas as it is presented in this Video? http://www.youtube.com/watch?v=X113cl9K_Oo&feature=related
Yes, though a big contributor to that is "phosphor lag" which can also affect 60fps.
Quote:
Originally Posted by bombaman View Post

2:: i am using a Sony Bravia hx753 "40" for gaming. There is an impulse mode which seems to copy the way a Plasma works.
If i am using it, i can see double images aswell. As far as i can tell i would prefer the double images over the lcd blur since it is more accurate.
Impulse mode uses backlight scanning to almost eliminate LCD motion blur. There should be less of a double-image with this than a Plasma which also exhibits "phosphor lag" in addition to low motion blur. Certain games will make this problem more obvious than others.
Quote:
Originally Posted by bombaman View Post

3: Sony has got the interpolation mode Motionflow "smooth". This increases FPS nearly to 60fps or even more. Also i am noticing zero motionblur or double images. There is however some weird Picture problems when i am moving the camera. Why is there such a problem?
All other MotionFlow options are using interpolation to "guess" intermediate frames to try and smooth things out, creating the "60fps" look. Because they try to minimize delay (I think Sony has a processing delay of around 4-5 frames with interpolation) it's rarely perfect. The algorithms tend to work better with video content rather than games. Actually running the game at 60fps (which you can do on a PC) looks even smoother, and avoids these interpolation artefacts.
Chronoptimist is offline  
post #57 of 72 Old 10-25-2012, 12:07 PM
AVS Special Member
 
Mark Rejhon's Avatar
 
Join Date: Feb 1999
Location: North America
Posts: 8,125
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 3 Post(s)
Liked: 102
Quote:
Originally Posted by bombaman View Post

Thank you for your detailed reply.
Unfortunatly i am not ably find anyone who owns a plasma and the tv store is not letting me...
I have a couple aof questions.
1: Is the double image problem of Plasmas as bas as it is presented in this Video? http://www.youtube.com/watch?v=X113cl9K_Oo&feature=related
2:: i am using a Sony Bravia hx753 "40" for gaming. There is an impulse mode which seems to copy the way a Plasma works.
If i am using it, i can see double images aswell. As far as i can tell i would prefer the double images over the lcd blur since it is more accurate.
However my double images are not as bad as the video i posted above. For tests i used FF XIII-2. Is that a game specific problem?
3: Sony has got the interpolation mode Motionflow "smooth". This increases FPS nearly to 60fps or even more. Also i am noticing zero motionblur or double images. There is however some weird Picture problems when i am moving the camera. Why is there such a problem?
Chronoptimist's explanations are correct.

For the impulse modes on Sony displays, that is a scanning backlight mode that Sony uses. For these (with interpolation disabled, while impulse enabled), a double-frame effect will now also occur on LCD too. Scanning/impulse backlights really make 60fps videogames shine on LCD panels, provided they are designed with low-input lag. In this case, you generally do want to try to seek out those 60fps games.

For Motionflow smooth, the interpolation causes artifact issues (and lag) in video games. Generally, I consider this unsuitable for twitch-action games, but for turn-based games and non-timing-critical games, this might be a lesser of evil for some people than low-framerate artifacts. Since interpolation requires both the previous and next (few) frames, there's a mandatory input lag. This is caused by buffering the frames, and this latency cannot be completely eliminated, affecting suitability for game operation.

Another alternative idea ("Plan B") -- connect a PC to the HDTV -- if you can afford an LCD with a scanning backlight, consider high end gaming HTPC with a GPU capable of 60fps, or a long HDMI cable to your computer room and a low latency wireless keyboard/mouse/gamepad, buy the PC version of the same console game, and forget about 30fps gaming. Your display choices then gets more flexible.

Thanks,
Mark Rejhon

www.BlurBusters.com

BlurBusters Blog -- Eliminating Motion Blur by 90%+ on LCD for games and computers

Rooting for upcoming low-persistence rolling-scan OLEDs too!

Mark Rejhon is offline  
post #58 of 72 Old 10-26-2012, 07:37 PM
Member
 
Mastperf's Avatar
 
Join Date: Feb 2010
Posts: 163
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 11
Quote:
Originally Posted by Mark Rejhon View Post

About PS Vita:
Note: I'm talking about full size displays. The PS Vita is wonderful, especially when dimmed, have very short OLED pixel pulses, so that eliminates quite a lot of motion blur. The pixels on it is not as fast as CRT phosphor decay, but at small sizes, you don't need CRT speed, because eye tracking blur is very small on small displays -- not much distance for eyes to track. So motion blur on handheld OLED is generally a non-issue -- It's a really very good OLED display, but I was originally talking about full size displays, which is a lot more challenging, as they need to be much brighter than a portable display, thus requiring longer OLED strobes, creating more opportunity for eye-tracking-based motion blur.
I've compared Rayman Origins on Vita (60fps) with Rayman running on Xbox 360 (1080p @60) and there's more noticeable blur on the Vita than on my Panasonic st30. I think your assessment of 1st gen OLED motion blur is right on the mark. The Vita is my only experience with OLED, so I'm not sure how other small panels in other devices compare.
Mastperf is offline  
post #59 of 72 Old 10-26-2012, 08:18 PM
AVS Special Member
 
Mark Rejhon's Avatar
 
Join Date: Feb 1999
Location: North America
Posts: 8,125
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 3 Post(s)
Liked: 102
Quote:
Originally Posted by Mastperf View Post

I've compared Rayman Origins on Vita (60fps) with Rayman running on Xbox 360 (1080p @60) and there's more noticeable blur on the Vita than on my Panasonic st30. I think your assessment of 1st gen OLED motion blur is right on the mark. The Vita is my only experience with OLED, so I'm not sure how other small panels in other devices compare.
Yes, that's correct. 1st gen OLED are sample-and-hold, while later gen OLED is more impulse-driven. You need very bright OLED pixels for shorter impulses. (See how bright the impulses are on a CRT display in high speed video). OLED needs to strobe impulse at ultra high brightness, to eliminate motion blur as much as a CRT, without a dim picture.

Fortunately, many (not all!) OLED displays often use PWM dimming, sometimes at just one PWM flash per refresh -- that conveniently provides the impulse nature of a CRT display (or an equivalent of black frame insertion / scanning mechanism). This is not applicable to all OLED displays, but lowering the brightness of OLED will often improve the motion resolution of an OLED, because of shorter PWM impulses, especially if it is strobing only once per refresh. On some cell phones, OLED's are equivalent to sample-and-hold at maximum brightness, but resembles increasingly an impulse-driven display at dimmed brightness settings due to PWM dimming (ideally one strobe per refresh, for the "CRT effect", one impulse per pixel per refresh)

Thanks,
Mark Rejhon

www.BlurBusters.com

BlurBusters Blog -- Eliminating Motion Blur by 90%+ on LCD for games and computers

Rooting for upcoming low-persistence rolling-scan OLEDs too!

Mark Rejhon is offline  
post #60 of 72 Old 11-02-2012, 12:37 AM
AVS Special Member
 
Joe Bloggs's Avatar
 
Join Date: Apr 2007
Location: UK
Posts: 2,603
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 175 Post(s)
Liked: 60
Quote:
Originally Posted by Mark Rejhon View Post

Yes, that's correct. 1st gen OLED are sample-and-hold, while later gen OLED is more impulse-driven. You need very bright OLED pixels for shorter impulses. (See how bright the impulses are on a CRT display in high speed video). OLED needs to strobe impulse at ultra high brightness, to eliminate motion blur as much as a CRT, without a dim picture.
It's a shame. When James Cameron and Peter Jackson are trying to eliminate strobing, they start adding it OLED, before big OLED TVs are a reasonable price range. Hopefully there will still be an option to turn the strobing off.
Joe Bloggs is offline  
Reply OLED Technology and Flat Panels General

Tags
Plasma Hdtv , Lcd Hdtv , Displays , Sony 32 Bravia Lcd Hdtv Kdl 32ll150 , Xbox 360 250gb Console With Kinect
Gear in this thread

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off