AVS Forum banner
1 - 20 of 65 Posts

·
Registered
Joined
·
229 Posts
Discussion Starter · #1 ·

Auto Motion Plus I understand is supposed to help LCD/LED displays in terms of minimizing blur for fast action sequences. On Samsung TVs, they offer you to control blur and judder reduction separately. I believe blur reduction is supposed to be for 30 or 60 fps content and judder reduction for 24 fps. So on my Samsung LED, I left blur reduction at default (5) and turned off judder reduction (0). 

 

But I am having a problem when watching regular TV from my cable box. For instance, when I'm watching an NCAA game on ABC or whatever, sometimes I see a lot of unsteadiness or judder going on on the screen, especially after they show a replay. But when I turn Auto Motion Plus from Custom (Blur 5, Judder 0) to the Standard setting, I don't see any judder during the game or after the replays.

 

Why is this happening? I thought all sorts of broadcast was at least 30 fps unless you are watching a movie or something. Is it because the replays are different than the regular footage of the game? Is that the cause of the judder to occur or auto motion plus lagging?

 

So for regular TV viewing, should I leave it on Standard? What about for game consoles? And for movies from DVD/BD, do I turn it off?
 

·
Registered
Joined
·
229 Posts
Discussion Starter · #4 ·

Quote:
Originally Posted by airscapes  /t/1523682/auto-motion-plus#post_24513606


Have all the processing crap turned off on my Samsung. If it wasn't in the original signal, I don't want to see it.
I would turn everything off on plasmas, but on LCD/LED I think the technology should be turned on to minimize blur for say sporting events. LCD/LED are sample and hold technologies that insert frames in between actual frames to try to smooth out things that would otherwise look blurry. 
 

·
Registered
Joined
·
229 Posts
Discussion Starter · #5 ·

Quote:
Originally Posted by xvfx  /t/1523682/auto-motion-plus#post_24514230


ZKACAL, how much blur do you experience?
Since blur reduction is on default, I wouldn't say it's blur. It's look more laggy and juddery for lack of a better term. I'm thinking to leave it on standard for regular cable viewing but movies turn it off to prevent soap opera effect. Not sure about console gaming (probably off to decrease input lag?).
 

·
Registered
Joined
·
6,335 Posts

Quote:
Originally Posted by ZKACAL  /t/1523682/auto-motion-plus/0_100#post_24513536



Why is this happening? I thought all sorts of broadcast was at least 30 fps unless you are watching a movie or something. Is it because the replays are different than the regular footage of the game? Is that the cause of the judder to occur or auto motion plus lagging?

Just because the manufacture puts a feature on a TV does not mean it works.. If it looks better off, then the option should be off. Ask Samsung why the option does not work as you think it should or they say it does.. The reality of life is TVs have options for marketing reasons not because they work well.
 

·
Banned
Joined
·
3,587 Posts
Blur - this is an inherrent technology limitation of LCD and LCoS display technology. LC pixels cannot change state fast enough to reproduce motion without adding blur to motion. So if the camera is NOT moving, objects that are moving within the picture will be blurred compared to what they look like when they are NOT moving. For example, a basketball moving across the court might look like a brown blur - part of that is because the video is being captured at 30 frames per second with fairly slow shutter speeds (indoor lighting), but the LCD display adds more blur. In fact, you can MEASURE the blur caused by LCDs and most of the time, the resolution of a 1080p image typically drops to about 300p on an LCD display operating at 60 Hz. LCD manufacturers have increased refresh rates to 120 Hz and even 240 Hz in an effort to mask the blue inherrent in LCD displays, In addition, they are even using black frame insertion to "hide" blur but that makes images darker when you turn it on. Blur affects EVERYTHING ALL THE TIME. If the camera pans across the court, EVERYTING in the image will be blurred.... except, perhaps a player or ball that happens to be moving in sync with the camera.


Judder - judder happens when there is a mismatch.between the frames per second used to capture the original program and the frames per second the video display is using to display those images. Movies for cinema are almost always captured and displayed at 24 frames per second. That applies to film and digital cinema. There is some discussion for increasing the frame rate to 48 frames per second or more. Video is captured at 30 frames per second.


TV programming can be 24p or 30p depending on how it was captured, Most TV studio work like news, game shows, talk shows, etc. are going to be 30p. But comedies or drama could be 24 or 30p. What you do NOT want to do, EVER, is somehow force a source to send 24p when the source was shot at 30p. That causes VERY noticeable judder. Showing 24p original material st 30p requires inserting 6 duplicate frames per second. That causes a visible-but-not-horrible judder.


ABC and FOX and their affiliates broadcast at 720p (over the air or from cable or satellite providers). All other networks broadcast at 1080i.


Presumably your display is compatible with 24p movies from Blu-ray or from a cable or satellite or over the air channel... if so, the TV will duplicate frames to display the movie at 72p or 96p or 120p or 240p... at 240p each of the 24 frames is flashed 10 times. At 120 Hz refresh rate, each frame would be flashed 5 times (each one staying on the screen longer than the 240 Hz display. If the display is 60 Hz only, the TV will add 3:2 pulldown to convert the 24p original to 30p then flash each frame 2 times to get to 60 Hz. We are pretty much past those 60 Hz displays these days, though low-end TVs may still support only 60 Hz.


You want the options affecting frame rate to be set to Auto so that 24p sources are displayed at an even multiple of 24. And so 30p sources (much TV plus live music concert recordings and such) are displayed at an even multiple of 30. If you pick a "fixed" frame rate instead of using Auto, you will have times when you will have bad judder problems and other times when images look great. "Auto" mode will allow, say, a disc player to "talk" to your TV to find out your TV's capabilities so it sends the best frame rate and resolution your TV can handle. There is also often a "Film Mode" that can cause all kinds of trouble by trying to "force" everything to either 24p or 30p (or some even multiple). The only REAL use for Film Mode these days is DVD playback where the 30p encoded on the DVD can be "fixed" to remove the duplicated movie frames bringing the DVD back to 24p without judder. If you don't watch DVDs at all or not very often, setting Film Mode to OFF is generally the best choice.


So judder happens DURING motion... if you were showing a still image, you'd never know you were in a mode that was going to cause judder. But as soon as there is visible motion, judder can come into play. You can see how duplicating frames can cause a sense of uneven motion if some frames are duplucated while others are NOT duplicated.


Blur is also only apparent when the image contains motion, but blur is ENTIRELY different than judder... it's a technology limitation, not an artificially induced motion issue like judder usually is. Blur is helped by higher refresh rates like 120 Hz or 240 Hz. Judder is sort of a "stop motion" sort of problem because of the duplicated frames (or that rare case where GOOD frames are removed when they should NOT be removed and motion jumps forward unevenly.


When you find the right settings for your sources and TV, you shouldn't ever see motion judder. Blur... it's always going to be there to some extent. Byt you can experiment with the various options the TV offers to minimize blur... those are typically higher frame rates, black frame insertion (every manufacturer calls this something different, but the ultimate result is that images will get darker when it s turned on), and frame interpolation. Frame interpolation actually ADDS FRAMES that never existed in the original. And moves objects to new locations within the newly created frame. So if Frame 1 has a car at the left edge of the frame and Frame 2 has the same car at the right edge of the frame, the new frame, call it 1.5, will have a car in the middle of the frame... and the more intense of a setting you use for frame interpolation, the sharper the car will be in frame 1.5. In fact, frame interpolation can create a MUCH sharper car in frame 1.5 than the car that existed in frame 1 or frame 2. This tends to make movies at 24p look more like video shot at 30p (or 30i) leading people to call that effect "soap opera effect: because most soap operas are shot with video cameras and look sharper (motion wise) than they would if shot at 24p/24fps. People are going to have to let go of 24 frame slow movie frame rates because it is inevitable that movies will abandon 24 fps in the not too distant future and we are going to get sharper motion in our video systems at home once higher frame rate source materials become common and we have a new video system to support all the new capabilities (like UHD).


I haven't seen a TV with the controls you describe in the original post. but they are going to do some version of what is described above. Your "home work" is to understand blur and jitter as independent topics, then read the TV manual again to see if it makes any more sense... and experimenting with settings after researching a bit to find the settings that work the best.


There is absolutely NOTHING WRONG with using frame interpolation, black frame insertion, higher refresh rates, or anything else the manufacturer may offer --- IF those options make the images you see more enjoyable for you.
 

·
Registered
Joined
·
1,372 Posts
What hertz is 1080i broadcast in America? Am I right in reading it's 60Hz? I'm also wondering why British broadcast seems to use 50Hz.


It's a weird feeling after watching Blu-Ray in 24p for hours then come back to 50Hz broadcast. It almost feels like Motion Plus was on.
Quote:
Originally Posted by ZKACAL  /t/1523682/auto-motion-plus#post_24514701


Since blur reduction is on default, I wouldn't say it's blur. It's look more laggy and juddery for lack of a better term. I'm thinking to leave it on standard for regular cable viewing but movies turn it off to prevent soap opera effect. Not sure about console gaming (probably off to decrease input lag?).

I never use Motion Plus because as you said, the soap opera effect feels nasty with films. Even with Blur Reduction between 5 - 10 and Judder 0. It feels like a realistic dream. It's so false.


From one of Doug's posts throughout the years about Google, you should be on Auto1 Film Mode for 1080i and never have those gimmicks on for gaming. Ever.
 

·
Registered
Joined
·
8,864 Posts
"Blur" on modern LCDs used to watch broadcast/satellite/cable TV sources is *not* the fault of the set (at least as long as all the extra marketing "features" are turned off.)


"LCD Blur" is nothing more than a function of the mpeg encoders "hitting the wall" due to a lack of digital bandwidth availability.


You'd have to have a "pristine" 60p source (usually generated by a computer) to be able to actually *see* LCD blur occurring.


FOX/ABC/ESPN 720p sports ain't gonna do it. MPEG encoding is ultimately still the limiting factor. See: "Swimming Grass." (NFL broadcasts.)


For 24p sources like BluRay, LCD blur is a complete non-factor. (This doesn't mean that you won't see effects from the original slow shutter rate.
)


Here's a test. If you think you are seeing "LCD Blur," hit pause on your DVR a/o BD player. If the "frozen frame" is completely free of *any* blur, then your LCD is to blame, otherwise, one way or another, the blur is baked into the source material. Simple.



PS: Plasma's actually *do* dither by design ... and you can spot that happening pretty quickly. Pick your poisons, then be happy.



PPS: I'm not an LCD Blur denier, I'm just saying it isn't really relevant compared to the typical digital video source based blur.
 

·
Registered
Joined
·
229 Posts
Discussion Starter · #10 ·

I'm getting the impression that most TV enthusiasts turn off Auto Motion Plus and Film Mode and consider them to be more or less marketing gimmicks. I think my Samsung has a 240 Hz refresh rate so the blur isn't as significant. And I believe if I were to turn off any sort of processing that the refresh rate would stay at its 240 Hz. Is that correct?

 

My Sharp has a Fine Motion Advanced control where you can choose from 240Hz to any smaller rates as well as a Film Mode.

 

My Sony does not have a motion control, only a film mode.

 

My Panasonic Plasma has 3:2 pulldown and 24p direct in. 
 

·
Registered
Joined
·
12,660 Posts

Quote:
Originally Posted by HDTVChallenged  /t/1523682/auto-motion-plus#post_24519232


"Blur" on modern LCDs used to watch broadcast/satellite/cable TV sources is *not* the fault of the set (at least as long as all the extra marketing "features" are turned off.)


"LCD Blur" is nothing more than a function of the mpeg encoders "hitting the wall" due to a lack of digital bandwidth availability.


You'd have to have a "pristine" 60p source (usually generated by a computer) to be able to actually *see* LCD blur occurring.


FOX/ABC/ESPN 720p sports ain't gonna do it. MPEG encoding is ultimately still the limiting factor. See: "Swimming Grass." (NFL broadcasts.)


For 24p sources like BluRay, LCD blur is a complete non-factor. (This doesn't mean that you won't see effects from the original slow shutter rate.
)


Here's a test. If you think you are seeing "LCD Blur," hit pause on your DVR a/o BD player. If the "frozen frame" is completely free of *any* blur, then your LCD is to blame, otherwise, one way or another, the blur is baked into the source material. Simple.



PS: Plasma's actually *do* dither by design ... and you can spot that happening pretty quickly. Pick your poisons, then be happy.



PPS: I'm not an LCD Blur denier, I'm just saying it isn't really relevant compared to the typical digital video source based blur.
actually lcds blur more with all sources than plasma... just because a source includes blur doesn't mean lcds don't add more blur on top of it


Just put a plasma side by side to a 60hz lcd and feed them the same source. . The lcd will be more blurry regardless of the source
 

·
Banned
Joined
·
3,587 Posts

Quote:
Originally Posted by xvfx  /t/1523682/auto-motion-plus#post_24517458


What hertz is 1080i broadcast in America? Am I right in reading it's 60Hz? I'm also wondering why British broadcast seems to use 50Hz.


It's a weird feeling after watching Blu-Ray in 24p for hours then come back to 50Hz broadcast. It almost feels like Motion Plus was on.

I never use Motion Plus because as you said, the soap opera effect feels nasty with films. Even with Blur Reduction between 5 - 10 and Judder 0. It feels like a realistic dream. It's so false.


From one of Doug's posts throughout the years about Google, you should be on Auto1 Film Mode for 1080i and never have those gimmicks on for gaming. Ever.

The HDTV standard is the same around the world for HD resolutions.


Broadcast HD (also carried on cable and satellite services) is typically 1080i30. The TV repeats each frame 2 times for a 60 Hz refresh rate.


The problem is, when they show a 24p source over HDTV, it usually has 3:2 pulldown added to get the frame rate back to 30. If the program was 24 fps and converted to PAL at 50 Hz, then shown on HDTV at 1080i... I'm not sure WHAT you get on an HDTV when you view the program, but it seems possible that you could have motion problems.


And if the program being shown is "legacy" PAL material with 576 resolution at 50 Hz upconverted to HDTV... you could EASILY have motion judder issues if the TV you purchased doesn't have a mode that's an even multiple of 50
 

·
Banned
Joined
·
3,587 Posts

Quote:
Originally Posted by ZKACAL  /t/1523682/auto-motion-plus#post_24519472


I'm getting the impression that most TV enthusiasts turn off Auto Motion Plus and Film Mode and consider them to be more or less marketing gimmicks. I think my Samsung has a 240 Hz refresh rate so the blur isn't as significant. And I believe if I were to turn off any sort of processing that the refresh rate would stay at its 240 Hz. Is that correct?


My Sharp has a Fine Motion Advanced control where you can choose from 240Hz to any smaller rates as well as a Film Mode.


My Sony does not have a motion control, only a film mode.


My Panasonic Plasma has 3:2 pulldown and 24p direct in. 

That's not the right impression. What you get online is people who HATE HATE HATE something (soap opera effect is one of those things) and take every possible opportunity to make sure the world knows how evil that darn frame interpolation is.


Frankly, frame interpolation makes movies look like they will look once we have new digital cinema standards with higher frame rates and a new home video standard that goes beyond HDTV and may include, at least as an option, higher frame rates than the 24 or 30 fps we are using (with each frame doubled or tripled, etc. to get higher refresh rates. If the TV will refresh at 240 Hz, why not have 240 frames per second (other than the hideously high data transfer rates required to send that much data that fast!). Frankly, 24 frames per second pretty much guarantees blurred motion unless you shoot with a very high shutter speed, like 1/250th of a second which doesn't happen often. As you increase frame rates, you MUST reduce shutter-open times (or frame capture times for digital cameras) and that will automatically reduce the captured blur in images so that everything captured at higher rates will "look like video" whether it is on film or on a digital cinema camera. And once you get significantly faster than 30 fps, images begin to look more convincingly real. Those who whine and complain about "soap opera effect" are like people who rebelled against seat belts, padded dash boards, stereo (mono is perfect, why do we need stereo?), We are, in not too many more years, going to have movie and video standards that produce MUCH better quality motion than we could EVER get from 24 or 30 fps and it won't look anything like what we are suffering with today. Movies aren't made at 24 fps because it is the best possible frame rare. They are made at 24 fps because back in the early days of cinema, they did testing on the public and 24 fps was the slowest frame rate the public would identify as acceptable. The slower the frame rate, the less film you use in the camera and the less film it takes to make a print to release to theaters. So people who rail against "soe" are essentially railing against higher quality motion. That said, movies shot at 48 fps or 60 fps don't have what I'd call soap opera effect exactly... they look BETTER than that. but they do NOT look like 24 fps movies either. I find frame interpolation modes as alternate ways to view content... not as something evil to be avoided at all costs. A movie shot at 24 fps and shown with interpolated frames to get it up to 48 fps won't look as good as the same movie SHOT at 48 fps. Mostly because the original 24 fps are going to have more blur in them while all frames in the 48 fps version will be clean and sharp (compared to 24 fps).


The statement about your Panasonic having 24p input capability and 3:2 pulldown is contradictory. 3:2 pulldown is added to 24 frame per second material so that it runs at 30 frames per second (with 6 duplicated frames per second). If the TV has 24p input capability you would NOT use 3:2 pulldown. You would send 24p to the TV and the TV would convert that to some even multiple of 24 for display... like 48, 72, 96, 120 or 240. Duplicating EVERY frame the same number of times has nothing to do with 3:2 pulldown.


LCD blur at 240 Hz is still significant. You may THINK you can't see it, but it's still there. As stated earlier, images in motion on an LCD screen drop from 1080 resolution down to about 300 resolution... and fast refresh rates don't really stop that. All you need to do is drag a resolution target around the screen of a computer connected to your video display to see the effects of LCD blur on motion... it's seriously bad at 60 Hz, less obvious at higher frame rates but still not obviously better than 300 lines per inch. To get legitimately higher resolution during motion on an LCD display, you have to enable frame interpolation at its highest (and sharpest) setting. In that case, the interpolated frames will be very sharp indeed and that will create less of an obvious loss of resolution during motion.


Plasma tech has plenty of problems too. And everybody who though OLED was going to be problem-free... HA! They are in for a big surprise... like pixels illuminated at 100% can't maintain that level of light output very long before the pixel dims to protect itself from head damage. And there's the face that anything organic begins decaying right after it comes into existence... nobody is saying how long we can expect OLED displays to look good before the organic (i.e. carbon) based LEDs begin to drift as they decay.


There is no perfect or ideal video display tech and there may never be one. DLP has, perhaps the fewest compromises but today the only way to get DLP is in a projector. And that's not practical in a room that gets a lot of light during the day.
 

·
Registered
Joined
·
8,864 Posts

Quote:
Originally Posted by PlasmaPZ80U  /t/1523682/auto-motion-plus#post_24519902


actually lcds blur more with all sources than plasma... just because a source includes blur doesn't mean lcds don't add more blur on top of it


Just put a plasma side by side to a 60hz lcd and feed them the same source. . The lcd will be more blurry regardless of the source

Well I actually still have a few CRTs in the house to compare to. If the source looks crappy on the LK450, it looks equally crappy on the CRT(s)



Seriously, even the folks who actually track this stuff and run blur test after test ad nauseum will admit that they have to use specially designed *computer* generated (and never compressed) patterns running at least 60p to be able to detect "LCD blur." We had this very discussion last year ... do we really need to rehash it again.



Broadcast (and satellite/cable) TV never contains a full 60p worth of actual motion. BluRays are almost exclusively 24p. So the logical conclusion would be that the conditions necessary to see "LCD blur" will never be met with "normal" HD digital video sources. OTOH, computer and console games are entirely another story.


Hint: Apparently, one can get away with charging (a lot) more money for a 120Hz or 240Hz LCD with the kung-fu grip upgrades than a for a lowly "blurry" 60Hz display.
And so it goes ...
 

·
Registered
Joined
·
8,864 Posts
PS: One example of why I dislike "frame-interpolation," particularly on older movies.


1) Frame interpolation is based on the assumption that any motion is "constant" between frame A and B.


2) In "olden days" humans actually pushed and pulled cameras around by hand sometimes on dolly/track sometimes on booms/cranes.


3) It's very hard for a human to maintain a constant rate of motion while pushing said cameras about. This inevitably leads to wobbly camera tracking ... which is actually mitigated slightly by slower frame rates.


4) Frame interpolation:

a) can't really tell what the camera motion between frame A and B was supposed to be,

b) just makes the wobbliness worse,

c) which often results in motion sickness.


PPS: I didn't care for the NYPD Blue so-called "steady"-cam look either.
I don't want to even think about what that would look like with FI.



I'm not against higher native frame rates. I just don't see the point of trying to interpolate 24p into 120p. And while we're at it, 480i/p still looks better at it's native resolution on a SD CRT than it does on a 1080p Plasma.
 

·
Registered
Joined
·
8,864 Posts

Quote:
Originally Posted by xvfx  /t/1523682/auto-motion-plus#post_24521895


So many didn't like it in The Hobbit...

I've never had the chance to see the 48fps version. OTOH, I did see the first movie in a commercial, 24p DLP theater and marveled at how many sequences melted into a mess of blur and blocks (particularly most of the Goblin cave "romp.") Clearly there were no LCD's around to blame there.
 

·
Registered
Joined
·
8,391 Posts
Blur is a natural phenomena - wave your hand in front of your face - so i don't see a problem with some minor blur on LCd tech



I was surprised to see blur in fast motion on the LG Gallery OLED in a store. According Samsung OLED motion is not natural that is why they had to use sample and hold. On both Samsung and LG OLED advise is to use Motion Interpolation options (Samsung also has some Black Frame Insertion alternative) which i find objectionable. Without motion enhancements the Samsung and LG's motion resolution is 300 lines. In most OLED reviews motion problems are mentioned. Plasma is gone and LCd2 is on the rise
 

·
Banned
Joined
·
3,587 Posts

Quote:
Originally Posted by HDTVChallenged  /t/1523682/auto-motion-plus#post_24523162


I've never had the chance to see the 48fps version. OTOH, I did see the first movie in a commercial, 24p DLP theater and marveled at how many sequences melted into a mess of blur and blocks (particularly most of the Goblin cave "romp.") Clearly there were no LCD's around to blame there.

All early digital projection sucked. The tech wasn't really "there" yet. I saw Star Wars 1 in a fairly-rare-at-the-time digital presentation at the Metreon theaters in San Francisco and the curves and A and W in the "Star Wars" crawl at the beginning of the movie were LOADED with stair-step aliasing because the resolution was WAY WAY WAY too low for the screen size and typical viewing distances in that theater. Today you can go to the same theater and see a digital presentation without obvious aliasing artifacts.
 
1 - 20 of 65 Posts
Top