240hz, 120hz, 60hz Effect with Cable TV - AVS Forum
Forum Jump: 
Reply
 
Thread Tools
post #1 of 38 Old 02-23-2009, 12:25 PM - Thread Starter
Newbie
 
jim946's Avatar
 
Join Date: Feb 2009
Posts: 11
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
I don't understand some things.

I purchased the Sony 52' W series with 120hz. From what I am hearing my Comcast HD cable only transmits in 60hz, but I can get 120hz from my Samsung Blueray. I have a 120hz 1080p cable on the blueray but only 60hz 1080p cable on the Comcast HD.

I still get "Jittery" motion on screen. I have set the Sony to "High" setting under options. Why it doesn't ship that way I don't understand.

I think I have everything set correctly but still get some "Jittery Motion". Less on Blueray.

From reading reviews I can see that even the 120hz still has some Jittery Motion" issues. Reviewers state that's why Plasma still has the advantage on "Motion" as it uses "Gas" instead of "Liquid" and runs at 240hz.

My question... if Comcast only sends a 60hz signal and if there are only 120hz cables available for even the faster blueray... how can Plasma or LCD 240hz show less "Jittery Motions" than the 120hz LCD? Am I missing something?

Should I have a 120hz cable on my Comcast Digital Cable? How will that improve the "Jitters" if they only transmit in 60hz?

I just bought the TV so I can still return it. I purchased the LCD over the Plasma for long-term reliability concerns that I read about. LCD seems to the be growing platform as well. The 240hz Sony just came out so I'm wondering if I should bring my 120hz back and get the 240hz, but I still don't understand how it will improve my HD Cable reception if Comcast is transmitting 1080i at 60hz.

Any comments would be greatly appreciated.

Jim
jim946 is offline  
Sponsored Links
Advertisement
 
post #2 of 38 Old 02-23-2009, 12:52 PM
AVS Special Member
 
Gary McCoy's Avatar
 
Join Date: Jul 1999
Location: San Jose, California, USA
Posts: 6,246
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 12 Post(s)
Liked: 42
Most 60Hz sources will not benefit from the jitter reduction of the 120Hz or 240Hz sets. The benefit only comes with two sources:

1) Blu-Ray and HD-DVD players where the output has been specificly set on the player at 24Hz. The 120Hz set displays each frame 5 times, and motion is smoothed. However, if you leave your player set for 60Hz (the default setting for all of them) there is no improvement. Not all players will output 24Hz.

2) Broadcast video source material shot at 30fps or 60fps will be smoothed on 60Hz and 120Hz and 240Hz sets.

The motion smoothness comes about due to the frame rate of the display. This is independant of the setting on the Sony which is for Frame Interpolation. I happen to like this Frame Interpolation and use it on my Samsung. It improves the image clarity of moving images - but the High setting can introduce motion artifacts.

Gary McCoy
The United States Constitution ©1791. All Rights Reserved.

Gary McCoy is offline  
post #3 of 38 Old 02-23-2009, 01:57 PM - Thread Starter
Newbie
 
jim946's Avatar
 
Join Date: Feb 2009
Posts: 11
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Still have my question... will a Plasma have less jittery motion on my Comcast 60hz signal than my LCD 120hz?
jim946 is offline  
post #4 of 38 Old 02-23-2009, 02:12 PM
Advanced Member
 
Zivman's Avatar
 
Join Date: Nov 2003
Location: MPLS/St Paul
Posts: 766
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by jim946 View Post

Still have my question... will a Plasma have less jittery motion on my Comcast 60hz signal than my LCD 120hz?

You need to understand what LCD and 120hz is all about... it is a marketing gimmick used to cover up lcd's innability to handle motion....
Zivman is offline  
post #5 of 38 Old 02-23-2009, 02:28 PM
AVS Special Member
 
VarmintCong's Avatar
 
Join Date: Mar 2008
Posts: 1,089
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 11
Forget about 120Hz for a moment.

My Samsung 650 has problems with Comcast's HD cable - I think it's a deinterlacing problem - Comcast sends your TV everything in 1080i, even if it's a 720p broadcast.

That means the cable box is interlacing 720p and upscaling to 1080i, then your TV is deinterlacing that 1080i to 1080p.

Somewhere here it gets screwed up, and the picture becomes jerky. I watch 24 on Comcast HD, it's jerky - watch the DVD, it's smooth.

The solution for me is to use motion interpolation for HD cable, that smooths it out, and it's fairly artifact free if the channel is high def.
VarmintCong is offline  
post #6 of 38 Old 02-23-2009, 03:05 PM
Advanced Member
 
DBLASS's Avatar
 
Join Date: Dec 2006
Posts: 730
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
120Hz= Technological fake out. It creates what isn't there. Therefore, it is never quite right.

Unless the source is outputing 120Hz (and most don't) and if the display can take 4 pixels in parallel (and most don't), the rest is a bit "smoke and mirrors".
DBLASS is offline  
post #7 of 38 Old 02-23-2009, 03:05 PM
AVS Club Gold
 
HogPilot's Avatar
 
Join Date: Jun 2006
Location: Good Ol' US of A
Posts: 2,884
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 3 Post(s)
Liked: 19
There's a lot of "Hz" specs and claims out there. Don't even waste money on cables that claim to be "120Hz" compatible just because you have a display capable of refreshing at 120Hz - it's just marketing hyperbole. Any HDMI 1.3 compatible cable can support the higher bandwidth made available by the HDMI 1.3 spec. That bandwidth can be used to carry expanded gamuts, higher bit depth color, higher resolution than 1920x1080, or a higher framerate than 60Hz.

What's important to know is that there are currently NO sources which are recorded at greater than 60Hz, and to my knowledge no widely available commercial displays that will even accept a 120Hz signal. BD players will output a 24Hz signal, and then a display will either show it at some integer multiple of that framerate (Pioneer plasmas use 72Hz and JVC's RS series projectors use 96Hz for example), or it will use 3:2 pulldown to display it at 60Hz.

LCD touts a 120Hz refresh rate because of inherent differences in LCD technology which make it more vulnerable to display-induced motion blur (as opposed to motion blur that results from slow frame-rate recordings like film). Plasma doesn't have this issue because, in simple terms, its pixels can "switch" on and off much faster than older 60Hz LCDs (CRT is even faster).

To directly answer your question, you will see no difference between using the so-called "120Hz" cable with your Comcast box vs a "60Hz" cable because the Comcast box is limited to a 60Hz output.

There are 10 types of people: those who understand binary, and those who don't.

HogPilot is online now  
post #8 of 38 Old 02-23-2009, 03:38 PM
Member
 
soundwatts's Avatar
 
Join Date: Jan 2009
Posts: 121
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by HogPilot View Post

There's a lot of "Hz" specs and claims out there. Don't even waste money on cables that claim to be "120Hz" compatible just because you have a display capable of refreshing at 120Hz - it's just marketing hyperbole. Any HDMI 1.3 compatible cable can support the higher bandwidth made available by the HDMI 1.3 spec. That bandwidth can be used to carry expanded gamuts, higher bit depth color, higher resolution than 1920x1080, or a higher framerate than 60Hz.

What's important to know is that there are currently NO sources which are recorded at greater than 60Hz, and to my knowledge no widely available commercial displays that will even accept a 120Hz signal. BD players will output a 24Hz signal, and then a display will either show it at some integer multiple of that framerate (Pioneer plasmas use 72Hz and JVC's RS series projectors use 96Hz for example), or it will use 3:2 pulldown to display it at 60Hz.

LCD touts a 120Hz refresh rate because of inherent differences in LCD technology which make it more vulnerable to display-induced motion blur (as opposed to motion blur that results from slow frame-rate recordings like film). Plasma doesn't have this issue because, in simple terms, its pixels can "switch" on and off much faster than older 60Hz LCDs (CRT is even faster).

To directly answer your question, you will see no difference between using the so-called "120Hz" cable with your Comcast box vs a "60Hz" cable because the Comcast box is limited to a 60Hz output.

Are you saying the HDMI 1.3 cables are different than regular HDMI cables?

I thought this was a scam. What would the advantages be?
soundwatts is offline  
post #9 of 38 Old 02-23-2009, 05:29 PM
AVS Special Member
 
borf's Avatar
 
Join Date: Oct 2003
Posts: 1,172
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 14
Quote:
Originally Posted by jim946 View Post

I have a 120hz 1080p cable on the blueray but only 60hz 1080p cable on the Comcast HD.

don't let them sucker you into buying a 120hz cable - there are no 120hz sources! a 120hz tv converts standard sources from blu-ray or cable box into 120/240hz. This happens in the TV after the signal has been transmitted so bandwidth to the TV is the same as it has always been. Some one for sure thought up a nice shtick though with these 120hz cables.

Quote:
Originally Posted by jim946 View Post

I still get "Jittery" motion on screen. I have set the Sony to "High" setting under options.

if 120hz is working you should see much less judder with 24p stuff (barely any with 60hz stuff though since 60p doesn't judder on a 60hz TV). Game mode turns the motion processing off. If you're referring to occasional hiccups - thats normal unfortunately.

Quote:
Originally Posted by jim946 View Post

Plasma still has the advantage on "Motion" as it uses "Gas" instead of "Liquid" and runs at 240hz.

what!

Quote:
Originally Posted by jim946 View Post

My question... if Comcast only sends a 60hz signal and if there are only 120hz cables available for even the faster blueray... how can Plasma or LCD 240hz show less "Jittery Motions" than the 120hz LCD? Am I missing something?

The tv's motion processing does it.


Quote:
Originally Posted by jim946 View Post

Should I have a 120hz cable on my Comcast Digital Cable? How will that improve the "Jitters" if they only transmit in 60hz?

it won't. the Tv does it.
borf is offline  
post #10 of 38 Old 02-23-2009, 06:15 PM
AVS Club Gold
 
HogPilot's Avatar
 
Join Date: Jun 2006
Location: Good Ol' US of A
Posts: 2,884
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 3 Post(s)
Liked: 19
Quote:
Originally Posted by soundwatts View Post

Are you saying the HDMI 1.3 cables are different than regular HDMI cables?

I thought this was a scam. What would the advantages be?

The HDMI 1.3 spec doubled the amount of bandwidth to be available for various possible future things (greater-than-HD res, greater color bit depth, larger color gamuts, framerates greater than 60fps, etc). To my knowledge, there's no widely available commercial products that use any of that bandwidth. Some cables may have already been capable of supporting that, many probably can't. You certainly don't NEED to have "HDMI 1.3" cables to take advantage of what 1.3 currently offers - namely passing bitstreamed codecs instead of just PCM - but just keep in mind that older cables may or may not have the bandwidth to support stuff like DeepColor and xvyCC down the road.

For the time being, it doesn't matter.

There are 10 types of people: those who understand binary, and those who don't.

HogPilot is online now  
post #11 of 38 Old 02-23-2009, 08:58 PM
AVS Special Member
 
Gary McCoy's Avatar
 
Join Date: Jul 1999
Location: San Jose, California, USA
Posts: 6,246
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 12 Post(s)
Liked: 42
Read what I said. A little more detail:

Motion judder is introduced via a technique used to synchronize 24 frames-per-second film with a 60Hz display. This is called "Telecine". The judder comes about because a 60Hz display will display one film frame for three video frames, the next for two frames, the next for 3, and so forth. The motion judder results from the uneven lengths of time each original film frame is seen on screen in a 60Hz display. I don't care whether that display is an LCD, a Plasma panel, or a CRT - if it refreshes at 60Hz, it has motion judder.

A 120Hz LCD or a 120Hz CRT display or a 72Hz plasma display can suppress this judder and restore the original film motion by displaying each film frame an equal amount of time.

===> No 60Hz display offers smooth motion with film source material. None.

Frame Interpolation is entirely seperate although it is only implemented on LCD technology displays AFAIK. The frame interpolation induces an image clarity that allows the 120Hz LCD to exceed the clarity of the film original. It does this by synthesizing a new video frame in between original source frames. If you have ever noticed that live soap operas on TV look realer than film movies, you have noticed this effect - it is also called the SOE or Soap Opera Effect.

Frame interpolation on a "High" setting can also introduce motion artifacts - double edges, moire patterns, etc on moving images. The SOE is also maximized by the frame interpolation on "High" - but artifacts are introduced.

I am one who believes that a few hard-to-see artifacts are a small price to pay for the additional image clarity of frame interpolation. I also do not have a philisophical issue with an image that offers improved clarity over the film original because some algorythmic processing was done in an image processing chip.

Feel free to retain your own opinion of frame interpolation, to each his own. But the motion-smooting benefits of 120Hz displays have long been known from CRTs, well before plasmas or LCDs were available at consumer prices.

==> You can't get this motion smoothing on a 60Hz LCD or a 60Hz plasma or a 60Hz CRT. It is specificly a benefit of a display that refreshes at 72Hz or 96Hz or 120Hz or 240Hz - any integer multiple of 24fps. If that same display refresh rate is also is an integer multiple of the 30Hz and 60Hz video frame rates, that display also offers smooth motion on video source material.

Seperate 120Hz from frame interpolation, understand that they are different techniques with different benefits.

Gary McCoy
The United States Constitution ©1791. All Rights Reserved.

Gary McCoy is offline  
post #12 of 38 Old 02-24-2009, 03:09 PM
QZ1
AVS Special Member
 
QZ1's Avatar
 
Join Date: Sep 2002
Location: S.E. PA
Posts: 5,047
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 11
Quote:
Originally Posted by VarmintCong View Post

My Samsung 650 has problems with Comcast's HD cable - I think it's a deinterlacing problem - Comcast sends your TV everything in 1080i, even if it's a 720p broadcast.
That means the cable box is interlacing 720p and upscaling to 1080i, then your TV is deinterlacing that 1080i to 1080p.

No, they don't. Comcast sends channels in the resolution, at which they were received. It is the box, that is converting them to a certain resolution. One can set the box to output everything (or just HD channels), at either 1080i or 720p. There is also an optional 4:3 Override, whcih allows SD to be outputted at a different resolution, either 480i or 480p.
QZ1 is offline  
post #13 of 38 Old 02-27-2009, 08:37 AM - Thread Starter
Newbie
 
jim946's Avatar
 
Join Date: Feb 2009
Posts: 11
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by Gary McCoy View Post

Read what I said. A little more detail:

Motion judder is introduced via a technique used to synchronize 24 frames-per-second film with a 60Hz display. This is called "Telecine". The judder comes about because a 60Hz display will display one film frame for three video frames, the next for two frames, the next for 3, and so forth. The motion judder results from the uneven lengths of time each original film frame is seen on screen in a 60Hz display. I don't care whether that display is an LCD, a Plasma panel, or a CRT - if it refreshes at 60Hz, it has motion judder.

A 120Hz LCD or a 120Hz CRT display or a 72Hz plasma display can suppress this judder and restore the original film motion by displaying each film frame an equal amount of time.

===> No 60Hz display offers smooth motion with film source material. None.

Frame Interpolation is entirely seperate although it is only implemented on LCD technology displays AFAIK. The frame interpolation induces an image clarity that allows the 120Hz LCD to exceed the clarity of the film original. It does this by synthesizing a new video frame in between original source frames. If you have ever noticed that live soap operas on TV look realer than film movies, you have noticed this effect - it is also called the SOE or Soap Opera Effect.

Frame interpolation on a "High" setting can also introduce motion artifacts - double edges, moire patterns, etc on moving images. The SOE is also maximized by the frame interpolation on "High" - but artifacts are introduced.

I am one who believes that a few hard-to-see artifacts are a small price to pay for the additional image clarity of frame interpolation. I also do not have a philisophical issue with an image that offers improved clarity over the film original because some algorythmic processing was done in an image processing chip.

Feel free to retain your own opinion of frame interpolation, to each his own. But the motion-smooting benefits of 120Hz displays have long been known from CRTs, well before plasmas or LCDs were available at consumer prices.

==> You can't get this motion smoothing on a 60Hz LCD or a 60Hz plasma or a 60Hz CRT. It is specificly a benefit of a display that refreshes at 72Hz or 96Hz or 120Hz or 240Hz - any integer multiple of 24fps. If that same display refresh rate is also is an integer multiple of the 30Hz and 60Hz video frame rates, that display also offers smooth motion on video source material.

Seperate 120Hz from frame interpolation, understand that they are different techniques with different benefits.

So what you are saying is if I take Frame interpolation from "High" to "Standard" or "Off" it will reduce the "blur" effect I am seeing but it will reduce the image quality?

So a 60hz Plasma... Is the image quality reduced from LCD with Frame interpolation on "High"? What I'm asking, can I get the same image quality that I get from my LCD with frame interpolation on high on a Plasma without the "motion blur"?

I notice with the Frame interpolation set to "standard" or "off" a blur ray movie is "grainy". I love the super sharp image of blue ray on my LCD with Frame interpolation set to "High", but hate the motion blur... how can I have the best of BOTH worlds? Is Plasma my answer?


Thanks.
jim946 is offline  
post #14 of 38 Old 02-27-2009, 08:46 AM
AVS Special Member
 
VarmintCong's Avatar
 
Join Date: Mar 2008
Posts: 1,089
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 11
Quote:
Originally Posted by QZ1 View Post

No, they don't. Comcast sends channels in the resolution, at which they were received. It is the box, that is converting them to a certain resolution. One can set the box to output everything (or just HD channels), at either 1080i or 720p. There is also an optional 4:3 Override, whcih allows SD to be outputted at a different resolution, either 480i or 480p.

I know it's fun to say "no" on the internet, but obviously I omitted the part about how you can choose 480p or 720p output from your cable box, cause nobody would use that with a 1080p TV.

Some other cable boxes let you send native output, but not Comcast's shiitboxes.
VarmintCong is offline  
post #15 of 38 Old 02-28-2009, 09:33 PM
AVS Special Member
 
brentsg's Avatar
 
Join Date: Jul 2003
Posts: 1,942
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 12
Quote:
Originally Posted by jim946 View Post

I notice with the Frame interpolation set to "standard" or "off" a blur ray movie is "grainy". I love the super sharp image of blue ray on my LCD with Frame interpolation set to "High", but hate the motion blur... how can I have the best of BOTH worlds? Is Plasma my answer?

I don't know why you would see grain with the enhancer off. Perhaps you are seeing the original film grain, which is in the source.
brentsg is offline  
post #16 of 38 Old 05-05-2009, 09:20 AM
Newbie
 
blurrysamsung's Avatar
 
Join Date: May 2009
Posts: 4
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by VarmintCong View Post

Forget about 120Hz for a moment.

My Samsung 650 has problems with Comcast's HD cable - I think it's a deinterlacing problem - Comcast sends your TV everything in 1080i, even if it's a 720p broadcast.

That means the cable box is interlacing 720p and upscaling to 1080i, then your TV is deinterlacing that 1080i to 1080p.

Somewhere here it gets screwed up, and the picture becomes jerky. I watch 24 on Comcast HD, it's jerky - watch the DVD, it's smooth.

The solution for me is to use motion interpolation for HD cable, that smooths it out, and it's fairly artifact free if the channel is high def.


Two months ago I purchased a samsung 650. I had comcast HD service installed. Immediately I noticed a blurring/jittery problem. Not on just fast motion, but on any motion. the screen skipped when a slow panning nature shot was shown. The problem was really apparent when watching ESPN or Headline news where they have the text/scores scrolling across the bottom of the screen. It was really blurry/skippy/jittery. My friends said that it makes people look like they have Parkinson's disease it is so bad! I called comcast and then sent out a tech. They found nothing wrong but I was able to produce the problem. Since then I have had 6 comcast technicians come out to the house (all of them seeing the problem) and none of them were able to fix it. They replaced splitters, boxes, wires, witched component cables to hdmi, switched outlets, switched inputs. I even swithced the output on the cable box trying 720, 480, 1080. I changed the 'AMP' (Auto Motion Plus - 120hz) Off, Low, Med, High. I tried all of the TV settings, all of the cable settings. Nothing fixed the problem!! I thought, maybe it's the tv, so I returned it and got a samsung 750. Hooked it up, SAME PROBLEM!! It's not the TV and I don't know what else it could be. I'm losing sleep over spending this much money on a tv and crappy comcast service. Any ideas to resolve this problem. Should I just switch from comcast to DirecTv?
blurrysamsung is offline  
post #17 of 38 Old 05-05-2009, 09:59 AM
AVS Special Member
 
HoustonPerson's Avatar
 
Join Date: Sep 2005
Location: Dallas, TX
Posts: 3,848
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 18
Quote:
Originally Posted by jim946 View Post

I don't understand some things.

I purchased the Sony 52' W series with 120hz. From what I am hearing my Comcast HD cable only transmits in 60hz, but I can get 120hz from my Samsung Blueray. I have a 120hz 1080p cable on the blueray but only 60hz 1080p cable on the Comcast HD.

I still get "Jittery" motion on screen. I have set the Sony to "High" setting under options. Why it doesn't ship that way I don't understand.

I think I have everything set correctly but still get some "Jittery Motion". Less on Blueray.

From reading reviews I can see that even the 120hz still has some Jittery Motion" issues. Reviewers state that's why Plasma still has the advantage on "Motion" as it uses "Gas" instead of "Liquid" and runs at 240hz.

My question... if Comcast only sends a 60hz signal and if there are only 120hz cables available for even the faster blueray... how can Plasma or LCD 240hz show less "Jittery Motions" than the 120hz LCD? Am I missing something?

Should I have a 120hz cable on my Comcast Digital Cable? How will that improve the "Jitters" if they only transmit in 60hz?

I just bought the TV so I can still return it. I purchased the LCD over the Plasma for long-term reliability concerns that I read about. LCD seems to the be growing platform as well. The 240hz Sony just came out so I'm wondering if I should bring my 120hz back and get the 240hz, but I still don't understand how it will improve my HD Cable reception if Comcast is transmitting 1080i at 60hz.

Any comments would be greatly appreciated.

Jim

Jim, I know you are looking for simple answers in the very confusing world of HDTV. Unfortunately, even within your statement (as you have notice others trying to help), there is reams of conflicting information in your explanation of what you think is going on. So maybe this little bit of information can help.


First of all there a numerous totally unrelated items in your question.

Comcast (and many cable services) quality is generally very poor in most parts of the USA. If you want maximum quality picture, there is nothing like OTA; and today's TV's can easily capture ultra High Quality HD in the top 250 markets in the USA. Also, the 60Hz Comcast is not related to the issue in producing a picture regarding the LCD's 60Hz, 120Hz, and 240Hz, options.

The 60, 120, and 240Hz that is referred to in LCD displays is the backlighting and switches in the front of the LCD panel itself. And the attempt to smooth out and "hide" the jerking, flickering, and popping, that many see in an LCD panel. Combine that with the 60Hz electical current that is supposed ?? to be fed to the TV, and you begin to see some of the problems to over come (good interal power supplies should help?)

So lets take an average LCD 60Hz panel that is fed 60Hz electricity (we hope-that never really happens in the real world); this mean the electricity, the backlight, and the front panel switches must all be in perfect sync so that most people will not see jerking and flickering. And that does work for a lot of people. the problem is, it is seldome perfert. So companies like Samsung came up with 120Hz and that helps a lot; so much so that maybe another 30-40 percent of the population call it perfect. Then came 240Hz LCD panel to attempt to make it smoother still - but the real gain to a few, is an ultra small percentage can see any improvment - for many there is still the jerking and flickering.

Now Samsung make LED backlight (combined with other tricks with the hz), to make it all appear smoother still.

But the underlying problem remains; making the electricity, the backlight, and the front panel switches all in "perfect" sync. Well truefully, there is a long way to go to make it CRT smooth.

Add to this complexity - the human eye. Most people begin to see flicker at around 60Hz and below, and the lower you go the more flicker you will see. Different lighting conditions further compound the problem. Some people can almost go completely nuts with flourescent lights in the room and a standard 60Hz LCD panel.

In other words, there are too many different lighting elements to get in Sync.

Plasma eliminates completely one of the biggest components all together - The Backlight; therefore there is no back light to have to sync up!

So yes the Plasma can make most viewing smoother with less jerkiness. Even with that said there can still be problems with your Comcast source, but the plasma should make it a little smoother for motion - it is much more CRT like. Plasma and CRT just fire up the phosphor to make a picture.

Much of the information others have said above is correct, even though it may sound like it is conflicting information.
HoustonPerson is offline  
post #18 of 38 Old 05-05-2009, 10:19 AM
AVS Special Member
 
HoustonPerson's Avatar
 
Join Date: Sep 2005
Location: Dallas, TX
Posts: 3,848
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 18
Quote:
Originally Posted by jim946 View Post

Still have my question... will a Plasma have less jittery motion on my Comcast 60hz signal than my LCD 120hz?

The simple answer is "yes" in most cases.
HoustonPerson is offline  
post #19 of 38 Old 05-07-2009, 06:52 AM
AVS Special Member
 
VarmintCong's Avatar
 
Join Date: Mar 2008
Posts: 1,089
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 11
Quote:
Originally Posted by blurrysamsung View Post

Two months ago I purchased a samsung 650. I had comcast HD service installed. Immediately I noticed a blurring/jittery problem. Not on just fast motion, but on any motion. the screen skipped when a slow panning nature shot was shown. The problem was really apparent when watching ESPN or Headline news where they have the text/scores scrolling across the bottom of the screen. It was really blurry/skippy/jittery. My friends said that it makes people look like they have Parkinson's disease it is so bad! I called comcast and then sent out a tech. They found nothing wrong but I was able to produce the problem. Since then I have had 6 comcast technicians come out to the house (all of them seeing the problem) and none of them were able to fix it. They replaced splitters, boxes, wires, witched component cables to hdmi, switched outlets, switched inputs. I even swithced the output on the cable box trying 720, 480, 1080. I changed the 'AMP' (Auto Motion Plus - 120hz) Off, Low, Med, High. I tried all of the TV settings, all of the cable settings. Nothing fixed the problem!! I thought, maybe it's the tv, so I returned it and got a samsung 750. Hooked it up, SAME PROBLEM!! It's not the TV and I don't know what else it could be. I'm losing sleep over spending this much money on a tv and crappy comcast service. Any ideas to resolve this problem. Should I just switch from comcast to DirecTv?

First question - is the TV ok for DVD or Blu-Ray? You may be just sensitive to LCD blur.

Mine is nowhere near that bad, in fact it's fine with AMP On, just a bit soap opera-like. But keep in mind the 750 is the same TV as the 650, so you might have better luck with a plasma.

My guess is it's a problem somewhere in Comcast's delivery, and they either won't fix it or don't know how. I'd get Verizon FIOS or DirectTV. Comcast is sucking anyway these days.
VarmintCong is offline  
post #20 of 38 Old 05-07-2009, 12:49 PM
AVS Special Member
 
NuSoardGraphite's Avatar
 
Join Date: Jul 2007
Location: Tucson AZ
Posts: 1,356
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 24 Post(s)
Liked: 110
jim946:

As others have pointed out, Cable highdef feeds are crappy. The best TV out there can only do so much with them.

The "jittery motion" you've experienced, as Gary pointed out, is a problem with the 24hz source in film, translated to 60hz, creating an uneven frame conversion (3/2 pulldown etc). The signal has already been converted to 60hz by your cable box, so there isn't anything your tv can do to fix it. Even Frame Interpolation will only go so far.

To see if the "jittery motion" issue is your TV or the source (cable) watch a live broadcast (or near-live) like American Idol or a concert on Palladia. If that is "jittery", its your tv. If its smooth like butter, its your cable (because of the 24hz to 60hz conversion) Broadcasts like those are oftentimes done at 60hz, so it should play very smooth on your tv. When I watch concert footage on Palladia, its almost like I'm in the crowd, its so smooth, and I only have a 60hz tv. No frame interpolation to smooth things even more.

Keep in mind that the jittery motion issue isn't an LCD weakness. Its present on ALL high def TV's, when the source material is 24hz and being converted to 60hz.

Others have mentioned that 120hz is a gimmick created to cover up the weakness of LCD tv's. This is incorrect. 120hz was created to minimize the problems with converting 24hz material (most movies are shot at 24 frames per second) to 60hz, which is the cause of the jerky motion. Since 120 is divisable by 24, it smooths the motion out. The "Gimmick" that others were talking about is the Frame Interpolation (aka Motion Enhancement like Samsungs AMP) which absolutely was created to cover up the weakness of LCD motion issues.

Stand tall and shake the heavens...
NuSoardGraphite is online now  
post #21 of 38 Old 05-07-2009, 01:48 PM
Advanced Member
 
maygit's Avatar
 
Join Date: Dec 2008
Location: Iowa
Posts: 619
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
This may have been answered above, i apoligize if it has, but i have one quick question.
I understand completely how hz work. However, i always hear people say that plasmas don't need higher hz becasue they handle motion just fine. This makes no sense to me because i always thought keeping up with motion had to do with the response time, not the refresh rate, in which their argument is apples to oranges. Anyone want to shed some light on this for me?

maygit is offline  
post #22 of 38 Old 05-07-2009, 01:54 PM
AVS Special Member
 
xrox's Avatar
 
Join Date: Feb 2003
Posts: 3,169
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 50
Quote:
Originally Posted by maygit View Post

This may have been answered above, i apoligize if it has, but i have one quick question.
I understand completely how hz work. However, i always hear people say that plasmas don't need higher hz becasue they handle motion just fine. This makes no sense to me because i always thought keeping up with motion had to do with the response time, not the refresh rate, in which their argument is apples to oranges. Anyone want to shed some light on this for me?

There are two types of blur produced by the display (not talking about signal blur here).
  • Slow response of the pixel
  • Hold-Type Blur (aka SAH blur)

Increasing the Hz enables manipulation of the SAH effect to reduce this type of blur.

Please review the following POST

Cheers

Over thinking, over analyzing separates the body from the mind
xrox is offline  
post #23 of 38 Old 05-07-2009, 02:10 PM
Advanced Member
 
maygit's Avatar
 
Join Date: Dec 2008
Location: Iowa
Posts: 619
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
I appreciate the post, but if i'm not mistaken that linked thread basically has to do with the benefits of frame interpolation(motion enhancers) to make motion traking seem more lifelike, therefore reducing blur. I'm asking about that without factoring in motion enhancers or response time. Pretend for a second that LCD's have a .001ms response time like plasmas. Why are people saying that 60hz lcd's cant handle motion as well as 60hz plasmas and therefore need 120hz+ to beable to do this? Once again, makes no sense. I'll reread that link one more time to fully get what was stated but i'm pretty sure i know what was there. Thx.

EDIT: o.k. I skipped the hold time thing towards the bottom of that post. Now a new question arises. You have a 60hz plasma and a 60hz LCD. Both receiving a 30fps source, so each frame is refreshed twice so basically making it 60fps. So each frame is displayed and held on screen for 16.7ms on lcd's but only 4-6 on plasmas. What are plasmas doing for that other 10ms? Black screen?

maygit is offline  
post #24 of 38 Old 05-07-2009, 02:19 PM
AVS Special Member
 
xrox's Avatar
 
Join Date: Feb 2003
Posts: 3,169
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 50
Quote:
Originally Posted by maygit View Post

I appreciate the post, but if i'm not mistaken that linked thread basically has to do with the benefits of frame interpolation(motion enhancers) to make motion traking seem more lifelike, therefore reducing blur. I'm asking about that without factoring in motion enhancers or response time. Pretend for a second that LCD's have a .001ms response time like plasmas. Why are people saying that 60hz lcd's cant handle motion as well as 60hz plasmas and therefore need 120hz+ to beable to do this? Once again, makes no sense. I'll reread that link one more time to fully get what was stated but i'm pretty sure i know what was there. Thx.

Even if the response time was infinitely small the blur on a conventional LCD will always be very poor because it has a long hold time. Wherein hold time is defined as the time each individual frame is displayed. That link explains how and why long hold times create blur.

60Hz LCD display each frame for a full 16.7ms while in comparison a 60Hz CRT only display each frame for 1-2ms.

Over thinking, over analyzing separates the body from the mind
xrox is offline  
post #25 of 38 Old 05-07-2009, 02:23 PM
AVS Special Member
 
xrox's Avatar
 
Join Date: Feb 2003
Posts: 3,169
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 50
Quote:
Originally Posted by maygit View Post

EDIT: o.k. I skipped the hold time thing towards the bottom of that post. Now a new question arises. You have a 60hz plasma and a 60hz LCD. Both receiving a 30fps source, so each frame is refreshed twice so basically making it 60fps. So each frame is displayed and held on screen for 16.7ms on lcd's but only 4-6 on plasmas. What are plasmas doing for that other 10ms? Black screen?

Plasma displays build up image information over time, essentially compiling several sub-images into one final image. They repeat this process each and every frame. In a 16.7ms frame they spend ~10ms of the time compiling dark detail sub-images and the remaining time compiling bright detail sub-images. This creates a percieved dark period of ~10ms each and every frame.

CRT and PMOLED use a single pulse to create an image and the rest of the time the screen is off. This is called "impulse" type display.

Over thinking, over analyzing separates the body from the mind
xrox is offline  
post #26 of 38 Old 05-07-2009, 02:25 PM
Advanced Member
 
maygit's Avatar
 
Join Date: Dec 2008
Location: Iowa
Posts: 619
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
ah gotchya. Basically that's what the whole subfield drive system is i'm assuming? Thanks for the post, that's a large and informative thread that i can't read right now at work but i'm going to look into a lot when i get home.

maygit is offline  
post #27 of 38 Old 05-08-2009, 07:48 AM
AVS Special Member
 
[Irishman]'s Avatar
 
Join Date: Nov 2007
Posts: 1,381
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 8 Post(s)
Liked: 51
Quote:
Originally Posted by jim946 View Post

I don't understand some things.

I purchased the Sony 52' W series with 120hz. From what I am hearing my Comcast HD cable only transmits in 60hz, but I can get 120hz from my Samsung Blueray. I have a 120hz 1080p cable on the blueray but only 60hz 1080p cable on the Comcast HD.

I still get "Jittery" motion on screen. I have set the Sony to "High" setting under options. Why it doesn't ship that way I don't understand.

I think I have everything set correctly but still get some "Jittery Motion". Less on Blueray.

From reading reviews I can see that even the 120hz still has some Jittery Motion" issues. Reviewers state that's why Plasma still has the advantage on "Motion" as it uses "Gas" instead of "Liquid" and runs at 240hz.

My question... if Comcast only sends a 60hz signal and if there are only 120hz cables available for even the faster blueray... how can Plasma or LCD 240hz show less "Jittery Motions" than the 120hz LCD? Am I missing something?

Should I have a 120hz cable on my Comcast Digital Cable? How will that improve the "Jitters" if they only transmit in 60hz?

I just bought the TV so I can still return it. I purchased the LCD over the Plasma for long-term reliability concerns that I read about. LCD seems to the be growing platform as well. The 240hz Sony just came out so I'm wondering if I should bring my 120hz back and get the 240hz, but I still don't understand how it will improve my HD Cable reception if Comcast is transmitting 1080i at 60hz.

Any comments would be greatly appreciated.

Jim

Jim, you've let yourself get carried away by the Hz mania that's sweeping through HDTV showrooms and LCD manufacturers.

Do not, I repeat, do not, get obsessed with Hz ratings, or contrast ratio ratings, etc.

Trust your eyes when you watch it at home. True, this means you can't make your "final" choice in the store, but in the long run, you'll be happier. Audition the TV in your home, in your light, connected to your gear.

Then decide whether or not to keep it.

[Irishman] is offline  
post #28 of 38 Old 05-08-2009, 08:24 AM
AVS Special Member
 
NuSoardGraphite's Avatar
 
Join Date: Jul 2007
Location: Tucson AZ
Posts: 1,356
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 24 Post(s)
Liked: 110
Quote:
Originally Posted by [Irishman] View Post

Jim, you've let yourself get carried away by the Hz mania that's sweeping through HDTV showrooms and LCD manufacturers.

Do not, I repeat, do not, get obsessed with Hz ratings, or contrast ratio ratings, etc.

Trust your eyes when you watch it at home. True, this means you can't make your "final" choice in the store, but in the long run, you'll be happier. Audition the TV in your home, in your light, connected to your gear.

Then decide whether or not to keep it.

120hz is actually useful. It allows HDTV's to play 24hz sources (mostly movies) much smoother than a 60hz tv because of the 3/2 pulldown "jitter" issue. A lot of plasma owners don't notice this effect because a lot of plasma tv's are 72hz or 96hz, which are divisable by 24, so they get smooth motion from 24hz sources.

Anything above 120hz though (240hz, 480hz) is pure hyperbole at this point and completly unnecessary. 120hz is more than enough.

I think 120hz is the "sweet spot" because its divisable by 24hz (film) 30hz and 60hz. Most video sources are at one of those 3 speeds including videogames. At 120hz, no video source should experience jitter.

Also please note that "jitter" derived from uneven frames conversion is different from Motion Blur (which is what xrox explained above) and from the original posters description of his problem, sounds like thats what he's experiencing.

Stand tall and shake the heavens...
NuSoardGraphite is online now  
post #29 of 38 Old 05-08-2009, 10:01 AM
AVS Special Member
 
[Irishman]'s Avatar
 
Join Date: Nov 2007
Posts: 1,381
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 8 Post(s)
Liked: 51
Quote:
Originally Posted by NuSoardGraphite View Post

120hz is actually useful. It allows HDTV's to play 24hz sources (mostly movies) much smoother than a 60hz tv because of the 3/2 pulldown "jitter" issue. A lot of plasma owners don't notice this effect because a lot of plasma tv's are 72hz or 96hz, which are divisable by 24, so they get smooth motion from 24hz sources.

Anything above 120hz though (240hz, 480hz) is pure hyperbole at this point and completly unnecessary. 120hz is more than enough.

I think 120hz is the "sweet spot" because its divisable by 24hz (film) 30hz and 60hz. Most video sources are at one of those 3 speeds including videogames. At 120hz, no video source should experience jitter.

Also please note that "jitter" derived from uneven frames conversion is different from Motion Blur (which is what xrox explained above) and from the original posters description of his problem, sounds like thats what he's experiencing.

Then what we need to do is press plasma makers (Panasonic, Samsung, LG, Hitachi, I'm talking to you) to make their sets capable of properly doing 72Hz or 96Hz, as Pioneer does/did.

[Irishman] is offline  
post #30 of 38 Old 05-08-2009, 11:42 PM
AVS Special Member
 
NuSoardGraphite's Avatar
 
Join Date: Jul 2007
Location: Tucson AZ
Posts: 1,356
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 24 Post(s)
Liked: 110
Quote:
Originally Posted by [Irishman] View Post

Then what we need to do is press plasma makers (Panasonic, Samsung, LG, Hitachi, I'm talking to you) to make their sets capable of properly doing 72Hz or 96Hz, as Pioneer does/did.

If it survives that long, Plasma needs to hit 120hz eventually, as some videogame developers are considering attempting 120fps in the next generation or so.

Stand tall and shake the heavens...
NuSoardGraphite is online now  
Reply Flat Panels General and OLED Technology

User Tag List

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off