AVS Forum banner
1 - 6 of 6 Posts

·
Registered
Joined
·
4 Posts
Discussion Starter · #1 ·
Hi, I just bought my first HDTV which is the Samsung PN43D450. Its a plasma 720p.


Anyways I hooked it up with my laptop using a HDMI cable and the laptop auto recognizes the TV and switched to the 1024x768 resolution. Everything looks good until I tried to run a video when I noticed a weird offset effect in a horizontal line that scans from top to bottom very very slowly (it scans from top to bottom in around 30 secs at least). This only happens when the scene has a lot of movement.


I went online to do some research and the closest phenomenon that matches what I see is judder. I downloaded a judder test that someone on AVSForum posted (a bunch of vertical white bars moving left to right across a black background). I took a picture and attached it here. You can see a distinct offset in the image...this offset will run from the top of the screen to the bottom.


Can anyone tell me why this is happening and is there a way to fix this? Thanks
 

·
Registered
Joined
·
483 Posts
Just a wild guess, but did the laptop set the frame rate correctly for the TV as well? Most HD TVs are looking for an ideal field/frame rate of 29.97 or 59.94, not 30 or 60 (or 72, etc). The TV may accept and display a 1024 x 768 PC resolution as 720p, but may not be able to gracefully display motion video without frame rate conversion artifacts.


Go into the display settings on the laptop, and see if one of the available options is 1024 x 768 @ 29.97 or 59.94. If so force that setting and see if your artifacts are reduced.


Computer video cards were for years designed to drive common computer monitor resolutions and frame rates, but only recently started having "TV Friendly" resolutions available. Dedicated media players IMHO will always beat a PC in overall picture quality, even with an optimum frame rate feed, but you may find the results acceptable once you find the best settings.
 

·
Registered
Joined
·
325 Posts
What you're experiencing is "screen tearing". It's what happens when the refresh rate of the laptop doesn't match that of the TV. Very common in video and computer games that don't use vsync. Also, try setting the laptop to 1920x1080 if you can. I'm sure the TV will know how to handle that 16:9 resolution better than a 4:3 resolution like 1024x768. The video hardware in the laptop is really going to be the determining factor here as to whether or not you'll get an acceptable picture on the TV. If it's too old, then it probably just doesn't have the chops to handle HD video.
 

·
Registered
Joined
·
325 Posts

Quote:
Originally Posted by Satori84 /forum/post/20866830


Dedicated media players IMHO will always beat a PC in overall picture quality, even with an optimum frame rate feed, but you may find the results acceptable once you find the best settings.

I disagree. You'll never find a dedicated media player that handle full vector adaptive deinterlacing.
 

·
Registered
Joined
·
4 Posts
Discussion Starter · #5 ·
Hi, someone else told me its a vsync issue as well. I looked at my laptop's display setting and don't see any option to change the frame rate to those decimal number setting...only the standard 60/75Hz etc...and my laptop can't display 1920x1080 either


But I did resolve the problem by declaring the TV as the extended monitor instead of being the same monitor as the laptop. That somehow fixed the issue.
 

·
Registered
Joined
·
4,538 Posts
Your choice of video playing software can make a difference, too. Media Player Classic Home Cinema, for example, has many more choices for rendering than WMP. A different rendering mode just might do the trick for you.
 
1 - 6 of 6 Posts
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top