Question about Decoding HDTV - AVS Forum
Forum Jump: 
 
Thread Tools
post #1 of 8 Old 01-31-2007, 11:40 AM - Thread Starter
Member
 
Dukenightvision's Avatar
 
Join Date: Nov 2006
Posts: 30
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
I have a question about the decoding process when I am simultaneously recording and watching a HD stream.

I'm using SageTV to both record and watch HD-TV on a Syntax Olevia 532H LCD HDTV. My computer is connected to the 532H via DVI-HDMI cable. The native resolution of the 532H is 1366X768 (supports 1080i when watching TV via the tuner).

Now my question: When I am watching a HD program that is being streamed at 1080i, is my computer's GPU (Nvidia 7800 Go) actually processing the stream to my HDTV at 1080i or is it down converting the 1080i stream to the native 720p HD resolution of my 532H? I sometimes use the Overlay setting and at times I use VMR 9.

Thanks.

Dell Inspiron E1705
Windows XP SP2
Dual Core T2400
1GB RAM
Nvidia Go 7800 (256MB)
Nvidia Purevideo Decoder
Artec T14A USB HD Tuner
Dukenightvision is offline  
Sponsored Links
Advertisement
 
post #2 of 8 Old 01-31-2007, 11:46 AM
 
Targus's Avatar
 
Join Date: Jan 2006
Posts: 2,617
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 11
What have you set the resolution to?
Targus is offline  
post #3 of 8 Old 01-31-2007, 12:09 PM
AVS Addicted Member
 
stanger89's Avatar
 
Join Date: Nov 2002
Location: Marion, IA
Posts: 17,357
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 85 Post(s)
Liked: 128
Quote:
Originally Posted by Dukenightvision View Post

When I am watching a HD program that is being streamed at 1080i, is my computer's GPU (Nvidia 7800 Go) actually processing the stream to my HDTV at 1080i or is it down converting the 1080i stream to the native 720p HD resolution of my 532H?

It will deinterlace (if necessary) and scale to whatever your desktop resolution is. This applies regardless of video resolution or display resolution.

See what an anamorphoscopic lens can do, see movies the way they were meant to be seen
stanger89 is offline  
post #4 of 8 Old 01-31-2007, 12:13 PM
AVS Special Member
 
umdivx's Avatar
 
Join Date: Nov 2004
Location: Minnesota (aka the frozen tundra)
Posts: 5,374
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 12
Quote:
Originally Posted by Dukenightvision View Post

I have a question about the decoding process when I am simultaneously recording and watching a HD stream.

I'm using SageTV to both record and watch HD-TV on a Syntax Olevia 532H LCD HDTV. My computer is connected to the 532H via DVI-HDMI cable. The native resolution of the 532H is 1366X768 (supports 1080i when watching TV via the tuner).

Now my question: When I am watching a HD program that is being streamed at 1080i, is my computer's GPU (Nvidia 7800 Go) actually processing the stream to my HDTV at 1080i or is it down converting the 1080i stream to the native 720p HD resolution of my 532H? I sometimes use the Overlay setting and at times I use VMR 9.

Your video card will down convert, or down-rez the 1080i video to whatever the resoulution your video card is set to.

So if your video card is setup and outputting 1366 x 768 that 1080i video is down converted to 768p.

BTW your tv basically does the same thing. When your watching a 1080i video with teh built in tuner of the tv, your TV has an internal video scaling chip, this scaling chip, scales that 1080i video to the native rez (ie 1366 x 768) of your tv.

make sense?

- Josh
umdivx is offline  
post #5 of 8 Old 01-31-2007, 12:51 PM - Thread Starter
Member
 
Dukenightvision's Avatar
 
Join Date: Nov 2006
Posts: 30
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Thanks for all of the quck replies. I really appreciate it.

My screen resolution is set to native (1366X768). When I set the resolution on my GPU to output 1080i to my TV, I get a bad flickering effect. I've read on another thread that my Olevia 532H is not capable of displaying 1080i when connected to my computer using the DVI-HDMI cable, but will display 1080i when using the component connection on the TV. I've purchased a DVI-i to Component cable from monoprice.com so I can output 1080i to my 532H.

So if I am able to change my TV resolution to 1080i, does this mean that my GPU won't have to deinterlace and scale down the 1080i HD stream?

I'm asking this question because at the present time when watching a 1080i HD program, I get a little video stutter at times. I'm wondering if the stuttering is due to my GPU deinterlacing and scaling the 1080i stream to 1366X768. If the GPU has less work to do, I'm wondering if the ocassional video stutter will stop.
Dukenightvision is offline  
post #6 of 8 Old 01-31-2007, 01:09 PM
AVS Special Member
 
umdivx's Avatar
 
Join Date: Nov 2004
Location: Minnesota (aka the frozen tundra)
Posts: 5,374
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 12
Quote:
Originally Posted by Dukenightvision View Post

Thanks for all of the quck replies. I really appreciate it.

My screen resolution is set to native (1366X768). When I change the resolution on my 532H to output 1080i, I get a bad flickering effect. I've read on another thread that my Olevia 532H is not capable of outputting 1080i when connected to my computer using the DVI-HDMI cable, but will display 1080i when using the component connection on the TV. I've purchased a DVI-i to Component cable from monoprice.com so I can output 1080i to my 532H.

Sorry to tell you but that DVI to component cable won't work. Not with Nvidia video cards atleast.

Quote:


So if I am able to change my TV resolution to 1080i, does this mean that my GPU won't have to deinterlace and scale down the 1080i HD stream?

either way your TV won't do 1080i, Like I said before, your TV will accept a 1080i signal but it won't display in 1080i lines of resolutions.

ANY 1080i video on your TV, is down converted to 1366 x 768.

So its weather you want your TV to down covert the 1080i or if you want your video card to down convert.

Honestly I have my PC do the scaling, meaning my PC converts all 1080i signals to 720p before its sent to my projector.

I bet you that the PC does a better job of de-interlacing 1080i video than your TV does.

Quote:


I'm asking this question because at the present time when watching a 1080i HD program, I get a little video stutter at times. I'm wondering if the stuttering is due to my GPU deinterlacing and scaling the 1080i stream to 1366X768. If the GPU has less work to do, I'm wondering if the ocassional video stutter will stop.


What mpeg decoders are you using? you might have better performance changing the de-interlacing and video settings in the decoders.

If you try feeding your tv a 1920 x 1080 resolution i bet you'll get more shuddering that way, than the way you are now.

- Josh
umdivx is offline  
post #7 of 8 Old 01-31-2007, 01:20 PM - Thread Starter
Member
 
Dukenightvision's Avatar
 
Join Date: Nov 2006
Posts: 30
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Thanks for the education, Josh. I'm using Nvidia Purevideo decoder. I'll play with the settings to see if that helps.
Dukenightvision is offline  
post #8 of 8 Old 01-31-2007, 01:25 PM
AVS Special Member
 
umdivx's Avatar
 
Join Date: Nov 2004
Location: Minnesota (aka the frozen tundra)
Posts: 5,374
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 12
Try changing the de-interlace to video, or smart. also if your on overlay for video try VMR9 or the opposite if its on VMR9 try overlay.

- Josh
umdivx is offline  
Reply Home Theater Computers

User Tag List

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off