AVS Forum banner
1 - 12 of 12 Posts

·
Registered
Joined
·
54 Posts
Discussion Starter · #1 ·
I'm unconcerned with the funky rectangular pixels per se because I don't use the Samsung 4254 as a pc display. However, I am puzzled as to whether I should output 1080p mkv as 720p or 1080i. Or if I should even bother with 1080p streams at all given the Samsung's resolution.


If the Samsung's "native" resolution were actually 1280 X 720 then I wouldn't bother with 1080p mkv's. But since the hdtv must convert *all* HD formats does it make a difference if my video card outputs 1080i or 720p?


Or to put it another way:


1) Is there any advantage to 1080p files over 720p with this TV?


2) If there is, what resolution should the pc output?


Thanks!
 

·
Registered
Joined
·
527 Posts

Quote:
Originally Posted by ronross /forum/post/20863238


I'm unconcerned with the funky rectangular pixels per se because I don't use the Samsung 4254 as a pc display. However, I am puzzled as to whether I should output 1080p mkv as 720p or 1080i. Or if I should even bother with 1080p streams at all given the Samsung's resolution.


If the Samsung's "native" resolution were actually 1280 X 720 then I wouldn't bother with 1080p mkv's. But since the hdtv must convert *all* HD formats does it make a difference if my video card outputs 1080i or 720p?


Or to put it another way:


1) Is there any advantage to 1080p files over 720p with this TV?

Yes, more information is always better than less. Downsampling should always look better than upsampling.


2) If there is, what resolution should the pc output?

Since you are outputting to an LCD you should always output at the native resoluton of the display, which it sounds like in your case is 720p. I'm referring to the setting of your video card.


The resolution of your movie as displayed on that screen has essentially nothing to do with the video cards output... since the displayed resolution of the movie is handled by the software application playing it back, full screen or in a window.


So:


1. Video card resolution to an LCD should always match the native resolution of the LCD.

2. Movie resolution should always be as high as you are willing to store it.


Thanks!

1) Is there any advantage to 1080p files over 720p with this TV?
Yes, more information is always better than less. Downsampling should always look better than upsampling.


2) If there is, what resolution should the pc output?
Since you are outputting to an LCD you should always output at the native resoluton of the display, which it sounds like in your case is 720p. I'm referring to the setting of your video card.


The resolution of your movie as displayed on that screen has essentially nothing to do with the video cards output... since the displayed resolution of the movie is handled by the software application playing it back, full screen or in a window.


So:


1. Video card resolution to an LCD should always match the native resolution of the LCD.

2. Movie resolution should always be as high as you are willing to store it.
 

·
Registered
Joined
·
54 Posts
Discussion Starter · #3 ·
Thank you for your prompt and courteous reply. If I understand you correctly:


1) might as well use 1080p files given they have more data before down conversion.


2) video card should be set at 720p resolution since that is the "nominal" native resolution of my display.


As helpful as your answer was it really only addressed one part of my two part question. So I should use 1080p files in preference to 720p files. Understood.


But my second question was: if the actual resolution of the (plasma) display is 1024 x 768, then *everything* is being processed so is there really a difference between 720p output and 1080i output on this particular display. I don't know if this analogy holds but my cable tv set-top box outputs 720p or 1080i to the display depending on the channel and I can't tell any difference. So is it "wrong" for the video card to output 1080i, given that 720p will also be "converted."


I'm sure I'm splitting hairs here. Thank you for your patience.
 

·
Registered
Joined
·
16,749 Posts
What is the souce of the content you are sending to the TV TV programs, DVDs, PC applications? If possible you want only one scaling so you want to send the resolution of your source and let the TV scale to it's rectangular pixels, and as mentioned downscaling is preferrable to upscaling.
 

·
Registered
Joined
·
54 Posts
Discussion Starter · #5 ·

Quote:
Originally Posted by walford /forum/post/20863659


What is the souce of the content you are sending to the TV TV programs, DVDs, PC applications? If possible you want only one scaling so you want to send the resolution of your source and let the TV scale to it's rectangular pixels, and as mentioned downscaling is preferrable to upscaling.

For the sake of discussion, we are sending 1080p content to a display that has a 1024 x 768 resolution. Does it make a difference if the video card output is 720p or 1080i?


Thank you!
 

·
Registered
Joined
·
3,558 Posts
Honestly just download and try all resolutions and see what looks best. We could talk all day about this but your eyes are the ultimate judge.
 

·
Registered
Joined
·
527 Posts

Quote:
Originally Posted by ronross /forum/post/20863450


Thank you for your prompt and courteous reply. If I understand you correctly:


1) might as well use 1080p files given they have more data before down conversion.


2) video card should be set at 720p resolution since that is the "nominal" native resolution of my display.


As helpful as your answer was it really only addressed one part of my two part question. So I should use 1080p files in preference to 720p files. Understood.


But my second question was: if the actual resolution of the (plasma) display is 1024 x 768, then *everything* is being processed so is there really a difference between 720p output and 1080i output on this particular display. I don't know if this analogy holds but my cable tv set-top box outputs 720p or 1080i to the display depending on the channel and I can't tell any difference. So is it "wrong" for the video card to output 1080i, given that 720p will also be "converted."


I'm sure I'm splitting hairs here. Thank you for your patience.

In regards to "everything" gettting processed...so is there really a difference... I look at it this way. Will you always have a 1024 res screen? What happens if in a year you upgrade to a new 1080p display.... you'll be wishing you had stored your HD content in native 1080p



Now, let's talk about source content vs. stored resolution. If you are storing Blu-ray then definitely store in 1080p. It makes no sense to me to store in a lower resolution thereby removing content. However there are exceptions. If you are going to play that content back on say an iPad... then you'll need to convert it completely before playback. IF this is the case then you should store the source in two resolutions, 1080p mkv and 720p mp4 (for the iPad).


If your source is DVD then it doesn't make any sense to store in 1080p as your content is 480p (852x480).


If you ask someone in the media business they will always tell you to store your material in it's native source resolution/quality. Then you are ready to use it at any lower quality you need to, either by direct playback and down-scaling or by re-encoding to a new secondary file of the lower resolution.


It's like a haircut. You can't put it back if you cut too much off
 

·
Registered
Joined
·
449 Posts
Don't ever change the native resolution of your tv ( even if windows shows that it's not recommended resolution).

phrases such as "trust your eyes" or "let your eyes judge" are wrong.

You always have problems with higher resolution.

Sometimes will not notice them but sometimes you will . All of your display problems would be definitely increased (e.g tearing, ghosting, motion blur)
 

·
Registered
Joined
·
1,058 Posts

Quote:
Originally Posted by ronross /forum/post/20863664


Does it make a difference if the video card output is 720p or 1080i?

If you can only output 1080i, and not 1080p, then by all means, go with 720p. Interlaced is not native to LCDs, which means the graphics card has to interlace the signal, and the LCD has to deinterlace it again, causing way more image degradation then the resizing of 720p.
 

·
Registered
Joined
·
16,749 Posts
If the cable box can only output HD in 720p or 1080i use 1080i.if itcan do pass through use pass through.

From the PC set your resolution to either 720p or 1080p and send that resolution to the TV.
 

·
Registered
Joined
·
54 Posts
Discussion Starter · #12 ·
Just to clarify: this is a plasma not an lcd. I don't know if that makes a difference. I totally understand why you would want to store movies at 1080p. I'm only concerned with playback here.


Subjectively, 1080i looks a little sharper to me and I keep going back to it for all inputs at all resolutions. I just wanted to know from more knowledgeable people if I'm just fooling myself.


It's no big deal either way so I'm perfectly happy to drop the subject. I appreciate all the input.
 
1 - 12 of 12 Posts
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top