AVS Forum banner

Is it possible that 1080i setting looks better than native 720p into 720p Plasma

8K views 17 replies 8 participants last post by  mailiang 
#1 ·
Just so I can get this straight. If I set my set top box resolution to 1080i my 720p Samsung plasma is just going to downscale it back to 720p. Is it possible that the TV does a better job of downscaling a 1080i signal back to 720p than a native resolution setting of 720p from the set top box? I think the 1080i setting looks better on my tv than the native 720p setting. Maybe its just in my head or is it all subjective to the viewer? Hope you can follow what I am asking. Thanks.
 
#2 ·
I bet it differs by channel. Some channels broadcast in 1080i (CBS), and others in 720p (Fox). CBS will likely look better if you output 1080i from the cable box, but Fox should look better if you output 720p from the box (if you output 1080i, the box upscales it, and the plasma just downscales it back to 720p). That was my experience when I had a 720p Samsung DLP.
 
#3 ·

Quote:
Originally Posted by david0406 /forum/post/20510819


Just so I can get this straight. If I set my set top box resolution to 1080i my 720p Samsung plasma is just going to downscale it back to 720p. Is it possible that the TV does a better job of downscaling a 1080i signal back to 720p than a native resolution setting of 720p from the set top box? I think the 1080i setting looks better on my tv than the native 720p setting. Maybe its just in my head or is it all subjective to the viewer? Hope you can follow what I am asking. Thanks.

I can't think of any good reason to not set the box to simply pass through the native resolution of the broadcast signal, be it 1080i or 720P. Just let the TV alone do whatever scaling it needs. I think the setting is "auto" on many set top boxes.
 
#4 ·

Quote:
Originally Posted by david0406 /forum/post/20510819


Just so I can get this straight. If I set my set top box resolution to 1080i my 720p Samsung plasma is just going to downscale it back to 720p. Is it possible that the TV does a better job of downscaling a 1080i signal back to 720p than a native resolution setting of 720p from the set top box? I think the 1080i setting looks better on my tv than the native 720p setting. Maybe its just in my head or is it all subjective to the viewer? Hope you can follow what I am asking. Thanks.

Well first of all, your 720p Samsung isn't 720p - it's actually 768p. There are no native 720p Plasmas on the market (the last was the old 37" Panasonics). Your TV is actually scaling everything to it's 768p Native Resolution.


I don't know why this is, but setting my HD Tivo and TWC DVR to output everything at 1080i only, the 1080i and 720p channels on my 768p Panasonic all look crisper and more detailed then when i set the boxes to output everything at 720p only. The 720p setting makes all the channels look softer and less detailed. I have tried both settings back-to-back and 1080i always yields a crisper more detailed picture.


I suspect that scaling a fixed 1080i signal to 768p is better than scaling a fixed 720p signal to 768p.
 
#6 ·

Quote:
Originally Posted by RandyWalters /forum/post/20510924


Well first of all, your 720p Samsung isn't 720p - it's actually 768p. There are no native 720p Plasmas on the market (the last was the old 37" Panasonics). Your TV is actually scaling everything to it's 768p Native Resolution.


I don't know why this is, but setting my HD Tivo and TWC DVR to output everything at 1080i only, the 1080i and 720p channels on my 768p Panasonic all look crisper and more detailed then when i set the boxes to output everything at 720p only. The 720p setting makes all the channels look softer and less detailed. I have tried both settings back-to-back and 1080i always yields a crisper more detailed picture.


I suspect that scaling a fixed 1080i signal to 768p is better than scaling a fixed 720p signal to 768p.

I do have my boxes set to 1080i. Only minor issue is that the box I primarily watch is the one I have running hdmi thru my AVR, the one hooked to my plasma, resets itself to 720p every time I power it off. The other two I have hooked directly to the TVs do not do that. Apparently my AVR has something to do with that. I just have to manually change it everytime. Not really a big deal. I have read that it is better to just leave the boxes set at the TVs resolution to get a better picture but I contend that at least on my TV it seems to look better and more crisp letting the TV downscale the 1080i signal to 768p. I guess it is subjective and depends on the TV. Thanks guys for the info.
 
#7 ·

Quote:
Originally Posted by david0406 /forum/post/20511698


I do have my boxes set to 1080i. Only minor issue is that the box I primarily watch is the one I have running hdmi thru my AVR, the one hooked to my plasma, resets itself to 720p every time I power it off. The other two I have hooked directly to the TVs do not do that. Apparently my AVR has something to do with that. I just have to manually change it everytime. Not really a big deal. I have read that it is better to just leave the boxes set at the TVs resolution to get a better picture but I contend that at least on my TV it seems to look better and more crisp letting the TV downscale the 1080i signal to 768p. I guess it is subjective and depends on the TV. Thanks guys for the info.

As I think Mr. Walters was trying to explain, it makes no sense to re-scale a broadcast signal more than once. You can't possibly end up with a better result if you do so. Since your television has to re-scale to 768P no matter what the signal is, why do you have your box doing any scaling at all? It should not be converting a 720P signal to 1080i just to convert it again to 768P. Obviously for any native 1080i signal, the box is not converting it and maybe that's why you're finding good results at times?
 
#8 ·

Quote:
Originally Posted by Hudson1 /forum/post/20511706


As I think Mr. Walters was trying to explain, it makes no sense to re-scale a broadcast signal more than once. You can't possibly end up with a better result if you do so. Since your television has to re-scale to 768P no matter what the signal is, why do you have your box doing any scaling at all? It should not be converting a 720P signal to 1080i just to convert it again to 768P. Obviously for any native 1080i signal, the box is not converting it and maybe that's why you're finding good results at times?

It seems to me Mr Walters stated that he does scale his signal more than once by setting his box to 1080i and letting his Panasonic rescale it to 768p. He states, unless I am reading it wrong, or misinterperating, his picture looks more detailed and crisp with his TiVO and his dvr set to 1080i. Maybe I have misread what he is stating. If I am wrong, I stand corrected. I never said I was not finding good results at times. What I stated is I feel I get better results all the time or just in general with the box set at 1080i
 
#9 ·

Quote:
Originally Posted by david0406 /forum/post/20511730


It seems to me Mr Walters stated that he does scale his signal more than once by setting his box to 1080i and letting his Panasonic rescale it to 768p. He states, unless I am reading it wrong, or misinterperating, his picture looks more detailed and crisp with his TiVO and his dvr set to 1080i. Maybe I have misread what he is stating. If I am wrong, I stand corrected. I never said I was not finding good results at times. What I stated is I feel I get better results all the time or just in general with the box set at 1080i

I reread his post and I was wrong. What I gleaned is if you have to select a fixed box output, it's better to fix it at 1080i than 720P. The better option is to not allow the box to re-scale at all... only let the TV do it.
 
#10 ·

Quote:
Originally Posted by Hudson1 /forum/post/20510906


I can't think of any good reason to not set the box to simply pass through the native resolution of the broadcast signal, be it 1080i or 720P. Just let the TV alone do whatever scaling it needs. I think the setting is "auto" on many set top boxes.

Most boxes, including mine, don't offer a pass-through option.
 
#11 ·

Quote:
Originally Posted by RandyWalters /forum/post/20510924


Well first of all, your 720p Samsung isn't 720p - it's actually 768p. There are no native 720p Plasmas on the market (the last was the old 37" Panasonics).

Even the old Panasonic 37" plasma was not a match to the broadcast standard of 1280x720p, rather it was 1024x720p. So, there would scaling on the horizontal axis of the resolution.


Quote:
Your TV is actually scaling everything to it's 768p Native Resolution.


I don't know why this is, but setting my HD Tivo and TWC DVR to output everything at 1080i only, the 1080i and 720p channels on my 768p Panasonic all look crisper and more detailed then when i set the boxes to output everything at 720p only. The 720p setting makes all the channels look softer and less detailed. I have tried both settings back-to-back and 1080i always yields a crisper more detailed picture.


I suspect that scaling a fixed 1080i signal to 768p is better than scaling a fixed 720p signal to 768p.

Yep, my experience as well; I have a 1366x768p LCD RPTV, and the reason appears to be that taking a 1920x1080i signal, for instance, and downscaling to 1280x720p, (which is less than the 51" TV's resolution (1360x768p); and less than the 43" TV's vertical axis res. (1024x768p)), and then upscaling to the TV's res., the 'thrown out' res. cannot be properly recaptured.


Conversely, taking a 1280x720p, and upscaling it to 1920x1080i, (which is more than either TV's res.), and then downscaling to the TV's res., no signal res. has been 'thrown out', just extra res. is 'created', and then 'thrown out'.


Of course, matching the broadcast signal and HD box output res. is optimal.
EDIT: In order to do so, one should set the HD box to 'native' output, if available.


Again, for those that missed it, Samsung 'non-1080p' plasmas, (as I call them):


51" - 1360x768p

43" - 1024x768p
 
#12 ·

Quote:
Originally Posted by QZ1 /forum/post/20514761


Even the old Panasonic 37" plasma was not a match to the broadcast standard of 1280x720p, rather it was 1024x720p. So, there would scaling on the horizontal axis of the resolution.





Yep, my experience as well; I have a 1366x768p LCD RPTV, and the reason appears to be that taking a 1920x1080i signal, for instance, and downscaling to 1280x720p, (which is less than the 51" TV's resolution (1360x768p); and less than the 43" TV's vertical axis res. (1024x768p)), and then upscaling to the TV's res., the 'thrown out' res. cannot be properly recaptured.


Conversely, taking a 1280x720p, and upscaling it to 1920x1080i, (which is more than either TV's res.), and then downscaling to the TV's res., no signal res. has been 'thrown out', just extra res. is 'created', and then 'thrown out'.


Of course, matching the broadcast signal and HD box output res. is optimal.



Again, for those that missed it, Samsung 'non-1080p' plasmas, (as I call them):


51" - 1360x768p

43" - 1024x768p

On my satellite system (DTV HD22-100 DVR) I've found that I get the best results by setting it to native and then choosing the option of all the available formats.


Ian
 
#14 ·

Quote:
Originally Posted by Mathesar /forum/post/20516042


My Motorola HD/DVR cablebox doesnt have a "bypass" or "auto" setting, After comparing both 720p / 1080i I found the 1080i setting to look a little better, 720P had a "softer" output (Kuro 5080HD / 768P).

This is the box I have also and I get the same result.
 
#15 ·
Quote:
Originally Posted by QZ1
Even the old Panasonic 37" plasma was not a match to the broadcast standard of 1280x720p, rather it was 1024x720p. So, there would scaling on the horizontal axis of the resolution.





Yep, my experience as well; I have a 1366x768p LCD RPTV, and the reason appears to be that taking a 1920x1080i signal, for instance, and downscaling to 1280x720p, (which is less than the 51" TV's resolution (1360x768p); and less than the 43" TV's vertical axis res. (1024x768p)), and then upscaling to the TV's res., the 'thrown out' res. cannot be properly recaptured.


Conversely, taking a 1280x720p, and upscaling it to 1920x1080i, (which is more than either TV's res.), and then downscaling to the TV's res., no signal res. has been 'thrown out', just extra res. is 'created', and then 'thrown out'.


Of course, matching the broadcast signal and HD box output res. is optimal.



Again, for those that missed it, Samsung 'non-1080p' plasmas, (as I call them):


51" - 1360x768p

43" - 1024x768p
Just one more thought on this. So what I gather, at least from the posters on this thread, those that have 720p TVs, get a better result from letting their TVs downscale a 1080i signal back to 768p than upscaling a 720p signal to the TVs native 768p. This seems to go against conventional wisdom to do the least amount of scaling to get the best results from your panel. Or are we not really doing any different amount of scaling, we are just letting the TV downscale the signal intstead of upscaling it in reality? Do 720p panels do a better job of downscaling a HD signal than upscaling it? Thoughts??
 
#16 ·
Quote:
Originally Posted by QZ1
Of course, matching the broadcast signal and HD box output res. is optimal.
Quote:
Originally Posted by mailiang
On my satellite system (DTV HD22-100 DVR) I've found that I get the best results by setting it to native and then choosing the option of all the available formats.
Indeed, that is what that sentence I wrote implies.
I've edited it, adding, 'In order to do so, one should set the HD box to 'native' output, if available.'
 
#17 ·
Quote:
Originally Posted by david0406
Just one more thought on this. So what I gather, at least from the posters on this thread, those that have 720p TVs, get a better result from letting their TVs downscale a 1080i signal back to 768p than upscaling a 720p signal to the TVs native 768p. This seems to go against conventional wisdom to do the least amount of scaling to get the best results from your panel. Or are we not really doing any different amount of scaling, we are just letting the TV downscale the signal intstead of upscaling it in reality? Do 720p panels do a better job of downscaling a HD signal than upscaling it? Thoughts??
Any given plasma might use a different scaler, and how good it is, is the most important factor in scaling quality; and scalers can vary quite a bit. Second, it appears to me, and you correctly understand, is it's best not to downscale to less than the broadcast resolution. Third, in this case, at least, is how much scaling is being done; relatively speaking, it isn't that much of a difference, apparently.
 
#18 ·
Quote:
Originally Posted by QZ1
Indeed, that is what that sentence I wrote implies.
I've edited it, adding, 'In order to do so, one should set the HD box to 'native' output, if available.'


I know. That's the reason I wrote my reply.





Ian
 
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top