Upscale or downscale? - AVS Forum
Forum Jump: 
 
Thread Tools
post #1 of 6 Old 12-14-2013, 05:20 AM - Thread Starter
Newbie
 
si2504's Avatar
 
Join Date: Dec 2013
Posts: 1
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
I'll start off by saying hi to everybody. Been using the forum in the background for a while and the amount of knowledge in one place is immense. So for my first post, this is my question.....

as i understand it, from reading different things on the net, a tv will either upscale or downscale the display resolution depending on the signal being received.......stop me there if i'm wrong.

So my tv is advertised as 720p, but its actual resolution is 1365x768p. If i set a device to output a 1080p signal (which my tv does accept), will my tv downscale to the actual 768p? And if i set that same device to output a 720p signal, will my tv upscale to 768p?

Does either upscaling or downscaling have any benefits over the other and would either be detremental to my picture quality, if that makes sense. Basically, would you rather your tv upscaled or downscaled?

Thanks
Simon
si2504 is offline  
Sponsored Links
Advertisement
 
post #2 of 6 Old 12-14-2013, 02:30 PM
Advanced Member
 
Dierkdr's Avatar
 
Join Date: Dec 2007
Posts: 849
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 52
Quote:
Originally Posted by si2504 View Post

....So my tv is advertised as 720p, but its actual resolution is 1365x768p. If i set a device to output a 1080p signal (which my tv does accept), will my tv downscale to the actual 768p? And if i set that same device to output a 720p signal, will my tv upscale to 768p?

Does either upscaling or downscaling have any benefits over the other and would either be detremental to my picture quality, if that makes sense. Basically, would you rather your tv upscaled or downscaled?

Part 1: Yes. Your tv will always display its native resolution - which is probably 1366x768, & which suggests a 50" panel (but not a recent Panasonic - which cheapened out to 1024x768 several years ago).

Part 2: Depends. You can try setting your source to 720 & then 1080, and see whether or not you can tell the difference between them - presumably results would vary if the scaling in the TV was substantially Better OR Worse than in your source.

We have a pair of older "720p" Pan plasmas, and have never been able to CONSISTENTLY distinguish between a 1080 vs 720 input.

OTOH, SD DVDs and most "HD TV" programs DO look Consistently Better on the 720p sets then they do on our 60" 1080 panel.
Though a lot of that is simply a function of SIZE - the more you magnify a less-than-HD source the more visible the inherent flaws become - cannot help but wonder whether having to "invent" extra data for the 1080 panel (as compared to 720) also introduces more artifacts...
Dierkdr is offline  
post #3 of 6 Old 12-15-2013, 01:51 AM
Advanced Member
 
Mark12547's Avatar
 
Join Date: Nov 2013
Location: Salem, Oregon, United States
Posts: 685
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 169 Post(s)
Liked: 133
Quote:
Originally Posted by si2504 View Post

as i understand it, from reading different things on the net, a tv will either upscale or downscale the display resolution depending on the signal being received.......stop me there if i'm wrong.

So my tv is advertised as 720p, but its actual resolution is 1365x768p.

 

Yes, every signal you can feed the TV will be rescaled to the TV's native resolution (1365x768p in the case of your TV; or 1366x768 in the case of my bedroom TV). So 480i (typically SD stations, DVD players without progressive scan, old analog VCR players), 480p (e.g., DVD players with progressive scan enabled), and 720p (many HD channels) will be "upscaled"; 1080i (many other HD channels), and 1080p (blu-ray player at its highest picture resolution setting) will be "downscaled".

 

So, if you have a device where you can set what video resolution to use for sending the signal to the TV, the TV will always be doing some form of scaling.

 

Quote:
Originally Posted by si2504 View Post


Does either upscaling or downscaling have any benefits over the other and would either be detremental to my picture quality, if that makes sense. Basically, would you rather your tv upscaled or downscaled?

 

Short answer:

 

The pragmatic answer is to try various settings and see.

 

Long answer:

 

Basically, I would rather have only one device doing the scaling, if possible, so there are as few scalers as possible trying to figure out how to display the image. In the case of the HD DVR, I had finally ended up having it send the signal to the TV in the same video format it received the channel over the cable, so only the TV is rescaling the image to the TV's native resolution. This seems to produce the sharpest picture on my 768p TV. However, the down side is that there seems to be a full second or two delay when switching from content of one video resolution to content of another video resolution. (Actually, there is one exception: I ended up configuring the HD DVR to never send a 480i signal out its HDMI port, so SD stations are sent to the TV as a 16:9 480p signal instead of a 4:3 480i signal, so I don't have to mess with the TV to get the right aspect ratio when switching between SD and HD content. This seems to work quite well for my setup.)

 

With the HD DVR feeding the TV via the HDMI port, I discovered that setting the HDMI port to always be at 1080p produced a better picture on the TV than having it always feed the TV at 720p (at 720p, some of the information from 1080i stations is lost, and the TV then has to scale that resulting 720p signal to the TV native resolution of 768p, whereas at 1080p there is no data loss, but a bit of the sharpness of 720p stations are lost when the DVR scales 720p to 1080p and the TV back down to 768p); but at "Native" (that DVR's speak for whatever video format the channel is using) the picture appears to be the sharpest for both 720p and 1080i stations. I have tried the DVR at 720p, later at 1080p, and most recently at "Native", and I think "Native" produces the best results.

 

If I hook up a Blu-ray player to that TV, I would set the Blu-ray player to generate a 1080p signal so the TV would have has much information as possible to scale the image down to the 768p resolution. Generally, it is easier to "discard" data to get a lower resolution than it is to interpolate the data to figure out what the additional pixels should be set at.

 

If hooking up a DVD player, if the player had an upscaler, I would probably test both 480p (progressive) and 1080p for several different discs and see which produced the best image, not being sure if 480p would allow just the TV to do all the scaling or if the DVD player may be finding a bit of additional information in the compressed video file that would make upscaling a bit more accurate than what the TV could achieve on its own and result with the DVD player upscaling to 1080p and the TV downscaling back to 768p.

 

It gets more complicated with other devices because, especially streaming content viewer devices, may be handling content of different video resolutions, so it can become an issue of if the scaler in the streaming device is better, worse, or about the same quality as the scaler in the TV.

 

In general, though, if I could not send to the TV the original video format so I had to err, I would rather err on throwing more data than necessary (i.e., 1080p) at the TV than not enough data (resolutions of 720p and below).

 

But the real test is to try some likely candidate resolutions and see which seems generally best for your TV and the specific piece of equipment for a variety of content you would typically be viewing.

 

By the way, I really understand Dierkdr's comments about SD material (and I even started a thread on that): if there are video flaws, not just SD content, but old kinescopes or video tape artifacts, or films where the film grain shows through or excessive flaking of the emulsion, the bigger the TV, the more the video flaws are magnified. (Also, I noticed, with some TV shows watched on the 12-in screens you never saw the wires holding up Supercar or moving some of the monsters in Voyage to the Bottom of the Sea, but when you blow it up to 50-in those strings become visible.) There have been times I wished I could zoom out the picture so the video flaws wouldn't be so apparent, but that option hasn't been on any of the TVs I have owned.


My very humble setup:
Man Cave:Vizio E500i-A1 "Smart TV" (50-in 1080p 120Hz LED/LCD, has Netflix app.), Blu-ray players (Sony BDP-S3100, old LG BD390), Roku (the original model: N1000), PC (Windows 7), Comcast Internet (25Mbps/5Mbps).
Bedroom:LG 32LV3400-UA TV (32-in 768p 60Hz LED/LCD), HD DVR (Motorola RNG200N), Xfinity Comcast cable (Digital Starter Package), DVD/VHS player.
Mark12547 is online now  
post #4 of 6 Old 12-15-2013, 05:16 AM
Advanced Member
 
Vic12345's Avatar
 
Join Date: Mar 2012
Posts: 769
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 19 Post(s)
Liked: 25
Good explanation.i do native on box and allow 480p720p1080i.Seemed the clearest.The1080p is not available on my cable.My tv is 1024 768.

A lot of the crap on the screen is cable compression.I dont know how much is scaling and how much is compression but it's pretty messy looking..I don't notice it when far enough back from tv.

Seeing as this was explained does anyone know if tv channels can change there resolution? I was sure I seen the box display 1080i on an abc channel that is normally 720p.
Vic12345 is offline  
post #5 of 6 Old 12-15-2013, 03:17 PM
Advanced Member
 
Dierkdr's Avatar
 
Join Date: Dec 2007
Posts: 849
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 52
Quote:
Seeing as this was explained does anyone know if tv channels can change there resolution? I was sure I seen the box display 1080i on an abc channel that is normally 720p.

With our UVerse STBs we must choose EITHER 720p OR 1080i output - which means that some channels will always be "Wrong" (that is, either 720p native channels output as 1080i, or 1080i channels output as 720p).

Apparently some STBs permit a "Native" output selection - so a 720p channel outputs 720p to the TV, while a 1080i channel outputs 1080i.

With Uverse we would need to Change the output resolution setting each time we switched between a 720 & 1080 channel - which, of course, presupposes that we KNOW which channels output which resolution.

SOMEWHERE - presumably within the past year or two - there was a fairly long discussion about STB Output Options & the likely impact on TV PQ.
Hopefully a "Search" would uncover that thread (or perhaps someone with a Good Memory will recall the location!)
Dierkdr is offline  
post #6 of 6 Old 12-15-2013, 05:30 PM
Advanced Member
 
Vic12345's Avatar
 
Join Date: Mar 2012
Posts: 769
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 19 Post(s)
Liked: 25
For 1024 768 tvs the best box option is native,then 1080i.There is talk about selecting resolutions for 1024 768 tvs in the Samsung pnd450 thread.
Vic12345 is offline  
Reply Plasma Flat Panel Displays



Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off