1 - 13 of 13 Posts

kidicarus7

·
Registered
Joined
·
40 Posts
Discussion Starter · ·
Does 16:9 SD content look better on a 1366x768 or 1920x1080 HDTV?

This is the question my brother and I asked ourselves recently. More specifically, we wanted to know which HDTV a Wii console would look better on, given that both HDTV's were exactly the same in every way except for resolution.

Both of us are aspiring mathematicians, so the first thing we did was look at the raw numbers of the problem. We first looked at the theory behind upscaling and used a basic theoretical example to better understand the problem. We assumed our source SD content had a resolution of 500x250 and that we had two theoretical displays, both physically the same size and both using the same upscaling circuitry, with the only difference between the two displays being that the first display has a resolution of 1000x500 and the other a resolution of 1500x750.

Displaying the source content on the first theoretical display, the problem becomes one of matching a 500x250 resolution to a 1000x500 display. In other words, our display resolution is 2 times that of our source material in both the horizontal and vertical direction and the display is tasked with inserting 100% more pixels in each direction. This is shown below in a single horizontal row of pixels:

First five pixels of one horizontal row from original source material:

{R}, {G}, {G}, {R}, {B},...

First five pixels (+ interpolated pixels) of the same horizontal row after upscaling:

{R}, {R#}, {G}, {G#}, {G}, {G#}, {R}, {R#}, {B}, {B#},...

- where {R}, {G}, {B} are the original pixels and {R#}, {G#}, {B#} are the generated pixels as a result of upscaling.

As you can see, the display can accomplish this by simply duplicating each horizontal pixel and displaying the image at a 2:1 resolution ratio. Although not shown, the same would be done in the vertical direction.

Similarly, displaying the source material on the second theoretical display can be done by simply displaying each horizontal and vertical pixel three times, resulting in a 3:1 resolution ratio.

Given that both displays are the same size and use the same upscaling circuitry, we can conclude that the source material will look exactly the same on both displays assuming that the viewing distance to the displays is far enough so that the screen door effect (SDE) is not noticeable on the lower resolution display.

As we know, however, the horizontal and/or vertical resolution of our source material is rarely an even multiple of our display resolution. As such, we need to investigate how the display handles these 'uneven' resolutions and when the worst case upscaling scenario occurs. Again, we use a basic example to illustrate the problem. We assume our source resolution is 600x300 and our theoretical display resolution is 750x375. As we can see, our display resolution is 1.25 times, or 25% larger than that of our source material in both the horizontal and vertical direction. In other words, the display is tasked with inserting 25% more pixels in each direction. This is shown below:

First five pixels of one horizontal row from original source material:

{R}, {G}, {G}, {R}, {B},...

First five pixels (+ interpolated pixels) of the same horizontal row after upscaling:

{R}, {G}, {G}, {R}, {#}, {B},...

- where {R}, {G}, {B} are the original pixels and {#} is the generated pixel as a result of upscaling. It's important to note that the location of {#} is one of five possible permutations and that it could instead have been located before the first {R} pixel, or between {R} and {G}, or between the two {G} pixels, or between {G} and {R}. If it was located after the first {R} pixel, a second {#}, say {#2}, would be located after {B}.

Whether {#} will be {R}, {G} or {B} will depend on the algorithm used during the upscaling process and the algorithm that is used will in turn depend on the resolution of the source material. Ultimately though, the way it is generated is not important because in all situations we are assuming both of our theoretical displays utilize the same upscaling circuitry. What is important is the distribution of the new pixels that are generated.

In the first example the new pixels were generated by simply duplicating each pixel. This is the ideal upscaling situation as no part of the original image has changed. Each pixel has the same 'weighting' in the upscaled image. In the above case however, we have one new pixel generated for every four original pixels. This is not ideal because certain pixels from the original image will have a greater influence on the image quality in the upscaled image than other pixels from the original image.

So the next logical question to ask ourselves is what is the worst case scenario for upscaling? When do we have an upscaled image that is least like the original? Well, we know that outputting to a display that has a resolution equal to that of the source material produces the best image quality with no upscaling required. Similarly, we know that displaying the same source material on a display of the same size but one that has a resolution that is a whole-number multiple of the source resolution provides the same image quality (again, assuming that the viewing distance to the displays is far enough so that the SDE is not noticeable on the lower resolution display). Therefore going by this logic the worst image quality should occur when the display resolution divide the source resolution results in a half remainder. This corresponds to when our display resolution is 1.5 times our source resolution in the horizontal or vertical direction.

Again, we use a basic example to test this. We assume our source resolution is 600x300 and our theoretical display resolution is 900x450. As we can see, our display resolution is 1.5 times, or 50% larger than that of our source material in both the horizontal and vertical direction. In other words, the display is now tasked with inserting 50% more pixels in each direction. This is shown below:

First six pixels of one horizontal row from original source material:

{R}, {G}, {G}, {R}, {B}, {G},...

First six pixels (+ interpolated pixels) of the same horizontal row after upscaling:

{R}, {G}, {G#}, {G}, {R}, {R#}, {B}, {G}, {G#},...

At this point the resulting upscaled image is the most different it will be to the source image. Fewer interpolated pixels bring the upscaled image closer to that of the original, and more interpolated pixels bring the upscaled image closer to a whole-number multiple of the source material's resolution which has already been shown to look identical to the source image.

Using the knowledge that the Wii outputs at 854x480 (internally renders at 640x480) when outputting in 16:9, it's now just a matter of dividing either the vertical or horizontal resolution of each HDTV with the corresponding output resolution of the Wii and see which calculation results in a remainder that is closer to a half. So,

1080/480 = 2.25

768/480 = 1.6

In conclusion, assuming two HDTV's are identical in every way except for their native resolution, we believe that the Wii will actually have a higher quality image on the Full HD 1080p display.

Discuss.

sneals2000

·
Registered
Joined
·
7,958 Posts
AIUI the better quality HDTVs use much better upscaling algorithms - using resampling techniques that treat the original signal as gaussian samples rather than block-based pixels, and thus deliver a smoother result.

In those cases the quality of the upscaling may play as big a part, if not bigger, than the panel resolution. (Good scaling with a lower panel res may look cleaner than higher res panel with poorer scaling)

However this may do better with video (where you don't want to see the basic 'pixel' structure) than some games (where pixel visibility is less of an issue?)

walford

·
Registered
Joined
·
16,749 Posts
Given that the screen sizes and viewing distances are also the same then IMHO the lower the % of total pixel content that has to be "invented" then the better the PQ will be. So I would pick the 1366x768 TV if the unit is ONLY going to be used for SD source content.

Ken H

·
Registered
LG 55" C9 OLED, Yamaha RX-A660, Monoprice 5.1.2 Speakers, WMC HTPC, TiVo Bolt, X1
Joined
·
45,683 Posts
Quote:
 Originally Posted by sneals2000 AIUI the better quality HDTVs use much better upscaling algorithms - using resampling techniques that treat the original signal as gaussian samples rather than block-based pixels, and thus deliver a smoother result. In those cases the quality of the upscaling may play as big a part, if not bigger, than the panel resolution. (Good scaling with a lower panel res may look cleaner than higher res panel with poorer scaling)
Agree.

kidicarus7

·
Registered
Joined
·
40 Posts
Discussion Starter · ·

Quote:
Originally Posted by sneals2000 /forum/post/19635345

AIUI the better quality HDTVs use much better upscaling algorithms - using resampling techniques that treat the original signal as gaussian samples rather than block-based pixels, and thus deliver a smoother result.

In those cases the quality of the upscaling may play as big a part, if not bigger, than the panel resolution.

That's an interesting point, and one that I was not aware of until now. It also introduces more questions; namely - is Gaussian sampling employed only on some 1366x768 panels, only on some 1920x1080 panels or both?

More importantly, and this is really the main question, does the fact that some upscalers employ Guassian sampling upscaling techniques affect the results in my OP? Assuming both panels employed Gaussian sampling and all other things were equal with the only difference between them being resolution, would 16:9 SD content look better on the 1366x768 display or the 1920x1080 display?

I'm confident that the results would be the same, even when Gaussian sampling is used. The reason I say this is because the fundamental problem remains the same - the display needs to interpolate pixels. How the color of these interpolated pixels is determined isn't important, because in both cases the same upscaler utilizing the same interpolating algorithm would be used.

Quote:
Originally Posted by walford /forum/post/19635632

Given that the screen sizes and viewing distances are also the same then IMHO the lower the % of total pixel content that has to be "invented" then the better the PQ will be. So I would pick the 1366x768 TV if the unit is ONLY going to be used for SD source content.

You'd think so wouldn't you? I thought the same thing initially and before before this investigation I actually advised my brother to purchase a 1366x768 panel for SD content rather than a 1920x1080 panel. But the results of the investigation prove the contrary.

walford

·
Registered
Joined
·
16,749 Posts
What are the results of your investigation.

I beleive that your investigation assumed that the same upscaling algorithmes to invent 1.5 megapixels to go from a 400K SD pixel image to to a 2.1 megapixel to 1080p display that they would use to go from a 400K SD to a 1.1 1368x768 megapixel display. In fact today they use very different algoriims for these two sceneraions in order to try and no create washed out images on the 1080p displays.

kidicarus7

·
Registered
Joined
·
40 Posts
Discussion Starter · ·
That's correct. My investigation assumes everything except the display resolution is equal between the two displays, including the algorithm used during the upscaling process.

Ken H

·
Registered
LG 55" C9 OLED, Yamaha RX-A660, Monoprice 5.1.2 Speakers, WMC HTPC, TiVo Bolt, X1
Joined
·
45,683 Posts

Quote:
Originally Posted by kidicarus7 /forum/post/19637853

That's correct. My investigation assumes everything except the display resolution is equal between the two displays, including the algorithm used during the upscaling process.

That's an incorrect assumption.

kidicarus7

·
Registered
Joined
·
40 Posts
Discussion Starter · ·

Quote:
Originally Posted by Ken H /forum/post/19638347

That's an incorrect assumption.

Indeed. But let's assume the same algorithm is used, even though it's not in reality. Is there any reason why my conclusion in the OP would be incorrect?

walford

·
Registered
Joined
·
16,749 Posts
Your conclusions are not correct since the higher the percentage of "invented" pixels there are the more washed out the images becoming and created what some call "clay faced" content.

No TVs use line duplication or even simple interpolation to upscale from one resolution to another resolution.

The networks use dedicated upscaing video processing hardware costing many thousands of \$ containing very sophisticasted algoriothims for upacaling.

sneals2000

·
Registered
Joined
·
7,958 Posts

Quote:
Originally Posted by walford /forum/post/19645414

The networks use dedicated upscaing video processing hardware costing many thousands of \$ containing very sophisticasted algoriothims for upacaling.

Yep - and no. Depends on the content.

Many broadcasters upscale SD content to HD using relatively low-cost, though still high-quality, upconverters if you have permanently interlaced or permanently progressive sources.

Things get trickier at the network level - where you have to cope with much more complicated content (where you may have a mix of progressive and interlaced content on-screen simultaneously) and in the US there is also 3:2 content to contend with as well.

The trickier process is actually often DOWN-scaling from HD to SD. Get that wrong and the SD content looks awful. You need good filtering to avoid aliasing. There are a lot of lousy downconverters out there.

walford

·
Registered
Joined
·
16,749 Posts
Very interesting, another reason not to buy a 720p TV if you are going to watch any 1080i or 1080p content.

gbynum

·
Registered
Joined
·
903 Posts

Quote:
Originally Posted by walford /forum/post/19663426

Very interesting, another reason not to buy a 720p TV if you are going to watch any 1080i or 1080p content.

My Sony 720P RP does a great job with 1080i. I have the Bluray output at 1080i too, but when I installed it a year or so back, saw no difference vs 720p. The set won't take 1080p in, so I didn't try that.

1 - 13 of 13 Posts