AVS Forum banner

What really is the "unit" pq improvement of hd material?

1991 Views 34 Replies 14 Participants Last post by  Robert SawyerIII
It seems to be commonly excepted that the pq of hd is certainly better than the pq of sd. We also have seen numerous topics which put the sd and hd version of the same scene (but both normalized to the larger size) back to back. Depending on the material, the comparison may be close with a slight edge to the hd sample, and other times the hd looks markedly better than the upsampled sd. This difference if fairly well obvious and fairly representative of what happens when either material is viewed on a real display of a given size.


Now here is a weird question...is the pq of hd really better than sd on a pixel by pixel basis, or is the pq better simply because there are more pixels? Does the sd look poor when upsampled simply because of the result of the dpi getting blown out of spec by the upsampling process, rather than the inherent pq of the material being of poor quality relative to hd? If you took a small tile of the hd sample and similarly upscaled it as you would with sd==>hd, would that tile not look similarly crappy? Now here's a confounding scenario...is it plausible that a 1080p image of hd material is not much more than a series of 480p "tiles" put together to make a larger picture? Could one distinguish between the 2 scenarios?


All of this would suggest that the "unit" pq has remained the same going from sd to hd. It's just the hd contains more pixels (yeah, I know- duh) to retain the same detail quality on larger screens. Looking at it another way, viewing sd on a small computer screen can be essentially an "hd quality" rendition similar to viewing hd on a large hdtv display. The former is just a smaller experience.


Where am I going with this? I don't know, really. I was just wondering if this would be of interest to anybody else to wrap their head around, and what are their thoughts...
See less See more
1 - 20 of 35 Posts
Well I'm not sure I follow your argument but here's a simple (minded) test for determining PQ. Look at the end of a DVD where they roll the credits on a SD monitor and determine what you can read. Now get a blue laser variant of that DVD and perform the the same test on a 1080 HD monitor. Now what do you see?


--- CHAS
Is this a trick question? I once had a teacher ask the class what would have heppened if they had had jets in WW1...


That said, SD DVD was Mpeg2...


Releases varied from poor, all the way to "as good as it gets",, but it is still 1/6th the resoltution of HD, so the point is a bit moot...
2

Quote:
Originally Posted by Mr. Hanky /forum/post/0


It seems to be commonly excepted that the pq of hd is certainly better than the pq of sd. We also have seen numerous topics which put the sd and hd version of the same scene (but both normalized to the larger size) back to back. Depending on the material, the comparison may be close with a slight edge to the hd sample, and other times the hd looks markedly better than the upsampled sd. This difference if fairly well obvious and fairly representative of what happens when either material is viewed on a real display of a given size.


Now here is a weird question...is the pq of hd really better than sd on a pixel by pixel basis, or is the pq better simply because there are more pixels? Does the sd look poor when upsampled simply because of the result of the dpi getting blown out of spec by the upsampling process, rather than the inherent pq of the material being of poor quality relative to hd? If you took a small tile of the hd sample and similarly upscaled it as you would with sd==>hd, would that tile not look similarly crappy? Now here's a confounding scenario...is it plausible that a 1080p image of hd material is not much more than a series of 480p "tiles" put together to make a larger picture? Could one distinguish between the 2 scenarios?


All of this would suggest that the "unit" pq has remained the same going from sd to hd. It's just the hd contains more pixels (yeah, I know- duh) to retain the same detail quality on larger screens. Looking at it another way, viewing sd on a small computer screen can be essentially an "hd quality" rendition similar to viewing hd on a large hdtv display. The former is just a smaller experience.


Where am I going with this? I don't know, really. I was just wondering if this would be of interest to anybody else to wrap their head around, and what are their thoughts...


Your comments on per unit improvement intrigued me. So I did a little research and math on the pixels per inch of some of the SD and HD sets. Needless to say I was surprised at the results. Let me preface this reply with the disclaimer that "I am far from an expert on HDTV's". But heres what I found regarding part of your post. These sizes and resolutions were chosen as they are the ones I am most framilar with and seem to be pretty common.


27" 4:3 SDTV screen viewing area = 349.92 Square inches

50" 16:9 HDTV screen viewing area = 1068.2 Square inches

60" 16:9 HDTV screen viewing area = 1537.62 Square inches



27" SDTV = 640x480 = 307200 pixels

50" HDTV = 1280X720 = 921600 pixels

50" HDTV = 1365x768 = 1048320 pixels

60" HDTV = 1365x768 = 1048320 pixels

60" HDTV = 1920x1080 = 2073600 pixels


27" 480p SDTV = 877 pixels per inch

50" 720p HDTV = 863 pixels per inch

50" 768p HDTV = 981 pixels per inch

60" 768p HDTV = 682 pixels per inch


60" 1080p HDTV = 1349 pixels per inch


From what I can tell a 50" 1280x720p set has about the same pixels per inch as a 27" SDTV. And a 60" 1365x768 actually has about 22% FEWER pixels per inch vs a 27" SDTV!(That was a surprise) Where I see an obvious improvement in the pixel per inch count is on the 1080i/p sets vs a 27" SD set. The 1080i/p sets have about a 35% higher ppi count. So from this I would assume that you are right about SD on a smaller set possibly looking as good as HD(non-1080) on a larger set because your getting pretty similar or even more ppi on the smaller set.



So it would seem that on most non 1080 sets the improvement in PQ for HDTV is not so much due to an increase in the ratio of pixels per inch as much as it is due to the size increase of the set while at the same time maintaining a reasonable ppi ratio. Obviously there are a lot of other factors that come into play regarding overall HD PQ such as color space, pixel size, seating distance, display tech etc.. But this could explain why a lot of HD programming looks pretty comparable to clean SD on nice smaller SD sets. Or why a small SD television might look like it is HD quality.


Of course my numbers could be messed up but it is interesting.


Cheers
See less See more
Quite frankly, this question doesn't make any sense.

Quote:
Originally Posted by ChrisWiggles /forum/post/0


Quite frankly, this question doesn't make any sense.

I thought the basic jist of the question was that does HDTV actually have more detail per square inch or is it more that the image is bigger with the same detail per square inch and that maybe most of the added benefit is due to the larger screen size not more resolution per inch when compared to SDTV. I may have read into the question too much though.
4
YOU, my friend, nailed it on this topic!
You get a big cookie with any fixin's you desire!!!


(just for clarity, the units on your final results should be "pixels per sq in", though
)


The 720/768p spec does seem to be most vulnerable to the "unit pq" idea, by these metrics.


I'm guessing that the 1080p specs would also become vulnerable, if you included 480p on a 20" computer monitor, as well.
1080p on medium size screens (sub-50"?) does seem to be the sweet spot to really knock the pixel density spec out of the park. I won't comment on ultra large projections...
See less See more

Quote:
Originally Posted by paradigm20s /forum/post/0


I thought the basic jist of the question was that does HDTV actually have more detail per square inch or is it more that the image is bigger with the same detail per square inch and that maybe most of the added benefit is due to the larger screen size not more resolution per inch when compared to SDTV. I may have read into the question too much though.

Well, sure, but that has no bearing to anything. SD and HD do not have size dimensions. Picking some small size for SD is arbitrary, as is picking some large size for HD. You could just as easily pick a 100" SD size and a 30" HD size and the latter would have oodles more pixels per sq inch.


And then add to the fact that this doesn't take into account viewing ratio, and it's really not a very useful query. I mean, it sort of makes sense if you don't think about it, but if you actually think about it it's not a useful question because it has no meaning. One can easily choose sizes such that the SD and HD density per sq inch are the same, or SD is higher, or HD is higher, and then we could also choose different viewing distances which further complicate the question and make it unuseful.
Simple answer - 1920x1080 is 6x as many pixels as 720x480.


Also, I'd say on average the pixels themselves in our VC-1 encodes are more accurate than DVD MPEG-2.

Quote:
Originally Posted by ChrisWiggles /forum/post/0


Well, sure, but that has no bearing to anything. SD and HD do not have size dimensions. Picking some small size for SD is arbitrary, as is picking some large size for HD. You could just as easily pick a 100" SD size and a 30" HD size and the latter would have oodles more pixels per sq inch.

I think we were kind of using average HDTV and SD sizes that a typical HDTV buyer mught purchase. 100" isnt quite an the average size HDTV that people are buying as far as i know.(I could be wrong)

Quote:
Originally Posted by ChrisWiggles /forum/post/0


And then add to the fact that this doesn't take into account viewing ratio, and it's really not a very useful query. I mean, it sort of makes sense if you don't think about it, but if you actually think about it it's not a useful question because it has no meaning. One can easily choose sizes such that the SD and HD density per sq inch are the same, or SD is higher, or HD is higher, and then we could also choose different viewing distances which further complicate the question and make it unuseful.

Well here is what I was thinking. I have been trying to figure out why when I watch HDTV on my new set 50" set that it really only looks a little better(still better just not night and day) than when I whatch the same program in SD on my old SDTV. I don't see any other reasonable explanation to it other than that per inch they are about the same resolution.


You obviously have more expirience with this stuff than I. Maybe you can answer this question for me,


Given a constant seating distance, what resolution increase do I need to have on a 50" HDTV to get the same PQ I am getting out of a 27" SD TV set?



Wouldn't there a certain amount of total resolution increase that would be needed in order to maintain a similar PQ to that of a smaller SD as you move up in size given a constant seating distance? I think that is what I am trying to figure out.



Like I said I am no HDTV expert. Just looking to discuss the impact of resolution increases in relation to screen size.

Cheers
See less See more
2
The larger 1080p HDTVs >60" may have less pixel density than 30" 480p SDTVs, however since there is 6 times as many pixels it allows for greater detail.


Angeline Jolie with six times as many pixels
See less See more

Quote:
The larger 1080p HDTVs >60" may have less pixel density than 30" 480p SDTVs, however since there is 6 times as many pixels it allows for greater detail.

Yeah i dont think anyone is debating that. 1080p is sweeton a 60". Just trying to figure out how many pixels you would need to get the same perceived PQ (resolution wise) on a 60" set that a smaller 27" SDTV would have.

Quote:
Angeline Jolie with six times as many pixels

Yummmmmmmy!

Quote:
Originally Posted by paradigm20s /forum/post/0


Given a constant seating distance, what resolution increase do I need to have on a 50" HDTV to get the same PQ I am getting out of a 27" SD TV set?



Wouldn't there a certain amount of total resolution increase that would be needed in order to maintain a similar PQ to that of a smaller SD as you move up in size given a constant seating distance? I think that is what I am trying to figure out.



Like I said I am no HDTV expert. Just looking to discuss the impact of resolution increases in relation to screen size.

Cheers

Well of course, but you are assuming both an increase in absolute screen size, AND no change in viewing distance, or in other words you are changing the viewing ratio.


If you had an format with greater resolution, you can achieve two things: 1) greater detail or 2) a larger image, or a combination of the two. Because HD has greater resolution, it will satisfy viewers at larger viewing ratios compared to lower-definition images. Now, you could do the math as someone did above, but I'm not sure what the purpose is, unless you are trying to determine what kind of viewing ratios you would like to view different content at.
Also keep in mind that while HD (say 1080p) has 6 times the pixels it may or may not have contrast (MTF) at the highest frequencies those pixels might represent. If an HD picture has been poorly telecined, shot through a vaseline covered lens, filtered, or compressed to excessively low bit rates then most of the contrast at higher frequencies may have been lost. In those cases there will be little advantage to all the extra pixels and they are just there for advertising big numbers but add little to the picture that good upscaling couldn't.


Basically HD is good HD only when care is taken all the way through the chain to make it good. Otherwise artifacts are added and high frequency detail may lost at each step and the result is cumulatively yucky. (technical phrase)


- Tom
That's an excellent point, as the "6x" argument has been cited multiple times in this very topic, w/o any further consideration beyond that. It should be recognized by all those concerned here that the "6x" part is merely a theoretical spec, not an inherent guarantee of the performance of whatever material occupies that "6x" medium. I guess this is a way of saying that there can be a "quality" to the pixels, themselves, apart from whether or not they come in a pack of 480p, 720p, or 1080p.

Quote:
Originally Posted by Mr. Hanky /forum/post/0


All of this would suggest that the "unit" pq has remained the same going from sd to hd. It's just the hd contains more pixels (yeah, I know- duh) to retain the same detail quality on larger screens. Looking at it another way, viewing sd on a small computer screen can be essentially an "hd quality" rendition similar to viewing hd on a large hdtv display. The former is just a smaller experience.

Like most others here, I am utterly confounded by what you're trying to ask. You seem to be assuming that watching SD material is always going to be on a smaller screen than watching HD material, but there's no correlation at all in screen size between formats. A TV of any format can be any size you want to buy. You have to assume at least a base-line comparison of watching the two discs on the same screen.


With that said, the greater pixel count of HD material allows for improved rendering of fine object detail. For example, the skin pores or individual strands of hair on an actor's head can be sharply defined in HD while they look indistinct in SD.
Imagine taking a decent looking DVD and upscaling it to 1080p, then making a hidef DVD of it. It would now have 6 times the number of pixels but AT BEST ZERO extra visibility of skin pores. I realize this is a very contrived example but it does show what you get when you have lower resolution input.


At best, HD can indeed have 6x the effective resolution. But it can also be as bad as a bad DVD if you make it that way. And if you start with the same master as a DVD it cannot reach 6x effective resolution simply because no MTF curves are flat out past 720x480. That is, there are diminishing returns of extra detail (MTF, sharpness) per pixel unless you can start with a much higher resolution scan and properly downsample. (computer graphics excluded)


- Tom

Quote:
Imagine taking a decent looking DVD and upscaling it to 1080p, then making a hidef DVD of it. It would now have 6 times the number of pixels but AT BEST ZERO extra visibility of skin pores. I realize this is a very contrived example but it does show what you get when you have lower resolution input.

Put in layman's terms, Density vs Detail. Resolution increases density over a given screen size but not necessarily detail.


New encodes from the masters bring additional detail for the resolution to make visible. Then there is the color increase.
Again one question!




Given a constant seating distance, what resolution increase do I need to have on a 50" HDTV to get the same PQ I am getting out of a 27" SD TV set?
480i blown up on most 50" screens looks worse than it does on a smaller SD set because of the increase in size with no increase in input signal resolution. I have to believe that you need an increase in resolution as the set increases in size in order to maintain the same PQ.


The reason I assume an absolute seating position is that I assume most people dont adjust their living rooms and seating positions when they buy a HDTV so I am just trying to figure put at what broadcast resolution and HDTV resolution would I get the same perceived image quality that I would get on a smaller SDTV with SD signal.
1 - 20 of 35 Posts
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top