or Connect
New Posts  All Forums:Forum Nav:

# Fake 1080i - Page 6

It does look a hell of a lot better than SD though. I'm watching a football game broadcast in standard 480i and it looks like ass, lol. So there is sure is a big difference between 480i and 480p. Going by the dramatic difference of that one gets a pretty good idea of how much better 1080p would look than 1080i, provided that the source material is 1080p native of course.
Quote:
 Originally Posted by Wickerman1972 It does look a hell of a lot better than SD though. I'm watching a football game broadcast in standard 480i and it looks like ass, lol. So there is sure is a big difference between 480i and 480p. Going by the dramatic difference of that one gets a pretty good idea of how much better 1080p would look than 1080i, provided that the source material is 1080p native of course.
This is a pretty unfair comparison (480i football). They use SD cameras. The biggest difference between HD and SD is on the source equipment.

Watch the FOX NFL game for the week that is not HD or MLB baseball on FOX this year. This is SD 480p and to me there is not much difference between that and the normal 480i.
I did a little bit of math to try and figure out exactly what my display is doing. I came up with these figures by assuming that the xbr970 is 853 X 1,080 interlaced which seems to me to be the most accurate argument I've heard.:

So, with a resolution of 853 X 1,080 that would mean this set has a total of 921,240 pixels. But because it is interlaced only 460,620 pixels get refreshed each 1/60th of a second. A 480p signal has 345,600 pixels refreshing all the time. So 1080i on this TV would have 115,020 more refreshed pixels displayed per 1/60th of a second than 480p. If you want to count all the pixels in both fields of the 1080i signal it would have 575,640 more pixels than 480p. But I don't think that is an accurate way of looking at it. When I look at 480p and 1080i on this TV I certainly don't see over 500,000 pixels of difference, probably because a lot of those pixels are old and cause a blurring effect. The 115,020 difference seems more accurate to me and more of what I'm physically seeing. If this TV had the resolution it is suppose to, that being 1,920 X 1,080 interlaced, it would have 2,073,600 pixels interlaced and/or 1,036,800 pixels refreshed every 1/60th of a second. That would be 691,200 more refreshed pixels than 480p per 1/60th a second and that would be quite a difference and would look significantly better than 480p. But because this TV is only 853 X 1,080 it is missing 576,180 refreshed pixels per 1/60th a second of a true 1080i signal. DAMN! It is missing more of a 1080i signal than what it is displaying, which is 460,620 refreshed pixels per 1/60th a second. And so do you want to know what the difference is between the xbr970 and a 1080p flat panel when it comes to sharpness? 1080p has 1,612,980 more pixels per picture refresh than the xbr970. If you want to count both interlaced fields of the xbr970's picture rather than just 1 that would be 1,152,360 more pixels for a 1080p HDTV than the xbr970. Wow, that is a huge difference imo.
Quote:
 Originally Posted by Wickerman1972 So, with a resolution of 853 X 1,080 that would mean this set has a total of 921,240 pixels. But because it is interlaced a maximum of 460,620 pixels could be displayed at any given time.
No, you still have 921,240 pixels on the screen at any given time, half of them are just 1 field older.
Quote:
 Originally Posted by primetimeguy No, you still have 921,240 pixels on the screen at any given time, half of them are just 1 field older.
Hmmm, I'm not really sure. But that's why I presented numbers for both viewpoints. I'm not entirely certain exactly how the interlacing works. But there is no doubt in my mind that I'm not seeing a 500,000+ pixel difference between 480p and 1080i on this TV.
Here is a link with an example of interlacing. There are probably better ones but look at these examples where an interlaced display shown. Note how both the odd and even lines are always displayed, just 1/60th of sec apart in time.

http://neuron2.net/LVG/interlacing.html
Quote:
 Originally Posted by primetimeguy Here is a link with an example of interlacing. There are probably better ones but look at these examples where an interlaced display shown. Note how both the odd and even lines are always displayed, just 1/60th of sec apart in time.
OK, I edited the post to account for your correction. Do you think it is more accurate now?
Quote:
 Originally Posted by Wickerman1972 But there is no doubt in my mind that I'm not seeing a 500,000+ pixel difference between 480p and 1080i on this TV.
That's because as RWetmore enlightened us on you lose clarity due to interlace scanning. These sets are 853x1080i but due to native interlace scanning we only see about an effective 600-800 lines of resolution out of the vertical 1080 interlaced resolution.
Quote:
 Originally Posted by Wickerman1972 Hmmm, I'm not really sure. But that's why I presented numbers for both viewpoints. I'm not entirely certain exactly how the interlacing works. But there is no doubt in my mind that I'm not seeing a 500,000+ pixel difference between 480p and 1080i on this TV.
Being able to see all of the pixels is different. I was just stating that with interlaced signals you still have all 1080 lines on the display at any given time, 540 of them are just 1/60 of sec older. Now your TV may not be able to resolve all 1080 by the time it gets to the screen and there is some overlap of the lines and therefore you cannot see them all.
Quote:
 Originally Posted by vazel That's because as RWetmore enlightened us on you lose clarity due to interlace scanning. These sets are 853x1080i but due to native interlace scanning we only see about an effective 600-800 lines of resolution out of the vertical 1080 interlaced resolution.
I agree.
Another link on how interlacing works.

http://www.hometheaterhifi.com/volum...e-10-2000.html
Quote:
 Originally Posted by vazel That's because as RWetmore enlightened us on you lose clarity due to interlace scanning. These sets are 853x1080i but due to native interlace scanning we only see about an effective 600-800 lines of resolution out of the vertical 1080 interlaced resolution.
Yeah, I edited the post to make it more accurate. I wanted to do the math to see exactly what the sharpness difference between this TV and a 1080p flat panel are. I knew there was quite a difference but the numbers actually kind of surprised me. The difference is huge.
Quote:
 Originally Posted by Wickerman1972 Yeah, I edited the post to make it more accurate. I wanted to do the math to see exactly what the sharpness difference between this TV and a 1080p flat panel are. I knew there was quite a difference but the numbers actually kind of surprised me. The difference is huge.
But I think what you are finding is there is a lot more to a good picture than just resolution. For instance, I'd be willing to bet that a display with a 25% better contrast ratio but 25% less resolution would look better than a display with 25% worse contrast ratio and 25% more resolution.
Quote:
 Originally Posted by primetimeguy But I think what you are finding is there is a lot more to a good picture than just resolution. For instance, I'd be willing to bet that a display with a 25% better contrast ratio but 25% less resolution would look better than a display with 25% worse contrast ratio and 25% more resolution.
Perhaps. But I want it all, heh heh. I want the resolution, color, contrast, blacks...all of it! Looks like SED will be the only thing that will deliver that but Toshiba and Canon are taking freaking forever developing it .:( First I heard they'd be out in 2006, then 2007, and now this article says we'll have to wait until 2008.

http://gear.ign.com/articles/736/736795p1.html
Wickerman,

Someone else asked a similar question here....