Originally Posted by HorrorScope
Then I come in here for the conclusive layman part of all of this...
1080i outputted content to a Fixed 1080 Panel = equals 1080P
The 1080P panel gets the data from 1080i and rids the I and makes it a P because it has to display it's full rez at once, call it a perk of a fixed panel, magic if that makes you feel better.
In this case 1080i=1080P the rest is marketing trickery. Yes? So the A1's 1080i to your 1080P display is 1080P, no need to wait for a 1080P outputted HD DVD player. Your Sammy starting with 1080P going to 1080i then out still in the end becomes 1080P on your 1080P display. Correct?
And to get deep into this (when talking to J6P) the only benefit of 1080P coming out of the player would be the player doing the progressive work and not the display. But with that said no one has any example of equipment where by having the player do this instead of the display results in a better picture? Correct? The part the display is doing now taking "I" and making it "P" is standard and all of them do this. Correct?
I'm pretty sure that is what this whole thread boils down to. Same goes for a broadcast HD coming in 1080i to my 1080 fixed panel. I might as well just drop the I and say I have 1080 P(eriod) going to my display because it is full 1900x1080. Not 1900x540, not a fuzzy picture, no a flickering picture just pure unadulterated 1080 to my display. Correct?1080i content=1080p content when you use a 1080 fixed panel display
I don't think you are correct... Easy to see why people get so confused on this topic.
First off, 1080i as a standard only has one resolution and that is 1920 x 1080 pixels and that is exactly the same as 1080p.
Obviously, 1080i sends the picture in two halves, one half at a time. 1080p sends the whole picture in one hit.
Let's assume that the rate we're sending 1080i at is 60Hz, this means that 1080i can transmit 30 whole frames in one second. Where a frame is composed of two halves.
Now, movies shot on film have a native frame rate of 24 frames per second. Using other processes this gets upped to 30 frames per second for 'easier' display. This can therefore be transmitted over 1080i at 60Hz by sending each half of each frame consecutively.
The key point here is that as the picture you are sending is 30 fps, the two halves of the picture sent in interlaced format are two halves of the same frame, that frame having been recorded at one moment in time...
A good de-interlacer will recognise that in this instance you can put the two halves of the picture back together to build a full frame with virtually no loss of information or resolution. This only applies to sources recorded at 30fps or lower and in one whole frame at a time mode which is exactly what film does.
However, much broadcast HDTV at 1080i is captured by the camera in 1080i. What this means is that the camera captures 60 half frames per second, it never captures whole frames. This is very important as unlike from a film source where halves 1 and 2 will make up whole frame 1, which was captured at the same instant in time, frames 1 and 2 in an interlaced capture represent only half of a whole picture captured in two different instants in time...
To try and explain that better, the first half frame represents half of what the camera saw at 1/60th of a second and the second frame represents one half of what the camera saw at 2/60th of a second. If the object the camera was looking at has moved between 1/60th of a second and 2/60th's of a second then you can't simply put the two halves together to make one complete frame as you get funny lines known as 'combing' because they look like a comb.
So, when faced with this type of 1080i signal, the de-interlacer has a very different job to do to create a full frame to display. It then potentially gets very complicated.
Clearly if you capture the same scene using 1080p at 60Hz, you get 60 full frames per second containing all the information. However this generates twice the data and therefore requires much more bandwidth to transmit and therefore no-one uses it.
So, 1080i definitely does not equal 1080p. However in those instances where the full frame rate of the source can be spilt in two and transmitted within the half-frame rate of 1080i, then it is possible to get virtually indistinguishable picture quality from 1080i as you would from 1080p... providing whatever does the de-interlacing recognises the incoming signal properly. Not always the case...
I have to disagree with Toke on his assertion that 1080p broadcast will be in Europe soon. Standards bodies have looked at it and done tests, but that doesn't mean it will happen anytime soon as the bandwidth doesn't yet exist and none of the HDTV infrastructure that the likes of BSybB in the UK are investing in could handle it, and they only started rolling out the service a few months ago...