Originally Posted by Ian_S
I don't think you are correct... Easy to see why people get so confused on this topic.
First off, 1080i as a standard only has one resolution and that is 1920 x 1080 pixels and that is exactly the same as 1080p.
Obviously, 1080i sends the picture in two halves, one half at a time. 1080p sends the whole picture in one hit.
Let's assume that the rate we're sending 1080i at is 60Hz, this means that 1080i can transmit 30 whole frames in one second. Where a frame is composed of two halves.
Now, movies shot on film have a native frame rate of 24 frames per second. Using other processes this gets upped to 30 frames per second for 'easier' display. This can therefore be transmitted over 1080i at 60Hz by sending each half of each frame consecutively.
The key point here is that as the picture you are sending is 30 fps, the two halves of the picture sent in interlaced format are two halves of the same frame, that frame having been recorded at one moment in time...
A good de-interlacer will recognise that in this instance you can put the two halves of the picture back together to build a full frame with virtually no loss of information or resolution. This only applies to sources recorded at 30fps or lower and in one whole frame at a time mode which is exactly what film does.
However, much broadcast HDTV at 1080i is captured by the camera in 1080i. What this means is that the camera captures 60 half frames per second, it never captures whole frames. This is very important as unlike from a film source where halves 1 and 2 will make up whole frame 1, which was captured at the same instant in time, frames 1 and 2 in an interlaced capture represent only half of a whole picture captured in two different instants in time...
To try and explain that better, the first half frame represents half of what the camera saw at 1/60th of a second and the second frame represents one half of what the camera saw at 2/60th of a second. If the object the camera was looking at has moved between 1/60th of a second and 2/60th's of a second then you can't simply put the two halves together to make one complete frame as you get funny lines known as 'combing' because they look like a comb.
So, when faced with this type of 1080i signal, the de-interlacer has a very different job to do to create a full frame to display. It then potentially gets very complicated.
Clearly if you capture the same scene using 1080p at 60Hz, you get 60 full frames per second containing all the information. However this generates twice the data and therefore requires much more bandwidth to transmit and therefore no-one uses it.
So, 1080i definitely does not equal 1080p. However in those instances where the full frame rate of the source can be spilt in two and transmitted within the half-frame rate of 1080i, then it is possible to get virtually indistinguishable picture quality from 1080i as you would from 1080p... providing whatever does the de-interlacing recognises the incoming signal properly. Not always the case...
I have to disagree with Toke on his assertion that 1080p broadcast will be in Europe soon. Standards bodies have looked at it and done tests, but that doesn't mean it will happen anytime soon as the bandwidth doesn't yet exist and none of the HDTV infrastructure that the likes of BSybB in the UK are investing in could handle it, and they only started rolling out the service a few months ago...