or Connect
AVS › AVS Forum › Blu-ray & HD DVD › Blu-ray Players › 1080i vs 1080p
New Posts  All Forums:Forum Nav:

1080i vs 1080p - Page 4

post #91 of 220
Quote:
Originally Posted by PeterS View Post

Again -

If the source is VIDEO shot at 1080i - then you are best looking at this on a 1080i (INTERLACED) display (generally - unless you have a good quality deinterlacer).

If the source is FILM shot at 24fps - then you are equal in looking at this transmitted in 1080i or 1080p to a DIGITAL DISPLAY. There will be no difference.

Since the primary focus of this forum is next-generation optical disc formats which are primarily FILM and stored on disc as 24fps - then there is no difference between 1080i or 1080p as a TRANSMISSION format.

And these same optical formats will undoubtedly be expected to deliver content recorded in 1080i to consumers in much the same way that DVD delivers interlaced content today. TV programmes, sports events, etc that many people buy and will want to buy in HD once available...

So it is an important distinction that 1080i only = 1080p in certain circumstances one of which is film. In other areas people are likely to purchase discs in then this may not hold true.

I do however agree that for movies, 1080p offers no significant advantage.
post #92 of 220
But wait, don't some people here have "golden eyes" so they can see the difference, just like those people with golden ears that can tell the difference between 48khz and 96khz audio? :P
post #93 of 220
Ok, so what if the set, namely the upcoming Hitachi 42HDX99 Director's Series, displays 1080 V lines and also has a great internals as well? Does that change things at all? Will the difference between p and i become less obvious?
post #94 of 220
Quote:
Originally Posted by Ian_S View Post

I don't think you are correct... Easy to see why people get so confused on this topic.

First off, 1080i as a standard only has one resolution and that is 1920 x 1080 pixels and that is exactly the same as 1080p.

Obviously, 1080i sends the picture in two halves, one half at a time. 1080p sends the whole picture in one hit.

Let's assume that the rate we're sending 1080i at is 60Hz, this means that 1080i can transmit 30 whole frames in one second. Where a frame is composed of two halves.

Now, movies shot on film have a native frame rate of 24 frames per second. Using other processes this gets upped to 30 frames per second for 'easier' display. This can therefore be transmitted over 1080i at 60Hz by sending each half of each frame consecutively.

The key point here is that as the picture you are sending is 30 fps, the two halves of the picture sent in interlaced format are two halves of the same frame, that frame having been recorded at one moment in time...

A good de-interlacer will recognise that in this instance you can put the two halves of the picture back together to build a full frame with virtually no loss of information or resolution. This only applies to sources recorded at 30fps or lower and in one whole frame at a time mode which is exactly what film does.

However, much broadcast HDTV at 1080i is captured by the camera in 1080i. What this means is that the camera captures 60 half frames per second, it never captures whole frames. This is very important as unlike from a film source where halves 1 and 2 will make up whole frame 1, which was captured at the same instant in time, frames 1 and 2 in an interlaced capture represent only half of a whole picture captured in two different instants in time...

To try and explain that better, the first half frame represents half of what the camera saw at 1/60th of a second and the second frame represents one half of what the camera saw at 2/60th of a second. If the object the camera was looking at has moved between 1/60th of a second and 2/60th's of a second then you can't simply put the two halves together to make one complete frame as you get funny lines known as 'combing' because they look like a comb.

So, when faced with this type of 1080i signal, the de-interlacer has a very different job to do to create a full frame to display. It then potentially gets very complicated.

Clearly if you capture the same scene using 1080p at 60Hz, you get 60 full frames per second containing all the information. However this generates twice the data and therefore requires much more bandwidth to transmit and therefore no-one uses it.

So, 1080i definitely does not equal 1080p. However in those instances where the full frame rate of the source can be spilt in two and transmitted within the half-frame rate of 1080i, then it is possible to get virtually indistinguishable picture quality from 1080i as you would from 1080p... providing whatever does the de-interlacing recognises the incoming signal properly. Not always the case...

I have to disagree with Toke on his assertion that 1080p broadcast will be in Europe soon. Standards bodies have looked at it and done tests, but that doesn't mean it will happen anytime soon as the bandwidth doesn't yet exist and none of the HDTV infrastructure that the likes of BSybB in the UK are investing in could handle it, and they only started rolling out the service a few months ago...


Well what I'm hearing now is that I am correct when I'm using Optical Movies and possibly not correct when talking about broadcasted 1080i. No?
post #95 of 220
Quote:
Originally Posted by Josh Z View Post

2,073,600

Actually it's 2,073,600j or 1,036,800s depending on whether they are standing or half are in mid-jump at any given time.
post #96 of 220
Quote:
Originally Posted by Ian_S View Post

....So, 1080i definitely does not equal 1080p. However in those instances where the full frame rate of the source can be spilt in two and transmitted within the half-frame rate of 1080i, then it is possible to get virtually indistinguishable picture quality from 1080i as you would from 1080p... providing whatever does the de-interlacing recognises the incoming signal properly. Not always the case...

The method of deinterlacing a 1080i signal would be different for a native 1080i source than for a 1080p source that was interlaced to 1080i.

When your set recieves a 1080i signal, how exactly does it recognize whether the original source was native 1080i or 1080p which was interlaced?

In any event, within the very near future all new 1080p displays will accept 1080p input; and all BD and/or HD-DVD players will output 1080p. Whether or not it actually makes a difference, the "as long as your display deinterlaces properly" concern will be limited to the 1080p displays sold in the short time before 1080p inputs were available.
post #97 of 220
Quote:
Originally Posted by Ian_S View Post

So it is an important distinction that 1080i only = 1080p in certain circumstances one of which is film. In other areas people are likely to purchase discs in then this may not hold true.

Right, for video-based 1080i, 1080p transmission doesn't mean it is better or worse than 1080i. It just allows you to decide whether you like the deinterlacer in the player or your display better. In some cases you may like the player's deinterlacer better. In other cases you may like the one in your display better. It will depend on the equipment.
post #98 of 220
1080p rules!
post #99 of 220
Quote:
Originally Posted by Artwood View Post

1080p rules!

Can you contribute something other than being a shmuck?
post #100 of 220
Quote:
Originally Posted by Ian_S View Post

I have to disagree with Toke on his assertion that 1080p broadcast will be in Europe soon. Standards bodies have looked at it and done tests, but that doesn't mean it will happen anytime soon as the bandwidth doesn't yet exist and none of the HDTV infrastructure that the likes of BSybB in the UK are investing in could handle it, and they only started rolling out the service a few months ago...

My "soon" guess is that we will have 1080p transmissions from next olympics.
And if you're talking about broadcasting bandwidth, it's already there.
Both sky and bbs are already using mp4 compression and we use here in Europe 8MHz channels, which can easily carry 22Mbps. With that kind of bitrate and mp4 there's no problem to transmit 1080p24/25/50/60.
post #101 of 220
Quote:
Originally Posted by dropKickMurphy View Post

When your set recieves a 1080i signal, how exactly does it recognize whether the original source was native 1080i or 1080p which was interlaced?

It analyses the image to see if it can detect the pattern from progressive source (i.e. repeated fields). If it can, it switches to "film" mode and you get 1080p.
post #102 of 220
Quote:
Originally Posted by dropKickMurphy View Post

In any event, within the very near future all new 1080p displays will accept 1080p input; and all BD and/or HD-DVD players will output 1080p. Whether or not it actually makes a difference, the "as long as your display deinterlaces properly" concern will be limited to the 1080p displays sold in the short time before 1080p inputs were available.

I'll bet next year every new fullHD model has 1080p input.

Btw, does bd or hd-dvd have 1080p50/60 in their specs?
post #103 of 220
Amir,

"It puts the lotion in the basket?"
post #104 of 220
Interesting read. Ok, here's my question: does the Home Theater Hifi Secrets article on progressive scan apply when discussing 1080i vs 1080p? I'm thinking it does, unless I'm mistaken, since the general concept is the same. If this article does apply, it might be worth reading by some since it gives a great explanation of what's going on (even though it focuses on 480i/p) and it even has neat animated GIFs.

Peace...
post #105 of 220
Quote:
Originally Posted by toke View Post

Btw, does bd or hd-dvd have 1080p50/60 in their specs?

Not as a storage form but players are free to output after processing.
post #106 of 220
Quote:
Originally Posted by toke View Post

Btw, does bd or hd-dvd have 1080p50/60 in their specs?

See post #49 (no).
post #107 of 220
Quote:
Originally Posted by tomdkat View Post

Interesting read. Ok, here's my question: does the Home Theater Hifi Secrets article on progressive scan apply when discussing 1080i vs 1080p? I'm thinking it does, unless I'm mistaken, since the general concept is the same. If this article does apply, it might be worth reading by some since it gives a great explanation of what's going on (even though it focuses on 480i/p) and it even has neat animated GIFs.

Peace...


That link was good and is basically what I've been asking/suggesting. You could have some judder but then again if no one is reporting it and I don't see many reporting it with HD DVD then the formula I've been using seems correct.
post #108 of 220
I guess the only other issue people would have with your point is it isn't quite accurate to say "interlaced = progressive" (which is basically what the 1080i=1080p is saying) but more accurate to say "1080i when displayed on a 1080p display is the equivalent of a 1080p signal sent to a 1080p display". Of course, we're talking about HD-DVD/Blu-Ray only.

Peace...
post #109 of 220
Quote:
Originally Posted by toke View Post

My "soon" guess is that we will have 1080p transmissions from next olympics.

I assume you're talking about 1080p50 (or 60) because sports transmissions captured at 1080p25 (or p30) would be horrible to watch.
Even if such signal could be broadcasted, there's a major problem :
As far as I know, there's (actually) no way to shoot, switch, process, record or even encode live at 1080p50.
Only one studio camera from Sony can output a signal @ 1080p50 but no hardware to connect to. No controller, no switcher, nothing.
Sony has demonstrate a few month ago, a modified HDCAM-SR studio recorder that could record at 1080p50. It was, and still is, a prototype.
Sony stated that this technology is marketed for slow-motion purpose only. It's not intended to be broadcasted as is.
post #110 of 220
When I connect my computer to the LCD-TV - the TV menu screen shows resolution 1920x1080.

However, when I connect my cable box (from Comcast booooo) and watch Hi-Def television, the LCD screen menu shows resolution 1920x540.

The computer sends a 1920x1080 progressive signal, and the cable box sends 1920x1080 interlaced over component.

My LCD screen does not treat the 1080i and 1080p signals the same. Therefore there seems to be some difference between 1080p and 1080i. How do you explain that?
post #111 of 220
Quote:
Originally Posted by rambo2300 View Post

When I connect my computer to the LCD-TV - the TV menu screen shows resolution 1920x1080.

However, when I connect my cable box (from Comcast booooo) and watch Hi-Def television, the LCD screen menu shows resolution 1920x540.

The computer sends a 1920x1080 progressive signal, and the cable box sends 1920x1080 interlaced over component.

My LCD screen does not treat the 1080i and 1080p signals the same. Therefore there seems to be some difference between 1080p and 1080i. How do you explain that?


If you are saying that the computer is sending a 1080P signal to your screen then I assume you have a display that can recieve 1080P?
post #112 of 220
Quote:
Originally Posted by MarcMame View Post

I assume you're talking about 1080p50 (or 60) because sports transmissions captured at 1080p25 (or p30) would be horrible to watch.
Even if such signal could be broadcasted, there's a major problem :
As far as I know, there's (actually) no way to shoot, switch, process, record or even encode live at 1080p50.
Only one studio camera from Sony can output a signal @ 1080p50 but no hardware to connect to. No controller, no switcher, nothing.
Sony has demonstrate a few month ago, a modified HDCAM-SR studio recorder that could record at 1080p50. It was, and still is, a prototype.
Sony stated that this technology is marketed for slow-motion purpose only. It's not intended to be broadcasted as is.

1080p50/60 camera's are now meant to be used with slow motion, because there are no 1080p50/60 broadcasts. When there will be, they can be used as a regular live camera. Soccer world cup had 6 OB-vans with 4 1080p50 camera's (from GrassValleyThomson, not Sony) in each. So there are starting to be a lot of camera's capable of 1080p50/60 and even from different manufacturers.
And of course slow motion camera's signal is recorded, because you can't show slow motion realtime (unless you can alter time.

One reason to shoot next olympics in 1080p50 or 1080p60 is that it is much easier to make high quality conversion from 1080p50/60 source to 1080i50/60 broadcasting. Converting from 1080i50<->1080i60 in HQ is really hard.
And since there then is 1080p source, why not to broadcast it to those who have mp4 receivers?
post #113 of 220
Quote:
Originally Posted by sfhub View Post

Not as a storage form but players are free to output after processing.

So my guess is that it won't take long after we get upgraded version2 of the specs that include 1080p50/60.
post #114 of 220
Quote:
Originally Posted by toke View Post

1080p50/60 camera's are now meant to be used with slow motion, because there are no 1080p50/60 broadcasts. When there will be, they can be used as a regular live camera. Soccer world cup had 6 OB-vans with 4 1080p50 camera's (from GrassValleyThomson, not Sony) in each. So there are starting to be a lot of camera's capable of 1080p50/60 and even from different manufacturers.
And of course slow motion camera's signal is recorded, because you can't show slow motion realtime (unless you can alter time.

As far as I know, Thomson's cameras (LDK6000) used on the world cup only support 720p or 1080i @ 50 or 60Hz. Not 1080p50, neither 1080p25.
The Slo-mo cameras used there, are only capable of capturing at double speed : 1080i100. The signal is directly output to an hard-drive and then streamed at 1080i50 to be manipulated.


Quote:


And since there then is 1080p source, why not to broadcast it to those who have mp4 receivers?

Maybe because there's no MP4 receiver on this planet capable of handling such a stream of data ?
post #115 of 220
Quote:
Originally Posted by tomdkat View Post

I guess the only other issue people would have with your point is it isn't quite accurate to say "interlaced = progressive" (which is basically what the 1080i=1080p is saying) but more accurate to say "1080i when displayed on a 1080p display is the equivalent of a 1080p signal sent to a 1080p display". Of course, we're talking about HD-DVD/Blu-Ray only.

Peace...

Got ya, I never said it or I said it so many times the other way that I took it as synonymous if I did shorten it. Basically I have been using: 1080i content=1080p content when you use a 1080 fixed panel display (And yes I could have put in HD/BD) still a bit fuzzy with broadcast however.

IMO why I care about this is because somewhere down the line a friend will ask and I would like to save them money if they have a scenario like this and if they are being sold they have to have a 1080P output device I can tell them save the money lets got to the T T bar instead or something like that. And well for myself in the case I ever have a 1080P fixed panel.
post #116 of 220
Quote:
Originally Posted by toke View Post

So my guess is that it won't take long after we get upgraded version2 of the specs that include 1080p50/60.

Historically in the states there is a lot of inertia to change unless you can justify the expense.
post #117 of 220
Quote:
Originally Posted by mboojigga View Post

If you are saying that the computer is sending a 1080P signal to your screen then I assume you have a display that can recieve 1080P?


My display has a resolution: 1920*1080. I connect my computer to it by using a DVI cable. The LCD is a Westinghouse.
post #118 of 220
Quote:
Originally Posted by MarcMame View Post

As far as I know, Thomson's cameras (LDK6000) used on the world cup only support 720p or 1080i @ 50 or 60Hz. Not 1080p50, neither 1080p25.
The Slo-mo cameras used there, are only capable of capturing at double speed : 1080i100. The signal is directly output to an hard-drive and then streamed at 1080i50 to be manipulated.

Ok,
actually they were LDK 6200's, but yes only 1080i100/120.
But it uses the regular triax or fiber to connect and tapeless recording is coming so recording to hdd will be mainstream thing also.
When the imagers and the camera have sufficient bandwidth for 100/120i it will be minor modifications for converting that to 50/60p.

Quote:
Originally Posted by MarcMame View Post

Maybe because there's no MP4 receiver on this planet capable of handling such a stream of data ?

My guess was that there will be in two years.
post #119 of 220
Would someone please explain this to me:

A 1080i TV has the same number of pixels as a 1080P TV but can only refresh 1/2 of them per second?

Why is the PC resolution set at 1366x768? and why can I run 1920x1080i on the PC if the number of pixels are the same???
post #120 of 220
Here is a link to another article about 1080i and 1080p.
1080p source transported as 1080i to a 1080p display = perfect picture only if the display properly weaves the 1080i.
According to the above article not many sets reassemble the signal properly.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Blu-ray Players
AVS › AVS Forum › Blu-ray & HD DVD › Blu-ray Players › 1080i vs 1080p