AVS Forum

AVS Forum (http://www.avsforum.com/forum/)
-   Blu-ray Players (http://www.avsforum.com/forum/149-blu-ray-players/)
-   -   1080i vs 1080p (http://www.avsforum.com/forum/149-blu-ray-players/698612-1080i-vs-1080p.html)

Cain 07-13-2006 05:01 AM

Can someone explain to me, and I am asking a question here, not looking for an argument.

Why do most folks agree that 480p looks better than 480i, but many argue that 1080i looks exactly the same as 1080p ??

--- Cain

Satmeister 07-13-2006 05:24 AM

Quote:
Originally Posted by Cain View Post

Can someone explain to me, and I am asking a question here, not looking for an argument.

Why do most folks agree that 480p looks better than 480i, but many argue that 1080i looks exactly the same as 1080p ??

--- Cain

Because its true, given quality source content and image projection equipment for both 1080i and 1080p. There comes a point where the ROI diminishes alot, and that's the border line IMHO. I've seen both side by side using solid source and projectors on the same screen - the difference doesn't begin to warrant the change or cost, again IMHO. That's just one vote.

PeterS 07-13-2006 06:11 AM

Actually, this is WRONG.

480p looks better than 480i because most traditional video material is encoded at 480i and designed to be displayed on 480i devices. 480p only looks better when it is properly deinterlaced with the proper "pull-down" and then displayed on a 480p display device.

I guess it is better to say that 480p display devices look better than 480i display devices as they actually do show you twice the apparent resolution at any given moment due to these techniques.

Now, this falls apart with 1080i and 1080p as we are basically all using progressive (p) display devices. This is true even if your device only takes in a 1080i signal. All current technologies (with the exception of CRT) actually display a progressive signal (DLP, LCD, SXRD, DiLA, Plasma, etc.) Since this is true, they must be grabbing an interlaced signal, holding one field, combining it with the next field prior to display (this is what any progressive output device previously did before outputting the image).

The issue here is that the technology is EXTREMELY robust and that for very little cost almost all displays can do this perfectly. For film, it is trivial as there is no "time offset" between two interlaced Fields - they come from the same Frame so putting them back together is extremely simple.

You can view it as follows:

A Progressive Signal is sent as
Line 1
Line 2
Line 3
Line 4
etc.
Display

An Interlaced Signal is sent:
Line 1
Line 3
Line 5
etc.
Line 2
Line 4
Line 6
etc.
Display

In the end run the same information is in the Frame Buffer prior to display - ergo no difference.

Another point is that people continually confuse the subject when talking about 1080i and 1080p. You MUST distinguish between TRANSMISSION and DISPLAY. In your context you are talking about how the signal is TRANSMITTED as almost all of the 1080 monitors, etc. DISPLAY 1080p and can not DISPLAY in an interlaced fashion anyway.

Hope this helps.

AV_Integrated 07-13-2006 06:29 AM

Film is shot (traditionally) at 24 frames per second.

Most digital displays, lets call it a 1080p display, show video at 60 frames per second (60hz).

With HD discs, if film is sent to the display at 1080i, it is actually 24 frames per second, that is broken in half (48 interlaced fields) then one field is repeated to get up to 60 fields (no longer called frames) per second. Now, a 1080p display ONLY can display at 1080p, so it must deinterlace the fields and get it back to 1080p/60. If done correctly, this will properly put the 24 frames back into a progressive format that should look darn near the original on the disc.

Now, if you have a film that is shot at 60 frames per second, progressive. Than that should NEVER be interlaced because you would lose film data.

As well, if you have a display that can properly show a 24hz source at 24hz (frames per second) then you probably shouldn't interlace that either. But, even if it was interlaced, as long as proper deinterlacing takes place, the original 24fps with full resolution should be able to be restored.

The hard part, is that properly identifying interlaced vs. progressive frame sequences can be difficult for some displays to do on their own, so it is typically better for the processing to be handled by the disc player.

This information is an 'as far as I know' and some generalizations were made - additional info/corrections is appreciated.

PeterS 07-13-2006 08:46 AM

Some corrections:

Quote:
Now, if you have a film that is shot at 60 frames per second, progressive. Than that should NEVER be interlaced because you would lose film data.

No film is ever shot at 60 Frames per second - and most likely never will be. There is no benefit for more than 2x cost and this is a significant cost.

Quote:
As well, if you have a display that can properly show a 24hz source at 24hz (frames per second) then you probably shouldn't interlace that either. But, even if it was interlaced, as long as proper deinterlacing takes place, the original 24fps with full resolution should be able to be restored.

Really, with film there is NO DIFFERENCE, as it is simply how you are transmitting the signal. All the data is the same - PERIOD.

Quote:
The hard part, is that properly identifying interlaced vs. progressive frame sequences can be difficult for some displays to do on their own, so it is typically better for the processing to be handled by the disc player.

Actually, it is extremely easy for the display to know if it is receiving an interlaced or progressive SIGNAL. The problem occurs with bad deinterlacing in a player outputting a progressive SIGNAL - thus turning off any processing in the display. If this is done improperly to an INTERLACED VIDEO signal, then there could be problems. If the source is FILM and either the source or display can not put this back together properly - then you have a BROKEN device.

scowl 07-13-2006 12:11 PM

Quote:
Originally Posted by PeterS View Post

Actually, it is extremely easy for the display to know if it is receiving an interlaced or progressive SIGNAL.

He was saying that it's difficult for a display to identify progressive frame sequences that are being sent in an interlaced signal.

ADGrant 07-13-2006 12:16 PM

Quote:
Originally Posted by Cain View Post

Can someone explain to me, and I am asking a question here, not looking for an argument.

Why do most folks agree that 480p looks better than 480i, but many argue that 1080i looks exactly the same as 1080p ??

--- Cain

In both cases it depends on the source material, display being used and DVD player. It is not a given (unless we are talking about native 480p and native 1080p to a progressive display, that should be better or at least as good).

overcast 07-13-2006 12:17 PM

People are confused with past displays that did no deinterlacing of their own. When progressive players came out, the results were apparent, because the tv was now being fed a progressive signal. Today with all modern tvs having internal deinterlacers, they are always showing the progressive image, regardless of what you are feeding it. Every time people spazz on about how the player or game system ONLY puts out 1080i or 480i , it doesn't matter. As long as the display has a proper deinterlacer, you are losing NO INFORMATION.

Either you are letting your player deinterlace, or you are letting your display deinterlace. The end result is the same, assuming the electronics are competent in doing so.

ChrisFB 07-13-2006 12:44 PM

Quote:
Originally Posted by overcast View Post

Either you are letting your player deinterlace, or you are letting your display deinterlace. The end result is the same, assuming the electronics are competent in doing so.

This is really the crux right here and the point was also made further up in more detail. The deinterlacing will happen regardless and as long as there isn't a significant disparity in the capabilities (i.e. your model television is horrendous at deinterlacing and you have a disc player that is much better) it just doesn't matter.

scowl 07-13-2006 12:46 PM

Quote:
Originally Posted by overcast View Post

As long as the display has a proper deinterlacer, you are losing NO INFORMATION.

Whether your display has a "proper" deinterlacer is a valid question sometimes. Most recent displays do a great job of deinterlacing. My three year old Aquos isn't one of those. 1080i looks better when my cable box is set to 720p (which probably does simple bob deinterlacing) than whatever my display is doing to deinterlace 1080i. I was very surprised by this.

oshodi 07-13-2006 12:55 PM

Quote:
Originally Posted by PeterS View Post

Some corrections:



No film is ever shot at 60 Frames per second - and most likely never will be. There is no benefit for more than 2x cost and this is a significant cost.



Really, with film there is NO DIFFERENCE, as it is simply how you are transmitting the signal. All the data is the same - PERIOD.



Actually, it is extremely easy for the display to know if it is receiving an interlaced or progressive SIGNAL. The problem occurs with bad deinterlacing in a player outputting a progressive SIGNAL - thus turning off any processing in the display. If this is done improperly to an INTERLACED VIDEO signal, then there could be problems. If the source is FILM and either the source or display can not put this back together properly - then you have a BROKEN device.

Very well put, thanks.

Artwood 07-13-2006 09:44 PM

P is always better than I and Big numbers are always better than little numbers. It's a FACT!

Mark0 07-13-2006 09:54 PM

Quote:
Originally Posted by Artwood View Post

P is always better than I and Big numbers are always better than little numbers. It's a FACT!

Not an informative statement. Read the posts above to get a better idea of what is going on. Sorry, but it's these generalizations that cause the questioning in the first place.

ChrisFB 07-14-2006 06:28 AM

Maybe Artwood was joking? It kind of seems like it, but forums sometimes don't convey that.

Artwood 07-14-2006 04:43 PM

Half joking half serious.

Look perfect interlaced displayed doesn't look as good as perfect progressive displayed--just like dots on a CRT don't look as good as paint on the paper of a magazine.

Who's going to say that less perfect dots or pixels will look better that A LARGER NUMBER of perfect dots or pixels?

Of course the operative word is perfect. Since the things aren't perfect--yes interlaced with lesser pixels could be better.

The bottom line of all of this though is that as one gets closer and closer to perfection the closer and closer my generalization will be to universal truth!

Of course some of the discerning videophile fans here of classic AVS prose knew that already!

HorrorScope 07-14-2006 05:09 PM

To clarify this for myself a 1080i cable/ota signal when going to my 1080P fixed panel for all intensive purposes is 1080P?

That said that shows me that there is just a wee difference in noticeable quality between 720P and 1080i/1080P with video myself. 1080i for all intensive purposes today is 1080P, the whole "i" difference was due to CRT's. The difference of 720 and 1080 is definitely something many people wouldn't be in general all that excited about. That is why HD is not only about rez but bit rate and color bits. My main reason I want 1080P is for my HTPC so my desktop and games can run at 1080P resolutions and well that is about it.

So that being said the A1 with 1080i output is nothing to hold against because it is the same as 1080P output because all 1080P devices will take care of the conversion itself and there isn't any resolution or anything lost. There is no reason to wait for a 1080P outputted HD DVD. It's a sales gimmick. All correct?

Artwood 07-14-2006 08:37 PM

No! Only TRUE 1080p is the REAL way!

DSKTexas 07-14-2006 08:53 PM

Below is Evan Powell's (Projector Central) appraisal of the 1080i vs 1080p controversy.

"The truth is this: The Toshiba HD-DVD player outputs 1080i, and the Samsung Blu-ray player outputs both 1080i and 1080p. What they fail to mention is that it makes absolutely no difference which transmission format you usefeeding 1080i or 1080p into your projector or HDTV will give you the exact same picture. Why? Both disc formats encode film material in progressive scan 1080p at 24 frames per second. It does not matter whether you output this data in 1080i or 1080p since all 1080 lines of information on the disc are fed into your video display either way. The only difference is the order in which they are transmitted. If they are fed in progressive order (1080p), the video display will process them in that order. If they are fed in interlaced format (1080i), the video display simply reassembles them into their original progressive scan order. Either way all 1080 lines per frame that are on the disc make it into the projector or TV. The fact is, if you happen to have the Samsung Blu-ray player and a video display that takes both 1080i and 1080p, you can switch the player back and forth between 1080i and 1080p output and see absolutely no difference in the picture. So this notion that the Blu-ray player is worth more money due to 1080p output is nonsense."

HorrorScope 07-14-2006 09:27 PM

Right so stations that transmit 1080i HD really are transmitting 1080P to us for all intensive purposes.

sfhub 07-14-2006 09:32 PM

Quote:
Originally Posted by HorrorScope View Post

Right so stations that transmit 1080i HD really are transmitting 1080P to us for all intensive purposes.

Stations that transmit film-based content as *telecined* 1080i are essentially transmitting 1080p if your display or video processor can perform IVTC on 1080i.

Otherwise it is not the same.

DSKTexas 07-14-2006 09:35 PM

That's not what he said at all. HD DVD discs are encoded at 1080p the same as BD. It doesn't matter if this data isTRANSMITTED to the display as 1080i or 1080p. The display still gets ALL the data, and if you have a 1080p display it will obviously easily reassemble ALL that data to its native resolution.

PeterS 07-14-2006 09:44 PM

Clarification:

First you MUST differentiate between TRANSMISSION and DISPLAY.

Second, if your display is DIGITAL (DLP, LCD, DiLA, SXRD, Plasma, etc.) then it is PROGRESSIVE in its display regardless of the signal it receives.

Now:

PROGRESSIVE SOURCE --------->Native 1080p------------->Native 1080p DISPLAY (Perfect 1080p Image)
PROGRESSIVE SOURCE --------->Interlaced 1080i---------->Deinterlaced 1080p DISPLAY (Perfect 1080p Image)

INTERLACED SOURCE ----------->Native 1080i------------->Deinterlaced and Processed to 1080p

The difference is only in that the job in going from a PROGRESSIVE SOURCE to final 1080p DISPLAY is in putting the image back together simply by recombining the fields.

If the source is INTERLACED then you also have to account for the fact that the fields are offset in time. This is where the quality of the processing is important as this is not trivial (though well-established on how to do well).

DSKTexas 07-14-2006 09:55 PM

Quote:
Originally Posted by PeterS View Post

PROGRESSIVE SOURCE --------->Interlaced 1080i---------->Deinterlaced 1080p DISPLAY (Perfect 1080p Image)

And this is exactly what HD DVD does when contected to a 1080p Display.
(Perfect 1080p image)

PeterS 07-14-2006 10:13 PM

Very true -

In fact, the 1080i output of the Toshiba HD-DVD is significantly better than the 1080p output of the Samsung BD-DVD.

This is because the Samsung actually converts 1080p stored on the disc to 1080i and then using a very cheap and inexpensive chip, converts it back to 1080p (also with some additional filtering which destroys the image). This was done because when the player was first designed, it was designed to output 1080i. MARKETING showed them that people did not understand and 1080p was a BIG selling point (even though it makes little difference). Well, instead of properly taking the signal from the disc, they simply threw another part into the mix to output 1080p.

Try it - if you use the Samsung BDP-1000 and a good set, change the output to 1080i and the picture will improve!!! This is why some early reports mentioned that the image looked better over component than HDMI - this is only because the component outputs only go to 1080i!

DSKTexas 07-14-2006 10:27 PM

If the source material is 1080p and you have a 1080p display it is inconsequental if the data is transmitted as 1080i or 1080p. The significance of transmitting 1080p over HDMI is a marketing ploy.

sfhub 07-14-2006 11:42 PM

Quote:
Originally Posted by PeterS View Post

This is because the Samsung actually converts 1080p stored on the disc to 1080i and then using a very cheap and inexpensive chip, converts it back to 1080p

Well I wouldn't necessarily call the Faroudja chip very cheap and inexpensive, but it does seem like their 1080p deinterlacers are working from a dated design.

PeterS 07-15-2006 06:50 AM

Compared to the $7K I paid for my first external Faroudja, I would say EXTREMELY inexpensive

toke 07-15-2006 08:17 AM

Quote:
Originally Posted by PeterS View Post

Really, with film there is NO DIFFERENCE, as it is simply how you are transmitting the signal. All the data is the same - PERIOD.

Ever heard about "interlace flicker"?

PeterS 07-15-2006 08:29 AM

Only applicable on an interlaced display where the persistance of phosphor is used to generate an image. It is caused as the one field is decaying while the other is being drawn.

Not relevant here.

toke 07-15-2006 08:48 AM

Nope,
"interlace flicker" occours when one pixel high detail is shown on interlaced display. That's because it is drawn only every second field. Even progressive display with bad deinterlacer can show the flicker if it just changes one field to one frame (60i->60p). The detail is then in every other frame.
This flickering is avoided in 1080i with low pass filtering witch blurs the picture so that there are no one pixel high details.
So how this is relevant?
There's lots of people over there in US having 1080i displays and they want to watch 1080i signal without any flickering with their ├╝ber expensive tubes.
If all 1080i displays does not have internal low pass filtering, it has to be done before transmission and therefore 1080i signal has less real vertical resolution than what 1080p could have. Even with progressive source material.


All times are GMT -7. The time now is 12:33 PM.

Powered by vBulletin® Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.

vBulletin Optimisation provided by vB Optimise (Pro) - vBulletin Mods & Addons Copyright © 2014 DragonByte Technologies Ltd.
User Alert System provided by Advanced User Tagging (Pro) - vBulletin Mods & Addons Copyright © 2014 DragonByte Technologies Ltd.