AVS Forum banner

1 - 20 of 223 Posts

·
Registered
Joined
·
4,952 Posts
Discussion Starter · #1 ·
Can someone explain to me, and I am asking a question here, not looking for an argument.


Why do most folks agree that 480p looks better than 480i, but many argue that 1080i looks exactly the same as 1080p ??


--- Cain
 

·
Registered
Joined
·
348 Posts

Quote:
Originally Posted by Cain /forum/post/0


Can someone explain to me, and I am asking a question here, not looking for an argument.


Why do most folks agree that 480p looks better than 480i, but many argue that 1080i looks exactly the same as 1080p ??


--- Cain

Because its true, given quality source content and image projection equipment for both 1080i and 1080p. There comes a point where the ROI diminishes alot, and that's the border line IMHO. I've seen both side by side using solid source and projectors on the same screen - the difference doesn't begin to warrant the change or cost, again IMHO. That's just one vote.
 

·
Registered
Joined
·
2,363 Posts
Actually, this is WRONG.


480p looks better than 480i because most traditional video material is encoded at 480i and designed to be displayed on 480i devices. 480p only looks better when it is properly deinterlaced with the proper "pull-down" and then displayed on a 480p display device.


I guess it is better to say that 480p display devices look better than 480i display devices as they actually do show you twice the apparent resolution at any given moment due to these techniques.


Now, this falls apart with 1080i and 1080p as we are basically all using progressive (p) display devices. This is true even if your device only takes in a 1080i signal. All current technologies (with the exception of CRT) actually display a progressive signal (DLP, LCD, SXRD, DiLA, Plasma, etc.) Since this is true, they must be grabbing an interlaced signal, holding one field, combining it with the next field prior to display (this is what any progressive output device previously did before outputting the image).


The issue here is that the technology is EXTREMELY robust and that for very little cost almost all displays can do this perfectly. For film, it is trivial as there is no "time offset" between two interlaced Fields - they come from the same Frame so putting them back together is extremely simple.


You can view it as follows:


A Progressive Signal is sent as

Line 1

Line 2

Line 3

Line 4

etc.

Display


An Interlaced Signal is sent:

Line 1

Line 3

Line 5

etc.

Line 2

Line 4

Line 6

etc.

Display


In the end run the same information is in the Frame Buffer prior to display - ergo no difference.


Another point is that people continually confuse the subject when talking about 1080i and 1080p. You MUST distinguish between TRANSMISSION and DISPLAY. In your context you are talking about how the signal is TRANSMITTED as almost all of the 1080 monitors, etc. DISPLAY 1080p and can not DISPLAY in an interlaced fashion anyway.


Hope this helps.
 

·
Registered
Joined
·
7,036 Posts
Film is shot (traditionally) at 24 frames per second.


Most digital displays, lets call it a 1080p display, show video at 60 frames per second (60hz).


With HD discs, if film is sent to the display at 1080i, it is actually 24 frames per second, that is broken in half (48 interlaced fields) then one field is repeated to get up to 60 fields (no longer called frames) per second. Now, a 1080p display ONLY can display at 1080p, so it must deinterlace the fields and get it back to 1080p/60. If done correctly, this will properly put the 24 frames back into a progressive format that should look darn near the original on the disc.


Now, if you have a film that is shot at 60 frames per second, progressive. Than that should NEVER be interlaced because you would lose film data.


As well, if you have a display that can properly show a 24hz source at 24hz (frames per second) then you probably shouldn't interlace that either. But, even if it was interlaced, as long as proper deinterlacing takes place, the original 24fps with full resolution should be able to be restored.


The hard part, is that properly identifying interlaced vs. progressive frame sequences can be difficult for some displays to do on their own, so it is typically better for the processing to be handled by the disc player.


This information is an 'as far as I know' and some generalizations were made - additional info/corrections is appreciated.
 

·
Registered
Joined
·
2,363 Posts
Some corrections:

Quote:
Now, if you have a film that is shot at 60 frames per second, progressive. Than that should NEVER be interlaced because you would lose film data.

No film is ever shot at 60 Frames per second - and most likely never will be. There is no benefit for more than 2x cost and this is a significant cost.

Quote:
As well, if you have a display that can properly show a 24hz source at 24hz (frames per second) then you probably shouldn't interlace that either. But, even if it was interlaced, as long as proper deinterlacing takes place, the original 24fps with full resolution should be able to be restored.

Really, with film there is NO DIFFERENCE, as it is simply how you are transmitting the signal. All the data is the same - PERIOD.

Quote:
The hard part, is that properly identifying interlaced vs. progressive frame sequences can be difficult for some displays to do on their own, so it is typically better for the processing to be handled by the disc player.

Actually, it is extremely easy for the display to know if it is receiving an interlaced or progressive SIGNAL. The problem occurs with bad deinterlacing in a player outputting a progressive SIGNAL - thus turning off any processing in the display. If this is done improperly to an INTERLACED VIDEO signal, then there could be problems. If the source is FILM and either the source or display can not put this back together properly - then you have a BROKEN device.
 

·
Registered
Joined
·
10,679 Posts

Quote:
Originally Posted by PeterS /forum/post/0


Actually, it is extremely easy for the display to know if it is receiving an interlaced or progressive SIGNAL.

He was saying that it's difficult for a display to identify progressive frame sequences that are being sent in an interlaced signal.
 

·
Registered
Joined
·
1,534 Posts

Quote:
Originally Posted by Cain /forum/post/0


Can someone explain to me, and I am asking a question here, not looking for an argument.


Why do most folks agree that 480p looks better than 480i, but many argue that 1080i looks exactly the same as 1080p ??


--- Cain

In both cases it depends on the source material, display being used and DVD player. It is not a given (unless we are talking about native 480p and native 1080p to a progressive display, that should be better or at least as good).
 

·
Registered
Joined
·
786 Posts
People are confused with past displays that did no deinterlacing of their own. When progressive players came out, the results were apparent, because the tv was now being fed a progressive signal. Today with all modern tvs having internal deinterlacers, they are always showing the progressive image, regardless of what you are feeding it. Every time people spazz on about how the player or game system ONLY puts out 1080i or 480i , it doesn't matter. As long as the display has a proper deinterlacer, you are losing NO INFORMATION.


Either you are letting your player deinterlace, or you are letting your display deinterlace. The end result is the same, assuming the electronics are competent in doing so.
 

·
Registered
Joined
·
2,387 Posts

Quote:
Originally Posted by overcast /forum/post/0


Either you are letting your player deinterlace, or you are letting your display deinterlace. The end result is the same, assuming the electronics are competent in doing so.

This is really the crux right here and the point was also made further up in more detail. The deinterlacing will happen regardless and as long as there isn't a significant disparity in the capabilities (i.e. your model television is horrendous at deinterlacing and you have a disc player that is much better) it just doesn't matter.
 

·
Registered
Joined
·
10,679 Posts

Quote:
Originally Posted by overcast /forum/post/0


As long as the display has a proper deinterlacer, you are losing NO INFORMATION.

Whether your display has a "proper" deinterlacer is a valid question sometimes. Most recent displays do a great job of deinterlacing. My three year old Aquos isn't one of those. 1080i looks better when my cable box is set to 720p (which probably does simple bob deinterlacing) than whatever my display is doing to deinterlace 1080i. I was very surprised by this.
 

·
Banned
Joined
·
1,500 Posts

Quote:
Originally Posted by PeterS /forum/post/0


Some corrections:




No film is ever shot at 60 Frames per second - and most likely never will be. There is no benefit for more than 2x cost and this is a significant cost.




Really, with film there is NO DIFFERENCE, as it is simply how you are transmitting the signal. All the data is the same - PERIOD.




Actually, it is extremely easy for the display to know if it is receiving an interlaced or progressive SIGNAL. The problem occurs with bad deinterlacing in a player outputting a progressive SIGNAL - thus turning off any processing in the display. If this is done improperly to an INTERLACED VIDEO signal, then there could be problems. If the source is FILM and either the source or display can not put this back together properly - then you have a BROKEN device.

Very well put, thanks.
 

·
Registered
Joined
·
1,635 Posts

Quote:
Originally Posted by Artwood /forum/post/0


P is always better than I and Big numbers are always better than little numbers. It's a FACT!

Not an informative statement. Read the posts above to get a better idea of what is going on. Sorry, but it's these generalizations that cause the questioning in the first place.
 

·
Registered
Joined
·
2,387 Posts
Maybe Artwood was joking? It kind of seems like it, but forums sometimes don't convey that.
 

·
Registered
Joined
·
4,849 Posts
Half joking half serious.


Look perfect interlaced displayed doesn't look as good as perfect progressive displayed--just like dots on a CRT don't look as good as paint on the paper of a magazine.


Who's going to say that less perfect dots or pixels will look better that A LARGER NUMBER of perfect dots or pixels?


Of course the operative word is perfect. Since the things aren't perfect--yes interlaced with lesser pixels could be better.


The bottom line of all of this though is that as one gets closer and closer to perfection the closer and closer my generalization will be to universal truth!


Of course some of the discerning videophile fans here of classic AVS prose knew that already!
 

·
Registered
Joined
·
4,273 Posts
To clarify this for myself a 1080i cable/ota signal when going to my 1080P fixed panel for all intensive purposes is 1080P?


That said that shows me that there is just a wee difference in noticeable quality between 720P and 1080i/1080P with video myself. 1080i for all intensive purposes today is 1080P, the whole "i" difference was due to CRT's. The difference of 720 and 1080 is definitely something many people wouldn't be in general all that excited about. That is why HD is not only about rez but bit rate and color bits. My main reason I want 1080P is for my HTPC so my desktop and games can run at 1080P resolutions and well that is about it.


So that being said the A1 with 1080i output is nothing to hold against because it is the same as 1080P output because all 1080P devices will take care of the conversion itself and there isn't any resolution or anything lost. There is no reason to wait for a 1080P outputted HD DVD. It's a sales gimmick. All correct?
 

·
Registered
Joined
·
135 Posts
Below is Evan Powell's (Projector Central) appraisal of the 1080i vs 1080p controversy.


"The truth is this: The Toshiba HD-DVD player outputs 1080i, and the Samsung Blu-ray player outputs both 1080i and 1080p. What they fail to mention is that it makes absolutely no difference which transmission format you usefeeding 1080i or 1080p into your projector or HDTV will give you the exact same picture. Why? Both disc formats encode film material in progressive scan 1080p at 24 frames per second. It does not matter whether you output this data in 1080i or 1080p since all 1080 lines of information on the disc are fed into your video display either way. The only difference is the order in which they are transmitted. If they are fed in progressive order (1080p), the video display will process them in that order. If they are fed in interlaced format (1080i), the video display simply reassembles them into their original progressive scan order. Either way all 1080 lines per frame that are on the disc make it into the projector or TV. The fact is, if you happen to have the Samsung Blu-ray player and a video display that takes both 1080i and 1080p, you can switch the player back and forth between 1080i and 1080p output and see absolutely no difference in the picture. So this notion that the Blu-ray player is worth more money due to 1080p output is nonsense."
 

·
Registered
Joined
·
9,578 Posts

Quote:
Originally Posted by HorrorScope /forum/post/0


Right so stations that transmit 1080i HD really are transmitting 1080P to us for all intensive purposes.

Stations that transmit film-based content as *telecined* 1080i are essentially transmitting 1080p if your display or video processor can perform IVTC on 1080i.


Otherwise it is not the same.
 
1 - 20 of 223 Posts
Top