AVS Forum banner
Status
Not open for further replies.
1 - 20 of 36 Posts

·
Registered
Joined
·
493 Posts
Discussion Starter · #1 ·
The subject of 1080i vs 720p came up in the anandtech forums with some claiming the latter superior to the former. This actually arose concerning an infocus RP, but I suppose it is relevant to all displays and since I only have plasmas, I thought I would post it here.


The following article was referenced by one person that claimed 720p to be better - 720p vs 1080i


So, is this true, or not? I must admit that I can detect a definitive improvement with 1080i over 720p whether passed through, or upconverted. Thoughts?


Thanks for your input.
 

·
Banned
Joined
·
314 Posts
720p IS better


Clearly.


Ever watch football? the say broadcast in 720p the worlds finest or something like that lol


But aside from that... yes 720p is better because its progressive


i think that if 1080i was P, it would = 500 something so 500p, 720p being higher than that, its better


At least this is what i believe to be true
 

·
Registered
Joined
·
109 Posts
You're going to notice more flicker during motion, but your eyes are noticing the difference in resolution.


In spite of what the article's author says, 1080i displays do resolve more pixels than a 720p set, even if they "cheat" by only repainting half of the screen at one go. 1920 * 1080 > 1280/720, a mathematical fact.


Some people are going to notice flicker more than others. Something I read states that images last approximately 1/25th of a second on the nerve endings of a human retina, or 25fps. Of course, everyone's biology is subtly different, and so there are differences in perceptions. To some people, 1080i is perfectly acceptable, while others are too distracted by the flickering to enjoy the picture.


Sounds like you are already aware of the newer sets that are capable of resolving 1080i and upconverting the signal to 1080p, which should reduce the amount of flicker that is noticeable. Of course, the amount of reduction will vary from person to person and will depend upon the quality of the set's internal deinterlacer.


If you haven't done so already, you will definitely want to check out the current cable/satellite service offerings for your area. For example, my cable provider (Cox) broadcasts exclusively in 1080i in my area, so I'm going to have to buy one of the 1080p-upconverting sets if I want to see the channels in full resolution (and due to my location, I'm ineligible for satellite, unfortunately).
 

·
Super Moderator
Joined
·
21,531 Posts
lots of misleading info floating around


one thing we should all agree on:


you should view the source material in its native resolution if you can: for example if you are viewing ESPN HD its likely 720p: view it as 720p


if you are watching HD NET it is likely 1080i, view it as 1080i


that is why set top boxes that have a 'pass through' mode are best


Myself I prefer to view 1080i if I am upscaling: it is really a personal preference and depends on your gear :)


If you have a 1080i source, deinterlacing it to 1080p adds NOTHING to the picture quality
 

·
Registered
Joined
·
4,162 Posts
I can easily see the flickering and I can actually see combing with interlaced material on an interlaced display. (too bad I can't really afford anything more than an interlaced TV at the moment)
 

·
Super Moderator
Joined
·
21,531 Posts
Quote:
Originally Posted by DonoMan
I can easily see the flickering and I can actually see combing with interlaced material on an interlaced display. (too bad I can't really afford anything more than an interlaced TV at the moment)
this being the flat panel forum, my comments are related to fixed pixel progressive displays ;)
 

·
Registered
Joined
·
1,548 Posts
I used to think that 720p should be better due to progressive scan, now after watching a lot of HD in both I don't think so. To begin with even half a frame of 1080i has more information than full 720p frame (1920x540=1,036,800 , 1280x720=921,600). Sorry, I have not seen any flicker on my setup and interlacing artifacts? Again haven't seen any, maybe because the pixels in 1080i are like 6 times smaller than standard video. The only time I could see some advantage in 720p would be freeze frame and alike but even then I'm not so sure. For example HC1 being 1080i camcorder should have problems with freeze frame but the CMOS in it is progressive and the video is scanned in progressive format and only later interlaced for recording. Either way this is kind of academic right now since very few people actually have displays capable to show full 1080i resolution to begin with and most opinions are based on preconception, or even worse watching some ED display not capable of properly displaying either HD format. So the problems you see could be totally unrelated to video but based on scaler implementation.
 

·
Registered
Joined
·
8,317 Posts
We need to differ movies (24fps) and video (60fps).


MOVIES:

Let's say we have a 24fps 1080p movie. Broadcasting is always done at 60fps (I believe).

(a) Let's say we broadcast it as 720p. We have to do 2 things: (1) We have to downscale 1080p to 720p. (2) We have to repeat some frames in order to get 60fps from those 24fps.

(b) Let's say we broadcast it as 1080i. The resolution is already fine, we just need to convert the 24p to 60i. That means, we interlace the 24p to 48i, then we repeat some frames to get from 48i to 60i.


VIDEO:

Let's say we have a 60fps 1080p video. Broadcasting is always done at 60fps (I believe).

(a) Let's say we broadcast it as 720p. We have to downscale 1080p to 720p. The frame rate is already ok, so we don't need to touch that this time.

(b) Let's say we broadcast it as 1080i. The resolution is ok, we don't need to change that. However, we can only send half of the pixels the original video contains. In order to send as much data as possible, we alternate with each interlaced frame between odd and even lines of the original 1080p data. This way although we drop half of the pixels, in non moving scenes we have "almost" the full resolution left. We can run into serious trouble with jaggies and combing when fast moving scenes are shown this way, though.


Now what does all that mean?


When looking at MOVIES, we have to say that using 720p reduces the data stream detail information by downscaling. In contrast 1080i doesn't drop *any* information. A good deinterlacer will be easily able to reconstruct the original 1080p24 from the 1080i60 data stream. So when looking at MOVIES, 1080i is definately superior - provided that our deinterlacer is not too bad.


When looking at VIDEO, the situation is more difficult. Both 720p and 1080i drop some information. For non moving images 1080i is definately better. But for fast moving images it gets more complicated. 720p will result in a clean image without motion or jaggy artifacts, but with slightly reduced resolution. When using 1080i for fast moving VIDEO stuff, the results depend *VERY MUCH* on the quality of the deinterlacer. E.g. the Faroudja deinterlacers are known to work relatively well for interlaced VIDEO signals*. Faroudja chips generally can deinterlace interlaced VIDEO signals well without introducing too much motion artifacts and jaggies. Unfortunately most deinterlacers are not so clever, which results in more or less obvious motion artifacts and jaggies. We can't really say whether 720p or 1080i is better for fast moving VIDEOs, because it so much depends on the quality of the deinterlacer. When having a good deinterlacer, 1080i might even be superiour for sports. However, because most current deinterlacers don't handle 1080i very well for fast moving scenes, 720p is probably the better choice for sports right now.


--------


*) Most Faroudja chips can't deinterlace 1080i videos properly, though. They're limited to SD video deinterlacing. Only the most expensive Faroudja standalone device can properly do its magic on 1080i videos.
 

·
Registered
Joined
·
80 Posts
I am a bit confused here on Progressive vs. Interlaced. I know what these terms mean but I dont see what difference it makes. A plasma display has a frame buffer in it. I am assuming that there is one memory location for each pixel on the screen. Progressive scan implies that the memory is updated in scan order and interlaced implies alternate lines. BUT since the whole frame is loaded into RAM and then displayed on the plasma screen, I am wondering just what the difference is from the standpoint of PQ?


Dont progressive and interlaced go back to the CRT days when there was no frame buffering and info was just spewed out of the video amp into the cathodes? What does it mean with modern PDP's


Joe
 

·
Registered
Joined
·
493 Posts
Discussion Starter · #11 ·
Well I'm at work and in kind of a rush, but I just wanted to say thanks for the replies - very informative. :)
 

·
Registered
Joined
·
15,358 Posts
Ahh...the sweet smell of a 720p vs 1080i thread....
 

·
Registered
Joined
·
671 Posts
The day the westinghouse 37" 1080P LCD sets hit the floor at Best Buy the clock started running. Anything less than 1080P is just a waste of time and money.

Let's see how long it takes for the industries to catch up to the technology.
 

·
Registered
Joined
·
1,047 Posts
Quote:
Originally Posted by nameless33
Let's see how long it takes for the industries to catch up to the technology.
I think it's going to take a long, long time. I do agree that at some point in the future everything [new] will be 1080p but that could easily be a decade away.
 

·
Registered
Joined
·
61 Posts
Quote:
Originally Posted by renlopez
What types of display's can actually display a 1080i picture? Plasmas are all progressive displays.
CRT.
 

·
Registered
Joined
·
550 Posts
I should probably ask this in the Direct View Forum, but I'm so used to being in this one.


What how does an Direct View CRT convert a 720p signal to it's native resolution, which I assume is 1080i? And how does a 720p signal look on a CRT compared to a 1080i signal?


Ren
 

·
Registered
Joined
·
912 Posts
One other thing to consider is the MPEG2 compression used for HD broadcasts. The HD bandwidth is not sufficient to encode a 1080i broadcast, and you get noticible compression artifacts in the image, particularly with high motion. With 720P the compression artifacts are not as noticable.
 
1 - 20 of 36 Posts
Status
Not open for further replies.
Top