or Connect
AVS › AVS Forum › Display Devices › Flat Panels General and OLED Technology › Official "1080p Vs. 720p" Thread Discussion
New Posts  All Forums:Forum Nav:

Official "1080p Vs. 720p" Thread Discussion - Page 42

post #1231 of 1467
I don't care if it's not going anywhere soon, because conversion is working. In every test 720p showed no artifacts and contains more frames per second. The EBU ruled on it, 720p won.

Have a nice day everyone.
post #1232 of 1467
Quote:
Originally Posted by Sole_Survivor View Post

I don't care if it's not going anywhere soon, because conversion is working. In every test 720p showed no artifacts and contains more frames per second. The EBU ruled on it, 720p won.

Have a nice day everyone.







Ian
post #1233 of 1467
Quote:
Originally Posted by mr. wally View Post

if 720p provides the best resolution and handle motion better than 1080i,
then why do all the networks the broadcast sports in 720p have inferior pq to the 1080i networks. fox, abc and espn sports broadcasts are definitely inferior to those from cbs and nbc.

I too agree with this opinion. But it is really a trade-off. 1080i does tend to have more artifacts than 720p, but 720p looks very soft me compared to 1080i on stills or minimal motion. So can't really say one is better, but I do prefer the sharpness of 1080i and live with potentially more artifacts rather than have a picture that is always soft with 720p. And I think you have to throw FOX out of the equation because most of the time their picture is soft and full of artifacts (on sports anyway).
post #1234 of 1467
Quote:
Originally Posted by mailiang View Post

Like I posted earlier. it depends on the amount of bandwidth being used. This not only varies from station to station, but how you receive the signal (ie: ota, cable, satellite, Fios...). The greater the bandwidth the higher the visual resolution, which has a major impact on detail and PQ, especially when it comes to HD formats.

http://www.maxim-ic.com/app-notes/index.mvp/id/750


Ian

is it bandwith or bitrates that determines signal quality?
post #1235 of 1467
Quote:
Originally Posted by mr. wally View Post

is it bandwith or bitrates that determines signal quality?

FWIW, OTA is 19.4Mbps ( typically about 18.3Mbps) and for our location, the pq is usually better than what our neighbors have on cable. Mostly noticeable on Sunday afternoon football. Bandwidth, bitrate, hard to tell but I don't care.
post #1236 of 1467
Quote:
Originally Posted by mr. wally View Post

is it bandwith or bitrates that determines signal quality?

I would say bandwidth. Bit-rates refer to the amount of data that can be processed at a given time, however, like audio signals, video signals perform based on their frequency response.


Ian
post #1237 of 1467
Anyone who thinks an interlaced signal is better than a progressive painting should stop smelling farts!
post #1238 of 1467
Quote:
Originally Posted by Sole_Survivor View Post

Anyone who thinks an interlaced signal is better than a progressive painting should stop smelling farts!









Ian
post #1239 of 1467
Quote:
Originally Posted by Sole_Survivor View Post

Anyone who thinks an interlaced signal is better than a progressive painting should stop smelling farts!

Agree, if you are comparing the same resolution, say 1080i vs 1080p. But when you reduce resolution to get the p then it becomes a much more subjective test.

Like a previous poster mentioned, turn on an NFL broadcast and within seconds I can tell you if it is 720p or 1080i because 1080i always looks sharper. May have a few more artifacts depending on the channel, but I prefer it over the 'soft' 720p. Now granted there is a heck of a lot more that goes into transmitting the signal than just the resolution, but I can count on one hand the number of times since I have said a 720p NFL game looks as good as a 1080i one. I'll take CBS and NBC over ABC, ESPN and FOX 99.9% of the time.
post #1240 of 1467
Quote:
Originally Posted by primetimeguy View Post

Agree, if you are comparing the same resolution, say 1080i vs 1080p. But when you reduce resolution to get the p then it becomes a much more subjective test.

Like a previous poster mentioned, turn on an NFL broadcast and within seconds I can tell you if it is 720p or 1080i because 1080i always looks sharper. May have a few more artifacts depending on the channel, but I prefer it over the 'soft' 720p. Now granted there is a heck of a lot more that goes into transmitting the signal than just the resolution, but I can count on one hand the number of times since I have said a 720p NFL game looks as good as a 1080i one. I'll take CBS and NBC over ABC, ESPN and FOX 99.9% of the time.


I can't tell you how grainy FX and Fox usually look on my sets. However ABC and ABC Family look very good, but not any better then my premium channels which are all 1080i. If de-interlacing was such an important factor when it comes to PQ, I believe the majority of stations today would be broadcasting in 720p instead.


Ian
post #1241 of 1467
First off I have not read the entire thread and I'm not in US to see the broadcast quality.

However in relation to the recent posts, just want to chime that there are a few threads that talks about the technical issues as well:
http://www.avsforum.com/avs-vb/showt...7#post20084797
http://www.avsforum.com/avs-vb/showt...9#post20214289
http://www.avsforum.com/avs-vb/showt...2#post20904732

And IMHO the variables present in this recent discussion makes comparison difficult if not futile. The reason for 720p is that it is a bridge between SD and HD source so there is good reason to have 720p TV if you watch a lot of SD content (for now). Broadcaster use 1080i60 and 720p60 because they are about the same bandwidth, even when interlacing is a legacy CRT issue. Until they can upgrade their existing infrastructure and when bandwidth is no issue (unlikely looking at past 30 years), this will be legacy problem for some. Bandwidth and bitrate are related. Just think internet speed.

1) 720p display 1080i source is different from 1080p display 720p source
2) It depends on broadcaster's lossy compression algorithm. And also how they capture the video. Think Garbage In Garbage Out (GIGO). Even 4k res will not fix that
3) Live broadcast with processing on the fly is different from recorded that can be processed post production
4) There is a big difference between scaling first then interlace vs interlace then scale. Hence always better to use just one part of your device chain to do the processing. The worst case is to have your set top box do some, AVR do some and TV again. That's why NATIVE mode is important.
5) Last but not least, how big is the display and how far you sit

On thing certain is that higher frame rate is better for motion. When the above points are no longer an issue, then BD and any storage medium will be redundant cause everything will be streamed with no loss in quality. i doubt we're there yet.
post #1242 of 1467
Quote:
Originally Posted by Sole_Survivor View Post

I don't care if it's not going anywhere soon, because conversion is working. In every test 720p showed no artifacts and contains more frames per second. The EBU ruled on it, 720p won.

The number of output frames per second, after de-intelacing of 1080i (ie. if both fields are from different points in time) is the same as 720p. ie. the motion should look as smooth on both, not double as smooth on 720p.
post #1243 of 1467
Quote:
Originally Posted by Joe Bloggs View Post

The number of output frames per second, after de-intelacing of 1080i (ie. if both fields are from different points in time) is the same as 720p. ie. the motion should look as smooth on both, not double as smooth on 720p.

Are you saying 1080i60 and 720p60 has the same 60fps?
post #1244 of 1467
Quote:
Originally Posted by specuvestor View Post

Are you saying 1080i60 and 720p60 has the same 60fps?

I'm saying, if you shoot sport in interlaced mode, each field is from a different point in time. There are 60 fields being taken per second. When de-interlaced, the 60i signal should get converted to 60 frames (not fields) per second. The progressive TV will (or should, if correctly de-interlaced) show the same amount of motion samples per second when showing content shot in interlaced mode as content shot in 720p mode, so both should look as smooth in terms of motion (as long as the interlaced one contains footage shot where each field is from a different point in time, and not 24hz or 30hz content).

Also, note that stations that broadcast in 720p60 will often be doing so by using a 60i source - so the source the 720p60 version is being made from (in those cases) will have the same motion quality.

It's true that 60i isn't as good as 60p at the same resolution (eg. 1080p60 is better than 1080/60i), but both should look the same in motion smoothness (if the 60i contains fields taken from different points in time).
post #1245 of 1467
Correct if that's how the motion was captured, as per my point 2.

But I think motion are increasingly captured progressively now vs interlaced years back. Even Apple 4S is doing 1080p30 video recording
post #1246 of 1467
Quote:
Originally Posted by specuvestor View Post

Correct if that's how the motion was captured, as per my point 2.

But I think motion are increasingly captured progressively now vs interlaced years back. Even Apple 4S is doing 1080p30 video recording

Then the apple camera isn't very good (they're doing it to save bitrate/bandwidth/processing - anyone who uses it to shoot sport or wants it to look realistic will see it looks much worse than 60i/p). Other consumer video cameras can capture 1080p60.

Also, most live content seems to be shot at higher Hz (for motion smoothness it's not necessarily whether it's captured progressively or interlaced, but how often they take a new image, and whether it gets delivered in that way to the TV). Also, with cinema that's the way the newer films seem to be going (Hobbit, Avatar 2, etc.).
post #1247 of 1467
Quote:
Originally Posted by Joe Bloggs View Post

Then the apple camera isn't very good. Other consumer video cameras can capture 1080p60.

Also, most live content seems to be shot at higher Hz (for motion smoothness it's not necessarily whether it's captured progressively or interlaced, but how often they take a new image). Also, with cinema that's the way the newer films seem to be going (Hobbit, Avatar 2, etc.).

Maybe but point is interlaced video capture is passe

And you would have glaring interlace artifacts when each interlaced field makes up each frame. Combing should be obvious. It's not just about "smoothness"

Yes higher fps is better for motion, but 1080p would be better than 1080i at same fps, even assuming progressive capture. There are some who disagree which is in one of the links I provided. In short it would be the same if you interlace and deinterlace using same device, but not when you push the fields through different devices, algo and pipes, like in broadcast.

Quote:
Originally Posted by specuvestor View Post

On thing certain is that higher frame rate is better for motion.
post #1248 of 1467
Quote:
Originally Posted by specuvestor View Post

Yes higher fps is better for motion, but 1080p would be better than 1080i at same fps.

True (sort of, I think you mean at the same Hz or output frames per second - at the same full frames per second captured, 1080i might be better for motion - ie. if in interlaced mode you could capture 60 full frames per second (which you can't), you could have 120 fields per second - though you can't do that).

But no broadcaster or producer (eg. Blu-ray) or other content source (apart from consumer cameras etc. but no full films/TV programmes) is giving us 1080p50/60 content yet, so as of now, 1080i with 50/60hz content is better for realistic/smooth motion (or 720p50/60 if you don't mind the lower resolution on more static shots).
post #1249 of 1467
Quote:
Originally Posted by specuvestor View Post

1) 720p display 1080i source is different from 1080p display 720p source
2) It depends on broadcaster's lossy compression algorithm. And also how they capture the video. Think Garbage In Garbage Out (GIGO). Even 4k res will not fix that
3) Live broadcast with processing on the fly is different from recorded that can be processed post production
4) There is a big difference between scaling first then interlace vs interlace then scale. Hence always better to use just one part of your device chain to do the processing. The worst case is to have your set top box do some, AVR do some and TV again. That's why NATIVE mode is important.
5) Last but not least, how big is the display and how far you sit

Good points. Thanks for posting. But as far as compression is concerned, other then OTA, doesn't it not only vary depending on the broadcaster, but also varies depending on the service provider? ( Cable, Satellite, Fios..) Also, when I was referring to bandwidth limitations in terms of FR before, I was referring to the visual resolution, (http://www.maxim-ic.com/app-notes/index.mvp/id/750) which varies depending on the source, such as 1080p BD verses 1080p broadcast, (mostly PPV) verses 1080p internet streaming (ie:VUDU).



Ian
post #1250 of 1467
^^^ yes all along the value chain the source can be "meddled" with. That is why BD will still be superior in the near future as it limits the number of nodes in the chain. Most of the imperfections in the real world past 50 years are a function of hardware and bandwidth limitations. Otherwise everything would have been lossless with enough bit depth and bit rate, and render distinctions between mediums like BD or Flash and transports like broadcast, streaming or AVC irrelevant. They would all be the same.

And the source have to be the same if you want to have a meaningful discussion with the OP. For one, comparisons between different stations are red herring.

PS Maxim is a semicon company. They are talking about circuit bandwidth but I am talking about transmission bandwidth. It's like comparing PCI bus vs internet bandwidth, not that I understand the maths

Quote:
Originally Posted by Joe Bloggs View Post

True (sort of, I think you mean at the same Hz or output frames per second - at the same full frames per second captured, 1080i might be better for motion - ie. if in interlaced mode you could capture 60 full frames per second (which you can't), you could have 120 fields per second - though you can't do that).

But no broadcaster or producer (eg. Blu-ray) or other content source (apart from consumer cameras etc. but no full films/TV programmes) is giving us 1080p50/60 content yet, so as of now, 1080i with 50/60hz content is better for realistic/smooth motion (or 720p50/60 if you don't mind the lower resolution on more static shots).

Strictly speaking, Hz is the display/device frequency while fps is the source frame rate. That made clear, doubling Hz by interlacing fps WILL induce artifacts. Think about it this way: if your logic works, why bother with complicated MCFI algo? Just interlace all the contents to achieve 120Hz or 48Hz whatever.

There are many arguments whether 1080i has higher or lower resolution than 720p. But one thing certain is that it is not close to 1080p motion resolution, especially in interlaced capture, even for static image.

Like you said newer films are going higher fps. Red One can go 2K 120fps progressive. I'm not sure how advance is general broadcast now but BBC Olympics should be broadcasting in progressive with their 1080p25 trial earlier this year. Certainly interlace capture is passe. So will interlace transmission in future. And I'm not mourning over it. It's time to let go of the CRT legacy/ baggage. It was useful while it lasted but it has served its purpose (likewise to the ingenius 2:3 telecine)
post #1251 of 1467
Quote:
Originally Posted by specuvestor View Post

Strictly speaking, Hz is the display/device frequency while fps is the source frame rate.

Yes, I was just trying to point out that at the same source fps, interlaced would have double the temporal resolution.
Quote:


That made clear, doubling Hz by interlacing fps WILL induce artifacts. Think about it this way: if your logic works, why bother with complicated MCFI algo? Just interlace all the contents to achieve 120Hz or 48Hz whatever.

Yes, I know it won't be as good/accurate/as true resolution per motion sample (which is why I think we should have 1080p50/60 instead). But couldn't that be sort of how some consumer camcorders/cameras currently achieve high fps (eg. 200-1000 fps) for slow motion?
eg. recording 240 fps at 448x336 or 600 fps at 192x108

A bit like interlacing, but instead using different pixels in the sensor for different temporal samples, a bit like this:

http://www.technomaly.com/2010/02/18...h-speed-video/
Quote:


There are many arguments whether 1080i has higher or lower resolution than 720p. But one thing certain is that it is not close to 1080p motion resolution, especially in interlaced capture, even for static image.

I was talking mostly about motion quality - smoothness of motion - and we have no broadcast-to-home/Blu-ray 1080p50/60 sources yet, so interlaced is the better currently delivered format for motion smoothness/realism (or 720p50/60...).
Quote:


Like you said newer films are going higher fps. Red One can go 2K 120fps progressive. I'm not sure how advance is general broadcast now but BBC Olympics should be broadcasting in progressive with their 1080p25 trial earlier this year.

The Olympics will also have some events shot at 7680x4320p60 which is a better format (not to consumers homes though).

The BBC is already broadcasting where they are broadcasting (in the UK anyway) in a sort of 1080p25 mode or 1080/50i depending on the GOP (I think it's all within 1080/50i because that's how some set top boxes identify it, but some parts must be flagged as 1080p25 somehow). Obviously when the GOP is 1080/50i that will be the one with twice as good motion quality (motion smoothness). And when it's jerky/stroby/unrealistic motion, that will be the 1080p25 mode. So having an added mode for unrealistic/jerky/stroby motion doesn't seem very advanced to me, especially if it could make mistakes or due to only being allowed to switch once per GOP, switch to that mode on content that is 50hz. You mention the Olympics - I doubt they would want the 1080p25 mode for the actual live events - it being half as good for motion quality/realism as the 50i mode.
post #1252 of 1467
Why did the EBU rule 720p was a better transmission format? Click the link in my profile. 720p won!
post #1253 of 1467
Quote:
Originally Posted by Sole_Survivor View Post

Why did the EBU rule 720p was a better transmission format? Click the link in my profile. 720p won!

No ones disputing that. All else remaining equal, compared to 1080i, 720p is probably the best format. However there are other factors that have to be considered. I believe specuvester did a great job by listing these points to help to explain some of that.

Quote:
Originally Posted by specuvestor View Post


1) 720p display 1080i source is different from 1080p display 720p source
2) It depends on broadcaster's lossy compression algorithm. And also how they capture the video. Think Garbage In Garbage Out (GIGO). Even 4k res will not fix that
3) Live broadcast with processing on the fly is different from recorded that can be processed post production
4) There is a big difference between scaling first then interlace vs interlace then scale. Hence always better to use just one part of your device chain to do the processing. The worst case is to have your set top box do some, AVR do some and TV again. That's why NATIVE mode is important.
5) Last but not least, how big is the display and how far you sit

On thing certain is that higher frame rate is better for motion. When the above points are no longer an issue, then BD and any storage medium will be redundant cause everything will be streamed with no loss in quality. i doubt we're there yet.


Ian
post #1254 of 1467
Well, I restfully disagree with point 4 of that list. It appears to be that that way, common sense wise but it is not the result I am getting.
post #1255 of 1467
Quote:
Originally Posted by Sole_Survivor View Post

Why did the EBU rule 720p was a better transmission format? Click the link in my profile. 720p won!

One thing to point out is they tested using mpeg4, which isn't something we can compare well since OTA is mpeg2 here.
post #1256 of 1467
I don't understand it guys, picture it this way, like pint colors, shinny and flat, when I put the box in 1080 it looks flat, convert to 720 it's shinny, glossy.
Why is that?
post #1257 of 1467
Quote:
Originally Posted by Sole_Survivor View Post

I don't understand it guys, picture it this way, like pint colors, shinny and flat, when I put the box in 1080 it looks flat, convert to 720 it's shinny, glossy.
Why is that?

That sounds like a particular box or tv issue. Has nothing to do with the transmission resolution.
post #1258 of 1467
No, 3 different boxes three different tvs, only common fact is it's Directv. Yea 1080i has it's moments, I said before outdoor sunny scenes of golf for some reason. Other than that in regular scenes 1080i drops glossyness if that's actually a word but I'm just trying to describe it the best I can. Other cable boxes may have different results. I don't not see this on blu rays 1080i format.
post #1259 of 1467
Quote:
Originally Posted by Sole_Survivor View Post

No, 3 different boxes three different tvs, only common fact is it's Directv. Yea 1080i has it's moments, I said before outdoor sunny scenes of golf for some reason. Other than that in regular scenes 1080i drops glossyness if that's actually a word but I'm just trying to describe it the best I can. Other cable boxes may have different results. I don't not see this on blu rays 1080i format.

Are all 3 boxes the same model? As for glossiness, keep in mind live sports have a much different look than sitcoms and dramas that were probably shot on film.
post #1260 of 1467
Quote:
Originally Posted by Sole_Survivor View Post

No, 3 different boxes three different tvs, only common fact is it's Directv. Yea 1080i has it's moments, I said before outdoor sunny scenes of golf for some reason. Other than that in regular scenes 1080i drops glossyness if that's actually a word but I'm just trying to describe it the best I can. Other cable boxes may have different results. I don't not see this on blu rays 1080i format.

Blu-Ray is a progressive format. You may have an issue with your LNBF or switch since you already tried several boxes. If you're not getting close to Blu-Ray quality on stations like AMC HD, TNT HD, Showtime and Starz, with the box set to native, then it's 'we have a problem Houston!'


Ian
New Posts  All Forums:Forum Nav:
  Return Home
AVS › AVS Forum › Display Devices › Flat Panels General and OLED Technology › Official "1080p Vs. 720p" Thread Discussion