or Connect
AVS › AVS Forum › Display Devices › Flat Panels General and OLED Technology › Official "1080p Vs. 720p" Thread Discussion
New Posts  All Forums:Forum Nav:

Official "1080p Vs. 720p" Thread Discussion - Page 48

post #1411 of 1467
Great to find this thread again after 6 years:

will make make reference to it in the new threads discussing 4k resolution....which many say offers little improvement over 1080p...
post #1412 of 1467
Quote:
Originally Posted by Sole_Survivor View Post

Lol! Hey I got a guy arguing with me who posted a photo of of 2 girls that look like they had hair as lime from the aliens in the Lost In Space Episode trying to prove his point against all these facts.
Why do you keep calling them "photos" when you have been told you that they are direct screen captures, where any colours in those images should be how they actually are in the source, and not how they are photographed from a particular camera of a particular TV - which your pictures are?

Why do you keep ignoring the posts that say that various TVs keep a memory of settings for each resolution (eg. 1080i and 720p) and asking you to ensure you display is calibrated for each input resolution? If you want the best, most accurate TV picture why not do that? Also, has your PC's monitor been calibrated (if that's what you are using to view the 'direct screen capture' image)?
Edited by Joe Bloggs - 8/3/12 at 8:08pm
post #1413 of 1467
Quote:
Originally Posted by markrubin View Post

Great to find this thread again after 6 years:
will make make reference to it in the new threads discussing 4k resolution....which many say offers little improvement over 1080p...

+1!
post #1414 of 1467
Quote:
Originally Posted by HogPilot View Post

+1!
Though most of the recent posts are mostly about 1080i vs 720p (not 1080p vs 720p like in the thread title). Should we have a "1080i vs 720p thread"?
post #1415 of 1467
Quote:
Originally Posted by Joe Bloggs View Post

Why do you keep

Fill in the blank after that...because he's here to support a conclusion with information rather than use complete information to support a conclusion. Notice the continued leaning on 6-7 year old material from the transition of interlaced to progressive material...anything newer would render this entire issue useless. Add to that the continued use of hyperbolic, inaccurate terms applied to the opposing side of the discussion, and you have a very emotional position backed by very little fact. Perhaps at this point we should jump into a discussion of Beta vs VHS or BD vs HD DVD...I'm sure they'd be equally relevant and fruitful...
post #1416 of 1467
Quote:
Originally Posted by Joe Bloggs View Post

Though most of the recent posts are mostly about 1080i vs 720p (not 1080p vs 720p like in the thread title). Should we have a "1080i vs 720p thread"?

Very true...the 1080i vs 720p argument is an old one that really has little relevance or bearing upon anything...except for those who seem to have convinced themselves of imaginary boogeymen buried in one of the two.
post #1417 of 1467
Direct captures makes my point even more....his source was 1080i, claimed it was properly de interlaced and their hair had more lime it than the aliens in Lost In Space, The interlaced signal is still reduced by 60% due to filtering/ compression and still has all the same artifacts problems they always had and those horrible direct captures as you call them, proves my point along with all the evidence I provided. Your side presented no evidence, other then a claim that yea on paper 1080i has 2 virtual fields than make up approx 2 million pixels, but link after link, I provided evidence that shows you it is reduced by 60% due to filtering compression, kell factor. Etc. You people are coming from the point of Blu Ray, maybe Blu Ray 1080i..but not broadcast, my evidence as well the artifacts in those girls hair proved my point, and I won this debate clearly a slam dunk. Now if you get that horrible broadcast signal since its reduced, if you take it out of 1080i and convert it to 480i, yea we know the TV still scales it to 1080p, but you sent less real pixels into the set because when your box if properly working, took the signal out of 1080i where it is no longer 1080i, it's now 480i. Click your info button on the TV remote, it will say 480i. OK... Follow me...now.. When you convert it to 720p, it is no longer 1080i it is now 720p/60... And yes...spare me...I do understand that the souce material is not real 60 fps, but the box is still processing the image at twice the speed in a single painted frame when PROPERLY converted to 720p/60. Now you're feeding your TV a signal that cleaned up the artifacts BEFORE THEY HIIT YOUR SET in a single progressive frame, rather than 2 F...D up broken fields reduced by 60% at half the frame rate. I know exactly what is in your head...the TV is going to deinterlace it anyway...no. Does it deinterlace 480i to 1080p quality.. No! Because you're not feeding it that signal...it tries to scale it to 1080p and it does because it's a fixed pixel resolution, but it is not 1080p quality. When you drink dirt, your immune system can fight off infections, but notice while all your cells are fighting this, you're body feels like crap? Why, because it's working harder. You're doing the same thing to your TV by feeding A PROGRESSIVE monitor 2 screwed up fields reduced by 60% containing artifacts..making the TV work harder. The way you filter contaminates out of water before it hits your body to make your body work less harder, filter the garbage out of that broken up artifacted signal to a single frame at twice the frame rate into your set. I'm really talking to the readers, because you people will defy all logic, tests, results and evidence presented. I was right since day one and test after test I was able to verify them. Like the video stated, it makes it very difficult for interlaced signals on a progressive monitors, STAY WITH 720p and 1080p IF AT ALL POSSIBLE "Tech Ev."..... I know. ...yea. ... They're wrong too. BTW, read the article I my profile link, the test used in the side by side used monitors that deinterlaced 1080i to 1080p, Pioneer plasmas EX5000 and the source material was 1080p signal that was converted to 720p and 1080i. Here is a sample from 2006 of TVs that either passed or failed the 1080i deinterlace to 1080p.

Make/Model Technology Pass/Fail

Epson LS65HD1 3LCD RP Pass
Mitsubishi WD-62627 DLP RP Fail
Mitsubishi WD-73927 DLP RP Fail
Mitsubishi WD-52627 DLP RP Fail
Mitsubishi LC-3780 LCD FP Pass
Samsung HL-R5668 DLP RP Pass
Sharp LC-45GD5U LCD FP Fail
Sony KDSR60XBR1 LCOS RP Pass
Sony KDSR50XBR1 LCOS RP Pass
Westinghouse LVM-37W1 LCD FP Pass

So even these cheap monitors passed converting 1080i to full 1080p by not using one field at a time....meaning a full 1080i signal, not one reduced by 60%.... And BTW..Pioneer Plasma is #1.


All presented on the same 1080p monitors...... Why...again another link verifying the same links I quoted earlier about BROADCAST COMPRESSION ....."In the presentation of uncompressed sequences, the delegates reported difficulties in seeing differences between the three formats – even at a viewing distance of 3h. But when the compressed images were shown, the viewers did notice differences in the visibility of compression artefacts. Depending on the viewing distance and scene content, the artefacts became visible to a greater or lesser extent and, with few exceptions, the following were reported:The 720p format showed better image quality than the 1080i format for all sequences and for all bitrates;"
http://tech.ebu.ch/docs/techreview/trev_308-hdtv.pdf
Notice it said in certain scenes... Just what I said, they are not always there, I can show you dozens of 1080i shots with no artifacts, but I know where they show up.

Another link saying the same.."720p vs. 1080i

TV stations would normally broadcast in either 720p or 1080i but not both; the predominant format is 1080i. This in itself is not an issue; all present HDTV sets can display pictures in any HDTV format by up-converting or down-converting to the set native resolution, i.e. the one in which the set is designed to display the image.

From a pixel-count perspective, 1080i supports better spatial resolution than a 720p HDTV. In fact, 1080i supports a pixel count of over 2 million pixels as against the 0.92 million pixels supported by 720p HDTV. But in reality, the situation is somewhat different when it comes to an interlaced format.

As expressed earlier on, the differences between the two halves of an interlaced image lead to interlaced artifacts. To reduce the visibility of these artifacts, filtering is applied to the vertical resolution of an interlaced signal. This reduces the real image vertical resolution to some 60% of the number of scan lines supported by the 1080i interlaced format. Furthermore, 1080i material is limited to around 1440 pixels horizontally to reduce transmission bandwidth requirements; this reduces the overall effective resolution of the 1080i format to around 0.93 million pixels.

It is thus clear that the actual difference in effective resolution between 720p and 1080i is almost negligible. And this apart from the fact that a 720p display is capable of a better flicker-free picture when dealing with fast action TV content.

"Yet there is another issue against interlaced video, that of digital compression of images. Digital image compression is more efficient with progressive video at the source than interlaced video. High definition digital TV broadcast uses the same 6MHz maximum allocated broadcast bandwidth in the US as with standard definition analog TV. This means it is necessary to apply compression to make high definition images fit into the space allocated for a broadcasting TV channel. ". http://www.practical-home-theater-guide.com/1080p-hdtv.html

”720P, when compared with 1080i, provides better dynamic resolution, better motion rendition, the absence of interlace artifacts, and the absence of compression artifacts. It makes brighter pictures with a higher contrast ratio than 1080i. It is well matched to the resolution capability of consumer displays. It is a forward-looking technological choice that is compatible with computers, with advanced display technologies, and with the display of text and multiple windows as well as conventional television pictures. Given all this, the technological choice between 720P and 1080i is not a difficult one. The topic of subjective picture quality is complex, but the reasons ABC chose 720P HDTV may be distilled down to a simple truth: it gives the viewer better HDTV pictures.". A real 720p/60 convetsion will do this that is why my curtain was lit up vs the 1080i shot.


http://www.bluesky-web.com/numbers-mean-little.htm

They have presented nothing other lime tinted hair they claim was deinterlaced properly that proved my point, and presented no verifiable evidence other than 1080i has more pixels,(yea with Blu Ray or HD DVD 1080i) but I have shown from multiple sources it is not true with broadcasts that the resolution is REDUCED and presented multiple facts that they cause artifacting. They have nothing other than what they want to believe in their head.
So the 720p version beat 1080i in every manner. So, hey, I have the evidence that verify my statements, I won this debate, over and out.

Sole. smile.gif
Edited by Sole_Survivor - 8/6/12 at 3:33am
post #1418 of 1467
OK I would not argue the point over 720p Vs 1080i

But to be clear, based on the thread title, would you agree a Blu Ray disc looks better than an ABC broadcast in 720p?

smile.gif
post #1419 of 1467
Quote:
Originally Posted by markrubin View Post

^^^^
OK I would not argue the point over 720p Vs 1080i

Just my two cents, but this subject has been discussed numerous times on AVS and there are about about as many opinions as to which is the best format as there are members.biggrin.gif I realize I'm beating a dead horse here, but due to the greater variables of highly compressed broadcast signals, I believe that most people would be pressed to see much difference in pq between these different resolution formats.

Quote:
But to be clear, based on the thread title, would you agree a Blu Ray disc looks better than an ABC broadcast in 720p?


A Blu-Ray disc even with the same fixed resolution will more often then not offer the best pq due to it's significant bandwidth which is much greater then broadcast TV. This is referred to as visual resolution which is defined as the smallest detail that can be seen. This detail is related directly to the bandwidth of the signal, not just to the amount of pixels that are displayed. The more bandwidth in the signal, the more potential for detail. Conversely, even with 1080p, the more the signal is compressed, the less detailed information will be seen. For that reason, even when I watch a 1080p movie on DTV, it still doesn't look as good as when I viewed it on Blu-Ray.




Regards,
Ian
Edited by mailiang - 8/4/12 at 10:48am
post #1420 of 1467
Last winter olympics NBC's 1080i broadcast did a very poor job with movement in the background. Speed skating comes to mind, the fast panned backgrounds faded into a blur.

In the current NBC olympic broadcasts this seems to be less of an issue. I wonder if faster and improved processing on the production end has addressed this issue or if my observations are not accurate. I view these broadcasts on a 1080p Plasma set with a good reputation of accurate reproduction of 1080i and 720p broadcasts.
post #1421 of 1467
Quote:
Originally Posted by Feirstein View Post

Last winter olympics NBC's 1080i broadcast did a very poor job with movement in the background. Speed skating comes to mind, the fast panned backgrounds faded into a blur.
In the current NBC olympic broadcasts this seems to be less of an issue. I wonder if faster and improved processing on the production end has addressed this issue or if my observations are not accurate. I view these broadcasts on a 1080p Plasma set with a good reputation of accurate reproduction of 1080i and 720p broadcasts.

If you have a Panasonic plasma 1080p, it passed all 1080 lines of moving resolution. I like the 2010 models that did this without creating a soap opera effect like the new ones. If you keep it in Cinema setting, and you convert the 1080i broadcast to a single painted frame of 720P @60 frames per second you will never see any blur because the TV wiill take the 720/60 image going into the set and will not deinterlace because the box did the dirty work and is processing the image at twice the frame rate going into the set and will produce a full 1080p/60 motion image. Check the motion rating on TVs. Panasonic lists them as well as Cnet. The scrolling info bars on mine have no smears, no jaggies, clear fast and smooth with these settings
Interlacing gets you every time especially if you a LCD monitor. Some people have no idea that LCDS fall between 300-700 motion rez on 1080p monitors. LED is better but there are many complaints about soap opera effect, I experienced it and it is horrible. I even returned a palmsa for that. Some day people will realize all these 480 HZ bite the big one because they are not seeing reality. If anyone doesn't know soap opera effect, if makes the image look like it was shot with a cell phone camera. Sounds like you're all set, most plasmas fall between 700-1080 motion. I doubt you're seeing any improvement unless you got a plasma monitor since. Plus you're not watching speed skating now.
Edited by Sole_Survivor - 8/5/12 at 8:34am
post #1422 of 1467
I well remember a conversation I had in the very early days of HD broadcasting with Bob Ross, Chief Engineer (at that time) of CBS. CBS was having issues with stereo and 5.1 broadcasts on some, but nto atll HDTVs. I had mentioned to CBS engineering that my set was not receiving a proper stereo signal. Bob had called me and asked if they could use my Zenith 64" RP HDTV as a guinea pig, as they threw various switches in the control room to see if they could get the audio straightened out on my end. Those were the fun days of HD! wink.gif

At any rate, in one conversation Bob mentioned some of the considerable research that went in to CBS's decision to go 1080i. Now of course you could call Bob's comments 'biased', but I thought they were very well reasoned. Despite the filtering, static or nearly static 1080i images provide considerably more information than 720p images. That's always been very easy for me to see when comparing various 'live' events on both good 1080i & 720p broadcasts.

Of course the caveat is how often do we see 'static or nearly static' images. Bob indicated that nearly 90-95% of images we view on TV fall in to the 'static or nearly static' category. So after doing considerable testing, they came to the conclusion that this was the better standard to go with to give the true pop of HD. It's no coincidence that networks that were more sports oriented (ESPN, ABC) went with 720p.

As to the issue of interlaced artifacts of 1080i, some people are still drawing their opinions on this from our old NTSC standard when interlaced artifacts were terrible. But it's living in the past to think that today's HD, 1080i interlaced artifacts on typical flat panel displays are anywhere near the level that they were in our old CRTs driven by our old NTSC standards.

I can see the advantages in both formats, but I honestly prefer the greater detail in 1080i. I discount the very opinionated on both sides that can only see cons and no pros in either of these formats.
post #1423 of 1467
On paper Ken...read my encyclopedia of evidence that filtering reduces the 1080i image by 60% causing the artifacts to show up in certain areas of the broadcast. What you're saying has relavance if say you had a HD DVD player that put out in 720p 1080i like the Toshiba A2.
Than you can get a DVD or HD DVD, and chose 1080i.
Than the image is not subject to the compression factors I presented that reduce the 1080i fields.
post #1424 of 1467
Sole, I believe the degree of filtering is quite variable, so I don't think you can quote a given % as a 'rule'. In my experience, my direct observations (to me far more reliable than a fluctuating number) on a number of HDTV displays I've owned, confirm to me with a high degree of consistency, the greater detail in 1080i broadcasts. I generally use live broadcasts as the guideline here. Even something like a golf telecast looks more detailed on wide shots on a CBS than it does on an ABC or ESPN. Since golf is a pretty static sport, it's a good one to use in my example. The same can be said for nature shows on video.

I've just seen this far too many times to discount it. Truth be told, even for sports like football or baseball (think YES Network), much of these broadcasts fits the 'static or nearly static' definition. That's a point that is missed by many IMO. This is not to say that 720p is not detailed, it certainly is, but numbers aside, my direct observations are consistent with the greater number of pixels (even with filtering).

With that said, one of the best taped pieces of footage I've seen, is an old ABC test video that used to be broadcast on their OTA HD channel when there was no telecast. They were different scenes shot in N.Y. and they were just stunning, as good as I've ever seen 'live' 720p look. Unfortunately, you just don't see that kind of quality around much these days. The sad truth is that OTA HD quality was much better years ago before multicasting began...and that applies to both 720p & 1080i.

Let's face it, this is a subjective thing that will never go away. Just wait until the 4K debates begin. As Mark mentioned above, some of the comments from guys like Joe Kane would tend to indicate that we'll see no real benefits from 4K at the beginning and a lot of material might actually look softer than it does today. Joe seems to feel that 4K should be bypassed and 8K be the focus of attention. Of course manufacturers would rather hit us up for 4K displays and once we're settled in, tell us how bad 4K really is and we need to move on to 8K. Never mind that we'll need enormous displays or absurdly close seating distances to see the difference. It never ends. smile.gif
post #1425 of 1467
That's all fine and dandy Sole, on paper that is. But I can easily tell when a program is broadcast (OTA) in 720p vs 1080i because the 1080i always looks a bit sharper and clearer to me, and that's all that matters, even with one or two sub-channels.
Edited by Otto Pylot - 8/5/12 at 9:56am
post #1426 of 1467
I have also seen some things looks more detailed in both formats, depends on the broadcast, but it still doesn't take away from the artifacts and smearing, as seen posted by me as well as someone else here, they are not always there and when they are people don't know how to recognize them. 720p will always have a smoother flicker free, artifact free image, even a 1080i converted to 720p will give you a smoother image by cleaning some of the garbage out of it. That is why when it came to compression, the down sampled 1080p looked better in 720p in every aspect over 1080i. See my big post yesterday. Remember, a lot of if not most boxes do not propeerly produce 720p, and most cable companies broadcast in 1080i, yes, they convert ESPN and other 720p channels from their native signal.
4k , I'm not even going to wate my time if you haven't noticed on marketing gimmicks that prey on peoples ignorance.
Edited by Sole_Survivor - 8/5/12 at 11:54am
post #1427 of 1467
I guess some of us will agree to disagree with you Sole. My box has a native pass-through, so I see 720p as 720p and 1080i as 1080i. There is absolutely no question in my mind that 1080i looks sharper on live telescasts (on average) than 720p. I also think you're overplaying 'flicker'. Again, this was far more common in our old CRTs & NTSC. It's just not a real issue with 1080i/ATSC/flat panels. Both are great formats and serve their intended purpose. smile.gif
post #1428 of 1467
Not here, are you sure your box passed the 1080i to 720p conversion test? Are you sure your service provider isn't broadcasting 720p in 1080i. Remember you're still processing the image twice as fast in 60 fps even if it's a 30 frame broadcast, and sharpness is not the only issue, it's color rendering, artifacts, and over brightness and texture of the image.
So there are a lot of tangaables for different providers, equipment and set up. I also see some extra detail in some 1080i broadcasts when I let it pass through native, but the smoothness reduced artifacts, and texture of 720p outweigh that. When the side by side tests were performed with right eq. And proper conversion 720p won in all sequences and berates. This is what I see ......"In the presentation of uncompressed sequences, the delegates reported difficulties in seeing differences between the three formats – even at a viewing distance of 3h. But when the compressed images were shown, the viewers did notice differences in the visibility of compression artefacts. Depending on the viewing distance and scene content, the artefacts became visible to a greater or lesser extent and, with few exceptions, the following were reported:The 720p format showed better image quality than the 1080i format for all sequences and for all bitrates;"
http://tech.ebu.ch/docs/techreview/trev_308-hdtv.pdf
Edited by Sole_Survivor - 8/5/12 at 5:16pm
post #1429 of 1467
All I know is that with OTA (which is no where near as compressed as cable/sat for the most part) 1080i looks much better overall than 720p. Flicker, or what ever you want to call it is non-existent as far as I'm concerned. But, you're more than welcome to believe what you want. I just don't think that 6 year old data is as applicable with today's sets than it was then.
post #1430 of 1467
It's there, look at jaggiies on scroll bars. It is a fact though that you will receive a way better interlaced signal OTA!
Edited by Sole_Survivor - 8/5/12 at 6:54pm
post #1431 of 1467
Quote:
Originally Posted by Otto Pylot View Post

All I know is that with OTA (which is no where near as compressed as cable/sat for the most part) 1080i looks much better overall than 720p. Flicker, or what ever you want to call it is non-existent as far as I'm concerned. But, you're more than welcome to believe what you want. I just don't think that 6 year old data is as applicable with today's sets than it was then.

Same here. Despite all the psychotic posts in this tread the reality is that when i tune local news broadcasts from my seven local OTA HD channels over my roof antenna on my TV's internal ATSC tuner (and on my Tivo's ATSC tuner via HDMI), the shots showing the in-studio anchor desks on the 1080i channels are crisper and more detailed and display much better facial detail than the in-studio shots on the 720p channels. The 720p channels are softer - facial detail is lacking and text and the various anchor desks and studio furnishings are softer and a bit blurry.

I see this effect on all my TVs. 720p is clearly an inferior broadcast format than 1080i is. I can't imagine 1080p being as bad as or worse than 720p.
post #1432 of 1467
Quote:
Originally Posted by Sole_Survivor View Post

On paper Ken...read my encyclopedia of evidence that filtering reduces the 1080i image by 60% causing the artifacts to show up in certain areas of the broadcast.
The wikipedia link said "to 70% of progressive"

Can you show a quote from an EBU article that says filtering reduces the 1080i image (or vertical detail) "by 60%"?
Quote:
causing the artifacts to show up in certain areas of the broadcast.
What type of artefacts are you saying show up because of filtering (do you have links saying that those artefacts are caused by filtering?)? Surely in general, with filtering you'd be lowering the resolution, reducing the appearance of artefacts (eg. interline-twitter, aliasing or noise) as well as (possibly) making it easier to compress (so could be less compression artefacts).
Quote:
differences in the visibility of compression artefacts
Wouldn't there be even less compression artefacts if they reduced the resolution to even lower than 720p? Wouldn't (with compressed video), at a given bitrate, the lowest resolution format be more likely to have the least compression artefacts (all other things being equal)?

Also, did the EBU tests have any "film-look" test sequences ie. 25Hz temporal resolution (if so, which test sequences?), and if not, were the tests relevant for broadcasts of most (eg. 24 fps) feature films or similar style TV programmes (except for any higher rate content in the latter)?
Edited by Joe Bloggs - 8/5/12 at 9:25pm
post #1433 of 1467
Look, they're still at it
."In the presentation of uncompressed sequences, the delegates reported difficulties in seeing differences between the three formats – even at a viewing distance of 3h. But when the compressed images were shown, the viewers did notice differences in the visibility of compression artefacts. Depending on the viewing distance and scene content, the artefacts became visible to a greater or lesser extent and, with few exceptions, the following were reported:The 720p format showed better image quality than the 1080i format for all sequences and for all bitrates;"
http://tech.ebu.ch/docs/techreview/trev_308-hdtv.pdf

"As expressed earlier on, the differences between the two halves of an interlaced image lead to interlaced artifacts. To reduce the visibility of these artifacts, filtering is applied to the vertical resolution of an interlaced signal. This reduces the real image vertical resolution to some 60% of the number of scan lines supported by the 1080i interlaced format. Furthermore, 1080i material is limited to around 1440 pixels horizontally to reduce transmission bandwidth requirements; this reduces the overall effective resolution of the 1080i format to around 0.93 million pixels.

It is thus clear that the actual difference in effective resolution between 720p and 1080i is almost negligible. And this apart from the fact that a 720p display is capable of a better flicker-free picture when dealing with fast action TV content.

"Yet there is another issue against interlaced video, that of digital compression of images. Digital image compression is more efficient with progressive video at the source than interlaced video. High definition digital TV broadcast uses the same 6MHz maximum allocated broadcast bandwidth in the US as with standard definition analog TV. This means it is necessary to apply compression to make high definition images fit into the space allocated for a broadcasting TV channel.". http://www.practical-home-theater-guide.com/1080p-hdtv.html

”720P, when compared with 1080i, provides better dynamic resolution, better motion rendition, the absence of interlace artifacts, and the absence of compression artifacts. It makes brighter pictures with a higher contrast ratio than 1080i. It is well matched to the resolution capability of consumer displays. It is a forward-looking technological choice that is compatible with computers, with advanced display technologies, and with the display of text and multiple windows as well as conventional television pictures. Given all this, the technological choice between 720P and 1080i is not a difficult one. The topic of subjective picture quality is complex, but the reasons ABC chose 720P HDTV may be distilled down to a simple truth: it gives the viewer better HDTV pictures.".


http://www.bluesky-web.com/numbers-mean-little.htm


Because one artticle didn't mention it they throw the other one out and turn to on online edit yourself wikipedia article ... Hey I went back in there and added stuff to it too.

Now here we go with lawyer tactics spins and cross exams, changing conditions and scenarios looking for loopholes. What they will do to defend that signal is beyond nothing I ever witnessed. See, I'm creating a blog on this just for it to be a textbook example of how far people will go to defy facts presented.

Did that article say 60% ...no... It said roughly HALF.... Make sure you spin that too

"Once interlacing is applied to an image format, vertical-temporal information is lost that can never be recreated"
Cont..."Although different interlacing techniques are possible, roughly half the vertical-temporal information compared to 1080p/50 is removed."

"the 1920 x 1080i format and the impact of interlace which results in a gradual reduction of vertical resolution with movement, caused by subdividing a single frame into two fields (interlaced). Fig. A2 (right) shows the 1080i/25 format with a Kell factor of Kv = 0.7 and in addition the interlace factor I = 0.7 caused by incomplete cancellation of the fields (interline twitter). Both factors further reduce the available vertical resolution of the format."

"Figure A3
1080i format compared and interlaced factor
to the 720p format with Kell"


"Kell and interlaced factor both “reduce” the available resolution while the interlaced factor reduces the vertical resolution"

Summary of results, conclusions and future work
720p showed better image quality than the 1080i for all sequences and for all bitrates;


http://tech.ebu.ch/docs/techreview/trev_308-hdtv.pdf

"720P delivers more real useful resolution, or sharpness, than 1080 interlaced source through a 1080i delivery channel. It’s counterintuitive but that is the reality"
http://www.philiphodgetts.com/wp-content/uploads/2011/03/What-is-HD.pdf

Roughly half would be an approx. 60%.

"The demonstration suggests that a progressive format for emission provides the best image quality / bitrate compromise with MPEG-4 AVC compression. EBU Members have already been advised in EBU Recommendation R-112 that the 720p/50 emission format is currently the best option. The demonstration has underlined this statement."

Hey I saw a special of the 49 ers on ESPN they didn't mention Montana won 4 superbowls, can you show me in that show where they say that? That means he my not have won 4 right? Because some articles don't elaborate on very fact does not mean it is in conlflict with the ones that do, unless you use an edit yourself website. Did you know there a group of people that will defend their belief that the Earth is flat. You people are just like them, the way I won't waste my time with them, I'm seeing the same thing here. You guys lost this tnis debate. Stop the spins.... This is my cue, never thought it would get his bad.

Hey, learn some of their tactics they're still at it, join them. You cannot win this.

You'll relate to the "fighting the evidence section"
http://www.alaska.net/~clund/e_djublonskopf/Flatearthsociety.htm

Enjoy
Edited by Sole_Survivor - 8/6/12 at 4:18am
post #1434 of 1467
Quote:
Originally Posted by Sole_Survivor View Post

Look, they're still at it
."In the presentation of uncompressed sequences, the delegates reported difficulties in seeing differences between the three formats – even at a viewing distance of 3h. But when the compressed images were shown, the viewers did notice differences in the visibility of compression artefacts. Depending on the viewing distance and scene content, the artefacts became visible to a greater or lesser extent and, with few exceptions, the following were reported:The 720p format showed better image quality than the 1080i format for all sequences and for all bitrates;"
http://tech.ebu.ch/docs/techreview/trev_308-hdtv.pdf
This study compares a native 1080p50 source, 1080i25 and 720p50. Of course 1080i is going to lose if the comparison is done at half the framerate. If they were going to be comparing a 50fps source, they should have used 1920x540p50 for the 1080i signal, not 1080p25.

This test is not at all relevant to real-world broadcasts, and certainly not applicable to your examples of 24/30fps native content broadcast at 1080i, downconverted to 720p by your TiVo.

You should also note that there is no mention of colour differences between signal formats, or show in their photographs—because there are no differences between the three signal formats in that regard. The colour shifts you're seeing are a problem in your setup, not anything to do with 1080i as a signal format.

How you think making a comparison between your TiVo outputting the native 1080i signal, and downconverting to 720p has any relevance to 1080i and 720p as a signal format is beyond me.

If 1080i degrades the image as much as you seem to claim, then it has already been degraded by the time it gets to your TiVo. What you do by setting your TiVo to output 720p is further degrading that image by throwing away half the resolution, because 1080i60 deinterlaces to 1080p24/30 with the film-type content in your examples.
Quote:
Originally Posted by Sole_Survivor View Post

"As expressed earlier on, the differences between the two halves of an interlaced image lead to interlaced artifacts. To reduce the visibility of these artifacts, filtering is applied to the vertical resolution of an interlaced signal. This reduces the real image vertical resolution to some 60% of the number of scan lines supported by the 1080i interlaced format. Furthermore, 1080i material is limited to around 1440 pixels horizontally to reduce transmission bandwidth requirements; this reduces the overall effective resolution of the 1080i format to around 0.93 million pixels.
It is thus clear that the actual difference in effective resolution between 720p and 1080i is almost negligible. And this apart from the fact that a 720p display is capable of a better flicker-free picture when dealing with fast action TV content.
"Yet there is another issue against interlaced video, that of digital compression of images. Digital image compression is more efficient with progressive video at the source than interlaced video. High definition digital TV broadcast uses the same 6MHz maximum allocated broadcast bandwidth in the US as with standard definition analog TV. This means it is necessary to apply compression to make high definition images fit into the space allocated for a broadcasting TV channel.". http://www.practical-home-theater-guide.com/1080p-hdtv.html
Yet another source that doesn't seem to understand the fundamentals of deinterlacing. If your article discusses flickering or how 720p produces a "flicker-free picture" chances are the information is wrong.

Deinterlacing a 1080i60 signal to 1080p24 or 1080p30 does not have any flickering, combing or other "interlaced artefacts". (whatever that is supposed to be) It is visually identical to a native 1080p24 or 1080p30 source, it's just a different transmission format.

Deinterlacing a 1080i60 signal to 1920x540p60 will also not have any flickering, combing or other "interlaced artefacts" and is still higher resolution than 720p60. If you are talking about broadcast 1080i which is often (but not always) 1440x1080 rather than 1920x1080, then you do have an image that is roughly 20% lower resolution than 720p. However, I don't know of any broadcasts that use the 1080i format when showing 50/60fps content.

The only time there is a possibility of flickering, combing or other "interlaced artefacts" is when either your display/source device is incapable of properly deinterlacing 1080i, or when you are trying to deinterlace a high framerate 1920x540p50/60 source to 1920x1080p50/60. This is not possible without artefacts, because you are interpolating data to do this.
Quote:
Originally Posted by Sole_Survivor View Post

”720P, when compared with 1080i, provides better dynamic resolution, better motion rendition, the absence of interlace artifacts, and the absence of compression artifacts. It makes brighter pictures with a higher contrast ratio than 1080i. It is well matched to the resolution capability of consumer displays. It is a forward-looking technological choice that is compatible with computers, with advanced display technologies, and with the display of text and multiple windows as well as conventional television pictures. Given all this, the technological choice between 720P and 1080i is not a difficult one. The topic of subjective picture quality is complex, but the reasons ABC chose 720P HDTV may be distilled down to a simple truth: it gives the viewer better HDTV pictures.".
http://www.bluesky-web.com/numbers-mean-little.htm
"Color reproduction is identical in all HDTV scanning formats, and may thus be disregarded as a factor."

This article, funnily enough, is also very old because it talks about CRTs and how there is a brightness difference between progressive and interlaced. On a flat panel display, which cannot show a natively interlaced image, there is no difference in brightness between image formats because they have been deinterlaced.
post #1435 of 1467
Another spin... The original source was 1080p and got converted to 1080i and 720p then they displayed all ..... All they an comeback with...They should have .... That's not true.... Not for all broadcasts. ... Yea they're all lies.., look from the guy who posted no verifiable evidence as I did as well his photo of girls with lime hair trying to prove something that actually proved everything I said. No evidence other that you're twisted hail maries trying to overturn verifyable evidence presented from multiple source. Your wishful twists hold no water. You got no evidence, you got your asses kicked plain and simple
Go join the flat earthers, even they had more of a case defending the earth is flat than you did defending 1080i. Maybe you'll learn some of their tactics. Never in my life did I ever see a group of people that want to psychologically believe their set is converting a 1080ix1920 signal to a full 1080p. Live in fantasy land, that may be the only place you can go to get something to back what you want to believe.
Edited by Sole_Survivor - 8/6/12 at 5:57am
post #1436 of 1467
Sole, I was pointing out that, as well as the EBU not saying it was reduced by 60%, that there is a big difference between "reduced by" (which you had said) and "reduced to"
eg.
1080 reduced to 70% would be 756
1080 reduced to 60% would be 648

1080 reduced by 60% (which the EBU haven't said) would be 1080-(60% of 1080)=432

The wikipedia article says reduced to (not by) 70%, and the EBU article in your link says to 0.7 (which is the same thing).
---
Also, you still haven't said which EBU test sequences were from film-style sources (eg. shot about 24p). As much as I want high motion films/content (eg. The Hobbit films/Avatar sequels, live TV), since 99.99% of films are shot at about 24 fps, shouldn't they have tested some content at about that with the different formats? Surely that would have been relevant to movie channels wanting to decide on 720p vs 1080i back when they wanted to start broadcasting in one of those formats.

If the EBU didn't test any surely the tests aren't really relevant to broadcasts of the vast majority of current films (which are nearly all about 24 fps ie. about 24p).
Edited by Joe Bloggs - 8/6/12 at 6:37am
post #1437 of 1467
Quote:
Originally Posted by Sole_Survivor View Post

Another spin... The original source was 1080p and got converted to 1080i and 720p then they displayed all ..... All they an comeback with...They should have .... That's not true.... Not for all broadcasts. ... Yea they're all lies.., look from the guy who posted no verifiable evidence as I did as well his photo of girls with lime hair trying to prove something that actually proved everything I said. No evidence other that you're twisted hail maries trying to overturn verifyable evidence presented from multiple source. Your wishful twists hold no water. You got no evidence, you got your asses kicked plain and simple
Go join the flat earthers, even they had more of a case defending the earth is flat than you did defending 1080i. Maybe you'll learn some of their tactics. Never in my life did I ever see a group of people that want to psychologically believe their set is converting a 1080ix1920 signal to a full 1080p. Live in fantasy land, that may be the only place you can go to get something to back what you want to believe.
OK, I give up, you "win". You clearly have no interest in actually learning anything or fixing your problem.

You have a foregone conclusion and are trying to find any "evidence" that fits, which results in your dredging up outdated information (referencing CRT-specific artefacts) and unknown sources (usually wrong) discarding anything else that does not agree with what you believe to be true, including direct captures showing exactly what happens when you downscale a 1080i image to 720p, how it looks when incorrectly deinterlaced as 540p, and how it should look when correctly deinterlaced to 1080p.

I have explained why the EBU tests that you have linked to have no relevance to actual broadcasts (no-one is broadcasting 1080i for 50/60fps content) and how they should have been comparing a 1920x540p50 signal to 720p50 rather than 1080p25 (or "1080i25" as they put it) yet that somehow means I'm making excuses. What the EBU test primarily shows is that 50/60fps native content looks better when shown at 50/60fps and not 25/30fps.

In the real-world, most, if not all 1080i broadcasts are of 24/25/30fps native content, not 50/60fps. This means that a 1080i50/60 broadcast should correctly deinterlace to 1080p24/25/30.

They did not even disable overscan with the 1080i/p sources which robs them of a lot of the sharpness advantage they have over 720p. (they still have more resolution, but the image is not as sharp) Furthermore, that display does not correctly deinterlace film-type content by default, you need to enable PureCinema mode. If they did not disable overscan, who knows whether they enabled PureCinema or not. And from owning several generations of Pioneer Plasma, I have first-hand experience with the fact that they do not do a very good job of deinterlacing, and particularly struggle with cadence detection with 50Hz sources. (up to and including the KRP-500M)

In fact, if you look over at the European "AV Forums" site, you will be able to find many posts over there telling people to disable PureCinema because it does such a poor job of cadence detection, introducing artefacts into the image that should not be there.


I have also explained several times why your 1080i/720p "comparison" is completely invalid, and I don't think you have even acknowledged those posts, either to admit that you are wrong, or try and argue the point. Instead you just try to discredit me.


And you even ignore information that contradicts your position from sources that you have posted here, such as this:
Quote:
Originally Posted by http://www.bluesky-web.com/numbers-mean-little.htm 
Color reproduction is identical in all HDTV scanning formats, and may thus be disregarded as a factor.
post #1438 of 1467
Quote:
In the real-world, most, if not all 1080i broadcasts are of 24/25/30fps native content, not 50/60fps. This means that a 1080i50/60 broadcast should correctly deinterlace to 1080p24/25/30.
Wouldn't it depend on type of content?

eg. if BBC1 HD is showing the Olympics (1080i) or most other live events (25 frames per sec, 50 fields per sec), it should get de-interlaced to 1080p50. (or by 50 "fps" are you including most live TV, credits, various TV soaps, etc)?

Also is there a simple way of finding out what the percentages are of the different types of content? eg. by channel.
ie. in theory couldn't there be an automated way of finding out (by analysing a video stream - has anyone done that and provided that info for particular channels?) - or is the only current way to do a mostly manual check (eg. by looking at some TV guide data)?
Edited by Joe Bloggs - 8/6/12 at 8:37am
post #1439 of 1467
Quote:
Originally Posted by Joe Bloggs View Post

Wouldn't it depend on type of content?
eg. if BBC1 HD is showing the Olympics (1080i) or most other live events (25 frames per sec, 50 fields per sec), it should get de-interlaced to 1080p50. (or by 50 "fps" are you including most live TV, credits, various TV soaps, etc)?
Most television shows now are shot for 24 or 30fps. (converted to 25fps in "PAL" regions, or potentially 25p native if shot there) And films are obviously 24fps native.
Sports are really the only possible exception, and in most cases they will be broadcast in 720p if they are shot at 50/60fps. I haven't been watching the Olympics, so I don't know what they're broadcasting.


If you have a 1080p24 source (technically 24/1.001, or 23.976...) you split each frame's information over two fields, and you now have the film as 1080i48 (or 1080/24PsF) without any quality loss.
1080i48 is not generally used, so a 3:2 cadence is applied to turn that into a 1080i60 signal (technically 60/1.001, or 59.94...) by repeating every fifth field.

When deinterlaced, what should happen, is the 3:2 cadence is detected, the duplicated fields are discarded, and the remaining fields are recombined to end up with the original 1080p24 source.

If you have a 1080p30 source, each frame is split into two fields and broadcast at 1080i60. These fields can be trivially re-combined to recreate the original 1080p30 source, using 2:2 cadence detection.
1080p25 is the same, only it's broadcast at 1080i50, again using a 2:2 cadence.


If you have a 1080p60 source (or 1080p50) you have a problem though. You can do one of two things:
  1. You can halve the framerate to 30fps (or 25fps) and retain the original 1920x1080 resolution.
  2. You can halve the resolution, keeping the 50/60fps framerate and having a 1920x540 resolution image.

In the EBU test, they chose to go with option 1, which I feel is a far worse compromise than option 2.

There is no way to convert a 1080p60 image as 1080i without doing one of these two things. You have to compromise on either resolution or framerate. In my opinion, the best compromise would be resolution, because halving the framerate makes a massive difference to the smoothness of motion.


Now the problem is, how does the display know if a 1080i60 signal originates from a 1080p24, 1080p30, or 1080p60 source?
1080p24 should be is relatively easy to detect, as you have a 3:2 cadence with repeated fields, though there are still many sources that will not actually switch to 24Hz, and so you have 3:2 judder.

2:2 detection is quite a bit more complicated though, and there are still many displays that don't handle this correctly today.

If the display gets it wrong, you are either going to have combing artefacts/jitter with motion (trying to deinterlace a 60fps source as 1080p) or lose half the resolution. (deinterlacing a 30fps source as 540p) Many displays choose to simply discard half the resolution with an interlaced signal at all times (540p) because it's less obvious than using the wrong type of deinterlacing.


If your display, or whatever box you're using to deinterlace the image does things correctly, 1080i should be a lot better than 720p in most cases. The only exception would be a 50/60fps source, with a 1440x1080i broadcast, rather than a 1920x1080i broadcast.
post #1440 of 1467
Quote:
Sports are really the only possible exception, and in most cases they will be broadcast in 720p if they are shot at 50/60fps
Isn't most live TV and soaps exceptions too?

Also, the UK broadcasters use 1080i instead of 720p, including for sports.
Quote:
In the EBU test, they chose to go with option 1, which I feel is a far worse compromise than option 2.
They say "The 1080i/25 content was generated via box-filtering (line/pixel averaging) over two consecutive frames". If that's option 1, then I agree, it's not a very good method, and they should have used a better method.
In theory, I think they should do something like option 2 for the parts of the frame in motion, and something like option 1 for the static parts of the picture - or use another, better de-interlacing method than the one they used.
Edited by Joe Bloggs - 8/6/12 at 9:08am
New Posts  All Forums:Forum Nav:
  Return Home
AVS › AVS Forum › Display Devices › Flat Panels General and OLED Technology › Official "1080p Vs. 720p" Thread Discussion