Direct captures makes my point even more....his source was 1080i, claimed it was properly de interlaced and their hair had more lime it than the aliens in Lost In Space, The interlaced signal is still reduced by 60% due to filtering/ compression and still has all the same artifacts problems they always had and those horrible direct captures as you call them, proves my point along with all the evidence I provided. Your side presented no evidence, other then a claim that yea on paper 1080i has 2 virtual fields than make up approx 2 million pixels, but link after link, I provided evidence that shows you it is reduced by 60% due to filtering compression, kell factor. Etc. You people are coming from the point of Blu Ray, maybe Blu Ray 1080i..but not broadcast, my evidence as well the artifacts in those girls hair proved my point, and I won this debate clearly a slam dunk. Now if you get that horrible broadcast signal since its reduced, if you take it out of 1080i and convert it to 480i, yea we know the TV still scales it to 1080p, but you sent less real pixels into the set because when your box if properly working, took the signal out of 1080i where it is no longer 1080i, it's now 480i. Click your info button on the TV remote, it will say 480i. OK... Follow me...now.. When you convert it to 720p, it is no longer 1080i it is now 720p/60... And yes...spare me...I do understand that the souce material is not real 60 fps, but the box is still processing the image at twice the speed in a single painted frame when PROPERLY converted to 720p/60. Now you're feeding your TV a signal that cleaned up the artifacts BEFORE THEY HIIT YOUR SET in a single progressive frame, rather than 2 F...D up broken fields reduced by 60% at half the frame rate. I know exactly what is in your head...the TV is going to deinterlace it anyway...no. Does it deinterlace 480i to 1080p quality.. No! Because you're not feeding it that signal...it tries to scale it to 1080p and it does because it's a fixed pixel resolution, but it is not 1080p quality. When you drink dirt, your immune system can fight off infections, but notice while all your cells are fighting this, you're body feels like crap? Why, because it's working harder. You're doing the same thing to your TV by feeding A PROGRESSIVE monitor 2 screwed up fields reduced by 60% containing artifacts..making the TV work harder. The way you filter contaminates out of water before it hits your body to make your body work less harder, filter the garbage out of that broken up artifacted signal to a single frame at twice the frame rate into your set. I'm really talking to the readers, because you people will defy all logic, tests, results and evidence presented. I was right since day one and test after test I was able to verify them. Like the video stated, it makes it very difficult for interlaced signals on a progressive monitors, STAY WITH 720p and 1080p IF AT ALL POSSIBLE "Tech Ev."..... I know. ...yea. ... They're wrong too. BTW, read the article I my profile link, the test used in the side by side used monitors that deinterlaced 1080i to 1080p, Pioneer plasmas EX5000 and the source material was 1080p signal that was converted to 720p and 1080i. Here is a sample from 2006 of TVs that either passed or failed the 1080i deinterlace to 1080p.
Make/Model Technology Pass/Fail
Epson LS65HD1 3LCD RP Pass
Mitsubishi WD-62627 DLP RP Fail
Mitsubishi WD-73927 DLP RP Fail
Mitsubishi WD-52627 DLP RP Fail
Mitsubishi LC-3780 LCD FP Pass
Samsung HL-R5668 DLP RP Pass
Sharp LC-45GD5U LCD FP Fail
Sony KDSR60XBR1 LCOS RP Pass
Sony KDSR50XBR1 LCOS RP Pass
Westinghouse LVM-37W1 LCD FP Pass
So even these cheap monitors passed converting 1080i to full 1080p by not using one field at a time....meaning a full 1080i signal, not one reduced by 60%.... And BTW..Pioneer Plasma is #1.
All presented on the same 1080p monitors...... Why...again another link verifying the same links I quoted earlier about BROADCAST COMPRESSION ....."In the presentation of uncompressed sequences, the delegates reported difficulties in seeing differences between the three formats – even at a viewing distance of 3h. But when the compressed images were shown, the viewers did notice differences in the visibility of compression artefacts. Depending on the viewing distance and scene content, the artefacts became visible to a greater or lesser extent and, with few exceptions, the following were reported:The 720p format showed better image quality than the 1080i format for all sequences and for all bitrates;"
Notice it said in certain scenes... Just what I said, they are not always there, I can show you dozens of 1080i shots with no artifacts, but I know where they show up.
Another link saying the same.."720p vs. 1080i
TV stations would normally broadcast in either 720p or 1080i but not both; the predominant format is 1080i. This in itself is not an issue; all present HDTV sets can display pictures in any HDTV format by up-converting or down-converting to the set native resolution, i.e. the one in which the set is designed to display the image.
From a pixel-count perspective, 1080i supports better spatial resolution than a 720p HDTV. In fact, 1080i supports a pixel count of over 2 million pixels as against the 0.92 million pixels supported by 720p HDTV. But in reality, the situation is somewhat different when it comes to an interlaced format.
As expressed earlier on, the differences between the two halves of an interlaced image lead to interlaced artifacts. To reduce the visibility of these artifacts, filtering is applied to the vertical resolution of an interlaced signal. This reduces the real image vertical resolution to some 60% of the number of scan lines supported by the 1080i interlaced format. Furthermore, 1080i material is limited to around 1440 pixels horizontally to reduce transmission bandwidth requirements; this reduces the overall effective resolution of the 1080i format to around 0.93 million pixels.
It is thus clear that the actual difference in effective resolution between 720p and 1080i is almost negligible. And this apart from the fact that a 720p display is capable of a better flicker-free picture when dealing with fast action TV content.
"Yet there is another issue against interlaced video, that of digital compression of images. Digital image compression is more efficient with progressive video at the source than interlaced video. High definition digital TV broadcast uses the same 6MHz maximum allocated broadcast bandwidth in the US as with standard definition analog TV. This means it is necessary to apply compression to make high definition images fit into the space allocated for a broadcasting TV channel. ". http://www.practical-home-theater-guide.com/1080p-hdtv.html
”720P, when compared with 1080i, provides better dynamic resolution, better motion rendition, the absence of interlace artifacts, and the absence of compression artifacts. It makes brighter pictures with a higher contrast ratio than 1080i. It is well matched to the resolution capability of consumer displays. It is a forward-looking technological choice that is compatible with computers, with advanced display technologies, and with the display of text and multiple windows as well as conventional television pictures. Given all this, the technological choice between 720P and 1080i is not a difficult one. The topic of subjective picture quality is complex, but the reasons ABC chose 720P HDTV may be distilled down to a simple truth: it gives the viewer better HDTV pictures.". A real 720p/60 convetsion will do this that is why my curtain was lit up vs the 1080i shot.
They have presented nothing other lime tinted hair they claim was deinterlaced properly that proved my point, and presented no verifiable evidence other than 1080i has more pixels,(yea with Blu Ray or HD DVD 1080i) but I have shown from multiple sources it is not true with broadcasts that the resolution is REDUCED and presented multiple facts that they cause artifacting. They have nothing other than what they want to believe in their head.
So the 720p version beat 1080i in every manner. So, hey, I have the evidence that verify my statements, I won this debate, over and out.
Sole. Edited by Sole_Survivor - 8/6/12 at 3:33am