Originally Posted by Ken Ross
Seriously though, a good HDTV with a good deinterlacer will make 1080i look very close to 1080p.
We've been through this many times. Yes, a decent TV can make crap look good, but why a TV has to have a deinterlacer at all? "Some companies" (and I believe this was Sony) pushed 1080i into HDTV spec because they already had 1080i equipment. This was a bad day for the whole HDTV business.*
Shoot and broadcast progressive, and any dumb TV panel will show it in the same quality, and you don't need to care whether a particular TV set supports video-mode deinterlacing or film-mode pulldown removal. Also, there are other devices and apps beside a TV, like hardware and software media players, editors, etc, YouTube. They all are much better off with progressive video.
*The History and Politics of DTV
-- William F. Schreiber: "Lets be perfectly clear about this. Interlaced scanning is a really dumb thing to consider in any new video signal format. Interlaced scanning was not even a good idea when the US B&W standard was defined in 1941! As your readers are probably aware, interlaced scanning can generate very bad artifacts in a video image, things like 30 hertz interline flicker, motion errors, and reduced vertical resolution. This is why computer monitors are not interlaced -- the interline flicker would be unbearable. The reason why there are still proponents of interlacing escapes me, especially when considered in the context of digital broadcasting. Progressive scanning is simply more "digital friendly" than interlaced. When digitally coded, a progressive signal requires no higher data rate than an interlaced signal that has half the bandwidth. I have a hunch that the continued advocacy of interlaced equipment originates from foreign-owned consumer electronics companies that are trying to get back the substantial investments they foolishly made in this obsolete technology!"
The above was said in the year of 1997.