Originally Posted by HDMe2
I'm probably oversimplifying, but...
Essentially there were 2 camps when the HD broadcast standard was being discussed.
One group wanted the highest resolution possible in the allotted 6MHz-per-channel OTA bandwidth... while the other group was set on progressive (instead of interlaced) even at lower resolution.
So... after fighting for a while, they eventually compromised to just accept both 720p and 1080i in the broadcast spec and let stations choose.
IF they could have waited another couple of years... they might have been able to settle on MPEG4 instead of MPEG2, and then we might have had 1080p only and been done with it... but alas, that was not possible at the time all these things were being discussed.
They played with "standards" for over 30 years (that's three decades -- yikes!). And what we ended up with is 8VSB modulation -- aka multiplath hell. Coulda had COFDM like everyone else, but no, we had to be different. Granted, there is a non-negligible power efficiency to 8VSB (IIRC it's something like 25%, as in you can cover the same area for 25% less power), but the cost is steep -- multipath problems effectively impede OTA reception (especially close in to the transmitter which is opposite from the analog days), and would be practically impossible on a moving platform such as a high speed train. Oh, wait, we don't have any of those. Yet. So why learn from the Europeans or the Japanese and plan for it?
MPEG4 could have been implemented if I remember the time table. Coulda and shoulda. MPEG4 showed up in the late 1990s IIRC so was available for a good nine or 10 years before the great analog - digital switch over. I have no idea why this wasn't done.
The fight over "i" vs. "p" was largely a turf war between TV and computer companies. TV wanted to keep "i" (as in: "interlaced is the way we've always done it") while computer companies had already shown that "p" was a much better technology (as in "we tried interlaced but found that progressive scan cured things like flicker, resolution, motion, etc."), especially for computer games. We recovered, such as it is, with blu-ray. Which shows just how good 1080p really is in comparison to either 1080i or 720p. 1080p trumps the others if you are interested in video quality, no question. If you are interested in protecting your turf, maybe no so much.
Perhaps we'll learn something from all this and the next standard (long after I'm dead I'm sure) will be 2048p with MPEG4+ on a COFDM modulated carrier. I'm a gonna hold my breath for that one.
BTW, can anyone tell me how we got 1080 instead of 1024? I'm thinking I knew once what the significance of 1080 was, but I've forgotten.
And why 16:9 and not something that makes a modicum of sense like maybe 16:10 (the golden ratio) or 2.24:1 (sqrt(5), related to golden ratio), or maybe 1:85:1 ("flat" 3 perf pull down for 35mm film), or even 2.39:1 (anamorphic 4 perf pull down for 35mm film)? I'm sure it's an average of what was being used at the time (1980s maybe? -- decades before implementation of course), but why do we always react to the past as opposed to planning for the future? I mean, think about it: how much longer do you think cinema will be using 35mm film anyway? And if you're going to go digital, why not pick an aspect ratio that makes a modicum of sense, or at least has some bit of logic in it?
Feh. Please forgive me. I've been trapped inside all day and evidently it's made me somewhat cranky.