Originally Posted by bicker1
HD is anything with 720 lines of horizontal resolution without regard to transmission bit rate.
You're mixing apples and oranges. In this case, mixing horizontal resolution with the number of scan lines. Horizontal resolution is how well the image can be defined as the lines scans across the screen. Or, how many vertical B&W line pairs can be drawn across the scree before it turns to mush, i.e. grey.
Obviously in the new digital world, resolution is defined by many factors. In theory, one should be able to send out 960 white vertical lines, alternating with 960 black vertical lines for 1920x1080, or 640 white and 640 black for 1280x720.
For the moment, lets throw away the display device, which we'll get back to.
Bitrate will not affect the static image of those vertical lines, since they aren't changing, so the low bitrate that is being used to cram three HD channels, will send out the complete picture. What reduces the horizontal resolution is doing the so-named HDlite, where the number of horizontal pixels is reduced to 1440, or even worse, 1280 (for 1080i). Those 960 vertical line pairs will now start to look like crap. The horizontal resolution, for lack of a better term, is now reduced to 720 line pairs or 640 line pairs, a 25% or 33% drop in resolution. Ouch.
Now, take those same line pairs and alternate them every frame. At the low bitrates that Comcast is using, macroblocking should appear. That will turn those nice crisp line pairs (8 line pairs per MPEG-2 block), into mush. The mush could be gray blocks, if really bit starved, to who knows what, depending on the amount of bits that are available. That is another means of affecting horizontal resolution in the in the digital world.
Remember that display device? In the analog world, the display of analog TV was pretty much limited to about 320 lines of horizontal resolution, no matter the size of the screen, since that is the broadcast limit.
In the digital world, we now have displays that also affect the resolution that it can display. For example, 720p native sets will not display the 1080i line pairs cleanly, since it can only display 640 crisp vertical line pairs. Of course, the vertical resolution is affected as well
I'm assuming that 720p displays are pixels for pixel and do not have overscan (which is dumb in the digital world). But, there are those 1360 (or whatever it is) x 768 sets that are 720p native. What is up with that? That means the 1280 x 720 image is expanded to not mathematically fits pixels 1:1. They seems to do a good job of it though, as my daughter has one of those.
I'm currently displaying 1080i on a 1280x1024 computer monitor, so the 720p is 1:1, but the 1080i is reduced, so I don't see the potential full 1080i resolution.
So, all these things affect horizontal resolution. So, where does HD stop being HD, i.e., what is the minimum horizontal resolution for 1080 and 720? Is it 1440 or 1280 (for 1080i)? Is anyone reducing 720p's 1280 to something less? How much macroblocking can exist before it is not longer HD?
For me, the minimum is 1440x1080i/p. Sony's HDV is 1440x1080i. Many HD pieces are shot on Pro Sony cameras and recorded at 1440.
As for macroblocking, I personally feel that there should be zero macroblocking, in order to experience the crystal clear HDTV experience that the industry says digital is. If there are macroblocks, it isn't crystal clear anymore. What Comcast is doing has definately crossed the line in what is considered as HDTV.
The nature of 1080i over our current ATSC standard will not provide crystal clear images 100% of the time.
Thank goodness I don't have Comcast, otherwise they would be hearing from my lawyer regarding their non-HD video. It is beginning to look like someone should file a class-action lawsuit over this, like was done over D*'s HDlite.
If you haven't noticed, there is a lot of personal opinion in this posting.