Hmm. So many variables. Several years back I even encountered MPEG-2 breakup because of design problems with an earlier cable box, Scientific Atlanta's 2000HD, built by piggybacking an HD module onto an earlier non-HD cable box. A new 3100HD corrected the 2000HD's overheating, image 'stuttering', and macroblock breakups. But the SA3100HD introduced a faint component 'fog' over images, spoiling contrast and diminishing fine detail as well as degrading the inky blacks my CRT-based RPTV is capable of. Then Tuesday I leased an add-on SA8000HD dual-tuner Digital Video Recorder cable box that banished fogged images, but of course brought its own problem: can't use S-video for 480i/p viewing without walking to the converter and changing it from HD to SD mode.
Those capturing images on film or video tape at 24 frames per second usually pan their cameras slowly to minimize the motion artifacts (not pixelation or so-called macroblocking) caused by this traditional capture rate. Image capture at 24 fps can still be worsened, introducing motion judder, when frames must be repeated with so-called 2:3 pulldown to produce the 60-field-per-second video rates needed by 480i and 1080i television. Images video taped or televised directly at 60i appear smoother than 24-fps because 60 images are captured each second. We see each 1/60-second field (half a frame), but the frame halves are also visually merged (1/60 + 1/60 = 2/60 = 1/30-second TV frame, or 30i).
When 1080/60i (1920X1080) is delivered with standard HD's MPEG-2 encoding, the digital bit rate used must be able to handle detail and motion within the programming. Over-the-air (OTA) HD can use up to 19.39 million bits per second (Mbps), although about 32 Mbps is broadcast. The 'extra' bits are duplicate OTA error-correction data used during MPEG-2 decoding. If a station is multicasting--putting one or more subchannels within its ~19-Mbps allocation--then there are fewer bits available for the main HD channel. The main channel benefits, fidelity wise, from using the full 17 Mbps available for the video payload part of ~19 Mbps. (An original HD video payload, before compression, is
1.2 Gbps ; giga or billion bps). Most often HD programming delivered to stations is less than 17 Mbps video payload, perhaps only 12 Mbps. MPEG-2 encoding can shrink the bit rate requirement by using repeat-frame flags that tell home decoders to generate 2:3 pulldown. But 1920X1080 programming is typically quite 'diluted' too: You're receiving signals with 1920X1080 samples/lines/pixels, but telecined movies copied from film typically have
only
800--1300 pixels maximum that can be resolved on each horizontal line. The 1920X1080 format can provide up to ~1700-pixel maximum
resolvable detail -- unless oversampling and downconversion is employed.
Each of a 1080i signal's 120X68 DCT macroblocks has four 8X8 smaller blocks for luma (B&W) processing and two 8X8 blocks for color. If the bit rate available is too low for the amount of detail or the motion in certain DCT blocks, the video details are discarded by the MPEG-2 encoder. Images then appear softer, lacking fine resolvable detail. Some older encoders used by stations toss out more details because they're less efficient then newer designs. If motion is too great for an encoder/decoder chain, the DCT blocks degrade into large visible rectangles or create smaller artifacts. Because motion is fast with live sports, as well as resolvable detail potentially being at or close to the maximum, some stations switch off subchannels, permitting up to 17 Mbps for the main HD channel. But all this is a small sampling of variables influencing HD image quality. -- John