The reason, mainly that people observe differences in HDTV quality has to do with the source of the program. The bulk of programming we have today was actually shot with a variety of film cameras and converted to video tape. Unfortunately, we see only the end product after there has been much processing, including many compression and decompression schemes that adds to quality degradation. Don't forget that we have the gigo concept at play too. If the movie was poorly lit, badly focused, then the transfer can be no better than the source. Additionally, any additional processing will take it's toll on the image quality as well.
To determine true HDTV, I choose to only qualify the video in these categories:
1. scan frame size as 16:9. This will include any letterboxed movies.
2. 1080i or 720P
3. video source was maintained component digital on all tape formats.
If the video was maintained component digital then the color integrity will remain with sharp edges on the chroma as well as more intense saturated color as the digital component color space is greater than the same picture in analog composite video.
Picture detail- The maximum horizontal resolution of HDTV is 1920 pixels, however, not all images will have this level of detail yet still be classified as "HDTV" the degree of detail loss in the HDTV video editing, dubbing and tape formats used affect the PQ of one over the other in addition to the source quality variance. for example. Take a high res film and transfer it to D5 video tape for the base master. This format will record and preserve 1920 pixels in the 1080i scan rate HDTV format. Then the distribution dub is done to HDCAM because that is what the cable op or DBS service happens to request. Automatically the H res was lost to 1280 pixels in the 1080i format. Assuming all else being equal, the station airing the D% will have a slight better PQ than the one with HDCAM. This also assumes you have a way to see this difference as well. The reality is that most people have HDTV monitors that top out at between 1000 and 1300 pixels resolving power. Therefore you probably won't see the difference between HDCAM and D5. What you may be seeing is other signal losses that are caused by compression of the video signal. It is my opinion that comparing one movie to another and attempting to blame the poor focus or bad lighting of one on compression is just wrong. I have seen many do this.
Sharpness adjustments- This picture adjustment has a precise calibration that can be made with a proper test pattern. However, no one should argue that you may have a personal preference to see your picture with an excess of sharpness added. This is simply an artifact as it is not the way the picture was designed. Heck, my father in law chooses to watch his TV with the color sat all the way up. The picture is iridescent but he likes it that way. Too much sharpness will add a white edge to black text on a gray background. the proper sharpness adjustment is to just back off to the point where the white edge just disappears.
Home Theater Pics at: www.scubatech.com
Last updated 3/25/01
[This message has been edited by Don Landis (edited 05-24-2001).]