AVS Forum Special Member
Join Date: May 2008
Location: San Francisco - East Bay area
Mentioned: 3 Post(s)
Tagged: 0 Thread(s)
Quoted: 188 Post(s)
Digital Video is not like analog video.
When a program is sent over the air (or by cable or satellite), for the video to change from the original, something in the signal path would have to change the bits.
If a pixel began as 00001111 (8 bits for 1 color pixel, let's say it is a red pixel)... it goes through everything and comes out of the "box" as 00001111. It cannot be any other way. The only thing that can change that pixel to something else is a video processor (or a math error in a format conversion, or a problem when the video is compressed for transmission). You NEVER get random bits changed because you would very likely get no image at all if random bits were being changed from 1 to 0 or vice versa.
Early on there were some Blu-ray disc players with math errors or bad conversion formulas that would produce video that was not quite right, but it was damn near impossible to correct for it because the entire image was not affected the same way. For example, for the entire image to be noticeably green over the entire image, something would have to add 00000010 or maybe 00000100 to all 2 million+ green pixels. You just don't get that sort of error by accident in digital video. If there is a video processor somewhere in the signal path and it is not disabled or it changes video somehow even with all settings at ZERO, maybe you could do something about that, but there's not much else you could do with digital video on a source by source basis.
Then there is the issue that internet video is a mess to begin with... a 4 minute clip of a dog doing backflips could be totally different than a 10 minute video of motorcycle stunts. And if you use cable or satellite TV, all 200+ channels may have slightly different video and there is no way you can compensate for that.
So for digital video, the best things you can do for image quality are to make sure you purchase an accurate Blu-ray disc player --- there was a thread on AVS somewhere that had a number of people who were measuring popular disc players and posting results showing which ones were accurate and which ones were... uh, less than accurate. But even an inaccurate Blu-ray player is usually close enough to being right that it can be very difficult to tell the color isn't perfect.
So unless there is some obvious reason to do so, in the age of digital video, you calibrate the video display using test patterns from an accurate video pattern generator or from a disc played on an accurate Blu-ray disc player (for a long time the PS3 and Oppo disc players were the only products you could rely on to be accurate, but these days, things are MUCH better and there are more and more disc players out there that are accurate.
If you manufacture a digital video box like Popcorn or a satellite box, if you cange the 1s and 0s in some random fashion, you will never get video, because there more data in the stream than just pixel data and all that "extra" data has to be perfect. LIke a photo... you can send a photo around the world 50 times over the internet, and it will NOT turn green, or experience an increase in contrast, or get darker or lighter... it wll be the same image. That is how digital transmission works.... the bits don't change. In digital video, to make the images change, you would have to have a video processor making intelligent changes to all the pixels for the image to change---remember you cannot change all the extra data that comes along with the pixel data or your video stream is no longer a video stream and you just won't get video at all. Analog video is SO DIFFERENT it is difficult for people to realize you just don't get too much red from a voltage being 0.15 volts too high like you might in the days of analog video where a voltage level controlled how bright each pixel was. In digital video, you make a pixel brighter by changing it from 00001111 to 0001000... that's 1 bit brighter... but every digit in that binary 8-bit word is different. That just cannot happen accidentally in digital video.
Calibration in recent years has not needed to focus on calibrating each source because of the very nature of digital video. But if someone is still using a Laserdisc player... OK, you might get some benefit in a special calibration for that because the analog video coming out of it is subject to minor voltage tweeks that CAN affect luminance or color and the Laserdisc player might just be a little too dark or a bit too red. That was a fact of life for analog video... digital video works differently and things that used to be critical, are no longer much of an issue. There may be isolated cases with some specific devices, but the majority of the time, there is no compelling reason to calibrate digital source components separately.
"Movies is magic..." Van Dyke Parks
THX -- ISF -- HAA