Calibrating Projectors - AVS | Home Theater Discussions And Reviews

AVS | Home Theater Discussions And Reviews > Display Devices > Display Calibration > Calibrating Projectors

Display Calibration

hussain's Avatar hussain
11:54 PM Liked: 20
post #1 of 8
03-24-2014 | Posts: 362
Joined: Jun 2007

I am going to build my first home theatre and need a little help as I have couple of questions:

1. Does calibration of projector different for different type of screens?
2. Does projector need to be calibrated to different type of HDMI Input i.e. I plan to watch videos from my popcorn hours, ps3 and my satellite dvr, Would I need to calibrate projector separately for each of these inputs. If yes, I can use disney wow for ps3 but how would I calibrate my popcorn hou and dvr.

kkpro's Avatar kkpro
10:17 AM Liked: 24
post #2 of 8
03-25-2014 | Posts: 918
Joined: Jul 2004
As far as screens yes. Inputs may vary also. There are some big differences in inputs especially going from a blu-ray player to satellite/cable. With using just a calibration disc, the most you will be changing is contrast, brightness and color though.
sjschaff's Avatar sjschaff
10:40 AM Liked: 15
post #3 of 8
03-25-2014 | Posts: 400
Joined: Apr 2004
For devices like Popcorn Hour you'd need to rip the Blu-RAy to the hard drive and use it via a player that supports Blu-Ray images. Same applies to a DVR (once you determine if the DVR will allow you to either put an iso or other files directly on the disc or allow for external attachment of a flash drive or other supported USB drive. This can get tricky based on what formats are supported (both drive and video). Some DVR's only can attach say FAT32 based devices which have limits on supported file sizes.
Doug Blackburn's Avatar Doug Blackburn
07:15 PM Liked: 232
post #4 of 8
03-25-2014 | Posts: 3,465
Joined: May 2008
Digital Video is not like analog video.

When a program is sent over the air (or by cable or satellite), for the video to change from the original, something in the signal path would have to change the bits.

If a pixel began as 00001111 (8 bits for 1 color pixel, let's say it is a red pixel)... it goes through everything and comes out of the "box" as 00001111. It cannot be any other way. The only thing that can change that pixel to something else is a video processor (or a math error in a format conversion, or a problem when the video is compressed for transmission). You NEVER get random bits changed because you would very likely get no image at all if random bits were being changed from 1 to 0 or vice versa.

Early on there were some Blu-ray disc players with math errors or bad conversion formulas that would produce video that was not quite right, but it was damn near impossible to correct for it because the entire image was not affected the same way. For example, for the entire image to be noticeably green over the entire image, something would have to add 00000010 or maybe 00000100 to all 2 million+ green pixels. You just don't get that sort of error by accident in digital video. If there is a video processor somewhere in the signal path and it is not disabled or it changes video somehow even with all settings at ZERO, maybe you could do something about that, but there's not much else you could do with digital video on a source by source basis.

Then there is the issue that internet video is a mess to begin with... a 4 minute clip of a dog doing backflips could be totally different than a 10 minute video of motorcycle stunts. And if you use cable or satellite TV, all 200+ channels may have slightly different video and there is no way you can compensate for that.

So for digital video, the best things you can do for image quality are to make sure you purchase an accurate Blu-ray disc player --- there was a thread on AVS somewhere that had a number of people who were measuring popular disc players and posting results showing which ones were accurate and which ones were... uh, less than accurate. But even an inaccurate Blu-ray player is usually close enough to being right that it can be very difficult to tell the color isn't perfect.

So unless there is some obvious reason to do so, in the age of digital video, you calibrate the video display using test patterns from an accurate video pattern generator or from a disc played on an accurate Blu-ray disc player (for a long time the PS3 and Oppo disc players were the only products you could rely on to be accurate, but these days, things are MUCH better and there are more and more disc players out there that are accurate.

If you manufacture a digital video box like Popcorn or a satellite box, if you cange the 1s and 0s in some random fashion, you will never get video, because there more data in the stream than just pixel data and all that "extra" data has to be perfect. LIke a photo... you can send a photo around the world 50 times over the internet, and it will NOT turn green, or experience an increase in contrast, or get darker or lighter... it wll be the same image. That is how digital transmission works.... the bits don't change. In digital video, to make the images change, you would have to have a video processor making intelligent changes to all the pixels for the image to change---remember you cannot change all the extra data that comes along with the pixel data or your video stream is no longer a video stream and you just won't get video at all. Analog video is SO DIFFERENT it is difficult for people to realize you just don't get too much red from a voltage being 0.15 volts too high like you might in the days of analog video where a voltage level controlled how bright each pixel was. In digital video, you make a pixel brighter by changing it from 00001111 to 0001000... that's 1 bit brighter... but every digit in that binary 8-bit word is different. That just cannot happen accidentally in digital video.

Calibration in recent years has not needed to focus on calibrating each source because of the very nature of digital video. But if someone is still using a Laserdisc player... OK, you might get some benefit in a special calibration for that because the analog video coming out of it is subject to minor voltage tweeks that CAN affect luminance or color and the Laserdisc player might just be a little too dark or a bit too red. That was a fact of life for analog video... digital video works differently and things that used to be critical, are no longer much of an issue. There may be isolated cases with some specific devices, but the majority of the time, there is no compelling reason to calibrate digital source components separately.
hussain's Avatar hussain
02:46 AM Liked: 20
post #5 of 8
03-28-2014 | Posts: 362
Joined: Jun 2007
Originally Posted by Doug Blackburn View Post

Digital Video is not like analog video................................


Very very thorough reply thankyou. What you say, seems to be correct. If one calibrates one digital input then calibration for other digital inputs shouldn't be required however:

1. Some players have built in settings for brightness, contrast etc. Isn't it possible that default settings of these are different for different input devices.
2.. In case of projectors with projectors isn't it possible that we will need to adjust things such as contrast and brightness to counter balance the characteristics of the screen such as gain etc.
Doug Blackburn's Avatar Doug Blackburn
12:26 PM Liked: 232
post #6 of 8
03-28-2014 | Posts: 3,465
Joined: May 2008
1 - if there are settings in the disc player for Brightness, Color, Saturation, Sharpening, Contrast, etc., the source has to have a video processor in it. Having an overall green tint to images (for example), can't be accidental. Something has to INTELLIGENTLY change each pixel to add green or remove red & blue in order to make an image from a source greener than it should be. If the "center' or "zero" position for each setting/control in the disc player or other source is not really "zero" then the image data will be changed. You can't tell if that is happening until you have a calibrated video display. Once you calibrate the video display, you could then make measurements using a disc with test patterns in the disc player. If the measurements are different than the measurements you got with the accurate video signal generator, then the disc player is changing things. If you were to use the disc player to calibrate the monitor and the disc player had an internal error that changed inages, all your other sources would be inaccurate. So it is better to calibrate the video display, then see if any of the sources do not look right or if you can make make measurements, to see if each source is accurate or not. If the source is not accurate, then you have to decide what to do about it. Some video displays will allow you to have different calibration settings for HDMI1 and HDMI2 inputs, for example. That makes it easy enough to put a custom calibration on HDMI1 for the disc player and a standard calibration on HDMI2 for something like a cable or satellite box. Of course you can avoid the whole issue by locating the thread on AVS where people are measuring Blu-ray disc players and determining which ones are producing accurate images and which ones are not. Simply avoid Blu-ray players that don't measure as being accurate and all you need on the video display is a calbiration using a video signal generator as the source.

The other thing to keep in mind is that if you DO have an inaccurate Blu-ray player and you customize the calibration of HDMI1 to make the output of that Blu-ray player accurate, if you change the Blu-ray player later and the new disc player is an accurate model, and you use HDMI1 you will need to copy the calibration settings from HDMI2 over to HDMI1 so you aren't changing the video from the new disc player.

2- For projectors, yes, the screen affects calibration. I've measured screens that are too yellow, too blue and too red. If that screen is the one you will use with the projector and you calibrate the projector by making measurements from the screen (some meters allow you to measure the light directly from the projector or to measure like reflected from the screen). And you don't want projectors to be too dim OR too bright. The SMPTE specification for projected cinema presentations is 12-20 fL with 16 fL being optimum. I've measured projection setups where the owner thought everything was fine, but the calibrated projector was much dimmer than the manufacturer's lumens specification and the gain of the screen was considerably lower than the manufacturer specification. The end result was that they had only 8 fL or even less for 100% white --- projector not bright enough and/or screen gain not high enough to get the desired 16 fL. Frankly, if you are measuring a fairly new lamp, you'd want the CALIBRATED projector to be capable of at least 24 fL so that when the lamp ages, you still have at least 12 fL. That said, I wouldn't leave the projector set to 24 fL for 100% white, I'd use low lamp mode and a lower contrast setting to get close to 16 fL and give the custoemer instructions on how much to increase the Contrast setting as the lamp accumulates hours and gets dimmer. And at some point, they may also have to change to "high" lamp mode to get enough light for satisfying images. Reflections from nearby colored surfaces in the home theater room will also change the measurements you get from a projector - which is why very dark gray or black is ideal for a home theater room with a projector... best for the walls, ceiling and carpet/floor.
hussain's Avatar hussain
01:39 PM Liked: 20
post #7 of 8
03-28-2014 | Posts: 362
Joined: Jun 2007
Fantastic reply. The reviews suggest that sony hw55es is the most calibrated projector out of the box. What does this mean. Should different screen have different calibration for the same projector?

Also what is cheapest way to calibrate a projector by tools. I mean what do these probes and softwares cost?

Also, Can somebody tell me how accurate is popcorn hour for ripped blu rays.
Doug Blackburn's Avatar Doug Blackburn
06:27 PM Liked: 232
post #8 of 8
03-29-2014 | Posts: 3,465
Joined: May 2008
Ripping implies copying. Digital copies are always perfect -- provided nothing is done wrong (like some critical data is "dropped" because the ripping program isn't doing what it is SUPPOSED to do). You can't change digital video "intelligently" without a video processor. That means you either gets the bits right or wrong during a conversion. If you get them wrong in any sort of random way, you are just plain going to lose the video stream altogether.

If the popcorn thing compresses video... all bets are off. Compression algorithms tend to "average" nearby colors so you may not get precision of color after compression, but it won't be a big obvious error. If the disc is 22 GB or 45 GB (single layer Blu-ray holds 25 GB and dual layer Blu-ray holds 50 GB) and it is the same size after it is popcorned, you will have Blu-ray quality playback (though multiple language soundtracks and special features can be difficult to account for if you aren't pulling all the contents into the popcorn deal... I've never seen or read about the popcorn machine so I don't know what they do.

There's really no such thing as a "best calibrated out of the box" projector. I"ve calibrated several samples of Sony's $25,000 1000ES projector and while there was ONE combination of settings that was fairly good, I would not call it "well calibrated" with those particular settings. It just wasn't as bad as many other projectors. Still lots of things to fix re.calibraton. And 3D mode was worse than 2D mode.... projectors and TVs are often very different in 3D mode compared to 2D mode but most people don't realize that.

There are many many posts in the calibration forum where people have asked about what equipment and software to purchase... I suggest you use the search feature to look for "what meter" or "what software"... just be careful.... you want the meter you purchase to be supported by the software you decide to use. Software you pay for probably supports MANY meters while Free software may only support a few meters. You can pay less than $100US for a poor meter and nearly $30,000US for an excellent quality meter. Most hobbyists spend $200-$300 on a meter, but at those prices, you must understand that the meter will change as it gets older and the measurements will change. So a new meter today will be very accurate for maybe 2 or 3 years before it begins to change enough to affect the readings. But you will never know that the readings are changing... because at the beginning, the errors will be small and not easy to see with your eyes. Nobody really knows how long a moderately priced meter ($200-$300) will remain accurate, but the best estimates are 4-6 years. If the meter is cold and dry, it will drift slower. If the meter is hot and in high humidity, it will "age" and drift faster. We do know that. Some people store their meters in freezer inside a heavy bag with "dessicant" inside the bag to absorb water from the air inside the bag for a couple of days before putting the meter in the freezer.

It is POSSIBLE you might find 2 very different screens that would not need separate calibrations, but there is no way to know for sure without measuring the screens and comparing the measurements. But it is not likely that you would find 2 screens that produce the same measurements and would use the same projector calibration settings. So you should ASSUME that the 2 or 3 screens will measure differently and get a calibration customized for each screen.
Reply Display Calibration

Subscribe to this Thread

Powered by vBadvanced CMPS v3.2.3