Andy, I'm not ISF certified but I think I understand this stuff fairly well...
Originally Posted by MadMrH
In your question quoted above you mention "full field", as I understand full field is not the correct test for CRT projectors, a windowed field is meant to be used. I am led to believe that full field puts strain on the PSU and usually the CRT projector is unable to sustain full field correctly.
Basically correct. The CRT circuitry is current-limited. It can only pump so much juice through the circuits per frame. IRE100 takes a lot of current, and the amount of current required is directly related to the area of IRE100 you're displaying. A CRT should be able to display an IRE100 window without too much trouble, but a full field is more than a typical and properly-setup CRT can deliver. If you measure the ftL for an IRE100 window, then measure for an IRE100 full field, you will see the light level drops when you try to light up the whole screen.
Since a full-field IRE100 screen is not a typical frame in movie material, it makes more sense to optimize the projector for a more-typical usage scenario. IRE100 in a smaller area, like a window, is a more realistic approximation of actual video content, so that's what you use for calibration.
I dont believe that D6500 is actually a known calibration curve - though I do see many quote it.
D65 or 6500 are the options .
D65 is the same as 6504 - thats what I understand.
I believe D65, D6500, 6500, etc are all common synonyms for a 6500 Kelvin "white." 6504 Kelvin is actually the exact temperature, but people use 6500 as a reasonable approximation.
ALSO I dont believe that you can only calibrate 80IRE to D65, it is one position on a calibration curve, although you can get the correct co ordinates ONLY with a minimum of two other positions can you know you are on the curve - 9, 11 or 21 points on the curve are usually used for calibration.
You're correct that you can't calibrate with one point. But with a CRT alone you can't calibrate for 9 or however many points -- only 2. You can MEASURE your results at as many points as you want, but you can only SET it at two. (You may be able to control the levels at more points with an external video processor, but a basic standalone CRT has only those two control points. But see below.)
Calibrating a CRT is a matter of fitting the color-temp curve to a desired curve. The desired curve is normally 6500 at all IRE levels. A CRT has only two controls to match those curves: low IRE (bias) and high IRE (gain). You adjust the bias on 2 or 3 colors to get 6500 when displaying a low IRE level, and adjust the gain on 2 or 3 colors to get 6500 when displaying a high IRE level. Since they interact, you have to go back & forth a few times until it stabilizes. By doing so, you level out the color-temp curve at the desired temp of 6500.
But that assumes the CRT has a flat color-temp range when you adjust bias/gain correctly. That's not always the case, and in fact it is normally NOT the case without some additional effort. Typically people will see a high-temp "hump" in the middle ranges. Why? Because they set the proper color temp at a high IRE level, say IRE80, but they've pushed the blue CRT into saturation and it's not responding linearly.
Red and Green are normally pretty linear. (It's actually a power-curve, not linear, but let's use the term "linear" for simplicity.) If you increase the video signal X%, then red and green increase their output by X%. Blue, however, quickly overloads and flattens out, so that increasing the input by X% might only increase the light output by 2/3 of X%. If you set the color temp correctly at a high IRE where the blue is saturating, you end up with too much blue at lower levels -- and that gives you the high-temp hump.
This is why you defocus the blue. By **electrically** defocusing the blue, you spread out those electrons over a larger area of phosphor. In the area of a sharply-focused dot, let's say the blue light output starts to flatten out when you hit it with X electrons per second. But if those electrons hit a larger area, you've decreased the electrons-per-unit-area, and you can get more light output before you overdrive the phosphor and go into non-linear response. This gives you a lot more blue light output in the linear-response area, so the blue can keep up with the red & green across the IRE range, and you get a flat color-temp response.
Fortunately your eye can't focus on blue very sharply, so you don't notice the fuzzy blue (except maybe on test patterns).
By the way, you still have to choose the low & high IRE levels to take your measurements. Some people choose the lowest level they can read, and IRE100. I prefer to calibrate at IRE30 and 80. IRE80 is much more common in typical video material than IRE100, so I think it's more important to have 80 correct than 100. Plus if you calibrate at IRE10 and 100, you have a long area between the two where the color temp can diverge from the desired D65. For example if you have a blue hump, you might be correct at IRE100 but 400K high at IRE50. If you calibrate at 30 and 80, the 0-to-100 temp will be closer to the target. The same 400K hump might be 200 high at IRE 50 and 200 low at IRE 100. You have the same hump but the worst error is smaller, and you're dead-on around the key areas of IRE80 and 30.
Does that help, Andy?