Originally Posted by Gary Murrell
blue can be got as good as others, I prefer all defocusing off as everyone knows, but you will need the ability of a scaler to flat the greyscale out which will in turn makes things a bit harder on your blue tube
If you prefer the blue sharper, that's fine. But as Gary says, you have to use a carefully-tuned gamma-like adjustment on the blue or you'll pay for it with a bad color-temp response. You can't
compensate for blue defocus just by boosting the signal in the source.
The reason for electrically (not
optically) defocusing the blue is that the blue phosphor maxes out, output-wise, before red and green. Let's pretend the red and green have a linear response curve, so that an input signal of X results in an output of Y*X lumens. (It's actually not linear, but it makes the explanation simpler if we pretend.) Blue has a linear (X in produces Y*X out) response curve too, but only until it gets into the overdriven part of the response curve. Which, depending on your contrast levels & other things, may be at IRE 80, IRE 40, or somewhere else. After you get to that critical point where the blue starts to saturate, the response curve is no longer linear. It flattens out. See Fig. A in the attached pic for an approximation.
If you set your grayscale by measuring your color temp at, say, IRE 30 & IRE 80, you adjust your blue bias to match the R/G levels at IRE 30, and your blue gain to match the R/G levels at IRE 80. But since the blue response isn't linear, that means your blue levels are TOO HIGH in the region between IRE 30 & 80. This is what causes the "blue hump," which causes high color temps in the midrange and low color temps at high & low IREs. See Fig. B. This hump causes midranges (e.g. skintones) to look cold and blueish, while shadow details (and to a lesser extent bright areas) look reddish.
If you just boost the blue (equally across the IRE range) in your scaler, you're doing the exact same thing as boosting the blue gain in your CRT -- you're increasing the input signal. That is still going to drive the blue into the nonlinear range. You're back to Fig. B and a blue hump. **IF** you can tweak a gain transfer function that perfectly compensates for the blue saturation, and produces the proper linear response, then you should be able to get correct color temps without defocusing the blue. Be aware, though, that this gain transfer function is contrast-dependent; change the contrast and you change the IRE level where the blue saturates, so your gain transfer function is no longer correct. And, as Gary said, this approach drives your blue really hard and will shorten its life.
The usual solution to this is to prevent the blue from saturating in the first place. One way to do that is to lower the gain or contrast so blue never gets into the overdriven range -- but unless you've got a screen with huge gain, that will result in such a dim picture that you probably wouldn't want to watch it.
The other way is to defocus the blue. The blue saturates because it can only handle a certain level of electrons hitting the phosphor -- call it Bsat -- before it goes into saturation. Bsat is a count of the # of electrons hitting the phosphor per second, PER AREA. With a sharply-focused blue beam, you hit that Bsat level very quickly in a small focused dot. But if you defocus it slightly, so the electrons spread out over a larger area, you've got more phosphor contributing to the light output. You can get more lumens out before you hit the saturation area, so your blue response stays more linear across the IRE range, and you avoid the blue hump.
Since your eye can't focus on blue very well, most people won't notice the defocused blue, at least most of the time. It will show up most in test patterns, and in occasional things like the movie credits Gary mentioned. If that drives you insane, then keep the blue sharp. But realize you WILL NOT get proper grayscale if you do that, unless you do some very careful magic with your scaler.