Here's what a major CRT OEM has to say:

"RGB bandwidth determines how fast the electron beam can change state as it sweeps across the screen, setting an upper limit on horizontal resolution. The most torturous image for a CRT is an alternating series of one-pixel-wide vertical B&W lines. The input signal for such a pattern is a square wave, but unfortunately, bandwidth measures only sinusoidal frequency components. The sinusoidal bandwidth required to adequately approximate a square wave is roughly three times the binary frequency of the square wave. The additional bandwidth is required to capture the third harmonic, which is essential if a decent approximation of a square wave is to be obtained (most of the signal energy for a square wave is in the first and third harmonics of its Fourier series)."

What was that?

So just how important is bandwidth? Will a 100Mhz bandwidth be visually better than say 75Mhz (which specs exceed HDTV requirements)?

If so, will 120,135,150, and yes 180(Cine 9) show any visual improvement over each other (all things being equal)?