Originally Posted by Wickerman1972
I wish somebody else who knows the precise facts better than I do would come in here and argue this with you because I have no doubt that you are wrong. What you're saying makes no sense. The 1080i standard is 1920 X 1080. You're saying that the SFPT displays at 800 X 1400. That makes no sense! Why would they increase the 1080 part to 1400? Doing that would be pointless because that goes above and beyond the signal coming into it. What does make sense is getting the 1920 part as close to 1920 as possible. Other tubes do that at 800-900, the SFPT at 1400. That's still not quite 1920 but certainly a big improvement.
You are reversing horizontal and vertical numbers, and you are misunderstanding CRT technology, particularly Sony's.
Let's start with CRT basics, from the era of black and white TV. The tube is coated with layers of material including phosphors that, when excited by an electron beam, emit light. Analog TV signals send an analog waveform along a horizontal scan line. Thus, it would be correct to talk about how many scan lines of resolution there is, because it's a fixed number. But there is no actual limit to the gray-level precision (other than signal-to-noise ratio). The limit on horizontal resolution is also a bit hard to quantify, because once again it's analog. Here the best you can do is reason from the bandwidth allocated, from which you can calculate that 440 Hz of bandwidth is available per scan line. 440 Hz means 440 up/down cycles, or a pattern of black/white stripes numbering 440 across, which is sort-of 880 pixels. As a practical matter, you can't really see all of that since the wave forms don't really have sharp edges.
Nothing in the forgoing said anything
about physical dots, because there aren't any in a black-and-white set! So, with analog TV, the only actual "digital" resolution is the number of horizontal scan lines. Although this is 525 for NTSC, due to lines wasted for overscan, it is treated as 480 interlaced lines per frame.
Physical dots appear when you move to colour. This is because you need to have different phosphors and/or colour filters to get light of different colours to be emitted when struck by the electron gun. You also need to make sure the gun hits the right spot on the screen for the colour it's rendering. Most CRTs use a shadow mask to accomplish this - it's a metal grid with lots of tiny holes in it. Sony is unique in using an aperture grille, which could be thought of as an outer metal frame with a series of vertical slits cut into it. Because the slits are vertical, Sony CRTs don't have a physical limit on vertical resolution imposed by the tube itself. (The limit would be imposed by the spot size of the electron gun, but this is irrelevant because as noted earlier the vertical dimension is already inherently "digital" because even an analog picture is broken up into a fixed number of scan lines.)
Colour CRTs still fundamentally work like black-and-white, in that an analog signal is put out across each horizontal scan line. The signal does not have sharp edges so for example it does not go from full-on to full-off as it crosses each slit. In other words, the horizontal "pixels" do not exactly match up with the slits, but instead can straddle slits and/or share slits. This is the same thing as the 440 Hz issue described earlier, only now you have the slits to contend with.
Digital CRTs still ultimately convert each scan line to an analog signal for rendering. It is at this point that the pixel in the digital image lose some of their individual identity to the analog waveform. Of course, as mentioned above, the number of scan lines is still a discreet ("digital") number, so that continues to be relevant, as it was even in analog.
Unlike most fixed-pixel displays, CRTs can scan the electron gun in many different manners, all on the same physical tube, by changing the scan frequency. In the days of CRT computer monitors there were "multi-sync" monitors that would run at a range of frequencies. One thing that I have never seen, but that is theoretically possible, is that a CRT could switch between progressive and interlaced rendering. As a practical matter, the design of the CRT and the selection of phosphors will make it run natively one way or the other. TV CRTs are interlaced.
While an HD CRT could in theory show SD material at 480i by literally scanning 480 lines vertically, this would look hopelessly grainy. My understanding is that Sony will line-double the 480i up to 960i to avoid this. Note that this does not say anything about the horizontal resolution, because, once again, that's an analog signal (possibly one generated from digital content, but still analog when finally displayed). The tube has some fixed number of slits, but that doesn't mean you actually have that many discreet resolvable areas to see. The underlying signal may have less resolution than the number of slits (e.g. for SD), or it may have more resolution than the number of slits (e.g. for HD).
Some confusion seems to exist about horizontal resolution. Current Sony CRTs, including my HS series, are the "normal" tube, which has 852 or 853 vertical slits. The super-fine-pitch tube, no longer in production, that was found in the XBR960, the XS series, and some other sets, had 1400 vertical slits. I can't say this enough times, so I'll say it again: the slits run vertically
, not horizontally - each slit is could be thought of as what you'd get by taking a knife and cutting from the top of the metal sheet down to the bottom. It's important to remember that the signal going out horizontally is already analog by the time the electron gun is hitting the slits, so the fact that the number of slits is not exactly the same as any of the SD or HD resolutions is not critical.
With slits, vertical resolution could in theory be whatever you want, based on the electronics - 480i, 720p, 1080i. As a practical matter, HD CRTs want to run in the range of 1080i, give or take - they aren't designed to "multi-sync" like the computer CRTs. If given a 1080i HD signal, I believe you really do get 1080i scan lines. 720p is converted to 1080i by the scaler. 480i is either converted by the scaler or line-doubled to 960i - note that this cannot add resolution where none exists, it's still only 480 discreet lines of information content.
So, let's review: given a 1920x1080i signal, my CRT will display all 1080i vertical scan lines, but the horizontal 1920 pixels will be converted to an analog waveform and somewhat degraded by the 853 slits they must pass through. Someone with an XBR960 would see somewhat less degradation from their 1400 slits. Given a 640x480i SD or analog signal, my set will line-double it up to 960i, but of course it still only has 480 actually different scan lines of information, right? The 640 pixels will be converted to an analog horizontal waveform and not greatly degraded by the 853 or 1400 slits they pass through.
It's subjective, but I think HD looks wonderfully detailed on my HS CRT. Some "loss" of resolution horizontally doesn't seem to matter; remember that the number of slits doesn't dictate what you can resolve because the spot size of the electron gun also matters. I can see a dramatic difference between finely-detailed HD pictures and digital SD. Another subjective judgement, I think SD digital and analog NTSC look better on my CRT than on the 1400-slit super-fine-pitch tube; I think that's because the SFP tube makes the flaws of SD too readily apparent. Note that owners of 42" plasmas don't seem bothered by having only 1024 horizontal pixels, and in their case, with a fixed-pixel display, there is no
ability to render partial pixels and/or share pixels, other than what the scaler can do.