one of the reasons that 48 hz is so popular with those who try it is this...
Bandwidth is THE slipperiest (i am pretty sure that is not a real word) number in video. all calculations, and most specs are somewhat interpretive.
If a projector says that it has a given BW then that means it can do a reasonable reproduction of the signal. As you come closer to the max bandwidth, you DO NOT get a better picture. as you push harder and harder then image starts to suffer.
My desktop computer monitor has the capability (and the specs) to display 1600 x 1200 @ 72 Hz. however it looks like crap. run it at 1280 x 1024 @ 70 and it looks much better. (go down to 1024 x 768 and it looks even better, but i hate that res) the reason for this is easy and complex at the same time.
As the display gets a higher and higher freq, it has to "draw" pixels faster and faster. there is a point you get to where it starts to blur them together. In essence, the rise time starts to fall off, and you get one pixel blending into the next. when that happens, the image loses "crispness" it is a simple matter of physics.
THe other (albeit smaller) part of the equation is the system as setup... Cables, Connectors (VGA BAD!) switchers DA's. they all contribute, and again, as the frequency gets higher... they all start to noticeably effect the image.
THink of this analogy. remember the game where you whisper something to somebody and then it gets passed on down the line... Video systems kinda work the same way. (in this example we are using the quick brown fox sentence) If the sentence is two slow, we get frustrated at waiting. (this is like too low a signal freq, we can see pixels and we do not like it) if the sentence is too fast, then everything blends together. However, at some point there is a speed that is perfect for intelligibility, and everything is good. however, we must not forget what happens when there is inconsistency in our line. (lets say one guy is a little slow) then there is a bottleneck at that point, an error is introduced and it continues down the line.
FOr bandwidth calculations go here...
http://www.extron.com/technology/archive.asp?id=vidband
http://www.extron.com/technology/arc...p?id=bandwidth
So if you were to take a signal doing resolution...
X @ 72 Hz. BW= 100 MHz
X @ 48 Hz. BW= 66 MHz
So in essence, the device is drawing individual pixels with an extra 33% of time. they are cleaner, MUCH sharper, and basically Mo' betta.
Of course, the final point on this is that you never have more resolution than you started with. SO the real frequency is not really higher than about 7 MHz. (in the case of NTSC/PAL)
So to test a system, do two things... Get a test pattern program and get a computer (HTPC) and feed an alternating pixel pattern to the display at your chosen frequency. If you CANNOT discern individual pixels, then you have passed the practical bandwidth of the display. Secondly, get VE or AVIA, and put up a multiburst pattern. and see if the resolution is clear all the way out.
Just trying to go as high a res as possible will not always translate to the best possible picture. sometimes it is best to back off a bit.
Of course, this is just my $.02 worth.