|Originally posted by DanielSmi
I was under the impression that 720p took up more bandwidth than 1080i because since it's progressivive it has twice as many fields. If 720p is equal to 1480i then why would a projector have more trouble resolving 1080i and be able to do 720p. Can you correct me if I'm wrong please.
720p is 1280x720 at 60Hz
1080i is 1920x1080 at 30 Hz
The problem is that while a 1280x720 chip can display the entire bandwidth and resolution of 720p, when it comes to displaying the 1080i signal, it cannot handle the resolution. Interlacing reduces the effective refresh rate by half to increase resolution.
This is only a problem with fixed pixel projectors - DLP, LCD, D-ILA. It is not a problem with CRT and GLV. Fixed pixels cannot change their native resolution and therefore have a hard limit to their ability to resolve higher resolutions. Raster based projectors do not have this limit, although CRT does get to a point where the CRT tube face cannot effectively resolve any further resolution. At this point, it isn't a hard set number, but rather an approximate guess, like "what is the resolution of 35mm film?"