Okay, if I'm understanding your question correctly, you're confusing what the projector is outputting (1920x1080, its native resolution) and the resolution that the projector is accepting (anything between 480i and 1080p, typically). So, for instance, even though you were watching a program in 720p, it means that the incoming signal was detected to be 720p; what it was outputting HAD to be 1920x1080, since that's the only resolution the projector can display. As I see it, at some point, either you didn't lock your cable-box resolution at 480p, and it's putting through native resolution, or your receiver is upscaling the incoming signal and putting it out as the native resolution. Regardless, the projector either passes through the 1080i signal by simply deinterlacing it, or if the signal is a 480 or 720 variety, it has to upscale to 1080 and deinterlace to progressive, if the incoming signal is interlaced.
Does that help?
Two things are infinite: the universe and human stupidity. And I'm not sure about the universe. -- Albert Einstein