With 480i output the projector does deinterlacing to 480p and then scaling to 1080p. With 480p output the player does the deinterlacing to 480p and then the projector scales to 1080p. So the difference would be which has the better deinterlacer. Bad deinterlacing is somewhat obvious if you know what to look for.
IIRC for 1080i output, the player would deinterlace to 480p, scale to 1080p and from that make 1080i output. Then the projector would have to deinterlace the 1080i to 1080p. This involves extra work compared to outputting 480i or 480p. Rule of thumb says the less video processing the better.
DVDs are encoded in YCbCr 4:2:0. It's a way to "compress" the color components to save space. The MPEG decoder in a DVD player converts the YCbCr 4:2:0 to YCbCr 4:2:2 via a process called chroma upsampling. Ultimately this YCbCr 4:2:2 has to be converted to RGB for output by the projector. In order to get to RGB you have to get to YCbCr 4:4:4 first - uncompressed color components. Converting back and forth from YCbCr 4:4:4 to RGB is just simple math but can cause some "truncation" of some values in some circumstances. So, sometimes it does make a difference where these color space conversions take place in the video output chain.
That's all I'm going to say because it gets somewhat complicated and requires test discs and such. Bottom line is that if you like what you see, use it.
If you want to investigate and learn more about this and display calibration a good place to start is the Spears and Munsil website. Even though their test disc is a BD, some of their white papers describe things in general before using examples with their disc. Some of their test patterns do exist on SD calibration discs. http://www.spearsandmunsil.com/