Using 1080i cuts the bandwidth required for 1080p in half, which makes 1080i's bitrate requirements comparable to those of 720p. Broadacasters prefer this method, as it makes 1920x1080 inexpensive enough that they can still use sub-channels with that picture format. Using 1080p would require the entire 19 Mbps that ATSC channels have to offer, which is an ineffective use of bandwidth for most networks.
As for the quality question, 1080i60 only has 60 fields/second, not 60 frames/second. The problem with scanning only half of the vertical resolution at once is that each row of pixels isn't updated for each frame, which means that the vertical resolution of 1080i is essentially halved during movement, giving 1080i an effective resolution of 1920x540 during fast motion, which is even worse vertical resolution than 1280x720 has to offer. This is why 1080i is often downscaled to 720p, as 1080i doesn't have the full detail of 1080p, so downscaling it causes less of a reduction in quality than would result from downscaling 1080p to 720p (and keeping it at 1080p may give a false sense of how much detail is actually present).
It is true that a 1080i signal must be deinterlaced to be displayed on an LCD or plasma TV, as only CRTs are capable of showing interlaced content; however, the quality of deinterlaced video is lower than that of native progressive content, as deinterlacing often requires either interpolation or blending to remove combing artifacts from misaligned fields.