So would my statement be good enough for government work or would someone look at me and say:
wow, dude, you're a moron.
You are somewhere in between. I will take a shot at it.
Movies are shot 24 fps. On these new BD/HD-DVD formats, they are usally stored at 24fps 1080p on disc. There are basically 3 ways to transmit to a 1080p display:
1. 24fps. Available on only a few 1080p displays. This allows the display to simply repeat frames 3x or 5x to display at 72hz/120hz, eliminating the 3:2 judder.
2. 60 frames/sec, progressive. Every other frame is shown 3 times rather than 2, which converts from 24 to 60 frames per second.
3. 60 fields/sec, interlaced. Each of the original 24 frames is broken up into 2 fields. (it's not converted to 30 frames like you say above). Then every 4 frames (8 fields) is shown but with some fields repeated to get it up to 10 fields, in a 3:2 pattern where some of the fields are repeated which gets you from 48 fields/sec to 60 fields/sec.
[edited, "10 fields", not "12 fields"]
Now, *if* the TV's deinterlacer is capable of detecting this pattern, it can tell which fields are repeating, discard them, and reassemble the original frames. It can then display them with whatever pattern it wants, 3:2 to get 60 frames per second, or some multiple of 24, and in this case it should end up exactly the same as if you had sent progressive in the first place. But it's not a given that every TV's deinterlacer has this film cadence detection for 1080i sources. There are some magazine articles (e.g. Merson's article
in Home Theater Mag) that suggest that a fair number of TVs aren't doing this properly; they are using video-mode algorithms. In these cases progressive from the player would be better, although probably many people wouldn't be able to tell a difference on most scenes.