You don't call 480i picture 240p, do you?
Actually, barring a few genesis games, pretty much every game made was literally 240p up until the playstation era (the consoles never told the TV to draw anything but odd scanlines which can cause problems when you try and play your old consoles on some digital TVs). Just because the console transmitted the video on a format capable of 480 lines of resolution doesn't mean the games automatically transmitted 480 lines of unique information.
Please don't take my word on it, even though the vast majority of my job is dealing with 1080i content, do the leg work and understand "why 1080i = 540" is garbage.
Converting from 720p 60fps is one of the situations where 1080i does equal 540p. In a situation where the source is 720p at 24fps or 30fps it is possible to preserve the original 720 lines of resolution over 2 or more fields. Unfortunately this is not possible with a 60fps source since you are limited to one field for every frame. Since you cannot use 2 fields to represent one frame, you are limited to 540 lines of resolution to represent each of the original 720 lines of resolution in the original frame. You effectively lose 180 lines of resolution when converting 720p 60fps to 1080i.
Now if you want to make the argument that the additional resolution not be noticeable while playing a game, then you might have a point. But technically, 720p 60fps conversion to 1080i causes 1080i to effectively become 540p.
To put it another way:
1280 x 720 x 24 frames/second = 22,118,400 pixels
1280 x 720 x 30 frames/second = 27,648,000 pixels
1280 x 720 x 60 frames/second = 55,296,000 pixels
1280 x 540 x 60 fields/second = 41,472,000 pixels
There are only 1280 horizontal pixels rendered in the original frame. As you can see 1080i should be capable of preserving 24/30 fps, but does not have enough pixels to preserve 60 fps. You lose 25% of your vertical resolution when converting to 1080i from 720p 60fps.