Effective resolution always seems too vague IMO, so I prefer using "resolvable detail", as outlined
earlier . During mid-90s ATSC
approval testing (based on
table 2.3 ), using the best gear then available, 720p delivered 1139X550 resolvable detail for static B&W test-pattern images and 1068X420 for moving (5-rpm) B&W test patterns. In both cases the format resolution remained fixed at 1280 horizontal samples/lines/pixels for each of the 720 horizontal scan (or row) lines. Since test patterns require less compression than moving images it's likely, IMO, that actual motion video has even less resolvable detail since MPEG-2 discards high-frequency detail to achieve compression. ATSC compressed video (OTA) is typically less than ~17 million bits per second (Mbps) compared to about 1,000 Mbps uncompressed for the original samples.
When sideconverted from 720p to 1080i, the resolvable detail can't increase over that captured originally. That is, even though the format resolution has shifted to 1920X1080i, as the resolvable detail varies depending on motion within each image frame, the additional electronically generated pixels won't supply more actual picture details. Nevertheless, sideconverted images might appear 'sharper', just as putting more pixels on your computer screen by boosting format resolution makes images seem sharper. This recent
article outlines the role of sharpness in HD viewing.
Sometimes sideconverted 720p-to-1080i appears more colorful on my 64" 1080i CRT RPTV screen. Just speculation, but that might be because three of four ATSC approval tests (link above) showed that 720p delivered more color resolution than 1080i, even though the latter format has about double the spatial format resolution. It seems reasonable that this enhanced color detail would be carried into the 1080i format during sideconversion. -- John