The 1080i your cable box sends out is the same number of pixels that your 1080p TV has.
There still seems to be some confusion about the difference between 1080i and 1080p. Both are 1,920x1,080 resolution. Both have 2,073,600 pixels. From one perspective, 1080i is actually greater than Blu-ray. And, you can't even get a full 1080p/60 source (other than a PC). True, 1080i and 1080p aren't the same thing, but they are the same resolution. Let the argument commence...
Because our TV world is based around 60Hz, and because there's a limit to how much resolution could be transmitted over the air (due to bandwidth and MPEG compression), the two main HDTV resolutions are 1080i and 720p. Let's start with 720p, as it's the easier to understand.
OK, 720p is 1,280x720 pixels, running at 60 frames per second (fps). This is the format used by ABC, Fox, and their various sister channels (like ESPN). I've seen some reader comments in response to other articles I've written ridiculing ABC/Fox for this "lower" resolution, but that's unfair in two big ways. The first, in the late '90s when all this was happening, there were no 1080p TVs. Sure, we all knew they were coming, but it was years before they started shipping (now, almost all TVs are 1080p). The other big way, was the sports. Both ABC and Fox have big sports divisions, which played a big role in their decision to go with 720p. This is because when it comes down to it, fast motion looks better at 60 fps (more on this later).
The 1080i designation is 1,920x1,080 pixels, running at 30 frames per second. This is what CBS, NBC, and just about every other broadcaster uses. The math is actually pretty simple: 1080 at 30 fps is the same amount of data as 720 at 60.
How, you might ask, does this 30 fps work on TVs designed for 60? With modern video processing, the frame rate doesn't matter much. Back in the olden days of the '90s, however, we weren't so lucky. The 1080 image is actually "interlaced." That's where the "i" comes from. What this means is that even though there are 30 frames every second, it is actually 60 fields. Each field is 1,920x540 pixels, every 60th of a second. Of the 1,080 lines of pixels, the first field will have all the odd lines, the second field will have all the even lines. Your TV combines these together to form a complete frame of video.
What about 1080p?
Your 1080p TV accepts many different resolutions, and converts them all to 1,920x1,080 pixels. For most sources, this is from a process known as upconversion. Check out my article, appropriately called "What is upconversion?" for more info on that process.
When your TV is sent a 1080i signal, however, a different process occurs: deinterlacing. This is when the TV combines the two fields into frames. If it's done right, the TV repeats each full frame to create 60 "fps" from the original 30.
What's the bottom line?
Read Full Article...
While 1080i and 1080p have the same number of pixels, they do have different frame rates (and one is interlaced). The reality is, other than PC games, there isn't any commercially available "real" 1080p content. It's either 1080i content deinterlaced by your TV, 1080p/24 content from Blu-ray, or upconverted content from console games.
That's not to say it wouldn't be great if we did have 1080p/60, but the slightly better motion detail would not be a huge, noticeable difference. In other words, you're not really missing out on anything with 1080i.