AVS Forum banner
1 - 20 of 25 Posts

·
Registered
Joined
·
46 Posts
Discussion Starter · #1 ·
On another gaming thread, the moderator says that 1080P really doesn't matter since there are no real sources for 1080P. For games, the amount of memory is too much so they will stay at 720P or 1080I. For movies, all frames are recorded in 24 fps so the 60fps in 1080P will not be discerible. And TV also because of bandwidth will really not be transmitted in 1080P for a while.
 

·
Registered
Joined
·
38 Posts
I have come the same conclusion as telemike. Give me a good source (720p or 1080i) on my 768p, and it looks absolutely beautiful. Sure, 1080p, with a solid 1080p source looks a little better, but from my 8-9 foot viewing distance on my 42" tv, there is virtually no difference.
 

·
AVS Forum Special Member
Joined
·
11,139 Posts
If you start with the assumption 1080p means displays with 1920X1080 resolution, capable of deinterlacing 1080i to 1080p, 1080p matters because such displays, assuming they're really capable of full native resolution, can present the maximum theoretical resolution (requiring oversampling/downconversion, except for test patterns). But the assumption1080p means only progressive source material involves lots of limitations, at least currently. -- John
 

·
Registered
Joined
·
6,194 Posts
My TV is a 720p set (DLP, 3.5 years old). It's certainly "good enough" for what I've used it for: XBox360, broadcast ATSC, D* HD-lite, and more recently digital cable HD, plus upconverted SD DVDs. However, since my primary source feeding the TV is a home theater PC, I think 1080p will be worthwhile as my next display device. I'll wait until bD/HD-DVD are available as an inexpensive computer drive (and not a workaround like the current options), and other factors make upgrading the HTPC parts sensible. I'm sure 1080p will suit such an HTPC setup nicely.
 

·
Registered
Joined
·
52 Posts
The only real source at this time for 1080p is HD movies ie.. Blu-Ray or HD-DVD.


1080p might not become a big deal for a long time as there is not much reason for it.
 

·
Registered
Joined
·
433 Posts
Who really has 1080p anyway? The $3000+ Sony XBR2 series doesn't. It will accept a 1080p signal and then "downrez" to 1080i. The dead giveaway is when the advertising says "full 1080p resolution". All that means is 1920 x 1080 pixels. My guess is less than 1 in 5 "1080p" sets will actually accept and display progressive scan 1080p format. One place that's relevant now is that it becomes a waste of money to by a DVD player that upconverts to 1080p. Why? So your TV can back it down to 1080i anyway?
 

·
Registered
Joined
·
29,399 Posts
1080i and 1080p both contain the same amount of data resolution: 1920x1080 pixels. The difference is in how they're delivered to the screen, either as 1920x540 interlaced fields or whole 1920x1080 progressive frames. If viewing on a native 1080p display, the 1080i signal will be deinterlaced to reconstruct the original frames. Thus, no matter how the signal is delivered, you'll wind up with the same 1080p picture in the end.
 

·
Registered
Joined
·
10,688 Posts
When my CBS affiliate broadcasts filmed material with the MPEG telecine flags, my HTPC will convert 1080i into 1080p at 24 frames per second. Some cable networks like also HBO send these flags during filmed content giving receivers the ability to display the material at true 1080p.
 

·
Registered
Joined
·
1,824 Posts

Quote:
It will accept a 1080p signal and then "downrez" to 1080i

Huh? Actually it's (kinda) the other 'way round - everything is upscaled to 1080p. The confusion might lie in the fact that many sets will not accept a 1080p signal - the Sony will, but only over the hdmi input (common).
 

·
Registered
Joined
·
282 Posts

Quote:
Originally Posted by John Mason /forum/post/0


If you start with the assumption 1080p means displays with 1920X1080 resolution, capable of deinterlacing 1080i to 1080p, 1080p matters because such displays, assuming they're really capable of full native resolution, can present the maximum theoretical resolution (requiring oversampling/downconversion, except for test patterns). But the assumption1080p means only progressive source material involves lots of limitations, at least currently. -- John

Agreed. It does matter.


Whether one will actually be able to see the difference depends upon the size of the set and their viewing distance. With my 50" 768p set and 10 foot viewing distance, it will essentially make no practical difference to me; i.e., I can't resolve the difference in resolution. Were I to increase screen size and/or decrease viewing distance, it would be more of an issue for me.
 

·
Registered
Joined
·
433 Posts
The business about accepting 1080p signals and then "downrez" to 1080i wasn't original to me. I played tag with Sony tech support and the exact quote was: "It is true that your television is having the capability of receiving the 1080p signals. However the output would not be 1080p as it is capable of displaying till 1080i signals. Please be informed that the display would be 'downrez' (lower the resolution) on your television." The English was a little bad, but my interpretation is that the XBR2 was never designed to display progressive scan 1080p or 1080p/24.
 

·
Registered
Joined
·
1,710 Posts

Quote:
Originally Posted by JCarls /forum/post/0


The business about accepting 1080p signals and then "downrez" to 1080i wasn't original to me. I played tag with Sony tech support and the exact quote was: "It is true that your television is having the capability of receiving the 1080p signals. However the output would not be 1080p as it is capable of displaying till 1080i signals. Please be informed that the display would be 'downrez' (lower the resolution) on your television." The English was a little bad, but my interpretation is that the XBR2 was never designed to display progressive scan 1080p or 1080p/24.

I think more than the English was bad. Something was lost in translation.
 

·
Registered
Joined
·
1,824 Posts
agreed - what he may have meant was it wasn't designed to display 1080p/24, but it definately is/was designed (and built) to display 1080p/60.

I know this because that's exactly how my HTPC is hooked up -
 

·
Registered
Joined
·
46 Posts
Discussion Starter · #16 ·
Pay attention, class. There will be a test at the end of this class.


1080p does not matter. Here's why:


There are a number of facts that must be grasped first:

1. All digital displays are progressive scan by nature.

2. Virtually all film releases are shot at 24 frames per second and are progressive scan.

3. 1080i delivers 30 frames per second, and 1080p delivers 60 frames per second.

4. All HDTV broadcasts and virtually all games will be limited to 720p or 1080i for the foreseeable future.

Got all that? Good. Now lets go into the explanation.


Movies


Take a movie. It's 24 frames per second, progressive scan. This is the nature of how movies are shot on film today. Just about all movies are shot this way; the only exceptions are films where the director or producer wants to make an artistic statement. But if you saw it at your local multiplex, it's in 24fps progressive.


Now, let's put it onto a disc so we can sell it. First, we scan each individual frame of the movie, one by one, at a super high resolution (far higher than even 1080p.) This gives us a digital negative of the film, from which every digital version of the film will be made (this means the HD, DVD, On-demand, PPV, digital download, digital cable and PSP versions were all made from this one digital negative.) We'll only concern ourselves with the HD version for now.


Because it's HD, we'll take the digital negative and re-encode it in MPEG2, .h264 or VC1 at 1920x1080 and 24 frames per second to match the source material. And this is how it is on the disc when you get it from the store, whether it's Blu-ray or HD-DVD.


Once you put it in your disc player to view the film, a number of things happen.


1080i/1080p

Because the film is in 24fps, and 1080i is 30fps, every second the player has to come up with 6 additional frames to make up the gap. It does this through a process called 3:2 pulldown whereby 4 film frames (1/6th of a second of the film) are processed to create 5 video frames (1/6th of a second on your TV screen). Just exactly how this is done is outside the scope of this post (click here) but the important thing to realize is none of the picture data is lost during this process; just re-formatted.


Now, here's the crucial difference between 1080i and 1080p, as it relates to movies. With 1080i transmission, the player interlaces the frames during the pulldown and sends the interlaced frames to the TV set to be deinterlaced. With 1080p transmission, the player never interlaces the frames. Click to see how deinterlacing works. Regardless, you will get the exact same result. The only exception is if you have a crap TV that doesn't deinterlace properly, but chances are that TV won't support 1080p anyway.


So 1080p doesn't matter for movies.


Television


Television is a little different. Television is typically not shot on film, it's shot on video which is a vastly different technique. While movies are almost always shot at 24fps, standard-def NTSC TV is shot at 30fps interlaced, and HDTV is shot at whatever the production company decides, usually 1080i at 30fps, or 720p at 60fps, depending on the network. What, no 1080p? Nope. Why? Bandwidth.


The American ATSC standard gives each broadcaster 19.4Mbps to transmit video for each broadcast channel. Broadcasters are free to transmit as many streams as they want as long as the total bandwidth for all the channels does not exceed 19.4Mbps. Consider that one 1080i stream compressed using MPEG2 at decent quality takes up about 12Mbps. Now consider that an equivalent 1080p stream will take up twice that bandwidth. You can see why nobody does 1080p, and this situation will not change until a new encoding standard arrives, which won't happen for at least another decade.


So 1080p doesn't matter for television.


Games


Ah, now we come to the heart of the matter. Games. The reason why there will be very few 1080p games is a simple one: lack of memory. All graphics cards, including those found in Xbox 360 and PS3, have what's known as a frame-buffer. This is a chunk of memory set aside to store the color information of every pixel that makes up a frame that will be sent to the screen. Every single calculation the graphics card makes is designed to figure out how to fill up the frame-buffer so it can send the contents of the frame-buffer to the screen.


Time to break out the calculators, because we're doing some math.


A 720p frame is 1280 pixels wide by 720 pixels high. That means one 720p frame contains 921,600 pixels. Today's graphics cards use 32-bit color for the final frame. This means each pixel requires 32 bits - 4 bytes - to represent its color information. 921,600x4 = 3,686,400 bytes or a little over 3.5MB.


A 1080i frame is 1920 pixels wide by 540 high. That's 1,036,800 pixels, 4,147,200 bytes or a little less than 4MB.


Now, a 1080p frame. 1920 wide by 1080 high. 2,073,600 pixels, 8,294,400 bytes, a smidgen less than 8MB.


Ooh, but the 360 has 512MB, and the PS3 has 256MB for graphics. How is 8MB going to hurt? Oh, it hurts. Graphics cards will have several internal frame-buffers to handle different rendering passes, and each one requires memory. And the textures and mapping surfaces all have to fit within that same memory space. In the case of the 360, there's also audio and game data fighting for the same space (though the "space" is twice as big on Xbox 360.) That's why GTHD looked like crap, because in order to get it running in 1080p, they sacrificed most of the rendering passes and other effects.


This is why the vast, vast majority of Xbox 360 and PS3 next-gen games will stick to 1080i or 720p.


So 1080p doesn't matter for games.


In conclusion, 1080p does not matter. Period. If you think it does, you're just buying in to Sony's marketing hype.


Class dismissed.
 

·
Registered
Joined
·
13,531 Posts

Quote:
Television


Television is a little different. Television is typically not shot on film, it's shot on video which is a vastly different technique. While movies are almost always shot at 24fps, standard-def NTSC TV is shot at 30fps interlaced, and HDTV is shot at whatever the production company decides, usually 1080i at 30fps, or 720p at 60fps, depending on the network. What, no 1080p? Nope. Why? Bandwidth.


The American ATSC standard gives each broadcaster 19.4Mbps to transmit video for each broadcast channel. Broadcasters are free to transmit as many streams as they want as long as the total bandwidth for all the channels does not exceed 19.4Mbps. Consider that one 1080i stream compressed using MPEG2 at decent quality takes up about 12Mbps. Now consider that an equivalent 1080p stream will take up twice that bandwidth. You can see why nobody does 1080p, and this situation will not change until a new encoding standard arrives, which won't happen for at least another decade.


So 1080p doesn't matter for television.

The above is in error.


Only live content is actually acquired at 1080i60 and 720p60. The overwhelming majority of all other high-def content is 1080p24 (on 1080i channels) or 720p24 (on 720p60 channels) with appropriate repeat flags. Content acquired at 1080p24 and rebroadcast using 1080i transmission actually require less bandwidth than native 1080i60 video.


Most episodic programming on CBS, NBC, and HBO is done in 1080p24. If you have a display with quality deinterlace, it will reconstruct the full 1080p image for every frame to deliver output like you would get from a HD-DVD or Blu-ray disk, albeit with higher compression. When buying a 1080p TV, it is very important to get one with quality deinterlace, as this video illustrates . In his tests, Gary Merson has found that a substantial percentage of available displays do not properly reconstruct the original 1080p source from 1080i transmission -- most displays cannot detect the 3/2 cadence and drop 50% resolution during movement.
 

·
Registered
Joined
·
433 Posts
So China, or BFDTV, or whoever... this may be a little beyond the bounds of this thread, but let me ask my ultimate practical question. If I have a very good 1080i set up - from cable box to DVD player to TV (which is still what I think the XBR2 is, but I don't want to go through that again), are you saying that it's pretty much a waste of my time and emotional energy to be wishing that I bought 720p, 1080p, 1080p/24, 922.3abc, or any other alphabet soup out there - at least for the near term?
 

·
Registered
Joined
·
1,710 Posts
JC, I would not buy a 720p display (native resolution) today when the 1080p display is available. The standard is moving to being able to accept 1080p. If a 1080p display can only take 1080i inputs, I see no great disadvantage. Sources will always accomodate (enter disclamer here) outputs of 720p and 1080i for the forseeable future, IMHO.
 

·
Registered
Joined
·
1,297 Posts
I have a 1080I TV (CRT) now. It looks pretty good. So if I bought a 1080P TV (DLP or SXRD, etc) at 50-52 inches and watched it from about 8 or 9 feet with 55 year old eyes that aren't the best, I probably wouldn't tell much, if any, difference? My set is older and only has a component input (only one). I want to upgrade at some point to a TV with HDMI inputs. So should I just save money and get a non-1080P HDTV? (I mentioned 50-52 inches earlier because that is about the largest I can put in the area where the TV will be.)
 
1 - 20 of 25 Posts
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top