or Connect
AVS › AVS Forum › Display Devices › Digital Hi-End Projectors - $3,000+ USD MSRP › The Turth about 1080p projectors/Displays.
New Posts  All Forums:Forum Nav:

The Turth about 1080p projectors/Displays.

post #1 of 44
Thread Starter 
I'd like some help in understanding just what a ture 1080p projector is, espically in the fixed pixel display technology.

Im not new to the HDTV thing, and I have a technical background, and this is partly why Im having some trouble understanding just what a 1080p fixed pixel system acutally is.

First, let me tell you what I know, and what Iv observed.

480i = Ok, 480i is the nomral number of visible lines of picture that is drawn onto our nomral TV screens in SD. What the American standard of TV knon as NTSC allows for 525 lines, only 480 of them acutally get used for viewable video. The rest are used for synch, cc, and some other stuff.

Every SD DVD out there is nativly encoded in the 480i format. When use use the yellow video out, or the S-Video out, your ARE using 480i format.

Every TV show on over-the-air-analog. analog cable, non-hd-digital cable, and non-hd-sat also decode and output 480i. And from what I can tell, on any CRT TUBE tv up to about 32 inchs, the picutre will be just spiffy.

When we start to get into larger screen tv sets, or projectors of any type, we start to notice all the problems with 480i analog video, and even digital video too, such as digital cable or satalite tv (in SD).

Problems on the anlaog side are pretty much due to noise on the line. At 32 inch CRT TUBE sets (which in my opnion give FAR better PQ then any other technology out there due to the non-pixel technology, and the killer black level and killer contrast ratio) you proabaly wont take notice of analog PQ issues, but above that size you surely will.

Problems on the digital side are mostly due to compression---most digital cable and satalite providers scale the video donw from its nomral rate of 720x480i into 480x480i, plus they cap the number of colors that can be sent---analog over-the-air has turelly INFINITY number of colors, but digital cable and satalite providers of SD video limit it down to only thousands of colors. Again, on a 32inch CRT Tube TV set, you proabaly wont even take notice of these problems, the PQ will look as good as anything you ever watch on that TV.

Big Screen TVs bring out the ture probelm of 480i. Besides the analog noise problems, and the digital compression and lack of color problems, the "i" of 480i means interlace video scanning...

Lines of video---480 of them, are darwn onto your CRT TUBE starting at line 2 then 4 then 6---all the way down to the bottom of the screen---every other line is left blank--or black.

Then the scan beam fly's back up to the top corner and then lines 1 3 5 7 and so on are drawn.

On 32inch CRT Tube or smaller TV sets, the physical distance of the top of the screen to the bottom is small enough that the time it takes to draw one sceen is short enough that to the nomral human eye ball, we see a full, solid picture, but on bigger screens, since there is more surface area to cover, since we get to view a bigger surface area, our eyes can more accuratly pick up the problems in the display.

On a big screen tv, running at 480i (big screen tv's up to the mid 1990's), you CAN see the black lines between each draw of the odd vs the even screen scan. These black lines are known as scan lines.

The very best picture quality you can get in SD to one of these older big screen TVs would proabaly be a good DVD hooked up to its SVideo port. DVD quality is superb 480i quality--generally speaking.

The result will be a nice picutre with very visible scan lines.

The only reason you dont see scanlines today on modern large screen tv sets is because modern sets have a built in scaler that upconverts that 480i signal into something higher.

=====

Progressivs Scan 480p

This is kinda tricky to explain. I think all of us know that the only real difference between 480i and 480p is that in progressve scanning, then lines are drawn all in one pass, 1 2 3 4 5 6....all the way up to 480.

As far as I know, this jump from 480i to 480p has no effect on the resoultion. On DVD's its still 720x480 pixels.

What makes this tricky is that in order to view 480p, you will need something better then your SD tv, you will need an HDTV. The thing is, EVERY HDTV out there has a built-in-scaler to convert 480i to something higher then 480i. The main reason to do this is, most HDTV sets are bigger then 32 inch, and thus would show scan lines on 480i sources, and thus have built in scalers.

This is interesting, since you need an HDTV to use a progressive scan dvd player, and every HDTV out there has a built in scaler, then why are there progressive scan dvd players?

The answer to that question quality. If you have a higher quality scaler in your dvd player the the one built into your HDTV, then you will benifit from that higher end DVD player.

Keep this in mind for later in this post----THERE ARE NO NATIVE 480P DVD's.

The only other technical difference(other then 2 4 6 8 vs 1 2 3 4 5 6 7 8) as I understand it beween 480i an 480p, is the scan rate needs to be doubled in oder to draw all of that info in one pass. By upping the scan rate you also increase the amount of info per unit of time that can be deliverd. This is known as bandwidth.

On a dialup internet you could only get so much info per unit of time, when you got broadband internet, you could get alot more info in the same about of time.

When you increase your video bandwidth, good things happen to the picture quality.

Keep this in mind for later in this post--- THERE IS NOT MUCH DIFFERENCE BEWEEN 480i and 480p.
(considering that all 480i gets converted in your tv set to something higher anyways).

========

HDTV 720p

720p is the progressive 720line HDTV standard. 720p vs 480p is not that much of a jump is it? But why then is 720p HDTV look SOOOOOOOOOOOO much better then 480p DVD's?

First, remeber that in 480i/p the acutal resoution is is 720x480, and thats if your on a DVD player, or over-the-air SDTV---Remember digital cable and satalite providers typicall dont even give us all of our already limiited bandwidth crushing it down to about 480x 480.

720p HDTV native res is 1280x720progressive. Look at the first number, 1280 dots of picture info per line vs 720 dots, or more typical 480 dots per line.

1280x720 is WAY better then 720x480 and WAY WAY WAY better then 480x480.

Again, just as before, more info beeing deliverd per unit of time means more video bandwidth. So its a safe bet to say that 720p HDTV has a higher bandwith then 480p, and a way higher bandwith then 480i.

=======

HDTV 1080i

Oh No! "i" ? "i" is BAD right? "p" is better right?

Well, 1080i is 1920x1080 pixels. 720p is 1280x720 pixels. If you do the math, this is what happens...

720p = 921600 pixels. or in other words not quite 1 megapixel of res.
1080i= 2073600 pixels. or in other words almost 2.1 meagpixel of res.

Again, more pixels per frame in the same amount of time means more bandwidth.

More bandwidth requires a higher scanning rate.

Remeber that doubling the scan rate of 480i to get to 480p was enough to ilimiate the scan lines, even on very large screens? Well the bandwidth scanning rate required to do 1080i is so high, that unless you uisng a MONSTER SCREEN size with a Ultra High Res projection system, you wont see scan lines at 1080i.

=====================
HDTV 1080p

What's 1080p? Well, obviously its progrssivly scanned 1080i. It dose not have any higher resoution then 1080i---just as 480i and 480p have the same number of pixels, 1080i and 1080p have also the same number of pixels.
And since I already stated that the video bandwidth requirments to do 1080i are so high already as to not show any scan lines, then whats the point of 1080p?

Just as there are NO 480P NATIVE DVD's, at this time, the same is ture for 1080p content. While some HD-DVD players and Blu Ray Players and even some of the disks you can by for them say "1080p Beyond HDTV"---from what I have been reading, they only decode to 1080i, and use a built-in scaler to pump it to 1080p over its hdmi port.

Why pump to 1080p?

I could total understand pumping 480i to 480p - it gets rid of the visible scan lines (on big screen non-HDTVs made before the late 1990's).

Again since there are no scan lines visiable at 1080i, whats the point of 1080p?

I dont have a great answer to that question yet. For some (very very few people) might be lucky enough to have such a high-end video display, and a screen size so large that they CAN seen scan lines at 1080i---then this make sence for them. But I would guess you would need a screen size of at least 12 FEET wide or more, and if your doing that, your going to need a hell of lot of brightness to light that up, and if your talking high brightness, your proabaly not talking 1080i ture hd projection.

=======

My question about 1080p

Sorry its taken me so long to get to the point, but I wanted you all to have a clear understanding of my knoledge on the subject....

Scan lines are only generated on devices that acutally SCAN. In other words, scanning means dropping pixels onto the display one-at-a-time using a "laser-beam like" technlogy. As far as I know, only CRT displays and CRT projectors do this.

LCD, LCOs, DLP, PLASMA, and even the new SXRD systems are all fixed pixel units, and are not scanning type displays.

When these type displays state that they can take signals such as 480i-1080p, the whole idea of "i" vs "p" makes no sence to me any more.

CRT technology type displays acutally have a physical part in them known as a "flyback transformer". It gets its name from the function of "flying back the electron beam to start its next scan down the tube surface."

In interlace scanning, the flyback zipps the beam back to line 1 after it completes drawing every other line, then it zipps it back again to line 2 to draw the next set of lines. In 2 pass's it will have completed the video frame.

In progressive scanning, the flyback zipps the beam back only after its has drawn all the video lines progressivly one after the other. In just 1 pass, it will have completed the video frame.

There is no flyback transformer in any fixed pixel display is there?
There is no beam drawing (scanning) pixels onto the screeen is there?
Thus there is NO DIFFERENCE BEWEEN 1080i and 1080p on a fixed pixel display, other then it should be able to accept video signal rates at double the 1080i bandwidth----HOWEVER, there is Z E R O P I C T U R E D I F F E R E N C E in quality beween 1080i and 1080p on these type display devices.....Right?


The only way to get higher quality out of 1080p equipment is if its based on CRT technology.

Right?


=======

I have a very unique display system. I was luckly enouhg to find a great deal on ebay for a extreemly high end CRT/Hybrid system. The technology is known as ILA (not D-ila). It uses R G B CRT's that acutally drive analog LCD video plates with no pixels. The results are CRT resoution, without the brighness constraints of typical CRT projector systems. My current systme using a DVDO VP50 puhing 1080p over RGBHV at ture 1920x1080 progressive scan at just under 7000 lumens.

Whiile I dont have any ture 1080p content to feed it, ture 1080i stuff scaled to 1080p looks better to my eye then if I fed it natural 1080i.

screen shots at:

http://bbs.flagnet.org

Sorry, I dont have any 1080p shots on the web site yet.

Carey
post #2 of 44
First off, let me say that I didn't read your whole post. I'm not trying to be an ass, really, but I just don't have the patience for all that.

I do have a couple of observations to share. If they are redundant due to my lack of reading your complete post, I apologize.

First of all, in the world of digital projection, the relevant comparison is not 1080i to 1080p. The comparison is 720p to 1080p. This is because unless you are buying a CRT projector, 1080i is not available as a display resolution. In other words, all digital displays are natively progressive.

So you can buy a digital projector with panels that display at 480p, 720p, or 1080p. But not 1080i. 1080i/p contain roughly twice as many pixels as 720p, so there's the main benefit. If you feed a 1080i film-based source to a 720p projector, the projector's internal processor will have to scale it downward, and you will lose at least half your resolution. If you feed a 1080i film based source to a 1080p projector with a quality inboard VP, you will get all 1920x1080 pixels worth of information displayed progressively. Here, the main benefit is not that it's progressive (although that is a benefit), but rather that it can display twice the resolution.

On another note, major studio releases on HD-DVD and Blu-Ray are encoded on the disk at 1080p/24. Newer players will eventually offer (if they don't already -- one of the guys following the players could tell you) true 1080p/24 out. Due to the lack of 1080i/60 to 1080p/24 conversion support on most (maybe all?) inboard video processors, one of the biggest benefits of feeding 1080p/24 directly to your display is to eliminate judder, provided that your projector displays 1080p/24 at an even multiple of 24 (48, 96, etc.).
post #3 of 44
Thread Starter 
So, in short, what your saying is that any NON-CRT display device that can do 1080 will be advertied as 1080p. Feeding this display device 1080i or 1080p wont make any real difference in the oytput you see.

But, on a CRT display/projector there WILL be indeed a technical differecne in the output if you feed it 1080i vs 1080p.

But dont assume that 1080i and 1080p are the exact same signal. The bandwidth of 1080p is twice that of 1080i---meaning that 1080p's signal is "better", but unless your uisng a CRT you wont be able to see any difference.

Carey
post #4 of 44
Quote:
Originally Posted by ctreesh View Post

but unless your uisng a CRT you wont be able to see any difference.

Carey

There is a difference on digital displays too.

1080i to a 1920 x 1080 digital display doubles/copies the scanned lines to make 1080p

1080p to a 1920 x 1080 digital display can show independent/unique fields or "lines" to deliver TRUE 1080p

It's doubtful most could tell the difference but there is a difference just the same
post #5 of 44
Thread Starter 
Tryg, thanks, thats the answer I was looking for. So indeed even if you have a fixed pixel device, you can actually have two frames of 1080i on the screen at the same time, or double the info in the same time frame.

Good, glad to hear that.

Sadly, the next big probelm (as far as I know) is that you'd need pixel size so tiny, that it would be very tuff to get any real brightness out if it (Im talking projector here).

This would exaplin why ALL the fixed pixel 1080p displays are under 1000 lumens?
post #6 of 44
Quote:


analog over-the-air has turelly INFINITY number of colors

Sorry, this is just nonsense. Anyone with a scope can measure the bandwidth of an NTSC chroma signal. Same goes for horizontal resolution -- just because it's not sampled doesn't mean it's infinite. There are standard ways to measure these signals, and for broadcast RF, the answer isn't pretty: even the lowest common-denominator 480i digital systems are far superior.

That doesn't mean a 32" LCD will look better than a CRT -- and it has nothing to do with 1080p projectors -- but so long as you're posting pages & pages of intro you might want to check facts.
post #7 of 44
Quote:
Originally Posted by ctreesh View Post

Tryg, thanks, thats the answer I was looking for. So indeed even if you have a fixed pixel device, you can actually have two frames of 1080i on the screen at the same time, or double the info in the same time frame.

Digital Displays are inherently progressive devices


Quote:
Originally Posted by ctreesh View Post

This would exaplin why ALL the fixed pixel 1080p displays are under 1000 lumens?

No, brightness of the bulb
post #8 of 44
Thread Starter 
Why cant they use the blub from a 3000 lumen system in a 1080p projector --or can they?
post #9 of 44
you are welcome to build one
post #10 of 44
Quote:
Originally Posted by ctreesh View Post

Tryg, thanks, thats the answer I was looking for. So indeed even if you have a fixed pixel device, you can actually have two frames of 1080i on the screen at the same time, or double the info in the same time frame.

I'm not sure what you mean by "two frames of 1080i." A single frame of 1080i video is composed of two fields (an odd field and an even field), each consisting of 540 vertical lines of resolution. The odd/even fields are displayed very fast, one after the other, with combing (visible or not) as a side effect.

In 1080p, the odd and even fields are displayed AT THE SAME TIME, so there is no combing. Both 1080i and 1080p contain 1920x1080 pixels worth of information and contain the same number of scanlines.

The benefit of going from 480i to 480p is very similar to the benefit of going from 1080i to 1080p, and it's not about scanlines. It's about combing artifacts, and sometimes the possible elimination of judder.

And to answer another one of your questions, no, it is not entirely true that the output will look the same on a 1080p projector regardless of whether or not you feed it 1080i or 1080p. It depends a lot upon the quality of the video processor and the source, and it also depends upon whether or not you're attempting to deinterlace film or video.

I'm oversimplifying this a lot.
post #11 of 44
Quote:
Originally Posted by Tryg View Post

you are welcome to build one

And we wil beat a path to your door!!!
post #12 of 44
Quote:
Originally Posted by Tryg View Post

There is a difference on digital displays too.

I agree there is a difference.

Quote:


1080i to a 1920 x 1080 digital display doubles/copies the scanned lines to make 1080p

The below edited for clarity:
But a film encoded at 1920x1080/24p contains the same amount of information whether it is being transmitted as 1080i or 1080p, and also contains the same number of scanlines. The difference is that the data in a 1080i signal is formatted as two discrete fields (odd and even) while a 1080p signal is formatted to display all of the 1920x1080 frame progressively.
post #13 of 44
Quote:
Originally Posted by ctreesh View Post

So, in short, what your saying is that any NON-CRT display device that can do 1080 will be advertied as 1080p.

Not only will it be advertised as 1080p, it will actually be 1080p. Digital displays are progressive devices.

Quote:


Feeding this display device 1080i or 1080p wont make any real difference in the oytput you see.

Now that's not entirely true. See my post above.

Quote:


But, on a CRT display/projector there WILL be indeed a technical differecne in the output if you feed it 1080i vs 1080p.

I don't even know of any CRT display that can accept a 1080p signal, and I think you may be missing the point.
post #14 of 44
Quote:
Originally Posted by gremmy View Post

But Tryg, both 1080i and 1080p contain the same number of scanlines, and the same amount of data.

Assuming a true 1080p source, not an upscaled one, and identical scan rates, the amount of actual data is DOUBLE with 1080p. 1080i data rate is roughly 33.5 khz. 1080P data rate is roughly 67 khz.

True 1080p sources are rare though...
post #15 of 44
Quote:
Originally Posted by ctreesh View Post

Why cant they use the blub from a 3000 lumen system in a 1080p projector --or can they?

A number of the new 3-chip 1080p DLP projectors now rolling out are in the 2500 ANSI range.
post #16 of 44
Quote:
Originally Posted by gremmy View Post

I don't even know of any CRT display that can accept a 1080p signal, and I think you may be missing the point.

Sony G90 CRT manages 1080p pretty well, as we showed for several years at trade shows...
post #17 of 44
Quote:
Originally Posted by wm View Post

Assuming a true 1080p source, not an upscaled one, and identical scan rates, the amount of actual data is DOUBLE with 1080p. 1080i data rate is roughly 33.5 khz. 1080P data rate is roughly 67 khz.

True 1080p sources are rare though...

I'm talking about the most common scenario where we are comparing a feature film (encoded onto Blu-Ray or HD-DVD, for example) in 1080p and the same film displayed at 1080i. Whether it's chopped up into interlaced fields or fed in its native format, the film only contains a certain amount of data.

William, I'm not talking about the bandwidth, I'm talking about the data. How is it double?

If you have 1920x1080 pixels worth of information, displayed progressively at 24 frames per second, how is that more information that the same film displayed as 1920x540 fields?
post #18 of 44
Quote:
Originally Posted by wm View Post

Sony G90 CRT manages 1080p pretty well, as we showed for several years at trade shows...

Forgive my ignorance, but are you saying that the G90 displays progressively? Or that it can accept a 1080p signal and then converts it via an internal VP to 1080i?
post #19 of 44
Quote:
Originally Posted by gremmy View Post

William, I'm not talking about the bandwidth, I'm talking about the data. How is it double?

If you have 1920x1080 pixels worth of information, displayed progressively at 24 frames per second, how is that more information that the same film displayed as 1920x540 fields?

Your original statement wasn't limited to 24 frame sources or displays, it simply referred to 1080i vs 1080p. A 1080p signal at a given sample rate has twice the data of 1080i at the same sample rate. Each field at 1080i is 540 lines. If the signal is 1080i at xx hz, it takes two fields to complete the frame. At 1080p each field is 1080 lines and each field is also a frame. Again, I'm talking about the source, not how it's treated or displayed. If the source isn't changing during that time then the additional information is redundant. But if it is, that's another story.

Note to original poster: 720p has a higher data rate than 1080i. It's less pixels per field than 1080i, but each sample is a full field, not half. That's why it's preferred over 1080i for sports - less motion artifacts.
post #20 of 44
Quote:
Originally Posted by wm View Post

Your original statement wasn't limited to 24 frame sources or displays, it simply referred to 1080i vs 1080p. A 1080p signal at a given sample rate has twice the data of 1080i at the same sample rate. Each field at 1080i is 540 lines. If the signal is 1080i at xx hz, it takes two fields to complete the frame. At 1080p each field is 1080 lines and each field is also a frame. Again, I'm talking about the source, not how it's treated or displayed. If the source isn't changing during that time then the additional information is redundant. But if it is, that's another story.

I edited my original post for clarity. My comment was regarding movies encoded at 1920x1080/24p and was made in the context of feeding a 1080i signal to a 1080p device versus feeding the same signal as native 1080p.
post #21 of 44
Quote:
Originally Posted by gremmy View Post

Forgive my ignorance, but are you saying that the G90 displays progressively? Or that it can accept a 1080p signal and then converts it via an internal VP to 1080i?

A G90, like any CRT projector that I know of, displays scan lines as received. So yes, it displays a 1080P signal progressively - it can't display it any other way! It's really a very dumb device - the signal says "go here and display this" and it does.
post #22 of 44
Quote:
Originally Posted by gremmy View Post

Whether it's chopped up into interlaced fields or fed in its native format, the film only contains a certain amount of data.

That would depend on how the transfer is done. I've seen some amazing work where 24 frame telecine was converted to 60 hz with motion comp and interpolation. It's expensive, and it won't fit on an HD DVD, but it has been done.
post #23 of 44
Quote:
Originally Posted by wm View Post

A G90, like any CRT projector that I know of, displays scan lines as received. So yes, it displays a 1080P signal progressively - it can't display it any other way! It's really a very dumb device - the signal says "go here and display this" and it does.

I stand corrected. I had thought that CRTs were inherently interlaced devices. Learn something new every day.
post #24 of 44
480p = 960i 540p=1080i 720p=1440i 1080p=2160i

it's a lot better to scale 720p than 1080i in a 1080p set
because not all displays are goog at scaling.
post #25 of 44
Quote:
Originally Posted by ctreesh View Post

...
Keep this in mind for later in this post----THERE ARE NO NATIVE 480P DVD's.
...

Most of modern commercial region 1 DVD-Video are actually progressive - recorded in 480p-24. Flags in video steam are used to create 29.97 interlaced frames per second output.

Recorded few like these by myself
post #26 of 44
Quote:
Originally Posted by DESTURBED View Post

480p = 960i 540p=1080i 720p=1440i 1080p=2160i

it's a lot better to scale 720p than 1080i in a 1080p set
because not all displays are goog at scaling.

It depends of source material. For sources with little motion 1080i almost always better.
post #27 of 44
Quote:
Originally Posted by DESTURBED View Post

480p = 960i 540p=1080i 720p=1440i 1080p=2160i

it's a lot better to scale 720p than 1080i in a 1080p set
because not all displays are goog at scaling.

Not all of them are good at deinterlacing, either. Six of one, half a dozen of the other.
post #28 of 44
Quote:
Originally Posted by DESTURBED View Post

it's a lot better to scale 720p than 1080i in a 1080p set
because not all displays are goog at scaling.

I've got to call BS on this one. It's better to learn the capabilities of your particular on board deinterlacer/scaler than to issue a sweeping declaration that "it's a lot better to scale 720p than 1080i in a 1080p set," to use your own words. First of all, converting 1080i to 1080p is called deinterlacing, not scaling. Secondly, 1080i film based content can be deinterlaced to produce true 1080p frames on any 1080p projector with a competent deinterlacer, maintaining all 1920x1080 pixels worth of information.

Things get more complicated if we're talking about the deinterlacing of video based content.
post #29 of 44
What he said.
post #30 of 44
Your idea of not possibly seeing a difference in 1080i vs 1080p unless it's a 12 foot or larger screen is incorrect. Anyone can see the jagged edges on 1080i stuff even on a 50" screen if you look for it on the edges of movement or pans. (provided the display device can half way sharply display those resolutions without downscaling or loosing detail)

Also direct view crt's and crt monitors are very different from crt projectors. The single tube displays have a pixel structure built into them. There are red blue and green segments of phosphor like a pixel structure and only crt projectors based on 3 seperate tubes have seamless non pixel limited display.

There are many many crt projector makes and models capable of accepting 1080p, and more than a few that can fully resolve (or nearly resolve) 1080p not just g90's. Just about any EM crt projector will show an obvious difference in 1080p vs i, even most ES focus machines can as they don't have to fully resolve the 1920 lines they only have to resolve enough of the 1080 lines (540 interlaced) to show the combing artifacts. You will see the jaggies if you look for them on any motion, only very slow movement and static images will look the same in both 1080i and 1080p.

Lastly (not trying to make you feel bad) I honestly don't think your display has the electronic bandwidth , optical capability, or beam spot capability to show as much detail as any 8" or 9" standard high res crt projector. You should compare your set to a nec XG, g70, g90, ampro 3600, 4600, barco 1208, 1209, cine9, etc etc. Your set is based on a compromise of getting more brightness out of crt technology and unfortunately was not about getting the highest bandwidth or resolution. What ansi resolution is it rated as being able to fully resolve?

I may be wrong but I'd almost bet my barco 708s could do better. (not trying to be an elitist or whatever, just trying to set things straight.)

Anyway I run all my crt front projectors around 72-80" wide and 1080i combing artifacts are easily visible. If you can't obviously see it then you're most definately display limited.

Troy
New Posts  All Forums:Forum Nav:
  Return Home
AVS › AVS Forum › Display Devices › Digital Hi-End Projectors - $3,000+ USD MSRP › The Turth about 1080p projectors/Displays.