or Connect
AVS › AVS Forum › Display Devices › Digital Hi-End Projectors - $3,000+ USD MSRP › what happened to native 2.35 projectors?
New Posts  All Forums:Forum Nav:

what happened to native 2.35 projectors? - Page 5

post #121 of 154
No confusion...

Yes, if you measure non-deterministic patterns like ANSI and Native On/Off which do not give actual intrascene numbers deterministically. It is considered Non-deterministic because the actual scenes we watch have a random nature to them and therefore you cannot determine the actual contrast by those measurements. 99% of the time when watching any movie, our eyes are receiving a large amount of white pollution (which gives us partial black floor blindness). You will be shocked to find that the actual intrascene are way way lower than those measurements (not the dynamic variations between illuminance from frame to frame, but the actual REAL intrascene our eyes can pick up). Again, I do not doubt in tiny white pixels or small amounts of white light we can see more than 1000:1. I also do not doubt in more than 0.5 to 1 second we can see more than 1000:1, I made that clear. The point is, the majority of the time we cannot AFIK (from everything I have read and seen).

All the ANSI pattern does at extreme ranges above 1000:1 is allow the black floor to go a little lower without needing as much on/off, at some point adding ANSI doesn't even help bright scenes either really, and starts only helping dark scenes. The problem is it only helps dark scenes a TINY TINY TINY bit, so overall it just doesn't help that much beyond a certain point. The native on/off allows magnitudes of lower illuminance. Plasmas have a superior intrascene ramp because of this extreme ANSI, but the illuminance level on many Plasmas still suffers because of the magnitudal necessity of native on/off (and some Plasmas have high ANSI and just so-so on/off, not speaking of the Kuro which has high both).
Edited by coderguy - 10/5/13 at 7:24am
post #122 of 154
The most misunderstood thing is illuminance level, the illuminance level affects the perception of contrast a lot more than people are accounting for, if it didn't than even the BEST DI's would be absolutely useless.

So trying to determine how much intrascene contrast the eye is seeing at different illuminance levels is virtually impossible by eye (it definitely isn't scientific). I never said I was 100% sure btw, this is just running all the figures that 1000:1 seems to make sense, perhaps it is a little higher, but if it is, it's probably a very very miniscule amount of our viewing time.
post #123 of 154
Again, you're muddling the issues. Just because nobody makes popcorn to watch an ANSI checkerboard (well, maybe some of us do wink.gif ) does not imply any limitations about the human eye.
post #124 of 154
I am basing it on more than 100 pages of technical papers that I have read and then combining that with my understanding of what I see. Some of the papers were from TI, some were from radiologists, some about LED, some from those sites I posted, and some from Ophthalmology. As noted, there are exceptions, but most of the time our eyes have a limited range.

There are so many illusions that are perceived when it comes to contrast, it is just not so simple as you guys are wanting to believe it is. The contrast range entirely changes between mainly three represented ranges (though some would say 4) based on average illuminance levels, from scotopic to mesopic to photopic. You cannot compare two scenes with different illuminance to judge intrascene by eye unless you eliminate all kinds of variables, I promise I am 100% certain of that.
Edited by coderguy - 10/5/13 at 8:15am
post #125 of 154
Quote:
Originally Posted by coderguy View Post

I am basing it on more than 100 pages of technical papers that I have read...
...and conveniently not citing a single one!
post #126 of 154
That is totally false, wth...

I posted at least 3 in here which each has sub-references of more than 10, but no I am not going to go digging up every single one, it's very easy if people want to find papers on it themselves, since no-one will believe anything anyhow, or they will just give them more information to try to prove me wrong. The more I argue, the more people argue back without any proof of their own. So what is the point of posting all the papers, some are very long, drawn out and very difficult to understand.
Edited by coderguy - 10/5/13 at 8:15am
post #127 of 154
I keep and read them in our bathrooms.
post #128 of 154
After this thread, I very much need to go to the b-room.
post #129 of 154
This is simply too easy. Ahem.

After this thread you will need a double dose of Lomotil.

In fact, I bet you need it now. smile.gif
post #130 of 154
i'm going to ask a stupid question, but at least it's on topic to the original thread...


what would the advantage be to having a native 2.35:1 projector anyway? is there any 2.35:1 source content?

i mean, is 1080p not the highest resolution you can get at home for 'normal' movies and content. and all 2.35:1 blurays are based on that 1920x1080 ratio, meaning you end up with 1920x800(or whatever it is). so all 16:9 projectors show 100% of the detail contained in the source anyway. if you got a 2.35:1 native projector, then it couldn't handle a 16:9 1080p source at full resolution, or it's 2.35:1 resolution wouldn't match the source anyway.

it just seems to me that you'd want the projector to match the specs of whatever the highest source resolution is, and that same aspect ratio. since bluray is 16:9, you'd want a native 16:9 projector. so, what's the point of a 2.35:1 projector?
post #131 of 154
Quote:
Originally Posted by fierce_gt View Post

i'm going to ask a stupid question, but at least it's on topic to the original thread...


what would the advantage be to having a native 2.35:1 projector anyway? is there any 2.35:1 source content?

I think what most people looking for scope projectors want is a (probably cheaper) alternative to using a lens. You can look at it as getting many/most of the benefits of a lens, better light efficiency, higher pixel density, quicker AR transitions, etc, without the cost or compromises (eg pincushion).
Quote:
it just seems to me that you'd want the projector to match the specs of whatever the highest source resolution is, and that same aspect ratio. since bluray is 16:9, you'd want a native 16:9 projector. so, what's the point of a 2.35:1 projector?

I don't know where this idea comes from (maybe Joe Kane and the like and their penchant for drawing the wrong conclusion from using computer test patterns to "test" scaling), but there's no technical reason to insist that source resolution match display resolution. Would you want to watch DVDs on a 480p projector? There are good reasons to use a higher resolution display than your source material, just like there are good reasons to much "faster" DACs for audio than the highest sample rate of the audio you're playing.
post #132 of 154
Quote:
Originally Posted by stanger89 View Post

I think what most people looking for scope projectors want is a (probably cheaper) alternative to using a lens. You can look at it as getting many/most of the benefits of a lens, better light efficiency, higher pixel density, quicker AR transitions, etc, without the cost or compromises (eg pincushion).
I don't know where this idea comes from (maybe Joe Kane and the like and their penchant for drawing the wrong conclusion from using computer test patterns to "test" scaling), but there's no technical reason to insist that source resolution match display resolution. Would you want to watch DVDs on a 480p projector? There are good reasons to use a higher resolution display than your source material, just like there are good reasons to much "faster" DACs for audio than the highest sample rate of the audio you're playing.

i guess i should clarify that last statement. i don't think there's any benefit to using higher resolution than the source material. so with a 2.35:1 projector you are compromising native 16:9 content, for no improvement in 2.35:1 content. aside from possibly the auxillary benefits of increased brightness, which doesn't seem like the right way to fix that problem though.

i've side-by-side compared 720p and 1080p displays displaying 720p content, and i've often thought the 720p looked sharper. i've certainly never been blown away by how much better the 1080p looked. never thought of myself as a purist, but i just don't like the idea of one pixel of the image being displayed on something other than one pixel of the display.
post #133 of 154
Quote:
Originally Posted by fierce_gt View Post

i guess i should clarify that last statement. i don't think there's any benefit to using higher resolution than the source material.

Tell that to the people who have a VW1000 or a JVC eShift machine and report "significant" benefits to displaying 1080p at higher than 1080p.
Quote:
...so with a 2.35:1 projector you are compromising native 16:9 content, for no improvement in 2.35:1 content.

Not with any of the scope projectors available, they are all 1080p, so 16:9 is still full 1920x1080. Plus you get a much higher pixel density for scope content.
Quote:
i've side-by-side compared 720p and 1080p displays displaying 720p content, and i've often thought the 720p looked sharper.

You've got to be careful, was the content really sharper, or was it the SDE/more visible pixel structure creating the false appearance of sharpness?
Quote:
i've certainly never been blown away by how much better the 1080p looked. never thought of myself as a purist, but i just don't like the idea of one pixel of the image being displayed on something other than one pixel of the display.

Why? We do it with audio all the time. Sampling theorem has been around for almost 100 years now, it's pretty well understood. On properly sampled signals you can resample them to any other sample frequency (so long as the new sample frequency is greater than twice the signal bandwidth) without any loss.
post #134 of 154
Quote:
Originally Posted by stanger89 View Post

Why? We do it with audio all the time. Sampling theorem has been around for almost 100 years now, it's pretty well understood. On properly sampled signals you can resample them to any other sample frequency (so long as the new sample frequency is greater than twice the signal bandwidth) without any loss.

One can argue theory all day but when Joe Kane's test patterns turn to fuzz when scaled, clearly there is some loss going on. How are you so sure that the same loss doesn't happen with real content?
post #135 of 154
Quote:
Originally Posted by ScottJ View Post

One can argue theory all day but when Joe Kane's test patterns turn to fuzz when scaled, clearly there is some loss going on.

That's a flawed test. The test patterns he uses are not bandwidth limited to half the sampling frequency, thus all theory goes out the window. They can not be reconstructed uniquely without aliasing. The Proper test would involve a scanned/photographed test pattern. The theory states that if you resample something like Joe's test pattern it will turn to mush, the problem is not the theory it's the interpretation.
post #136 of 154
Quote:
Originally Posted by stanger89 View Post

That's a flawed test. The test patterns he uses are not bandwidth limited to half the sampling frequency, thus all theory goes out the window. They can not be reconstructed uniquely without aliasing. The Proper test would involve a scanned/photographed test pattern. The theory states that if you resample something like Joe's test pattern it will turn to mush, the problem is not the theory it's the interpretation.

That's my point: how do you know that the scanned/photographed image has been bandwidth limited to half the sampling frequency? Why would it be?
post #137 of 154
Because that's how sampling works, that's how anyone designing a sampling device would design it, and anyone performing sampling would do it. If you don't do that, you get aliasing/moire.

This is completely different from how you build a test pattern with single pixel-dimension lines which you do on a computer and can specify any pixel be any value you want it to be, even it it's "impossible".
post #138 of 154
Quote:
Originally Posted by stanger89 View Post

Because that's how sampling works, that's how anyone designing a sampling device would design it, and anyone performing sampling would do it. If you don't do that, you get aliasing/moire.

This is completely different from how you build a test pattern with single pixel-dimension lines which you do on a computer and can specify any pixel be any value you want it to be, even it it's "impossible".

Interesting. Would this also be true for computer-generated animation like Pixar movies?
post #139 of 154
It could be, but likely isn't, they'd need to render them specifically for each resolution, and probably have to do some special work. Think of a CGI movie more like a video game, you might get aliasing (jaggies), but odds are they're rendered with some sort of antialiasing algorithm (often essentially rendering at much higher resolution and scaling down, essentially simulating "analog" sampling).

You've basically got to go through some serious effort (like Joe Kane talks about recreating all his test patterns for 4K) and do it on purpose, or be "negligent" or cut corners (like some DLSRs do with video) to get something isn't well behaved (ie at least acts like a correctly sampled signal).
post #140 of 154
Quote:
Originally Posted by stanger89 View Post

Tell that to the people who have a VW1000 or a JVC eShift machine and report "significant" benefits to displaying 1080p at higher than 1080p.
Not with any of the scope projectors available, they are all 1080p, so 16:9 is still full 1920x1080. Plus you get a much higher pixel density for scope content.
so what is the resolution of these panels? like 2400x1080 or whatever it works out to?
Quote:
You've got to be careful, was the content really sharper, or was it the SDE/more visible pixel structure creating the false appearance of sharpness?
Why? We do it with audio all the time. Sampling theorem has been around for almost 100 years now, it's pretty well understood. On properly sampled signals you can resample them to any other sample frequency (so long as the new sample frequency is greater than twice the signal bandwidth) without any loss.
i like to explain things as much as the next guy, but ultimately the math means nothing if reality doesn't back it up. at distances far enough away to not see SDE, the 720p looked better. the only benefit to 1080p was being able to sit closer and not see pixel structure. but i didn't see ANY more detail in the image. it looked like the 720p image was ever so slightly out of focus and 'blurred' the pixel structure. that's the best way i could describe it. playing blurays, the complete opposite happened. the 1080p looked far sharper, and 'focused' and did provide more detail. the 720p display looked 'blurred'.
post #141 of 154
Quote:
Originally Posted by fierce_gt View Post

so what is the resolution of these panels? like 2400x1080 or whatever it works out to?

The one's I've heard of are all 2560x1080 (2.37 * 1080, or 1.33x wider than a 1920x1080 panel).
post #142 of 154
Quote:
Originally Posted by stanger89 View Post

The one's I've heard of are all 2560x1080 (2.37 * 1080, or 1.33x wider than a 1920x1080 panel).

Interestingly, their DLP chips have a larger native resolution, something like 2560x1400, but they block off the top and bottom portion. (At least, it is never used.)
post #143 of 154
Quote:
Originally Posted by stanger89 View Post

The one's I've heard of are all 2560x1080 (2.37 * 1080, or 1.33x wider than a 1920x1080 panel).

well that at least takes care of the 16:9 concerns. guess there's really no down side to that then. I still question if there's a noticeable enough upside to justify the extra costs associated with smaller production numbers.
post #144 of 154
Quote:
Originally Posted by darinp2 View Post
....claimed we couldn't get more than 219:1 with video (or 256:1 later).
 

If you are referring to the dynamic range per standard Rec 709? Even 200:1 contrast would itself be very unlikely. CIELUV and CIELAB models contrast 100:1 with target peak ~200 cd/m^2. Video is (Low Dynamic Range) by design for good technical reason. The pending new UHDTV format is addressing HDR, enhanced color gamut and higher Frame rates i.e. 120 fps.

 

 http://www.telescope-optics.net/eye_intensity_response.htm

Excellent primer coderguy:rolleyes:

 

Personally, my research supports that human perception is roughly logarithmic. Explains why IMO film and video are logarithmic, video is defined by a power function, but in reality video gamma is simply (.45) or 1/2.2. A 'linear segment' delineates this 100:1 break-point.  

post #145 of 154
Quote:
Originally Posted by bralas View Post

If you are referring to the dynamic range per standard Rec 709? Even 200:1 contrast would itself be very unlikely. CIELUV and CIELAB models contrast 100:1 with target peak ~200 cd/m^2. Video is (Low Dynamic Range) by design for good technical reason. The pending new UHDTV format is addressing HDR, enhanced color gamut and higher Frame rates i.e. 120 fps.

 http://www.telescope-optics.net/eye_intensity_response.htm
Excellent primer coderguy:rolleyes:

Personally, my research supports that human perception is roughly logarithmic. Explains why IMO film and video are logarithmic, video is defined by a power function, but in reality video gamma is simply (.45) or 1/2.2. A 'linear segment' delineates this 100:1 break-point.  
You wouldn't happen to be Tom Brunet, who posted some of the biggest nonsense on this site and was banned years ago, would you? This sure looks a lot like his style and position.

Do you know what contrast ratio with projectors is? It is the white level divided by the black level, such as in lux or ft-lamberts.

If the white level is 12 ft-lamberts then 200:1 would mean an absolute black level of 0.06 ft-lamberts. Only somebody who doesn't understand much about video would think that the absolute black level would be limited to 0.06 ft-lamberts with REC.709 video if the white level was 12 ft-lamberts. Is that what you believe?

If you understand that vision is logarithmic then you should understand why the actual contrast ratio of a display needs to be much higher in space with straight division and if you did any actual measurements of real displays you would see that a claim that contrast ratio can't go beyond 200:1 with REC.709 material is pure nonsense, even if thinking about it didn't work.

--Darin
post #146 of 154

Greetings,

 

Am simply referring to the current Rec 709 color space. If you will, you may wish to research UHDTV interest in HDR and higher frame rates. There is little doubt that legacy video format is LDR. Your post specifically mentioned "video".   

post #147 of 154
Yep, sure sounds like Tom Brunet; instead of responding to straightforward refutation of what he's said, he points you to irrelevant references.
post #148 of 154
Even Dolby, in pushing their Dolby Vision system, state the current system (REC 709) supports way more then 200:1 contrast.

On top of that, and I hesitate to restart this whole discussion, but REC709 assumes a gamma factor of 1/0.45 IIRC. If we just plug that in, assuming white is 220 (220 total steps), and the ratio between any level and white is (x/220)^(1/0.45), then the ratio between 1 and 220 is 160,466. So to reproduce a value one step above black at the correct ratio to white, it needs to be 1/160,466th as bright, ie you need a contrast ratio higher than that. And remember that's a value above black. Black would be (0/220)^(1/0.45), which is undefined.

It's even worse if we assume a higher gamma since video is coded with a 0.45 factor, but assumed that the display will be higher. If you assume something like a 2.4 gamma, then it's almost 420,000:1 to reproduce one step above black at the correct level, but again, black is still 0^2.4, which is undefined, effectively infinity.

REC709 (nor any other standard I've seen) have defined a minimum value for black, it's always 1 is white, and 0 is black, and the lim(x), x->0 of 1/x is infinity.

Dolby Vision, and HDR video, from everything I've read is all about raising the reference white level much higher than the now standard 12-16ftL, not making black blacker.
post #149 of 154

http://www.image-engineering.de/datasheets_manuals/testcharts/TE108_A_datasheet.pdf

 

This is an industry standard logarithmic alignment chart. It's gray scale contrast is ~50:1. Each gray sample correlate: (-log) for % reflectance. These pre selected samples produce equal linear "video" step size. You seem to overlook the essence of the information itself. While there is perceptual log progression, the color space is fundamentally linear. If it were not, the steps would not be equal in size.


Edited by bralas - 2/4/14 at 4:21am
post #150 of 154

http://www.image-engineering.de/datasheets_manuals/testcharts/TE223_A_datasheet.pdf

 

-LOGARITHMIC GRAY SCALE TEST CHART-

 

Technical data for 200:1 contrast, note 13 linear steps.

New Posts  All Forums:Forum Nav:
  Return Home
AVS › AVS Forum › Display Devices › Digital Hi-End Projectors - $3,000+ USD MSRP › what happened to native 2.35 projectors?