AVS Forum banner
Status
Not open for further replies.
1 - 20 of 52 Posts

·
Registered
Joined
·
4,654 Posts
Discussion Starter · #1 ·
We are talking about all things being equal here. I'm pretty sure your eyes adjust to the overall lumen level of an image. And that a brighter average level causes your iris stop down and causes blacks of the scene to look blacker than they "really" are (whatever that means).


If so, which is more important: to change your black level from 1 Ft.L to .5 Ft.L (and double your contrast ratio---think about that for a second) or add some Ft.L to the whites?


It would seem to me that having bigger differences between the various light levels in a particular image would make those differences in light level easier to perceive (improving shadow detail and apparent contrast).


For example, imagine a 25 step gray scale shown on two projectors. One projector has 1 Ft.L black and 25 Ft.L white. The other pj has 2 Ft. L black and 50 Ft.L whites. So although the contrast ratio is the same on both projectors, the differences in light level between the gray bars is twice as great on the latter projector. Wouldn't the apparent contrast look greater on the latter pj? So doesn't higher lumens create greater apparent contrast?


Anyone know how this works? I've been asking for years, and haven't heard a peep yet. (That I can remember, or that I understood).
 

·
Premium Member
Joined
·
4,708 Posts
I think if the average picture level (APL) is always 70-100 IRE from the source your correct, whats meant to appear as black would look black for most of us.

But... if the APL was only an 20-50 IRE the image would look like a foggy mess and very obviously be horrible because there won't (can't) be any convincing black in the scene.


edit:

It alway's comes back to the Actual Black level *CR of the display device.


I believe we do overcome a poor Black level with a lot's of lumen's AND a high contrasty scene to get a convincing Black, but movies arrn't static. Dark scenes don't always have enough highlight (high IRE) to fool the eye all the time so we can see this "false" Black. It's not believable anymore if the APL is low.
 

·
Registered
Joined
·
4,654 Posts
Discussion Starter · #3 ·
Thanks Jimmy, now all I need is for you to use enough words to make me understand what you are talking about. You clearly understand this, so PLEASE give me a little tutorial here. Apparently I need the basics. Thanks.
 

·
Registered
Joined
·
475 Posts
Well, at least in one aspect you are certainly correct, and that's with reference to existing ambient light.


If one projector goes from 0.1 to 100 lamberts, and another goes from 2 to 2,000 lamberts, both display a CR of 1,000:1.


But if the screen is already receiving 1 lambert due to ambient room light, the 0.1 lambert value would not be discernable over the ambient light. The 2 lambert value would be notable and this project will have a greater perceived contrast.
 

·
Registered
Joined
·
23,188 Posts
Quote:
Originally posted by JHouse
If so, which is more important: to change your black level from 1 Ft.L to .5 Ft.L (and double your contrast ratio---think about that for a second) or add some Ft.L to the whites?
I think it depends a lot on how bright the whites are, the scenes, and the room.


A scene that is about 20 IRE at the brightest will be somewhere in the 2% of 100 IRE range at a gamma of close to 2.5 (or 1 ft-lambert for brightest parts if 100 IRE is 50 ft-lamberts). If your black level was already 1 ft-lambert your 20 IRE stuff is going to have to be a bit above that to differentiate from the high "black" and you will still probably see gray. For those scenes I don't think that adding overall brightness would have helped your perceived CR much and would probably have hurt it in a light controlled room. Your irises are probably pretty far open if everything is under 2 ft-lamberts and I don't know if they could open more than that. So, if you had your blacks at 0.5 ft-l and you doubled them to 1.0 ft-l (along with doubling all your levels), I'm not sure if your irises would be in any different position for an image that is 20 IRE at the brightest part.


Where I think what you describe might help is in the brighter scenes, but I'm not even sure about that. With ambient light, making things brighter can help the actual CR as well as the perceived CR.


I know some people claim the effect you are talking about with the High Power, but I think this could just be due to the directionality (small viewing cone) helping reduce the effect of reflections off the walls and/or fighting ambient light. Or in other words, helping your ANSI CR even though you thought it was the brighter whites that made you think there was better contrast. Just one theory. I'm not sure if that is the case or the iris effect you mentioned is responsible, though.


--Darin
 

·
Premium Member
Joined
·
1,045 Posts
I agree with all the previous answer. It depends on the room.


If your room is light control. The projector that has the lowest absolute black will look better. Your reference point would be the darkest of the room around you. So your white even if they are not that white will look darn good and your black will be ok. Creating the illusion of great contrast. The brighter projector will have white that look white but a darker image will look washed out, creating the impression of lower contrast.


But if the room is bright than your higher lumens projector will look way better. Your black would look blacker and your white better than the low lumens projector.



Bruno
 

·
Registered
Joined
·
475 Posts
One way to look at this is to think about the extremes. A very very very dim image (say 10 lumen projector) would look very low contrast even in a pitch black room because there aren't enough photons hitting the retina to give enough information.


And on the other end, a way too bright, billion lumen projector would also look low contrast; the overpowering light would overload your retina and you wouldn't be able to differentiate the highlights.


So this implies that the ideal light level is somewhere between the extremes. It's probably something like a bell curve where perceived contrast is lower at the extremes of brightness, and reaches a wide peak somewhere in the middle of the brightness range.


So where is that peak? What brightness level yields the best contrast?


One way to approach it is to think about doing some visually difficult task indoors. You would want the best perceived contrast you could get. For instance when a surgeon is operating on a patient, they shine a heck of a lot of light on the site. I would think that the intrinsic contrast ratio of a patient's organs is probably in the same range as that of a projector; there are no flat black parts only dark greys, and no pure whites, only off whites. If the surgeon finds he can differentiate things most effectively with lots of light, I think that points to lots of light being useful on our screens.


So the answer I think is that if your projected image is less bright than the surgeon's (or dentist's) light makes the patient, then making it brighter will yield an image with more perceived contrast. If your projected image is brighter than that, then you should lower it to improve contrast.


Of course this requires that you let go of the idea that your blacks will be imperceptible. And some people just can't get past that; if they see the black level of their projector they are just unhappy and turn it down, no matter how much it flattens the image on non-black scenes. I was in that category at first when I was transitioning from CRT to digital projectors.


Others, like myself now, tolerate a bit of grayness on the blacks in exchange for brighter scenes that really pop and look 3d.
 

·
Registered
Joined
·
17,711 Posts
I think it would, especially in what you are saying (i.e. bright scenes) but I think in darker scenes it might be the opposite.


using 0.1 to 100 lamberts, and another goes from 2 to 2,000


if a scene is dark (i.e. 10% at the part with more light and 0 at the darkest)


you end up with .1 to 10 and from 2 to 200 the rations are still the same, but the delta between lightest and darkest part of the image has become much smaller so there should be less iris change between the two images and the iris effect gets less important.
 

·
Registered
Joined
·
10,200 Posts
I want 30ft-L whites and 0.001 ft-L (or lower) blacks. In low power mode.
 

·
Registered
Joined
·
10,200 Posts
Seriously, though, Joe, I'm not sure what the answer to your question is. I mean, our perception of luminance is roughly logarithmic. If it were exactly logarithmic, then no, there would be no difference in perceived contrast between your 25ft-L grey bar pattern and your 50ft-L pattern. But "roughly" does mean "roughly", so maybe there is a difference.


I really do wonder if there is such a thing as "too bright" for a projector. For a daylight scene, the answer is probably no, at least within practical limits of available projectors. But what about for a very dark scene? If my projector's white level is up at 100ft-L, are the caves in the LOTR:FOTR going to look like they're bathed in sunlight? (Even if my projector has super-high contrast?) If so, then the projector is too bright.
 

·
Registered
Joined
·
17,711 Posts
Quote:
are the caves in the LOTR:FOTR going to look like they're bathed in sunlight? (Even if my projector has super-high contrast?) If so, then the projector is too bright.
that's true Michael
 

·
Registered
Joined
·
4,188 Posts
With super high contrast (meaning tens of thousands to one, since it seems to be on the horizon now), your caves would still look dark even with a monster bright projector, at least similar to as dark as they'd look in real life. In a real life dark environment, your eyes are adapting, and similarly with the projected scene.


Joe's (and mine) setup probably has 50-60 ftL whites. And being LCD pjs, gray blacks (as far as absolutes go), and the caves certainly don't look bathed in sunlight, maybe moonlight. If you quadrupled the contrast of our PJ and doubled the brightness, you'd have your 100 ft lambert whites and far better blacks. In fact, the whole preference for blacks and contrast aside, I'd be curious how many here would prefer a much brighter picture if it didn't come at the expense of anything else. Not for better performance in ambient or any reason other than just heightened realism for bright scenes. There are a few here who have stated they get headaches from brighter setups, but I don't think that would apply to the majority.


I am firmly in Joe's camp on the bright is good issue. However, my eyes readjust to low and high brightness on the quicker side, so brightness is not something that would bother me either way. And he might be right, that would explain why alot of us with the 20HD go for the highpower in such a big way.


Incidentally, recently saw ROTK at the cinerama dome in Hollywood, where the image was pretty bright for a theater, and I could see the 48fps gate flicker (and the damn red spots, like ANYONE is going to camcord a movie off a curved cinerama screen). Very telling as far as why the 25 ftL standard is so universal. And why it should not apply to us.


BB
 

·
Registered
Joined
·
77 Posts
This is from The Film & Video Institute....


How Bright?


This begs the question of how bright your picture should be. The SMPTE (Society of Motion Picture and Television Engineers) recommends 16 foot-lamberts for cinema use, but this is measured with a cine projector stopped and no film in the gate. In real life that probably translates to about 9 foot-lamberts during a show. You are lucky if most cinemas actually manage more than 5 foot-lamberts.


The formula is brightness in foot-lamberts = (lumens/screen size in square feet) * gain.


A projector with a light-output of 1,000 lumens is easily available now. Assuming a flat, matt white screen 12 feet by 9 feet then this translates to: brightness = (1,000/108) * 1 which is 9.3. So given a decent blackout you can achieve cinema standards on that size of screen.


If the shape of your hall is long and narrow you could go for a high-gain screen (between 1.3 and 3 gain) which changes the brightness = to between 12.9 and 27.9 so you could consider a larger screen.
 

·
Registered
Joined
·
4,188 Posts
But as has been pointed out many times before, what is the reason for that standard. As a minimum it makes sense. As a maximum, the only compelling argument IMO is to prevent flicker from the projector being noticable.


Since that is not an issue for our setups, what other reason is there for the limitation?


BB
 

·
Registered
Joined
·
7,164 Posts
Quote:
Dark scenes don't always have enough highlight (high IRE) to fool the eye all the time so we can see this "false" Black. It's not believable anymore if the APL is low.
I think that's the point. Quite possibly a "brighter" PJ may look more dynamic with material that has a good overall balance of brights and darks...because the eye will "auto callibrate" to perceive the dark areas as "black" relative to the bright areas.


But that sensation only lasts as long as content stays bright with just a small portion of dark areas. Switch to an overall-dark scene (like most of Lord of the Rings) and suddenly you're confronted with the "gray" floor of the image. And as bright as Ganlalf's torch may be...those grays still look gray.


p.s. and you don't have to be a finger-puppet-testing neurotic to notice!
 

·
Registered
Joined
·
23,188 Posts
Quote:
Originally posted by Michael Grant
I really do wonder if there is such a thing as "too bright" for a projector.
One of my limits for "too bright" has to do with making the artifacts much more obvious. This is one reason that I am more comfortable making video based HD brighter than film based DVDs, in general.


Also, I recall somebody here mentioning that above 50 ft-lamberts your eyes will just adjust to the brighter whites and they won't really look much brighter at all. This would mean that going from 50 ft-lamberts to 100 ft-lamberts without raising CR wouldn't have much effect on the brightest scenes, but would double the "black" level.


--Darin
 

·
Registered
Joined
·
17,711 Posts
Quote:
With super high contrast (meaning tens of thousands to one, since it seems to be on the horizon now), your caves would still look dark even with a monster bright projector, at least similar to as dark as they'd look in real life. In a real life dark environment, your eyes are adapting, and similarly with the projected scene.
but that is the initial question if CR is the same. I don't think anyone will argue that brighter and more CR is better, but what if it is just brighter
 

·
Registered
Joined
·
10,200 Posts
Quote:
Also, I recall somebody here mentioning that above 50 ft-lamberts your eyes will just adjust to the brighter whites and they won't really look much brighter at all.
Oh, right, I forgot about this. There is some point where the irises of our eyes start constricting; and at that point, our eyes are counteracting the range of brightnesses that our projector is trying to produce. Dark scenes will still look dark, but bright scenes will no longer look proportially brighter. (Or will it? Do our brains know how much our irises constrict, and take this into account?) This is not unlike the "midnight mode" volume compression that surround-sound receivers provide to reduce the dynamic range of audio.


So wherever that "iris" threshold is, it's probably a good idea to keep our projector's brightness under it---for critical cinema viewing, at least. When watching football, who cares? :)
 
1 - 20 of 52 Posts
Status
Not open for further replies.
Top