Originally Posted by imbloodyskint
So a lot of this quoted 20-50-100-300,000:1 contrast ratios is a lot of BS by the company's.
Now why doesn't that surprise me.
I wouldn't quite say that, but you have to take specifications with the appropriate grain of salt. For example JVC projectors do actually hit their rated specs, even those in excess of 100,000:1. That's why it's always a good idea to check out objective reviews to see what reality is.
Originally Posted by motorman45
well maybe 4000:1 is just ok and youre correct cinema has some ambient light to deal with to some degree but the human eye cannot see the contrasting images much beyond 10,000:1 ratios. if a white part of the screen image is %100 and a black image in the same shot is 10,000 times less, that is %0.01 , much more contrast than that is only theoretical.
Rather than go into this in this thread I'll direct you to an older thread about contrast:
I could summarize by saying based on the NASA data, in a home theater something on the order of 10 million:1 would be usable/visible in a home theater.
i build spectral instruments and made one for testing my 3d system durring development and getting it to measure the extiction ratio of my 3d system was a real challenge at %0.1 . now i admit our eyes are a lot better than instraments but when i hear 50,000:1 and higher contrast ratios i know it cant be measured but some people "see" a differance. it a preception issue some times as well.
It's definitely tricky to measure, but it's possible, the easiest way I know of is to simply reduce the size of the screen you measure off of, or to measure directly from the projector. This increases the minimum measured light, which is usually the harder part. There's also options like measuring in steps, if you measure 100 IRE:50IRE and then 50IRE:0IRE you can multiply them and get the total contrast.
But this is all way OT for this thread.