This is just a question out of curiosity seeing as how I have never witnessed a variable iris projector in action. If the contrast was measured at complete black and complete white, wouldn't the image fall apart for say material that does not completely fill the screen, for instance, 2.35:1 or 4:3?
I have access to a DLP projector (NEC LT260K, so not geared for HT) and get rather annoyed that the black bars on wide screen material is noticeable in a darkened room. Imagine a bright scene that on 2.35:1 material, surely the black bars top and bottom would be a greyish colour? Similarly for 4:3 material the side bars would also be grey if the main material is bright which is common for a lot of tv material.
I am guessing that the demonstrations of the Sony have carefully selected demo material so as to not draw attention to the iris deficiencies in challenging and demaning conditions. Please correct me if I am way off base.
What would be even better would be an iris that can adjust on a per pixel basis
but you may as well just focus on creating a new technology, something HT oriented.
It'd be great when one day there was one definitive technology which had in excess of 15000:1 CR, long bulb life, no screen burn, no focus issues, no convergence issues, automatic adjustment depending on lighting conditions, properly calibrated colour, multiple sync frequencies for those of us running pal and ntsc material, multiple resolutions... the list goes on... one can only dream.