Is that "85%" figure you mention the amount of light lost through the LCD panel in an open state, or is it a measure of light average over time in the open state given the on/off transitions? I ask because if the latter is the case, then it's really saying that the on-state of the eyewear has a 35% light loss, coupled with the 50% time where it's completely off, so over any given second you only get 15% of the light through that goggle.
This distinction of light loss when the LCD panel is in "open" versus the average light measured over time with it alternating on/off is important.
By definition, with dual-image stereo, you'll only be able to have each shutter open 50% of the time because you've got to share the time space with two channels, so imagining that we've developed LCD eyewear with perfect transparency, therefore you'd have a literal loss of 50%. So with better and better LCD eyewear design, we should be closer to that number.But here's the most important point of what I'm suggesting and what I've read: I'm not just talking about perceived brightness in terms of the way our eyes see logarithmically, I'm talking about a different paradigm of perceived brightness because *one eye* is seeing a full-on image while the other eye is darkened.
Example: if you put on sunshades that filter out 50% of the light onto both eyes, you see a dimmer world.
Now, without sunshades, close one eye. Do you see a "dimmer" world?
Even though in each example the average light for your combined vision of both eyes is cut by exactly 50%.
Your brain doesn't say "well one eye is dark-black, and the other eye bright, so let me average the perceived brightness between the two". Your brain simply uses the eye that sees the active image as the correct eye to trust, so it does. This isn't wired to the physical closed nature of your eyelids. If you leave both eyes open but cover one eye with an eye-patch, your brain still sees the world as "bright" as the eye that has an image.
That's what I mean by *alternating* left/right stereoscopic images as perhaps actually having the chance to look brighter than the measured fact that you're missing 50% of the light over the given span of time... because when the image *is* on, it's on for one eye full bright, and the brain may use that to gauge it's perception of how bright the image is and not the "average" of on/off.
That's most likely why folks say that they perceive less light loss with active shutter wear... because except for the light lost in transmitting through the LCD panel when open, the brain is able to see the image as "bright" as the image it sees in the on-state.
Polarized is a different story because both eyes are, by definition, only getting 50% of the light and never more (and of course probably less with real-world filter designs).
the biggest issue re light loss is how long each shutter is open, time wise. This is not a limitation of the shutter but is directly related to the hold time for each flash on the screen.i amnot sure how refresh enteres into this but i would suspect it does. A few posts earlier there was a link concerning this (link was indicated by bolding some text in the post0. it should how little time a shutter was open.Obviously transmissivity of LCD glassses can be improved. but there is little that can be done re how long a shutter is open.
It would be interesting to see how the speed of back/forth on/off affects perceived brightness. Does anyone have any objective data?