With my motorized electric drop down screen, I'm finding it fine to just retract it a bit when I have a 2:35 movie.
Most of my watching is in 16:9, I watch movies too, sure, but way more TV shows and videogames (which aren't likely to support 2.35 : 1, at least not on consoles. trust me, I know, I do that for a living and it's always shot down when I mention supporting a variable AR or custom resolutions).
I used to want to have an A-lens, then after a deal fell through to get one for free, I decided when I get a 4K projector, I'll get instead of an expensive one with an A-lens, two inexpensive ones and use them for passive 4K x 120hz 3D and the zoom method.
The benefit of this strategy is:
a) if one of them needs servicing, or a bulb blows and I'm out of spares, I'm not without a unit (I use my BenQ w1070 as my main monitor),
b) HDMI 2.0 will not support the b/w requirements for 4K / 120, so that'll have to be some kind of displayport -> demuxer setup unless I can do it from the videocard with two HDMI 2 outputs, one for each eye. An A-Lens would never work with that, and I sudder at the idea of having two projectors and aligning two A-lenses. I know some setups have that, but they're laser-aligned and very top end cinemas like IMAX and whatnot.
The only thing that'd get me to change my mind for 4k is if anamorphic red-rays come out with that new encoding technique from Panamorph. If that gets standardized, I suspect not only will anamorphic lenses make a big splash, but their company stock will go through the roof. And at that point, you'll probably see some investment in native 2.35 : 1 projectors.
Can anyone tell me what's the best processor to take a 1080p 120hz 3D signal from a computer, and split it into two 1080p / 60 2D images, one for each eye? Can it be done in software or via the videocard?