Originally Posted by img eL
Should'nt Sony have the Lense making process down by now so it would'nt be such a high expense?
My understanding is the process of making lens elements has been "down" for many, many years now, so there's not a lot of process improvement to be had. You've got to grind glass smooth, and that takes time, lots of time for really high quality optics.
Originally Posted by mark haflich
My problem is with reviewer variance.
Seriously, a lens can be slower, having a higher effective f stop at both ends than a similar resolving power faster (lower effective F stop) lens. This has a ten fold savings in cost just by increasing 1 F stop, say going from a 2.8 to a 4. The larger the diameter lens the larger the sweet spot too allowing the use of more lens shift without going into the area of curvature and CA . Also decreasing the number of ED or other named low dispersion glass elements can save a bundle too. Using plastic elements can save money too. Any lens on a 4K projector should be capable of resolving the grid, the question is how clearly will each pixel be resolved across the screen, something akin to pixel sharpness of the MTF rating of the lens which is another name for contrast resolution. But once you sit a small distance away from the lens ones eyes would not be able to detect much if any difference. Its not like the lens has to be so sharp as to be able to resolve a fine CH for measurement. Before one asks, CH means Canine Hair. Shut up and clean your mind out.
No way will the lens be the same and you can take that statement to the bank.
That's an interesting thought... Even the worst (er, least expensive) 1080p projectors today can resolve the pixel grid (as far as I know), but today we worry about just how well that pixel grid is resolved since we sit close enough that we can really see each pixel (even if we can't see the space between them). We care that the edges are sharp and the contrast between them is high. On the other hand 4k will be so dense, pixel wise, and the pixels will be so small (each of your 1080p pixels will be cut in quarters) that will it really matter how clearly each individual pixel is resolved?
I've gone through the math before, but we can see more detail than 1080p, another way to look at it is 1080p pixels (at "normal" seating distances like 3 picture heights) are big enough that we can actually see individual pixels (the spaces in between are too small though which is why nobody complains), but 4k will be beyond that threshold unless you get really close.
This opens up some interesting possibilities. You could argue the current "high end" (JVC, VW95, LSx) machines have optics good enough for 4k already, take a look at this from Mark P's investigation:
The newer JVCs resolve pixels very well too (RS35 here)
Just imagine if you were to go in and divide each pixel by 4, the edges at 1080p are so sharply defined, that even at 2160p, you'll still have very well defined pixels, and the CA on these machines is so low that even at 2160p it would be essentially negligible.
Of course the pixel density also means some other intriguing possibilities. Effectively 4k is a 1080p machine with sub pixels, meaning you can now do real half-pixel alignment without electronic tricks, full pixel convergence on a HW100ES will be like being able to do half pixel convergence on a 95ES.
Maybe backing up a bit to summarize, I guess what I'm saying is having not actually seen anything in person, I'd guess there's a high probability that given I'd be putting a 4k machine on the same size screen/same seating distance as my current 1080p machine, I would not be disappointed with a 4k machine with the same optics I've got today.