Originally Posted by darinp
If they could somehow have say 90% fill ratio on the chip and focus each pixel so that the fill ratio of each flash was less than 50% at the screen I could see that working, but cannot imagine an optical system that would achieve that.
With RGB scanning lasers you could re-introduce the concept of interlaced (native) scan, rather than progressive scan, and then draw 50% of the lines one frame then 50% the next.
You could double the framerate this way but it would cut vertical resolution AND lumens in half, which is bad (especially the lumens part). Interlaced was a smart idea at the time (CRTs), but, at least at the signal level, if you need to cut your bandwidth requirements in half, you would do it via YCbCr 420 instead and only lose chroma resolution, not RGB* resolution.
Any time you drop fill ratio you lose lumens so it's a good idea to do that deliberately for projectors, especially not when you're also trying to boost peak brightness to achieve higher HDR peaks, which is right now their biggest flaw compared to TVs (I'm talking LCoS projectors here), IMO.
*speaking of tricks to double refresh rate w/o dropping luma res in half: this is IT. They could have done this for 1080p / 60hz 3D modes, instead of losing half vertical or horizontal RGB resolution using SBS or O/U modes, you could simply cut chroma res to 420 and then pack in two frames in an over-under mode. At the time, nobody thought 420 was worth pursuing, which is unfortunate. 420 only became officially supported in HDMI 2.0 hardware although it's certainly possible to output it over HDMI 1.4 hardware (Kepler GPUs for instance, added 420 to allow 4K60 over HDMI 1.4 via a driver update).
All this talk of achieving perfect 4K for data-grade projection has been discussed to death. The problem with these 4K DLPs isn't a lack of static resolution for movies / games, it's the low-ish contrast, lack of 3D, WCG filter, still using lamps, size, noise, no 120hz mode even in 1080p, etc. A WCG filter could enhance perceptual contrast ratio by trading lumens for wider gamut, and as we all (should) know, more saturated primaries appear to be brighter, so the black floor would drop more than the (perceived) white point.
But from what I could see, the UHD65 doesn't appear to have a WCG filter. Also, they dropped their CES lumens rating from 3000 to 2400, anyone notice that? I guess they realized it was a stretch pretending like it was that high (likely not even 2400 either). I'm still hopeful some of my wishlist might make it through, and if the Pixelworks FI is decent I might grab one of these so I can start watching HDR content. Then again, it seems foolish to buy an HDR projector without WCG too (or even make one, what is up with that? Epsons have had cinema filters for many years now, it costs literally twenty dollars to add one).