Originally Posted by Mark Rejhon
How's the persistence -- aka MPRT measurements?
Full-array (2D) DLP chips have higher effective persistence (motion blur) than directly scanning-based lasers.
Is the DLP chip a full array, or a line-array (scanline)? If scanline, that should create CRT-like zero-motion-blur, if a prism moves the scanline for a rolling scan and create less motion blur than a CRT. If full array, the effective motion blur is probably worse than a CRT. So I'm curious about the exact nature of the DLP engine.
Note: MPRT in milliseconds is different from GtG pixel response in milliseconds. MPRT is pixel visibility time in milliseconds, and GtG is pixel transition time. They are totally different millisecond numbers.
Mark, nice to see ya here on AVS.
None of these DLPs are low-persistence, which would cut lumens unless using raster scanning tech, as you say. (which they aren't).
DLP is fundamentally incompatible with raster scanning approaches anyway, since it achieves a certain pixel pixel intensity only in aggregate, over time, due to duty cycling light digitally on/off, meaning the lower the persistence, the lower the effective bit depth. Scanning a single row at a time would be 1/1628th of the persistence of a full-frame DMD, and that would mean the mirrors would have to be 1628 x faster than they are currently in order to modulate the light fast enough, and even then, if it were scanned it would smear a gradient across the screen.
There was even some question about whether these new TI 4K chips had mirrors that could move fast enough to achieve, using a single chip, 10-bit colour at 120hz, since in the past this was only achievable with 3-chip designs. (they could do 8-bit only at 120/144hz). I asked TI about it a while back on twitter and they confirmed that 4K60 @ 10-bit works just fine (meaning internally 120hz at 10-bit, with XPR on).
The good thing about DLP is that the pixel transition times are effectively nil as a result of how they operate. Well, not quite, it's more complicated than an LCD cell's transition time, but it's pretty fast. A bigger issue for some people is the fact that the colours aren't shown at the same time, i.e. these are all grayscale displays, time multiplexed to fool us into seeing a fused colour final image. At 120hz that doesn't really matter much though, I don't think (actually it would be interesting to see if people who see rainbows on current DLPs can see any on these Faux-K XPR DLPs).
In the era of HDR and lasers and WCG, projectors really need all the lumens they can get, so manufacturers are not at all likely to ever implement one with BFI or rolling scan type low persistence. But both of which would be problematic for DLP anyway, for the reasons above. On a 3LCD or LCoS that could work, especially for 3LCD with LED or laser light sources and plenty of lumens to spare, which Epsons typically have. But if you're going to implement 120hz in order to do 60hz BFI, you might as well just do 120hz native and get double the refresh rate thus double the smoothness, with no lumens loss either. (rather important for HDR10, which has been estimated as needing 13000 lumens for a 120 inch screen to achieve 1000 nits peak brightness). 120hz should cut persistence in half just as well as 60hz + BFI, without flickering either, according to the formula you came up with on Blurbusters.com for persistence (meaning 8ms persistence). These DLPs' strength is how fast they can switch frames, so if someone were enterprising enough to make a 240hz or 480hz version, that would be great. Although they'd run into economic problems: who beside ultra gamer geeks would value > 120hz operation? It's hard enough to get PJ manufacturers to care about 120hz native input let alone 240 or 480.
The first and biggest problem with all these projectors is to get 120hz native at 1080p minimum working. If you really cared about BFI on a laser or LED projector one could probably hack in a switch in there to cut out the laser light 1/2 the time, with a knob for phase to sync it to the display. I know some LED DLPs are dirt cheap these days and would be ripe for such experimentation. Although that'd cut the lumens in half and those are precious indeed.