These TI chips are quite good / powerful.
It's confirmed they can do 10-bit at 4K60, not just 4K24 / 4K30. Why does this matter? Because due to the way the wobulation works by showing two subframes in 1/60th of a second, that means they are effectively delivering 2.7K x 120hz x 10-bit. 10-bit at 120hz is super
good, and a feature normally for 3-chip DLP only territory. (i.e. very expensive). On DLPs the higher the framerate the lower the effective bit depth, since digital projection is binary and using time-slicing / duty-cycling to achieve their bit depth, so the effective bit depth on a DLP is mostly dependent on the mirror speed vs the framerate. And older single-chip DLPs can't do 10-bit at 120hz or in 3D. But these new chips can...because wobulation at 60hz = equivalent to no wobulation at 120hz.
The UHD65 seems like it may be the winner for 3D enthusiasts, if it can actually scale up 1080p then just jack in to the wobulation tech to render 120hz 3D at 4K (upscaled). And if that doesn't work, the fact that these run at 120hz internally, and deliver 1:1 pixel mapping (overlapped) means it might even be able to accept 120hz in 2D OR 3D (with a video processor for LCD glasses sync up) through my pixel packing shader
This will require a bit of work on my end to write a virtual display driver which windows would see as a 2.7K 120hz native screen, (or dual 2.7K 60hz screens for 3D, or over-under or side-by-side for example, for games / media players which support those modes). Then you can do PC-based FI for 1080p 3D sources at 60hz per eye in SBS without losing resolution
I'm super excited about these TI chips, for hackability.
Although for superior 3D one might consider the new 1080p Dell 7760 laser
DLP projector with 3D AND 5400 lumens
. It's under 3K. No built-in FI though probably but that's OK for 1080p sources since one can always rip 2D / 3D Blurays to disc then do FI on your PC instead. It's a hassle, but...5400 lumens for 3D is killer. Plus laser too.
If my w1070 contrast mod works I might even consider buying the dell instead and do the same procedure to get awesome HDR (these DLPs all accept 10-bit native SDR signals, but it's possible to do tone mapping and temporal dithering to do HDR).
HDR is just a display curve, yo. The important thing is native contrast and peak lumens. And even crappy on/off DLPs can apparently deliver stunning HDR10 due to high ANSI contrast. But imagine 5400 lumens of laser powa...awesome.