240hz is well past the flicker fusion threshold, so it's not unreasonable to say "at once", thought technically it's not true, it's kind of true from the perspective of human perception. Since we aren't robots that see the world at 1000hz, or even at any exact framerate (our vision is framerate-less, but still temporally limited in terms of frequencies).
Tuan, please, if you talk to the engineers there, please get them to release a 1080p 120hz version of this projector with HDMI 2.0a inputs. All DC3 projectors can technically do 120hz, including this one, since 720p120 is supported, and that content has to be upscaled internally so in reality, this projector can definitely display 1920 x 1080 pixels at 120hz, it's just limited by a signal path that is needlessly limited. The 0.47 inch DLPs that do 240hz internally have dual DMD controllers adding up to 600 Mhz pixel clock (300 Mhz each), so if you even had one of those same modern controllers in this particular projector, you'd be able to show 300 Mhz content, which is what 1080p 120hz in HDR10 is, basically.
This is all academic, and I did recommend this projector via a blog post I wrote on Blurbusters.com, for console or PC gamers on a budget. It's a toss-up as to whether the Acer G550 is a better buy, given that one doesn't support HDR (AFAIK
...does it?). I'm not sure how much 8.3ms is that much better than 16ms, but 120hz is clearly a massive improvement over 60hz, and for fast action videogames much more noticeable and important than simulated 4K is, any day.
Still, I love HDR too, so it bugs me after waiting all this time to upgrade my 1080p projector, I still am forced to decide between low contrast faux-K with 50-80ms lag, 120hz support without HDR, or HDR support without 120hz. Only the latter two are worthwhile upgrades, but I think I will wait for a projector that offers me 1080p 120hz HDR10 with low lag (16ms or less).
As an engineer, I would be remiss if I didn't at least say it's worth nudging the engineers at Optoma to support 120hz somehow, and Acer to add HDR10 decoding somehow. Seriously, all these projectors support SDR in 10-bit and have enough bandwidth to pass it, even at 1080p 120hz I believe, and the main difference between SDR and HDR on a projector is the dithering pattern used and the lookup-table. DLPs are linear light modulators so adding a new dithering table and EDID mode should be as simple as adding a new value for Gamma decoding, like 2.7, to the menu. I've seen firmware updates on my w1070 in years past add extra refresh rates and 3D modes even years after the initial launch, so I think it's in the realm of possibility. If the Acer adds an HDR10 decoding mode I'm buying it. I'd much rather 1080p 120hz with 2000:1 native contrast and 8.3ms than Faux-K at 700:1 and 50ms lag, any day of the week. For one, it's easy to drive 1080p 120hz and maintain constant FPS and for fast action games will definitely be sharper than 4K60 due to less motion blur and persistence.
Anyway, thank you Optoma for pushing the envelope a bit. I do think this one has good value given its price and HDR10 decoding on a DC3 chip. Just for next year's iteration, please consider adding the features we gamers are asking for, plus lasers or HLD and P3 gamut. 1080p is fine for many of us. Low prices and great gaming performance and decent native contrast would make me upgrade in a jiffy.
This projector at least accepts 4K HDR signals and downscales inside the projector though, right? I don't see how HDR10 is going to be used otherwise, except for streaming sources like Netflix or games. And for consoles, I wonder is they'll output HDR at 1080p. Anyone know?