Adding my quick input, as I just got one.
First, reasoning for choosing it:
My prior projector is a Sony VPL-HW40ES. By most criteria, the Sony is unquestionably a better projector, so keep that in mind, but there are a few things that led me to try the HD27HDR instead: First, probably 70% of my usage is gaming, primarily PC. The Sony is really low lag already, but the reported numbers for the Optoma are better, and 120hz is a huge bonus for gaming, and honestly something I'd basically given up on ever seeing in a projector, but here it is. Second, I live in a small apartment so no dedicated room, and the living room is almost entirely glass on two walls. I have automated blackout blinds, but with so many windows even that means limited light control. So I'm thinking the boosted light levels will help more than the better color accuracy on the Sony. Next, HDR is nice and should add some good pop to newer games, and if 4k downscaling to 1080p isn't too much lag, then for less time-sensitive games rendering at 4k and downscaling to 1080p is effectively supersampling AA (I already do this some of the time using AMD Virtual Super Resolution and it works really well at jaggy elimination, but a lot of games aren't compatible). Also, I'm in an apartment that has a projector mount, but being an apartment it is where it is - no option to move.
Finally, I use it a LOT, and the Sony bulb life isn't as great as advertised - I've been through 5-6 bulbs, and after about 2000-2500 hours, the level of flicker has gotten intolerable every time. And yes that includes direct-from-Sony OEM bulbs, which were no better in my experience. Since the OEM ones didn't help the issue, I dropped down to cheap-o bulbs which were in the $100 range, so honestly it wasn't prohibitively expensive, but replacement is a pain... With this one the bulbs should be cheap even from OEM, and the bottom facing bulb replacement door will simplify that a LOT.
So with all that criteria, my real unicorn is a 4k laser in the 2500 actual lumen range, 120hz capable, with sub-20 input lag. Doesn't exist right now, even though all those specs exist independently, 4k low-lag is exceedingly rare and none I've found are lasers. Luckily this checks a LOT of those boxes already, and is quite inexpensive, along with fitting my needs for positioning. So, my expectation with this projector is to be an inexpensive stopgap that serves until the unicorn comes into being
Installation was easy since I already had wiring, mount, screen, etc. - Mostly just adjusting the mount plate for the screws on the optoma and it was done.
It's just there, no problem, no firmware update needed. For reference, system fw is listed as "C03", and MCU "M04". For validation, I set the PC to output [email protected]
, then checked using frame skipping checker: ( https://www.testufo.com/frameskipping
) - and can confirm, no gaps or dotted lines, so it's projecting a true 120hz signal. Yay!
I see lots of questions in this thread about whether 120hz 1080p 3d works - I'm happy to try stuff if anyone wants to point out what to try. I have a few pair of dlp-link glasses coming Monday, have stereoscopic player and a few games that support HD3D, etc., but if the interest is in Nvidia's proprietary 3d I'm ... just not an nvidia guy, let's say
, so can't help there. What I can say is this - I'm using AMD HD3D (which I believe frame-packed IIRC) - If I set to [email protected]
, open up the 3d THX demo vid in Stereoscopic Player 2.4.3 set to HD3d and full-screened, the projector would show the expected overlapped images and the res/refresh tag would show 38something by 20something at 24 (don't recall the specific res, but I'm pretty sure it's the normal for 1080p frame packed). If I set desktop refresh to 60, it still bumps to 24 when I full screen in stereoscopic player, and when I set desktop to 120 it just goes into 'searching' mode. Probably not very meaningful; I'm guessing this has more to do with how stereoscopic player decides what refresh to use, which in turn is likely a factor of HD3D specifically.
- Starting with a baseline -- the old projector was reviewed to show 24ms lag ( https://www.projectorcentral.com/son...ge=Performance
). I'm testing against an Asus VS247 monitor that's got a published test of 11ms (per displaylag.com). I set the Sony into game mode, turned off all the bells and whistles, and took a series of measurements ( using: http://www.lagom.nl/lcd-test/response_time.php
) and throwing out any where the full numbers were not visible or were muddled - Differences were: .013, .019, .014, .015, .014 - average of 15ms, meaning my measurement would show 26ms. Close enough
Now let's try this on the Optoma -- [email protected]
non-HDR (because we need to clone with a 60hz display), 'enhanced gaming' on, .027, .014, .000, .000, .014, .014, .000, .014 -- Average of .010, so total measured lag of .021 (and since my old measurement was .002 over, more like .019)... Of course, at this low a level, screen-comparison lag measurement variances go through the roof since it depends where in the screen cycle the pic is taken. I guess the outtake here is that I can confirm it's fast!
One thing I really wish I could test is the 120hz lag, but since my weaksause method of lag testing involves cloning, I could only do that if I had a 2nd 120hz monitor, which I don't. So. hopefully it's even better? By logic it certainly should be - since a significant component of lag is the time it takes to actually refresh the physical screen, which at 60hz is between 0 at the top of the screen and 17ms at the bottom; at 120hz that bottom score drops down into the 8ms range. So I guess it depends on how you measure lag - From a tech spec perspective, the meaningful part of lag is the time between receipt of the top of the frame and beginning of frame draw, which is independent of refresh rate, but every means of measuring has some component of draw time. It's interesting that the more I learn about screen lag the fuzzier the definition of the spec gets!
In any case, actual impactful lag, to a user, is phenomenally low for a projector (competitive with many gaming monitors), and even better (from a user impact perspective) if you bump to 120hz.
With all that done, here's another interesting test, though i'm going to preface this with a big, bold, I'M NOT CONFIDENT OF THE ACCURACY OF THE FOLLOWING TEST!
So another thing I'd be interested in is how much lag impact the downscaling from 4k has. Again the trouble here is that I don't have a 4k monitor I can use as a cloned reference. But I do have an AMD card, which has virtual super resolution (where the video card renders at a set resolution, and scales down to match the display). So I set VSR on for the Asus monitor, set desktop to 1440P (for some reason cloning would fail when I set to 4k, but at least 1440p should still be downscaled, so it's worth a try), and tried again - knowing that this introduces an unknown: How much if any time is added by the VSR scaling. I believe it is nil-to-negligible, but I don't know that for certain so unless that gets confirmed, take these results with a grain of salt. What we're comparing here is the display lag of the projector, with the projector scaling, vs. the known display lag of the monitor (11ms), with the video card scaling (assumed 0 but not certain). Yes, this is a messy, messy test
But the result are: .029, .029, .014, .028, .028 - so looks like this is more in the 40ms range when you add in the 11ms lag of reference monitor. Note on the unreliability of this test: If scaling on the reference monitor *does* introduce additional time, it would mean instead of 40ms it's some higher number, adjusted by the time taken by the VSR scaling process. Subjectively, I did some test gaming in 4k (forza horizon 3 and doom 2016) - The lag was just enough that, explicitly trying to feel it, I think I could tell in doom but probably wouldn't have noticed otherwise - In forza I couldn't tell at all. So subjectively the number seems consistent with the experience - in other words, for twitchy gaming and competitive online play you'd want to stick with 1080p (especially [email protected]
), but 4k is very usable for single-player non-twitch gaming if you want to maximize visual appeal.
: Next up, tried some 4k and HDR - I'm lumping those together because they're all kinda tainted by HDCP. Setting to 4k and playing back video worked fine, but the short of it is I could not under any circumstances get Netflix to play back in 4k (or HDR). I tried with both Edge and the netflix app, directly connected with new high-bandwidth HDMI cable direct from the PC to the projector, no luck. Windows does recognize HDR and opens up all the HDR options (in Display Settings, 'Play HDR games and apps' is on, and under 'Windows HD Color Settings', 'Display Capabilities' shows 'Yes' on all 3 - 'Stream HDR video', 'Play HDR games and apps', and 'Use WCG apps'). I'm on the top tier netflix subscription. Hardware is compliant (Ryzen 2700X & Vega 64 - some old web articles claim that you must have an Intel processor, but AMD states Ryzen has supported UHD netflix since May, and all drivers are fully up to date). My internet is gigabit, and tests to have ping of 3ms, up and down speeds over 400mbit... Yet Netflix details (alt-ctrl-shift-d) shows resolution at 1080p, always. I'm sure it's some software issue (thanks DRM asshats!), but whatever the reason I've been unable to test any protected content, which includes most 4k and HDR video. I have an XBOX One S coming in a bit and will try from there, likely with better luck.
Others have also mentioned washed-out color. And yeah .... coming from the Sony, which had amazing color, it's definitely a significant step back. However, out of the box it was *MUCH, MUCH* worse than it had to be. Two reasons for that: Windows, and brilliantcolor. So, BrilliantColor is probably the biggest culprit here. Such a piss-poor name for what it actually is -- It's brilliant*WHITE*, really. I don't claim to know in-depth how it works, but what it appears to do is extend the time with the white panel in the color wheel to give additional brightness. But frankly, it looks like ARSE.
With my limited light control, I found it useful to have it turned up to 2 or 3 to counteract the ambient light (and accept some loss of color fidelity), but stock it was 5, I think?? WTF optoma!
To any of you that don't understand what the others mean by washed out color, go here ( https://upload.wikimedia.org/wikiped...-blue_flag.svg
), then while looking at that image turn brilliant color up and watch it turn UGLY
Plus, another thing that contributes to the impression of washout is how Windows handles HDR. When you enable HDR in windows, it does some weird things to standard dynamic range content. I was tempted to turn off HDR altogether because I couldn't get standard dynamic range content looking good for a while. That was also before I figured out how much brilliantcolor was jacking with things... Once I got that set, and got the SDR brightness readjusted, the colors were OK. Still a disappointment coming from the Sony, but these are very different projectors targeted at very different usage, and this is a disappointment I pretty much expected (like being annoyed that your new WRX isn't as smooth a ride as your old Caddilac
) Also, note that yesterday I had a windows Update trigger that significantly changed what was in my HDR options. That was right in the middle of testing, so frankly I'm not sure if it was just a matter of MS moving things about or whether there was a real change to Windows' handling of HDR.
A few other minor complaints:
So one thing that's a mild but consistent annoyance, this projector is really slow to lock in a resolution. This was often annoying in testing, because each time you change res or refresh windows asks to confirm and if you don't within a certain time it reverts... often the confirmation would expire before I had time to confirm (only a second or two left when the screen finally got sync'd. Not a major thing, admittedly, but a small annoyance.
Also, others have mentioned audible color wheel whine, and yep, it's definitely audible. Constant drone at probably in the 16kHz range, so high pitched it's just right at the edge of human hearing. It's not loud enough that I can't tune it out, but it's there, definitely. When you first start the projector it 'spins up', and is much more audible in the first few seconds, but for me that draws attention to it and I notice it more as a result.
Finally, brightness uniformity is definitely down from what I'm used to - On white screen the left side is noticeably darker than the center, but while it's noticeable on a white screen, even then it's not severe, and on actual content I don't notice it.
On the other hand, the fan is audible, but it's still *almost* as quiet as the Sony (which is a freaking DREAM from a fan loudness perspective), and it's much smaller (which given the size of my apartment, even though I have tall ceilings, is still nice for making the room feel more open). Plus, with relatively cheap OEM bulbs and a bulb door that's accessible without un-mounting the projector, I expect to be happy keeping this in the living room until my unicorn springs forth from Zeus's forehead or whatever.
So yeah, this isn't ever going to be competitive with projectors that are more focused on picture quality, but it's got some MAJOR advantages as a gaming projector, and still look pretty great.
Oh and if anyone's interested in buying the Sony, PM me (especially if you're in Seattle and can save me figuring out how to ship this behemoth!