Originally Posted by ttn333
The only thing I can remember is it was just dimmer. I thought it was the projector as this is my first experience with projector. When I switched to the Xbox one x, it was the same. This only reinforced my initial thoughts. I was actually looking into a higher gain screen when I picked up the Panny. The resolution wasn't the problem, it's the brightness and hence the colors were severely hindered.
Originally Posted by (CinemaScope)
I've been running some tests with the HDR mode enabled and disabled on the Sony Player.
A few things I noticed.
1. Leave all video settings on Auto, except for Resolution, I set that to 4K. If you choose a color gamma like 4:4:4, you will see a slightly darker image. Combine this with HDR, and it looks like crap. It's extra dark. Makes you not even want to enable HDR at all. Kills the effect completely.
2. Disable HDR, and allow the UHD60/65 do the HDR. This is extremely huge. The difference it makes is crazy. Colors pop, brightness and contrast is there. But, I feel like some bright areas of movies look too bright, to the point where some details get washed out.
I have tested all HDR modes with this, and couldn't find one that I feel confident using throughout all my movies. Some scenes look great, some look just ok, and some make you want to turn HDR off. I'm still testing with this, along with adjusting some modes in the projector.
3. If you set HDR on the Sony player to Auto, and adjust your contrast all the way up to +40ish range, the brightness for HDR content looks decent. But why would anyone want to set their contrast that high?
I think the problem sits with the player. All my 1080P content looks bright and colors pop. If you turn HDR off on the player and projector, the UHD movies look bland, and have no color pop to them. I'm sure some of these things can be adjusted, but should you have to?
Overall, the Sony player is just not a good idea for HDR content out of the box. It seems like you need to keep adjusting things and resorting to reference disks like Disney WOW or AVS Rec709 to get ideas on what to calibrate when HDR in enabled with the player.
I'm still giving the Sony a few more tries. Ideally I would like to leave the HDR mode on Auto. Choosing SDR to HDR on the UHD60/65 feels un-natural.
Originally Posted by (CinemaScope)
Upon further testing, I realize, I will need to use the SDR to HDR option when playing 4K discs. it's the only way HDR will look nice enough to watch.
I've tried adjusting brightness/contrast with various youtube videos and some calibration discs I have. Nothing helps when HDR is enabled on the Sony player. All content just looks too dark to actually enjoy the movie. The colors are where I want them to be, but it's just too dark.
I picked up the patriot 4K UHD. I tested between the 4K version and Blu-Ray version. It's comical to see the Blu-Ray version beat the 4K when it comes to brightness and color pop. Seriously something wrong with the UBP-X800.
I will try a player or two from best buy to hold me over until I see some better players come out. I cannot justify $500 for the Panny right now.
Also waiting for a 4K calibration disc. Nobody makes one yet.
The Sony (and XBox) is doing it correctly actually. The reason it's dim is due to HDR being mastered for displays like flat panels that can do much higher brightness (600, 1,000 nits+) than projectors. What it's doing is cramming, say for simplicity's sake, the 1,000 nit mastered UHD HDR Bluray down into a projector's 100 nits space. So what would be 100 nits on a 1,000 nit display would only be 10 nits on the projector's 100 nits peak capability. See how much dimmer that would look?
What has to happen is something called "tone mapping", so it takes that 1,000 nit HDR Bluray and maps it to what the display (projector in this case) is capable of, so the brightness levels can be displayed at their proper places for said device, hence increasing the "brightness" of each scene to the same level/percentage that it is on the master, or even elevating it in some instances (this is up to the creator of the tone map) to better serve low nit displays.
Tone mapping can happen in the source or the display, but you shouldn't do it on both and you DEFINITELY should NOT turn off HDR in the source and then turn on pseudo fake HDR in the display and then think you're actually getting real "HDR"! I mentioned this previously somewhere, but a good analogy of this would be if you had a nice Dolby 9.1 surround audio source and you down-convert it to stereo PCM, and then output that to your Dolby Surround AVR via HDMI, toslink optical or whatever and then re-upconvert that stereo audio back into simulated Dolby 9.1, 7.1, etc. surround sound for your surround sound speaker setup theater! Of course if you do that, what you're hearing wouldn't be true discrete audio channels right? That's the same as what you're doing, but with the video. You're not getting the true HDR video image that's mastered in the source. See the issue there?
As I said, tone mapping can be done in the source in some instances, like the Oppo 203 and 205 UHD Bluray players or the Lumagen Radiance Pro video processor, or it can be done at the display. This is why you see Owners of the JVC DiLA and Sony SXRD projectors creating custom gamma curves and settings, and the manufacturer's attempts at doing this too (Sony's HDR Contrast Slider, for example, which the Panasonic UB900 UHD Bluray has too). Epson has also released a few FW updates for their Home Cinema 4000, 5040/6040UB lines too, which gives you brighter options to select for HDR sourced video. I like to think I had a hand in that with my custom "tone mapping" that I created awhile ago, initially for the 5040UB since many complained it was too dark just like folks are saying here. I called it "HarperVision" and you can search that model's Owners thread for details. I then expanded that to even be able to play full HDR on so called "non-HDR" displays like the Epson LS10000 laser projector and also to other HDR capable ones like the LS10500.
The best results by far so far have been with the 5040UB (HDR capable) and my latest experiment with the BenQ LK970 laser XPR DLP, which is marketed as "non-HDR", but my results after HarperVision are absolutely incredible! You just need a device that can force the HDR signal to be sent (Oppo 203, etc.) or one that tricks the source into thinking it's connected to a fully HDR compliant display, like the HDFury Vertex or Linker devices.
The two best UHD Bluray players that I know of are still the Oppo 203/205 (great customizable HDR to SDR tone mapping) or the Panasonic UB900 (HDR contrast slider). Panasonic is coming out with a new one soon though, the UB820 I think it's called. Oppo is going out of the hardware business, unfortunately.
Originally Posted by cliptags
Just looking for some advice, I've currently got a BENQ W1070+ and considering an upgrade to the UHD60. I've got a bit of a squashed room, I currently get about 120-125 inches diagonal on the BENQ at a distance of about 11 feet. Looking at Optoma's website it seems to indicate I'd only be getting about 110 inches diagonal. I just wanted to check if that was correct (Optoma's tool isn't the most clear!) and if so, is it worth losing 10-15 inches (and 1600 pounds!) for the upgraded picture? I'm currently using a OLED 65 inch LG for my 4k content and it looks phenomenal! Can only imagine what it would look like blown up to that size!
Thanks in advance!
I recommend you get the new UHD51A instead. It has a much shorter throw distance so you'll be able to get a larger image in that same 11'. It also features the RGBRGB color wheel which has much better contrast and colors than the UHD60 with the RGBY color wheel, designed to be brighter for high ambient light rooms, but without as much contrast or color saturation. The UHD51A also can do 3D from sources like Bluray players which the UHD60/65 can't. It is also Amazon Alexa enabled if that matters and has a totally retooled GUI that looks more modern and up to date.
Shoot me a PM if you want more info so we don't get off the topic of this thread.