Originally Posted by CaptinCrunch
Sony has gotton a lot of negitive feedback about how it implemented Dolby Vision on their TV's. The preset DV modes could be added to the X900F since all they are is preset modes. No different than if you were to write down the settings off one of those 950G's and added them to your custom profile.
Does anyone have both a 900F and a 950G for comparison? I'm curious how much brighter this Dolby Vision Bright setting is, and if it's much different from just raising the Gamma on the 900F while in DV mode (The clipping of blacks over HDMI is a separate issue). After spending several months with this set and watching various types of content (which is a huge deal, as many film colorists and editors currently differ widely in how they are implementing HDR), I've come to really appreciate how it is used on the 900F. While I understand that there are definitely brighter TVs out there that may allow for a greater punch with the HDR effect, I think the 900F is actually very robust in its capabilities. To appreciate it, you've got to get over the initial impulse to play your SDR content at eye searingly bright levels (which is hard not to do, because it just seems SO much bolder than what we've been used to before). It wasn't until I embraced a slightly dimmer SDR experience that the picture really locked in for me.
Two stories related to this:
I initially had the Picture set to Brightness: 25, then eventually settled at 20. For many months. Then, after seeing several configurations posted that dialed that WAY down, I finally ended up with a compromise of 13. I watch TV primarily in a bright room, so this seems appropriate. After getting used to the way this looked on the TV (which doesn't take long, just stop flipping back and forth between settings), I happened to visit my in-laws, who have my old Panasonic Plasma in their house, with my original calibration intact. I was struck by how much brighter the 900F at 13 seemed than what I had been used to for so many years! It's almost literally like night and day.
Then to further drive the point home to me, a couple of months ago, when my wife and I went to see Captain Marvel at our newly renovated local theater, which has great quality projector installed, I was amazed when I noted just how dim THIS screen was (it was a 2D showing, just for reference), even compared to Dolby Vision content at home. This is the experience that Dolby Vision is attempting to replicate in your home, and in my experience it was still far dimmer than what I am getting on my 900F. And note, in a dark movie theater, with a huge screen, in the shortest time your eyes adjust to the picture and the brightest whites seem like a very bright day outside, it's really all relative.
If any of you are familiar with performing or recording classical music, I would compare the way Dolby Vision works to how you should appreciate dynamics. Older audio recordings seem quieter to our ears, but really they just respected dynamics, so when a classical music piece is supposed to be very quiet, you can only barely hear it, but then as it rises to the climax of a piece, it gets so loud that you might reach for the volume knob. Newer recordings have flattened out a lot of this dynamic range so that you can hear all of the parts without adjusting the volume on your earbuds, but if you've ever been to a live orchestra concert, you know that pianissimo can be so quiet that everyone holds their breath, and then fortissimo can be as loud as a rock concert! Basically, DV sets a brightness range for the film you are watching based on the abilities of the TV. So with HDR, the brightest light will display at the brightest nits available in the set, and the darkest blacks at the darkest available for the set. Then it sets everything else on a scale in between that. If it artificially pushed the brightness for mid range bright colors, it would kill the contrast of things, and you would essentially end up seeing clipping. Basically, it would look like SDR content with the brightness jacked up. The whole point of HDR is to see those extra bright peaks that have never before been available in home entertainment, because of the limited range of screen brightness. Now you can have a bright sky, but the sun can poke through it as it does in real life. In a rainy scene the headlights of a car or flashlight can beam through the haze with clarity. And lightning and fire appear hotter and more dangerous than ever!
That's not to say that I think the 900F has the PERFECT implementation of HDR and Dolby Vision. I certainly think that if Sony bothered to tweak a couple of things, it would come close, though.
Here's an experiment to try if you're thus far underwhelmed by HDR and Dolby Vision: Take a movie that you have available in Dolby Vision and find a part of it that is pretty middle brightness, maybe an indoor well lit setting. Try to take a picture of the screen, or just flip back and forth between SDR and HDR if possible (this is easy on an Apple TV, just disable Match Dynamic Range; if that's not an option try a movie that you own in 4K, but can also stream in HD elsewhere, though that will be less exact of a match). Then adjust the brightness in your SDR picture settings until the mid ranges are about the same as on the Dolby Vision copy, likely this will result in a much lower Brightness setting. Try watching SDR content with those settings for a few days, you will probably adjust to it fairly easily. Then watch something in Dolby Vision, and be wowed by the extra bright colors that occasionally occur, bringing the image to life!
Sorry for the ramble-y post; I guess I have opinions on this!