AVS Forum banner

7141 - 7160 of 13317 Posts

·
Registered
Joined
·
641 Posts
Not following. I don't see an HDR Video Vivid picture mode on my display.
How Sony did the Vivid option is that you have to go to the reset screen to set it to Vivid mode when you are viewing a HDR. it uses the cool color temp and picture color at 60 I think. If you want to revert back to your original setting do not do reset, but change your color temp back to expert and bring back the color to 50.

1. You have to run make sure you are playing HDR video
2. Go to Advance Setting
3. Go to Reset Menu
4. Reset (HDR Video Vivid).

Do not select the regular Reset ( it will erase your regular expert1 or expert2 color back to default). HDR Vivid mode will only show up when there is a HDR video playing.




I found it on a review, if I did not read the review, would have never known that the TV had two modes. However I tweaked it a bit and removed the black adjust etc. I go between your color temp and I have a setting for Vivid to burn my eyes. I tweaked it to remove edge enhancement etc. It is very bright. 880 nits. Most of the time I leave it on default using your setting, but once in a blue I use the Vivid more.. the colors really pop




"On our Bravia KD75XD9405BU review unit, peak brightness in [HDR Video] mode actually hit 796 cd/m2 out of the box, but because [G-Gain] needed to be adjusted downwards to neutralise greyscale to D65, peak brightness dropped slightly to 740 cd/m2 after calibration. If you don’t mind sacrificing colour accuracy and video fidelity, peak brightness could actually be boosted to 880 cd/m2 by engaging [HDR Video Vivid] mode. Of course, this entailed [Black Adjust] “High”, [Adv. Contrast Enhancer] “High“, [Colour Temperature] “Cool” and [Live Colour] “High“, not to mention excessive edge enhancement brought about by [Reality Creation], all of which constituted an eyesore to our videophilic brain. Full-field peak white settled at 360 cd/m2 in [HDR Video] mode on our review sample."

Sorry for the bad quality pic

http://www.hdtvtest.co.uk/news/kd75xd9405-201606164301.htm
 

Attachments

·
Registered
Joined
·
1,587 Posts
I have the DIRECTV 4K setup. Yes, you need both the HR54 and a Mini client to view the 4K. As was stated earlier, the HR54 will receive 4K, but can't stream it to the TV. There are a couple things I disagree with, though. In my setup, I have the HR54 on my bedroom TV and the Genie Mini client on the 940D. I can access my DVR recordings with full functionality from both locations. There is a very slight lag on the client, but it only shows up when you use the 30 second FF button to skip commercials...and the lag isn't dramatic...it just isn't as smooth and seemingly instantaneous as it is on the HR54 directly. The only thing I've found that I can't do with the client is the split screen picture in picture. That only works with the HR54. As far as content, I did have to upgrade my programming package, but only one level. It costs me $5 more/month. I did NOT have to go to the top tier package...I'm still a couple levels down from there. There are only 3 channels currently available in 4K. One is always on with demo type material and short documentaries, one is PPV movies, and the last is only on sometimes with live events like Norte Dame football and, most recently, Garth Brooks' concert from Yankee Stadium. Hope this helps your decision.
To clarify from my earlier post, I wasn't commenting on lag in any way. I was specifically pointing to not having full functionality of your DVR on what is almost certainly your "main" TV. For me, it would be irritating beyond belief to A) have to move the HR/54 to another location (the other locations are really not roomy enough to accommodate it) and B) have to operate it from that secondary location when I want to do things like change default record settings, remove or update series recordings, cancel items from the To Do List, or order PPV movies.

When I inquired about 4K programming (somewhat recently), I was told you had to have the top tier programming. I just looked it up and it seems that 4K is available on any package from XTRA up (Ultimate, Premier).

Still not interested in going through all of the hassle only to have ONE channel to watch in 4K.

What's more, I think that the picture quality of the set is fantastic. Any quality 1080 recording looks amazing. I've pointedly watched a couple of UHD DVDs, and aside from the blacks being fantastic, I do not observe a "marked" increase in PQ.
 

·
Registered
Joined
·
211 Posts
It's 10/100 which is ridiculous! That said, I've been able to stream 4K content from YouTube, Netflix, and Amazon over a wired connection.
It does seem ridiculous in this day and age and the price of the set the ethernet connection is only 10/100. But 100Mbps should still be more than enough for any 4k streaming.
 

·
Registered
Joined
·
1,587 Posts
It's 10/100 which is ridiculous! That said, I've been able to stream 4K content from YouTube, Netflix, and Amazon over a wired connection.
It does seem ridiculous in this day and age and the price of the set the ethernet connection is only 10/100. But 100Mbps should still be more than enough for any 4k streaming.
It's not ridiculous at all.

There's absolutely no need to over-engineer the network interface and provide 10/100/1000 since that would ALSO mean that owners would need to have Cat5e/Cat6 cabling in the house in order to ensure it works properly. The bandwidth requirements for 4K content are well within the 100Mbps throughput numbers that the interface supports and there is no "risk" of incorrect performance if someone is trying to use Cat5 cabling to carry gigabit signaling.

Putting a 10/100/1000 NIC in these sets would create problems for a lot of owners (and for Sony) and it will do NOTHING to enhance your experience for streamed content.
 

·
Registered
Joined
·
211 Posts
It's not ridiculous at all.

There's absolutely no need to over-engineer the network interface and provide 10/100/1000 since that would ALSO mean that owners would need to have Cat5e/Cat6 cabling in the house in order to ensure it works properly. The bandwidth requirements for 4K content are well within the 100Mbps throughput numbers that the interface supports and there is no "risk" of incorrect performance if someone is trying to use Cat5 cabling to carry gigabit signaling.

Putting a 10/100/1000 NIC in these sets would create problems for a lot of owners (and for Sony) and it will do NOTHING to enhance your experience for streamed content.
That's only true if the stream coming in is one that a cat5 cable couldn't handle. Which would mean over 100Mbps...maybe. Cat5 spec is capable of handling 1Gps under good conditions....i.e. .not too many turns, short enough run, and good cabling. If this tv had 10/100/1000 nic connected with a cat5 cable only capable of physically passing 100Mbps to a router with 10/100/1000 connection then yes they would negotiate the speed most likely to 1Gps. But it's still only going to pass the speed of the incoming stream from the net. IOW, a 30mbps stream from Netflix is going to be passed at the same speed all the way thru the chain. From the modem to the router to the tv. A router is not going to take a 30mbps stream from the modem and speed it up to 1Gps between it and the end device. Its only going to pass the speed it receives. It can't pass along more info than it receives.

But yes 100Mbps is more than enough for streaming.
 

·
Registered
Joined
·
166 Posts
Depends on how far your seating position is. I have mine above our fireplace and use a full motion articulating mount that's angled down and to the right just a bit b/c of furniture placement. We sit about 10 feet away and the angle seems just right. If you sit closer you'd really need it to be angled or the picture would degrade.
Seating is about 12 feet away from the TV with the bottom of the TV starting at about 5'10" from the floor. I just need know if there is any degradation in PQ at all when looking up at the tv at any angle whether it be slight or severe.
 

·
Registered
Joined
·
73 Posts
Do not connect the ps4 pro to the receiver. The ps4 pro can only output HDR when it's connected directly to the tv
Are you sure about that? Got a link?
 

·
Registered
Joined
·
73 Posts
You keep referring to my 4K settings, but to be clear, I have not posted any 4K/HDR settings. What settings are you talking about? I use the Default settings under the HDR Picture Mode. My white balance settings are in use, and I configure the motion settings to the Custom settings discussed recently. Otherwise, I don't change anything from the default.

As for turning off HDR, that setting on the Oppo is to accomodate older displays that don't support HDR. I don't know why you would want to turn off HDR on the 940D. It sounds like there is something about the default HDR Picture Mode settings that you don't like, and I'm not sure I understand why. I haven't watched that much HDR content with the Oppo yet, but what I have watched is pretty stunning. Streamed HDR content is somewhat disappointing, but that is not the fault of the 940D, IMO.
All good, well aware that you haven't posted 4K/HDR settings, only 1080p and 4K settings.

Regarding turning off HDR, as I mentioned I too don't know why someone would want to. I was just asking as it is an option on the Oppo.

I've got no issue with HDR at all - I was just correcting the mistake I initially made where I used your 4K settings for 4K/HDR.

As mentioned, i've now reset my HDR picture mode (which is where I first discovered the standard+vivid reset options).

The 940D has two HDR modes - Default (expert or Austin Jerry Color settings etc if you used them) or HDR Vivid (cool temp mode)

Under Picture Mode (HDR Video) -> Advance Settings -> Reset (out of box / default ) 796 nits ( if you reset this mode, you will have to redo JT's color settings for regular UHD/HDR viewing)

or Reset (HDR Video Vivid) this mode reset the TV to Vivid Mode with 880 Nits. However this mode uses the cool color temp and some edge enhancements, but it is sure bright and you can kill some of the edge enhancements by tweaking some of the settings, plus you can get it even brighter with the gamma settings, (hdtvtest.co.uk) has it in the review.
I have no idea how you select this HDR Vivid mode, but I stumbled across it as mentioned above, when I went to reset my HDR picture mode to default.

Initially I selected the HDR Vivid reset and it then loaded the vivid view, which looked cold and horrible. I then reset the main HDR picture setting and it was back to factory.

I have to double check if that affected any of the 'Custom' picture mode configuration which I based off AJ's settings.

Not following. I don't see an HDR Video Vivid picture mode on my display.
How Sony did the Vivid option is that you have to go to the reset screen to set it to Vivid mode when you are viewing a HDR. it uses the cool color temp and picture color at 60 I think. If you want to revert back to your original setting do not do reset, but change your color temp back to expert and bring back the color to 50.

1. You have to run make sure you are playing HDR video
2. Go to Advance Setting
3. Go to Reset Menu
4. Reset (HDR Video Vivid).

Do not select the regular Reset ( it will erase your regular expert1 or expert2 color back to default). HDR Vivid mode will only show up when there is a HDR video playing.

I found it on a review, if I did not read the review, would have never known that the TV had two modes. However I tweaked it a bit and removed the black adjust etc. I go between your color temp and I have a setting for Vivid to burn my eyes. I tweaked it to remove edge enhancement etc. It is very bright. 880 nits. Most of the time I leave it on default using your setting, but once in a blue I use the Vivid more.. the colors really pop

"On our Bravia KD75XD9405BU review unit, peak brightness in [HDR Video] mode actually hit 796 cd/m2 out of the box, but because [G-Gain] needed to be adjusted downwards to neutralise greyscale to D65, peak brightness dropped slightly to 740 cd/m2 after calibration. If you don’t mind sacrificing colour accuracy and video fidelity, peak brightness could actually be boosted to 880 cd/m2 by engaging [HDR Video Vivid] mode. Of course, this entailed [Black Adjust] “High”, [Adv. Contrast Enhancer] “High“, [Colour Temperature] “Cool” and [Live Colour] “High“, not to mention excessive edge enhancement brought about by [Reality Creation], all of which constituted an eyesore to our videophilic brain. Full-field peak white settled at 360 cd/m2 in [HDR Video] mode on our review sample."

Sorry for the bad quality pic

http://www.hdtvtest.co.uk/news/kd75xd9405-201606164301.htm
I haven't seen it as a available option. HDR picture mode is greyed out for me when HDR content is playing.

I only came across vivid when you go to reset the HDR mode.
 

·
Registered
Joined
·
211 Posts
Are you sure about that? Got a link?


Hi,

That's not true. Once the receiver supports HDMI 2.0a Connectivity and is HDCP 2.2 Compatible you can connect The PS4 to the receiver to get the HDR option on the console. The Marantz site says the NR1606 supports 2.0a and hdcp 2.2. Connect it straight to the Marantz and make sure you HDMI port on the 940D is set to enhanced and you are using a Certified HDMI 18gbs cable.

Connecting it to the Marrantz will allow you to bit stream you audio also. Straight to the TV you will have to use the TV speakers or run a toslink cable to the receiver to get audio from the receiver.


On 940D: HDMI Enhanced

On PS4:
Settings -> Sound and Screen -> Video Output Settings

Resolution: Automatic or 2160p
RGB Range: Automatic
HDR: Automatic
Deep Color Output: Automatic

Also you can do a quick check on the PS4 after you are done

Settings -> Sound and Screen -> Video Output Settings
Video Output Information (will give you output info)

Also, when you run the first hdr game on the PS4 it will pop up with an option saying the game is in HDR mode.

Unless this is something specific to his model of receiver this is not true. I have my PS4 pro connected to my Denon receiver and then to my KS9000 and when I play a game with hdr(FFXV in this case) my tv gives me the HDR is playing message when I start the game. If the receiver is compliant in passing thru HDR spec it shouldn't matter.
If the receiver is HDR capable the PS4 Pro passes it with no issue.
 

·
Registered
Joined
·
2,123 Posts
It's 10/100 which is ridiculous! That said, I've been able to stream 4K content from YouTube, Netflix, and Amazon over a wired connection.
Which is exactly why it isn't ridiculous. There is nothing the set is capable of that would benefit from faster than 10/100 (except maybe downloading apps/system updates faster) so why bother?
 

·
Registered
Joined
·
572 Posts
Thanks! Did you set the HDMI port that you're using to "Enhanced" on the 940D?



For Video/Output Settings on the Marantz I'm using:

Video Mode - Auto

Video Conversion - On (my understanding is this just converts non-HDMI sources to HDMI before passing the signal to the TV, no additional processing)

i/p Scaler - Off (this one would handle upscaling, but I'm letting the TV do that)


What's your reasoning for the tv to do the upscaling? My sr6010 does a better job upscaling than the tv. Did you notice something different?




Sent from my iPhone using Tapatalk
 

·
Registered
Joined
·
56 Posts
To clarify from my earlier post, I wasn't commenting on lag in any way. I was specifically pointing to not having full functionality of your DVR on what is almost certainly your "main" TV. For me, it would be irritating beyond belief to A) have to move the HR/54 to another location (the other locations are really not roomy enough to accommodate it) and B) have to operate it from that secondary location when I want to do things like change default record settings, remove or update series recordings, cancel items from the To Do List, or order PPV movies.



When I inquired about 4K programming (somewhat recently), I was told you had to have the top tier programming. I just looked it up and it seems that 4K is available on any package from XTRA up (Ultimate, Premier).



Still not interested in going through all of the hassle only to have ONE channel to watch in 4K.



What's more, I think that the picture quality of the set is fantastic. Any quality 1080 recording looks amazing. I've pointedly watched a couple of UHD DVDs, and aside from the blacks being fantastic, I do not observe a "marked" increase in PQ.


I certainly agree that it's likely not worth all that to have one and a half 4K channels. In my system, I am able to do all of those things you mentioned with the DVR settings/recordings (update series recordings, to-do list, etc) from the mini client on my main TV. The only function I've found that I lose on the client, so far, has been split screen/ picture in picture. I'm hoping that they will expand their 4K offerings in the near future. The tech that did my setup said that a one-box solution is in development. Who knows if/when we will see that?
 

·
Registered
Joined
·
50 Posts
There's absolutely no need to over-engineer the network interface and provide 10/100/1000 since that would ALSO mean that owners would need to have Cat5e/Cat6 cabling in the house in order to ensure it works properly.
Speaking as a software engineer, and former IT guy, this is absolutely wrong. Gigabit NICs will step down without any issues at all. Also, the spec says you need CAT 5E cabling to get gigabit, but short runs of CAT 5 will also work at times. I have CAT 6 in my house and my CAT5/5E devices work just fine. I've also hooked plenty of Gigabit devices into 10/100 hubs/switches in my day and they also work fine.

In addition, most modern routers are shipping w/Gigabit switches built in. A number of consumers have their routers right next to their TVs, so this make it even less of an issue.

Putting a 10/100/1000 NIC in these sets would create problems for a lot of owners (and for Sony) and it will do NOTHING to enhance your experience for streamed content.
"Problems for Sony"? What kind of problems? There's no engineering challenge, but there is a small financial one.

Which is exactly why it isn't ridiculous. There is nothing the set is capable of that would benefit from faster than 10/100 (except maybe downloading apps/system updates faster) so why bother?
The set is capable of running PLEX and doing DLNA streaming. Both of those can take advantage of the faster connection.

Also, just because today's bitrates and codecs don't take advantage of speeds over 100Mbps, doesn't mean "tomorrow's" will not. The number of homes with 300-500Mbps internet service grows by the day. I'm paying less today for 150Mbps service than what I paid for 75Mbps less than 2 years ago.

There's really no reason why an $8K (at the time of release) TV can't include a Gigabit NIC.
 

·
Registered
Joined
·
2,292 Posts
Vudu

Does this TV have VUDU in UHD like the Nvidia Shield? Right now, my Shield can play UHD movies from VUDU but they just aren't HDR. Is the Sony app the same way?
 

·
Registered
Joined
·
1,512 Posts
Seating is about 12 feet away from the TV with the bottom of the TV starting at about 5'10" from the floor. I just need know if there is any degradation in PQ at all when looking up at the tv at any angle whether it be slight or severe.

Your question is impossible to answer. There is always an angle where the picture will degrade. But an adult in a normal seating position shouldn't have much of an issue at 12 feet away. I suggest you hedge by going to a dealer and view the set from the position you have in mind. If you are going to wall mount you should chose a mount that tilts.


At 12' only someone with better than 20/20 vision can fully resolve a 2k image. To see full 4k you'd have to sit a lot closer.
 

·
** Man of Leisure **
Joined
·
22,879 Posts
I believe @AustinJerry said that the TV does a better job. Or perhaps he just said it was a preference.
I believe I said the Bravia scaler does an excellent job and, when in doubt, it should be left to do the up-scaling. There are test patterns to assess a scaler's performance, but many times the difference in measurements are subtle.
 
  • Like
Reactions: dmbfan36_23

·
Registered
Joined
·
335 Posts
There's really no reason why an $8K (at the time of release) TV can't include a Gigabit NIC.
Agree wth everything you said. Especially when you count the fact that most consumer-y equipment (routers etc) will usually run at 50-70% (?) of their max rated capacity in a typical home network and you can start to get into trouble with a 100Mbps interface if you're trying to stream high bitrate files.
 
7141 - 7160 of 13317 Posts
Top