AVS Forum banner

Owner's Thread for the Sony X900H (No Price Talk)

7M views 41K replies 2K participants last post by  MeowWoof 
#1 · (Edited)






MSRP
55X900H: $1,199
65X900H: $1,599
75X900H: $2,499
85X900H: $3,499





4K UHD 3840 x 2160 Full Array With Local Dimming LED Panel
HDR10, HLG, Dolby Vision HDR Support
HDMI 2.1|HDCP 2.3
4K/120/eARC

VRR/ALLM(update coming later)

120 Hz Refresh Rate /X-Motion Clarity
Works with Alexa & the Google Assistant
Acoustic Multi-Audio speakers with S-Force Front Surround
ATSC 3.0 tuner




Official PlayStation 5 TV.
 
See less See more
1 2
#8,065 ·
OK guys I'm using my 75 inch 900h with a 16ft Zeskit HDMI 2.1 Cable hooked up to a Club 3D Display Port 1.4 to HDMI 2.1 Active Adapter from my EVGA 2080 super hybrid.

4K 120hz and 2k 120hz PC inputs are available up to RGB 10 Bit Full. Text looks great as you can see by the screenshots.

Even COD Modern Warfare recognizes 2K and 4K 120Hz capabilities of the TV. I played a quick match and NVidia Performance center was reporting "Render Latency" around 8 ms with 7.3 ms being the lowest and "Render Present Latency" around 10 ms. That's pretty dang good.

There's 2 caveats I noticed...
1. 2k 120hz active signal is still 4k 120hz so I guess Nvidia is scaling it up rather making it native. I'm not sure about that.
2. The Adapter and eARC from the TV didn't handshake well with my Denon S950h. I got it to work but it was quirky.

The most impressive aspect of this setup is Sony's motion handling. We all know it's probably the best in the business but I can confirm a 120hz signal performs incredible regardless of the resolution. This setup plays incredible with an RTX card with low latency and basically zero screen tearing without a true VRR implementation right now. Nvidia console didn't report G-Sync option either.

So inconclusion the 900h is fully capable of 4k 120Hz and 2k 120hz without the blurriness as you can see with my various screenshots. I strongly believe culprit of the blurriness is the RTX 3000 series graphics cards. We will definitely know more when the consoles arrive. I will have both so I definitely will report back

I just wish this TV had about 500 more nits of brightness and it would be perfect. Watch the 2021 Sony 900 series have this fixed lol.

And I'm buying nothing but these Zeskit HDMI 2.1 cables because...they work!

View attachment 3050053 View attachment 3050105

View attachment 3050111
That’s extremely good latency. Mine is consistently 50-70 ms. I don’t know how to get it lower. I’m hard wired to my router. Speed is only 100-110 mbs. 5ghz channel. If you know of a way to get lower latency, please share.


Sent from my iPhone using Tapatalk
 
#8,066 ·
I doubt that means anything other than they're making it expressly clear for your average punter that these features work with PS5. I highly doubt it will be fine with PS5 but not other devices.
Yes I wouldn't read too much into it. Its probably just branding and protecting themselves from compatibility issues with other devices.

I still think they will iron most if not all of the issues out eventually to make the HDMI 2.1 features as well rounded as they can be.

Sent from my SM-G975U using Tapatalk
 
#8,067 ·
Guys i have a really bad feeling about this 4k 120HZ issue. When i updated my TV a few weeks back manually the patchlog said:

„activates support for 4K 120Hz“

Now it says

„activates support for 4K 120Hz signals FOR PS5“
View attachment 3050396
I mean it could mean they've tested it with the PS5 and have no issue, so they're now aware of issues with the GTX cards. So hopefully they're working on a fix for it.
 
#8,068 ·
Even COD Modern Warfare recognizes 2K and 4K 120Hz capabilities of the TV. I played a quick match and NVidia Performance center was reporting "Render Latency" around 8 ms with 7.3 ms being the lowest and "Render Present Latency" around 10 ms. That's pretty dang good.
What you are measuring here is the input latency of the game engine/graphics drivers. This does not include input latency of the TV.

To measure input latency of the TV you either need:
.) LDAT: A hardware tool provided by Nvidia the is mounted on the TV and check for differences in the screen.
.) A mouse with an LED when a click happens and a highspeed camera to check input latency manually

If you dont need very precise values you can also do it with slow motion mode of a smartphone:
.) Activate slow motion mode
.) Start filming your mouse and the mouse cursor
.) Slap the mouse

Then check how long it took from start of mouse movement until the cursor starts moving

That’s extremely good latency. Mine is consistently 50-70 ms. I don’t know how to get it lower. I’m hard wired to my router. Speed is only 100-110 mbs. 5ghz channel. If you know of a way to get lower latency, please share.
We are talking about input latency here (time from moving the mouse until you see the changes on screen.

What you are refering to is network latency or ping. This is the response time of the server and is relevant for actions of other players.
 
#8,069 ·
That’s extremely good latency. Mine is consistently 50-70 ms. I don’t know how to get it lower. I’m hard wired to my router. Speed is only 100-110 mbs. 5ghz channel. If you know of a way to get lower latency, please share.


Sent from my iPhone using Tapatalk
Well this varies depending so many things such the GPU, CPU, and even drivers.

Sounds like you have Vsync turned on. That’s the #1 latency culprit. Also turn on Nvidia’s Low Latency Mode to Ultra if you have an Nvidia 2000 series or better. Turn the TV to Game or Graphics as well. Also I bypassed my receiver had everything hooked up to Port 4

As far as network latency I use a Asus AC5300 hardwired. COD usually grabs me less than 50ms ping games depending on the available players


Sent from my iPhone using Tapatalk
 
#8,070 ·
What you are measuring here is the input latency of the game engine/graphics drivers. This does not include input latency of the TV.

To measure input latency of the TV you either need:
.) LDAT: A hardware tool provided by Nvidia the is mounted on the TV and check for differences in the screen.
.) A mouse with an LED when a click happens and a highspeed camera to check input latency manually

If you dont need very precise values you can also do it with slow motion mode of a smartphone:
.) Activate slow motion mode
.) Start filming your mouse and the mouse cursor
.) Slap the mouse

Then check how long it took from start of mouse movement until the cursor starts moving



We are talking about input latency here (time from moving the mouse until you see the changes on screen.

What you are refering to is network latency or ping. This is the response time of the server and is relevant for actions of other players.
Welp I’ll leave that to you pros. I’m not going that far in testing without making a profit somehow lol. I can only report on how it feels and what software tells me based on my hardware. You pros gotta take it to the next level


Sent from my iPhone using Tapatalk
 
#8,071 ·
#8,074 ·
So can anyone confirm if Auto Local Dimming set to High is bugged, or is it actually working as intended? Basically with it on High, if there is a small bright box in the center of the screen, it is actually dimmer than with it on Medium. Not only that, but setting it to High also seems to mess up the white balance and clips some whites! It's very obvious on certain images if you switch back and forth between the settings.
It doesn't seem to be a big issue leaving it on Medium, but it definitely doesn't let the screen get as inky black as it can on High (hard to notice in normal usage).
I noticed a couple of posts in this thread refered to this same issue but the discussion never went anywhere other than "Medium is the proper setting".
So in summary, firmware bug or working as intended?
 
#8,075 ·
Sony has this USB idle bug for years. They haven't fixed it and high chance they won't
You can plug your bias lightning system to power outlet instead.
Hope this isn’t something that stays unfixed. Might just have to unplug from the USB 3.0 when not in use. Probably only use the bias lighting when I want lights-out movie watching anyway. Still better than overhead lighting.
 
#8,076 ·
Well this varies depending so many things such the GPU, CPU, and even drivers.

Sounds like you have Vsync turned on. That’s the #1 latency culprit. Also turn on Nvidia’s Low Latency Mode to Ultra if you have an Nvidia 2000 series or better. Turn the TV to Game or Graphics as well. Also I bypassed my receiver had everything hooked up to Port 4

As far as network latency I use a Asus AC5300 hardwired. COD usually grabs me less than 50ms ping games depending on the available players


Sent from my iPhone using Tapatalk
I’m just using an Xbox one s.


Sent from my iPad using Tapatalk
 
#8,078 ·
So can anyone confirm if Auto Local Dimming set to High is bugged, or is it actually working as intended? Basically with it on High, if there is a small bright box in the center of the screen, it is actually dimmer than with it on Medium. Not only that, but setting it to High also seems to mess up the white balance and clips some whites! It's very obvious on certain images if you switch back and forth between the settings.
It doesn't seem to be a big issue leaving it on Medium, but it definitely doesn't let the screen get as inky black as it can on High (hard to notice in normal usage).
I noticed a couple of posts in this thread refered to this same issue but the discussion never went anywhere other than "Medium is the proper setting".
So in summary, firmware bug or working as intended?
This is working as designed. For accuracy of image, you set it to Medium. If you care more about blooming, you want it set to High.
 
#8,079 ·
This is working as designed. For accuracy of image, you set it to Medium. If you care more about blooming, you want it set to High.
Luckily I don't notice blooming even on Medium, just sometimes larger black patches exhibit that ever so slight backlight glow, but even that's only noticeable in a completely dark room. I just find it strange the High setting screws with the white color balancing and not just brightness...

P.s. I'm just being nit-picky, I still love this set and think it's one of the best you can get at this price!
 
Top