I laboriously parsed through the EDID data by hand, because I couldn't find a piece of software that reliably decoded all of it, and I figured in doing so I'd understand the standards a bit better.Can the EDID on the Q90R be read to determine if it supports TV variant 4K120 and perhaps point to an nvidia driver bug?
Amazing work!I laboriously parsed through the EDID data by hand, because I couldn't find a piece of software that reliably decoded all of it, and I figured in doing so I'd understand the standards a bit better.
I compared the EDID I got reading from both a straight HDMI 2.0b connection and from the Club3D HDMI 2.1 active adapter. They were identical, which is how it should be. The way Nvidia presents the results in its control panel is confusing, though; more on that below. Much of the info for parsing comes from CTA-861-G, which defines the standard profile for Digital TVs. Among other things, it defines extension data that appears within the overall EDID data; the base EDID data is defined by VESA. A small amount of info comes from the HDMI specification.
Here are the formats that the Q90R advertises using a data block within the CTA extension data, with various restrictions:
With one exception, noted below, all of these formats are specifically defined in CTA-861-G.
In another EDID extension block, this one defined in the HDMI specification, the Q90R advertises that it supports 4:4:4, 10-bit, and 12-bit deep color modes. In yet another EDID extension block, also defined in the HDMI specification, the Q90R specifies that it only supports 4:2:0 for 10-bit color, not for 12-bit. The HDMI specification requires that 8-bit 4:2:0 also be supported for any format that supports any 4:2:0 deep color, so that's simply implied.
Just to make 4:2:0 even more complex, there are two more CTA data blocks that come into play. In one data block, the Q90R specifies that only 4 formats, the ones with a single * above, support 4:2:0 in addition to the color formats defined elsewhere. In the second data block, the Q90R specifies that 4 formats, the ones with a double ** above, only support 4:2:0, and nothing else.
So, put all that together, and the Q90R says 3840/4096x2160p100/120 can only be done with 8/10-bit 4:2:0, and 3840x2160p50/60 are the only other formats that support 8/10-bit 4:2:0.
There's one further wrinkle, which is the 2560x1440p120 format, marked with triple *** above. This format is not actually defined in CTA-861-G. The CTA extension block allows optional data blocks that can define arbitrary additional supported formats, and the Q90R uses that to advertise this format. You may ask where is 2560x1440p60 defined; well, turns out it's defined in the base VESA EDID data, so it's kind of a peculiar split.
Now, how to reconcile this with what you see in the Nvidia control panel? Beats me. Nvidia is categorizing some CTA resolutions as "PC" resolutions, and conversely categorizing some VESA resolutions (like 1680x1050) as "Ultra HD, HD, SD" resolutions. Seems somewhat arbitrary to me, but in the end it's just a categorization thing, not a protocol thing.
So, assuming I have understood things correctly this time, my take is that if the new consoles do turn out to support [email protected] 10-bit 4:2:0, then they should work with the Q90R. And apologies for any confusion I've caused before.
The consoles seem to always favour an 'auto' option by default so I see no reason why they wouldn't work with 420. The Nvidia driver also has an 'auto' option but it's dumb and will send 8bit regardless if you have HDR turned off/on in Windows. You have to manually select 10bit.Would a 4:2:2 option work for 4:2:0?
So if you don't get VRR, what does this get someone who has the XBox hooked directly into the sound bar for Atmos?I got tired of waiting for the eARC update and caved into getting the HDFury AVR Key (Amazon.com: HDFury AVR Key 18Gbps with Enlarged Heatsink | HDMI Audio Extractor | 4K HDR Splitter). I hate that we have to buy stuff like this to get the most out of a TV that costs so much. The AVR Key works great for my Xbox One X with my Q90R and my Pio SC-LX904, almost like it's directly connected to both. No more HDR stripping and no more lack of Atmos, though the refresh rate (defaults to [email protected]), ALL, and VRR are not selectable on the Xbox.
Sent from my SM-N986U using Tapatalk
My AVR was not allowing 4K HDR for Games and Movies at the higher bit rates. I had mostly red on my TV capabilities going through AVR, with AVR Key I now have all green checks (like when I connect directly to TV) except Dolby Vision (because our TV doesn't support DV).So if you don't get VRR, what does this get someone who has the XBox hooked directly into the sound bar for Atmos?
Does other 4k HDR channel content have the same issue?Hi everyone ... I have the Q90 65 and overall pleased, except for Directv 4k live sports. For some reason football, golf, etc that is on one of their dedicated 4k channels is so dark. I've toggled settings, including turning HDR on/off and very little change. The only thing that made a small difference was changing Color Space from native to auto. All other content on Directv is fine ... Streaming via Nvidia Shield is fine ... and Blu-Rays are fine! Just Directv live sports are dark.
I opened this thread over on Directv/AT&T to try and get help on that front.
Thanks in advance!
I should add, though, that to get 4:4:4 to display correctly you need to be in PC mode, and for HDR that results in a fairly significant drop in peak brightness, so using 10-bit RGB vs 10-bit YCbCr 4:2:2 is a tradeoff.I am just wondering whether Q90R supports 4k 60hz 10Bit RGB