Ok let me clear up a few things.
I always maintained that you can tell chroma subsampling is being used for 1-pixel wide text and lines. This fact is not in dispute.
If you go back and read what I wrote, I said that when I was looking for a UHD TV to use as a PC monitor, that I wouldn't get one without 4:4:4 at 60hz.
HOWEVER, games and such do NOT render aliased 1-pixel wide text and lines, especially not at UHD resolutions like your windows desktop. The main text-heavy games like MMOs have scaled, antialiased text which, at UHD at least, probably are > 3 pixels wide at least, in order to be visible at a normal seating distance, meaning the chroma "bleed" is at most 25%, and even that is "antialiased" if you enable edge sharpening on your TV.
Tia, in your own post you even state that at UHD the increased resolution makes it so that if don't sit closer, you have 4x less pixel visibility, meaning if your chroma bleed is under 1/4 of what it would be at 1080p 420 vs 444. Yes, it's visible, but you have to get FOUR TIMES closer.
You guys and gals can stick to 444 all you want at UHD, but that means you are giving up 10-bit color which means no HDR, and no WCG for you.
You cannot tell the difference with chroma subsampling being used in anything but 1-pixel wide, high contrast content like text or lines, or unless you're walking right up next to your TV. The resolution charts that inform you about what your visual acuity are in terms of resolution refer to the luma resolution, not the chroma. The chroma resolution you are much less sensitive to, and that's a fact. You are entitled to your own opinions but not your own facts.
If you are sitting at a sufficient distance from your screen that you can't see the pixels, then you definitely can't see the color bleed from chroma subsampling at that same distance. Again, because of the science I cited, you have 120 million rods and 7 million cones, therefore 17 times the static resolution in luma information than chroma, and that's being generous because minute changes to chroma in absolute magnitude, not just resolution, make it even worse for chroma compared to luma.
Going up from a 1080p to a UHD TV means that chroma subsampling in practice becomes 4x less the non-issue it already is, for games that is.
Most (not all) Games do not use 1-pixel wide text, and if you're talking about an MMO with lots of scrolling text, then it's probably worth it to you to give up 10-bits for 444 (for HDMI 2.0 TVs you can't have both, you have to choose), but you can easily get around that by simply increasing your font size slightly and turning up edge enhancement. Giving up HDR and WCG for 444 is foolish if you ask me.
If you are asserting that you can tell the difference between chroma subsampling ON/OFF in a moving game scene, I will call BS on that, you can't. Period. No way.
First off, most LCD TVs have poor motion resolution, around 600 lines or so, even with BFI enabled. Forget about UHD, most LCDs can't even do FHD properly when frames are in motion, especially not at 60hz. 120hz would be a big improvement in sharpness and clarity during motion. Even then, you couldn't tell if chroma was being used.
Next, even on static frames, with decent upscaling chroma filters and a high enough resolution, I bet you any money that can't double-blind which is better, especially not at UHD.
The human visual acuity is simply not there. Your assertion is as baseless and unfounded as the idea that people can hear above 20khz or tell the difference between 16 bit or 24 bit audio in a double-blind test. If you're doing color correction and zooming in real close and editing per pixel, then yes, it matters. For lossy compression? Not so much. Sure, it would be better that they used JPEG instead of chroma subsampling for compression, but they didn't go that way.
At UHD how close do you have to sit to see individual pixels? If you aren't sitting that close, you CANNOT tell that chroma is used. Because if you don't have the visual acuity to tell 1 pixel from its neighbour, in LUMINANCE, then you definitely cannot tell the difference in chroma.
I make games for a living, and let me tell you, the era of static UIs with 1-pixel wide content and unscaled, non-antialised text is behind us. UIs are going in-game now, rendered just like the rest of the scene. We've done tests internally and using 422 YCoCg compact frame buffer people can't tell that chroma subsampling is used.
Anyway, you guys and gals do what you want. If you want to buy a TV with HDMI 2.0 on it that has HDR and WCG, and not use those features because you think 422 chroma subsampling being used to get 10-bits to your TV is the bane of your existence (it's not, scientifically speaking), then you are free to do that.
Meanwhile, all your HDMI 2 cards and TVs are subject to this bandwidth limitation. 8-bit color SUCKS for games. It's got tons of banding and other horrible artifacts, the worst of which is the fact that you need to quantize a fully HDR pipeline into 8-bit LDR through tone mapping unless your TV supports 10-bits in which case you can leave the firehose of beautiful dynamics open.
Trust me, when you see HDR in Unreal IV, you are going to eat your words. Currently the only way to get that to a UHD TV is to use 422 chroma subsampling, and that's a fact. Not using HDR because it requires subsampling is like racing your honda civic at Grand Prix because it has better mileage when you drive to the corner store. Different speeds, different needs.
Yes, desktop use should use 444, but games? Heck no. We already don't render many internal buffers at full resolution. That's what you're not getting. There is TONS of compromise being used all the time in rendering engines to keep things running fast.
Last edited by RLBURNSIDE; 06-21-2015 at 10:08 AM.