Official 4:4:4 / Chroma Subsampling Thread - Page 16 - AVS | Home Theater Discussions And Reviews
Baselworld is only a few weeks away. Getting the latest news is easy, Click Here for info on how to join the Watchuseek.com newsletter list. Follow our team for updates featuring event coverage, new product unveilings, watch industry news & more!


Forum Jump: 
 14Likes
Reply
 
Thread Tools
post #451 of 462 Old 05-29-2015, 08:30 AM
AVS Special Member
 
RLBURNSIDE's Avatar
 
Join Date: Jan 2007
Posts: 1,762
Mentioned: 4 Post(s)
Tagged: 0 Thread(s)
Quoted: 599 Post(s)
Liked: 448
If you think a 1080p TV showing a UHD Bluray is going to be sharper, you are seriously deluding yourself.

It is in fact you who has no idea how the eye works.

http://hyperphysics.phy-astr.gsu.edu...n/rodcone.html

"Current understanding is that the 6 to 7 million cones can be divided into "red" cones (64%), "green" cones (32%), and "blue" cones (2%) based on measured response curves. They provide the eye's color sensitivity. The green and red cones are concentrated in the fovea centralis . The "blue" cones have the highest sensitivity and are mostly found outside the fovea, leading to some distinctions in the eye's blue perception."

"The rods are the most numerous of the photoreceptors, some 120 million, and are the more sensitive than the cones. However, they are not sensitive to color. They are responsible for our dark-adapted, or scotopic, vision. The rods are incredibly efficient photoreceptors. More than one thousand times as sensitive as the cones, they can reportedly be triggered by individual photons under optimal conditions."

So, 7 million Cones = color sensitive vs 120 million Rods.

There are SEVENTEEN TIMES the number of luma-sensitive receptors than chroma-sensitive ones. And EACH rod is in itself FAR more sensitive to light than each cone is. Do the math.

This is why chroma subsampling works well and is based on sound science.

444 is foolish for anything except windows desktop.

Increasing the luma resolution from 1080p to 2160p is going to increase perceptual detail FAR more than increasing chroma resolution from 540p to 1080p.

I'm not going to debate this further, the science is not on your side. The industry is not on your side. Facts are not on your side, and neither is the marketplace for televisions, or the games industry.

The games industry is going to embrace HDR and wide color in a major way in the coming years, and they will have to do it through HDMI 2.0 since the chance of most TVs getting DisplayPort 1.3 is very low. And doing HDR through HDMI 2.0 is going to require dropping to 422 chroma subsampling to allow 12 bits to fit into 18gbps. That is just a fact. Yes, it would have very slightly better quality at 444, 12-bits, but you are going to have to choose.

You're entitled to play games forever on an 8 bit 1080p TV. Meanwhile, people are going to be jumping on HDR and wanting to take advantage of all the resolution UHD has to offer as well. And that means chroma subsampling.

HDR boosts sharpness in a major way for the EXACT same reason as above, because it takes advantage of not only the greater dynamic range of our rods, but the resolution and sensitivity advantage we have in luma detail due to the fact that we have 17 times as many of them.

Say it with me. 17 times more rods than cones. Let the truth sink in. Science. 'Nuff said.

Last edited by RLBURNSIDE; 05-29-2015 at 08:38 AM.
RLBURNSIDE is online now  
Sponsored Links
Advertisement
 
post #452 of 462 Old 05-29-2015, 08:54 AM
Senior Member
 
Join Date: Apr 2015
Posts: 499
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 317 Post(s)
Liked: 48
Quote:
Originally Posted by RLBURNSIDE View Post
If you think a 1080p TV showing a UHD Bluray is going to be sharper, you are seriously deluding yourself.

It is in fact you who has no idea how the eye works.

http://hyperphysics.phy-astr.gsu.edu...n/rodcone.html

"Current understanding is that the 6 to 7 million cones can be divided into "red" cones (64%), "green" cones (32%), and "blue" cones (2%) based on measured response curves. They provide the eye's color sensitivity. The green and red cones are concentrated in the fovea centralis . The "blue" cones have the highest sensitivity and are mostly found outside the fovea, leading to some distinctions in the eye's blue perception."

"The rods are the most numerous of the photoreceptors, some 120 million, and are the more sensitive than the cones. However, they are not sensitive to color. They are responsible for our dark-adapted, or scotopic, vision. The rods are incredibly efficient photoreceptors. More than one thousand times as sensitive as the cones, they can reportedly be triggered by individual photons under optimal conditions."

So, 7 million Cones = color sensitive vs 120 million Rods.

There are SEVENTEEN TIMES the number of luma-sensitive receptors than chroma-sensitive ones. And EACH rod is in itself FAR more sensitive to light than each cone is. Do the math.

This is why chroma subsampling works well and is based on sound science.

444 is foolish for anything except windows desktop.

Increasing the luma resolution from 1080p to 2160p is going to increase perceptual detail FAR more than increasing chroma resolution from 540p to 1080p.

I'm not going to debate this further, the science is not on your side. The industry is not on your side. Facts are not on your side, and neither is the marketplace for televisions, or the games industry.

The games industry is going to embrace HDR and wide color in a major way in the coming years, and they will have to do it through HDMI 2.0 since the chance of most TVs getting DisplayPort 1.3 is very low. And doing HDR through HDMI 2.0 is going to require dropping to 422 chroma subsampling to allow 12 bits to fit into 18gbps. That is just a fact. Yes, it would have very slightly better quality at 444, 12-bits, but you are going to have to choose.

You're entitled to play games forever on an 8 bit 1080p TV. Meanwhile, people are going to be jumping on HDR and wanting to take advantage of all the resolution UHD has to offer as well. And that means chroma subsampling.

HDR boosts sharpness in a major way for the EXACT same reason as above, because it takes advantage of not only the greater dynamic range of our rods, but the resolution and sensitivity advantage we have in luma detail due to the fact that we have 17 times as many of them.

Say it with me. 17 times more rods than cones. Let the truth sink in. Science. 'Nuff said.
I don't think you're understanding me, how many times do I have to repeat myself.

When you move closer to an object, the size increases.
It takes more space in your vision.
Our 20/20 vision is limited to a certain amount of space
It just so happens the 1080p-1440p spectrum fits perfectly in that space
4k is going outside the boundaries of our vision, and since it is flat and none of the content is properly adjusting the field of view.
The overall resolution is much lower, because the once 1080p image is now stretching beyond the ideal view.


So what happens? Playing video games with collectibles becomes difficult, because your overall vision is worse, you'll tend to miss a lot of details that you could see plainly before.

But wait there's more to it.

4k content is unlikely going to change at all, because if they change the angle it is effectively changing the aspect ratio making it incompatible with older displays which shortens the market.
The only hope to fix this issue is Oculus Rift, because it requires extensive field of view changes; once this implemented games will provide a field of view option which will make the 4k experience significantly better that blows 1080p out of the water.



Does this make sense?


You can't compare chroma subsampling on videos to videogames, chroma subsampling is much more noticable in rendered graphics particularly in anti aliasing; the difference isn't that noticeable at 4k because of the field of view issues, your own mind is blurring the differences in the peripheral vision.
Chase Payne is online now  
post #453 of 462 Old 05-29-2015, 11:26 AM
Advanced Member
 
NintendoManiac64's Avatar
 
Join Date: Feb 2012
Location: Northeast Ohio
Posts: 792
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 347 Post(s)
Liked: 135
Quote:
Originally Posted by Chase Payne View Post
I would just buy cards with the power vs dollar ratio, like the 970's give you the most graphic power for your money; much more than the other cards have in the past.
But then you have to deal with the silly 3.5/0.5GB memory partition thing.
NintendoManiac64 is online now  
post #454 of 462 Old 05-29-2015, 06:19 PM
Senior Member
 
Join Date: Apr 2015
Posts: 499
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 317 Post(s)
Liked: 48
Quote:
Originally Posted by NintendoManiac64 View Post
But then you have to deal with the silly 3.5/0.5GB memory partition thing.
Even then, it is still the best card you can gt with your money right now. It has the highest dollar for power ratio, I have 2 of them and have been very satisfied with it.
Chase Payne is online now  
post #455 of 462 Old 05-29-2015, 08:01 PM
Advanced Member
 
NintendoManiac64's Avatar
 
Join Date: Feb 2012
Location: Northeast Ohio
Posts: 792
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 347 Post(s)
Liked: 135
Ehhh, SLI 970s for 4k is one of the main cases where the 3.5/0.5GB partition silliness can cause issues, so personally I'd still be very hesitant with that set-up for such a use-case.
NintendoManiac64 is online now  
post #456 of 462 Old 05-31-2015, 01:24 PM
Senior Member
 
Join Date: Apr 2015
Posts: 499
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 317 Post(s)
Liked: 48
Quote:
Originally Posted by NintendoManiac64 View Post
Ehhh, SLI 970s for 4k is one of the main cases where the 3.5/0.5GB partition silliness can cause issues, so personally I'd still be very hesitant with that set-up for such a use-case.
Never really had much of a problem, just make sure you're not using the last 500mb of memory. SLI isn't even using the full potential of memory anyway. When Direct X 12 comes out to mainstream, it will use the memory of both cards instead of just duplicating the load. Considering most games can run with less than 4GB of memory even at 4k on medium-high settings, this won't be an issue at all.
Chase Payne is online now  
post #457 of 462 Old 06-20-2015, 08:20 PM
Advanced Member
 
Squishy Tia's Avatar
 
Join Date: Jun 2010
Posts: 757
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 13 Post(s)
Liked: 15
Quote:
Originally Posted by Chase Payne View Post
So what happens? Playing video games with collectibles becomes difficult, because your overall vision is worse, you'll tend to miss a lot of details that you could see plainly before.

But wait there's more to it.

4k content is unlikely going to change at all, because if they change the angle it is effectively changing the aspect ratio making it incompatible with older displays which shortens the market.
The only hope to fix this issue is Oculus Rift, because it requires extensive field of view changes; once this implemented games will provide a field of view option which will make the 4k experience significantly better that blows 1080p out of the water.

You can't compare chroma subsampling on videos to videogames, chroma subsampling is much more noticable in rendered graphics particularly in anti aliasing; the difference isn't that noticeable at 4k because of the field of view issues, your own mind is blurring the differences in the peripheral vision.
Video games on computer @4k are directly rendered without any chroma subsampling, assuming the TV is in PC mode or whatever mode is required to defeat CS. Depending on the allowable texture size, your VRAM, and system RAM, you can get more detail out of 4k than you can in 1080p. Whether or not you can sustain good framerates is not part of the discussion and has no impact on the actual visible content. You'll often find small objects easier to pick out on 4k than 1080p unless you're nearsighted because they tend to have better contrast and/or are separated from the background and surrounding objects more clearly.

@RLBURNSIDE : For computer use, you definitely want to be able to defeat chroma subsampling. All you have to do is visit this page and check the last two lines in the first major image to see what I mean. The text will be very easy to read with CS disabled vs. CS enabled. In fact, CS will actually cause blurring on computer generated images that render in 4:4:4:4/8:8:8:8 colorspace. I can easily see the difference with forum text and in-game chat text by hitting a keyboard shortcut that changes my resolution from 1080p60 to 1080p24. At 1080p60 the TV is in PC Mode, and CS is disabled, thus text is very easy to read. In 1080p24 PC Mode is disabled and the TV's processing (and CS) are enabled and cannot be defeated even in Game Mode. Same image, same resolution, but chroma subsampling is the only difference, and BAM, unreadable text because everything is blurred. You get color bleed like crazy with CS enabled.

both of you: Sitting at the proper distance from any resolution display (1080p, 1440p, 2160p) such that you have at least a 53 PPD (pixel per degree) separation will give you the proper FoV for that resolution. That is why you have to sit further back at 1080p vs. 2160p to not notice individual pixels - the fewer pixels there are in any given space, assuming equal screen sizes, the easier it is to see differences. For current content, 4k simply makes it so that individual pixels are less easily discerned. For computer use, more detail can be implemented at 4k than 1080p, but that's because it is being directly rendered by the GPU, not upscaled like 1080p content would be on the same 4k TV. However, at proper ≥53 PPD viewing distance, 4k will appear sharper at equal screen sizes because you can't see individual pixels easily or at all.

If you don't use your TV as a computer display, CS won't matter much, if at all. But if you do, it's definitely going to affect far more than just the "windows desktop".

Edit: Because the "at" symbol is forcibly recognized as "mention" and screws up poster references. Bleh.
dandiego and NintendoManiac64 like this.

When a Priest says they're going to Flash you, it isn't for healing.

Squishy Tia is offline  
post #458 of 462 Old 06-21-2015, 08:52 AM
AVS Special Member
 
RLBURNSIDE's Avatar
 
Join Date: Jan 2007
Posts: 1,762
Mentioned: 4 Post(s)
Tagged: 0 Thread(s)
Quoted: 599 Post(s)
Liked: 448
Ok let me clear up a few things.

I always maintained that you can tell chroma subsampling is being used for 1-pixel wide text and lines. This fact is not in dispute.

If you go back and read what I wrote, I said that when I was looking for a UHD TV to use as a PC monitor, that I wouldn't get one without 4:4:4 at 60hz.

HOWEVER, games and such do NOT render aliased 1-pixel wide text and lines, especially not at UHD resolutions like your windows desktop. The main text-heavy games like MMOs have scaled, antialiased text which, at UHD at least, probably are > 3 pixels wide at least, in order to be visible at a normal seating distance, meaning the chroma "bleed" is at most 25%, and even that is "antialiased" if you enable edge sharpening on your TV.

Tia, in your own post you even state that at UHD the increased resolution makes it so that if don't sit closer, you have 4x less pixel visibility, meaning if your chroma bleed is under 1/4 of what it would be at 1080p 420 vs 444. Yes, it's visible, but you have to get FOUR TIMES closer.

You guys and gals can stick to 444 all you want at UHD, but that means you are giving up 10-bit color which means no HDR, and no WCG for you.

You cannot tell the difference with chroma subsampling being used in anything but 1-pixel wide, high contrast content like text or lines, or unless you're walking right up next to your TV. The resolution charts that inform you about what your visual acuity are in terms of resolution refer to the luma resolution, not the chroma. The chroma resolution you are much less sensitive to, and that's a fact. You are entitled to your own opinions but not your own facts.

If you are sitting at a sufficient distance from your screen that you can't see the pixels, then you definitely can't see the color bleed from chroma subsampling at that same distance. Again, because of the science I cited, you have 120 million rods and 7 million cones, therefore 17 times the static resolution in luma information than chroma, and that's being generous because minute changes to chroma in absolute magnitude, not just resolution, make it even worse for chroma compared to luma.

Going up from a 1080p to a UHD TV means that chroma subsampling in practice becomes 4x less the non-issue it already is, for games that is.

Most (not all) Games do not use 1-pixel wide text, and if you're talking about an MMO with lots of scrolling text, then it's probably worth it to you to give up 10-bits for 444 (for HDMI 2.0 TVs you can't have both, you have to choose), but you can easily get around that by simply increasing your font size slightly and turning up edge enhancement. Giving up HDR and WCG for 444 is foolish if you ask me.

If you are asserting that you can tell the difference between chroma subsampling ON/OFF in a moving game scene, I will call BS on that, you can't. Period. No way.

First off, most LCD TVs have poor motion resolution, around 600 lines or so, even with BFI enabled. Forget about UHD, most LCDs can't even do FHD properly when frames are in motion, especially not at 60hz. 120hz would be a big improvement in sharpness and clarity during motion. Even then, you couldn't tell if chroma was being used.

Next, even on static frames, with decent upscaling chroma filters and a high enough resolution, I bet you any money that can't double-blind which is better, especially not at UHD.

The human visual acuity is simply not there. Your assertion is as baseless and unfounded as the idea that people can hear above 20khz or tell the difference between 16 bit or 24 bit audio in a double-blind test. If you're doing color correction and zooming in real close and editing per pixel, then yes, it matters. For lossy compression? Not so much. Sure, it would be better that they used JPEG instead of chroma subsampling for compression, but they didn't go that way.

At UHD how close do you have to sit to see individual pixels? If you aren't sitting that close, you CANNOT tell that chroma is used. Because if you don't have the visual acuity to tell 1 pixel from its neighbour, in LUMINANCE, then you definitely cannot tell the difference in chroma.

I make games for a living, and let me tell you, the era of static UIs with 1-pixel wide content and unscaled, non-antialised text is behind us. UIs are going in-game now, rendered just like the rest of the scene. We've done tests internally and using 422 YCoCg compact frame buffer people can't tell that chroma subsampling is used.

Anyway, you guys and gals do what you want. If you want to buy a TV with HDMI 2.0 on it that has HDR and WCG, and not use those features because you think 422 chroma subsampling being used to get 10-bits to your TV is the bane of your existence (it's not, scientifically speaking), then you are free to do that.

Meanwhile, all your HDMI 2 cards and TVs are subject to this bandwidth limitation. 8-bit color SUCKS for games. It's got tons of banding and other horrible artifacts, the worst of which is the fact that you need to quantize a fully HDR pipeline into 8-bit LDR through tone mapping unless your TV supports 10-bits in which case you can leave the firehose of beautiful dynamics open.

Trust me, when you see HDR in Unreal IV, you are going to eat your words. Currently the only way to get that to a UHD TV is to use 422 chroma subsampling, and that's a fact. Not using HDR because it requires subsampling is like racing your honda civic at Grand Prix because it has better mileage when you drive to the corner store. Different speeds, different needs.

Yes, desktop use should use 444, but games? Heck no. We already don't render many internal buffers at full resolution. That's what you're not getting. There is TONS of compromise being used all the time in rendering engines to keep things running fast.

Last edited by RLBURNSIDE; 06-21-2015 at 09:08 AM.
RLBURNSIDE is online now  
post #459 of 462 Old 06-21-2015, 01:16 PM
Advanced Member
 
NintendoManiac64's Avatar
 
Join Date: Feb 2012
Location: Northeast Ohio
Posts: 792
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 347 Post(s)
Liked: 135
There are 10bit 4k monitors... with the standard reduced timings (full timings are only needed for CRTs), at least DisplayPort 1.2 has enough bandwidth to do this with 4:4:4 chroma at 60hz.
NintendoManiac64 is online now  
post #460 of 462 Old 06-21-2015, 01:43 PM
AVS Special Member
 
RLBURNSIDE's Avatar
 
Join Date: Jan 2007
Posts: 1,762
Mentioned: 4 Post(s)
Tagged: 0 Thread(s)
Quoted: 599 Post(s)
Liked: 448
Oh yeah, for sure. If you can get a TV with a Displayport, that's a smart move for PC use. Unfortunately that's not on the menu for most TVs in 2015.

DisplayPort 1.3 would be a great thing to have un a UHD TV that would do 120hz and 12 bit color and HDR and Freesync, all at 4:4:4. I just think it's absurd give up HDR and wide color gamut for 444 in a game, it's a dumb thing to do. The numerous 10-bit offers would massively outweigh the negatives of 422 vs 444.

I collected some PNG lossless screenshots at 2560x1440 of various games and wallpapers, including some that are super high detail and others that are more natural looking, and used GIMP to export it to 422 and 420 to compare.

The only time I could see the difference between original (RGB / 444) and 420 is when I put my eye closer than 6 inches to my LCD monitor. And that's on an image with super fine detail and no AA, like a worst case scenario. I'm using an ACER 1440p G-sync and yes, once you look really really close, 420 does lose sharpness. But, on more natural outside scenes there was no perceptible difference. I defy people to be able to tell any at UHD, even on static images, except for the very worst case scenario and only from very, very close up to the screen.

However, chroma subsampling actually looks like FXAA to me, or maybe TXAA. Some people don't like any AA, but that's foolish. Fine edge detail in games is more likely from aliasing than from any discernible or important high-frequency detail. The best I would describe 420 is like a very mild full scene antialiasing which is totally free. Not like, costs only 1ms, I mean 0ms extra cost since the TV does the upscaling and filtering.

At UHD resolution, you can probably even sit closer than I am to my monitor and not see the difference.

It's all about DPI at that point. But most of the detail that your eyes see is from high contrast changes in luminance information, that's why FXAA operates only on luma (at least in the compute shaders that I've seen recently that do FXAA quite well and cheaply). Although to be fair, 420 subsampling is indeed probably closer to TXAA of Unreal IV. I think it would the height of ridiculousness to even attempt to rationalize the need for hard 1-pixel edges in a game that uses any kind of edge AA. In between edges where there are no glaring luminance gradients, you definitely can't tell that chroma is being subsampled, even very close up. So we're only talking about high-contrast, edge detail. Detail which is probably already antialiased since it's rare for lines to be perfectly vertical or horizontal in a game, in anything but menu items which are probably not 1-pixel wide anyway.

I'd like to write up a script to spider google images for 1440p and UHD Pngs and auto-generate 422 and 420 versions of those, and flip the originals vs the subsampled versions (still losslessly encoded in the end, at the same final resolution), and get people to swipe left or right if they think there's subsampling going on. I bet you that people can't reliably tell the difference, and even that many people would prefer the subsampled versions.

During moving scenes though, there is ZERO chance that chroma subsampling is detectable by the human vision at UHD resolution, even from up close. There is just no way. Probably not even at FHD either. Static images of real game scenes do exhibit some loss of high-control very fine detail, but it's minor and totally undetectable after you are 12+ inches from the monitor or in more natural scenes involving AA already.

Any time you have AA operating on your pixels, you are blurring the edges. Edges are where high contrast details are visible, hence MSAA and FXAA focusing on the edges of geometry or in the final raster image. 420 to me looks like a form of TXAA. Lots of people don't like it. Anyway, it's up to the TVs now.

But 422 + 10 bit WCG + HDR is going to look insanely better than 8-bit SDR. Anyone arguing against that has not seen Unreal IV with Dolby Vision demo. It is STUNNING. That's why I stand by my assertion : if you want a good TV, get one with HDR and 444, but when HDR is available in-game, lose the 444 and pick HDR instead.
RLBURNSIDE is online now  
post #461 of 462 Old Today, 11:14 PM
Member
 
AgentHEX's Avatar
 
Join Date: Jan 2005
Posts: 93
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 9 Post(s)
Liked: 11
Quote:
Again, because of the science I cited, you have 120 million rods and 7 million cones, therefore 17 times the static resolution in luma information than chroma, and that's being generous because minute changes to chroma in absolute magnitude, not just resolution, make it even worse for chroma compared to luma.
This is irrelevant because the center part that does most of the seeing (ie not peripheral vision) has high cone density. Also, human eye resolution is limited by diffraction limit posed by the pupils, not # of "sensors".

Quote:
During moving scenes though, there is ZERO chance that chroma subsampling is detectable by the human vision at UHD resolution, even from up close. There is just no way.
This is true in general for both resolution and color. The visual system (notice I don't say "eye") tends to work at a higher level anyway, and even for the pixel peepers more sensitive to source problems like encoding or various other artifacts.
AgentHEX is online now  
post #462 of 462 Old Today, 11:19 PM
Member
 
AgentHEX's Avatar
 
Join Date: Jan 2005
Posts: 93
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 9 Post(s)
Liked: 11
BTW, are the current samsungs and maybe LGs still the main feasible desktop monitor replacements (ie 40-50 sizes, 444/full color)? I was looking at the JU6700 model but the low PWM backlight freq. gives me pause for someone who is going to spend a lot of time in front of it.
AgentHEX is online now  
Sponsored Links
Advertisement
 
Reply LCD Flat Panel Displays

Tags
Lcd Hdtv



Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off