What bit depth (color depth) should I expect on 2012 TV with new HDMI cables? - AVS Forum
Forum Jump: 
 
Thread Tools
post #1 of 17 Old 02-05-2013, 05:46 PM - Thread Starter
Member
 
basil lambri's Avatar
 
Join Date: Aug 2008
Location: South Euclid, OH
Posts: 47
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
I have a question about what kind of bit depth (color depth) I should expect from my TV's picture with new HDMI cables that I just bought. I was hoping that some of you tech-savvy guys might have an answer.
I have a 2012 LED TV made by Samsung. I bought 2 new NXG HDMI cables the other day from Radio Shack and they were inexpensive. I use one HDMI cable to connect the TV to the HD cable TV converter box and one cable to connect the TV to the Blu-ray player. Those NXG HDMI cables say that they can even be used for 4K TV, so I guess they are HDMI 1.3 type, or even 1.4. They say that they can support "Deep Color," that is color depth that is even 48-bit.

My question is: what kind of bit depth should I expect in the picture on my 2012 Samsung TV using those new HDMI cables when watching HD cable TV or BD discs, etc.? Could it be 32-bit or something like that, or maybe what comes out on the TV is 10-bit, or something like that? Actually, I would say that the picture I get from the cables is nice and vibrant.

Also, by the way, what bit depth (color depth) picture would you say one gets when watching a new LED TV like mine when using a simple RF coaxial cable connected to an outdoor TV antenna, watching over-the-air TV?

Edit: My thread posting was posted twice (instead of once) by mistake, but I stand by what I said!... smile.gif
basil lambri is offline  
Sponsored Links
Advertisement
 
post #2 of 17 Old 02-19-2013, 03:11 PM
AVS Special Member
 
TomCat's Avatar
 
Join Date: Jul 2000
Posts: 2,400
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 12
There is no good reason to be concerned about this. The bit depth of consumer HD as delivered by OTA, cable, sat, internet is never more than 4:2:0 at 8 bit. Simply transporting 8 bit 4:2:0 video through a chain that has the potential of handling higher bit rates or color depth schemes imparts no additional quality; the quality level is fixed at 8 bit 4:2:0.

The only need for a higher bit depth is in acquisition or the production chain, and the reasons there are due to potential generational losses during editing and compositing. There are no generational losses in consumer delivery and display because there is no actual binary math done that would imply generational losses. Once consumer HD is received it is not manipulated inside the digital domain at all; it is simply decoded, delivered to the light engine, converted to analog, and displayed.

There is consumer level "digital processing", but that is a misnomer. There are digitally-controlled parameters in how digital or analog signals are moved through the display chain, which if you bend the hype sufficiently can be referred to as "digital processing", but there is no actual processing that happens to the signal itself within the digital domain. True digital processing inside the digital domain is strictly avoided due to the fact that at 4:2:0 and 8-bit video at consumer-level bit rates, the loses to PQ would be apparent.

The dirty little secret is that HDMI uncompressed HD video is regularly converted to analog for processing immediately as it enters the HDTV, usually in a second stage (DAC stage) of the same HDMI RX chip that it enters on. It can then be moved through the signal chain by digitally-controlled circuitry, but since the signal itself is either analog by then or digital with no actual processing of the signal itself within the digital domain, it is really a misdirect to refer to that as "digital processing". It's more of a bald-faced lie, actually, but they want to sell you on it being digital because that gives the false impression of state-of-art technology that consumers crave.

Analog processing in a closed environment such as this has virtually no artifacts and is cleaner than digital processing (manipulation by performing a mathematical operation) so is the protocol of choice. Manufacturers do not want to reveal this to you because the general public has the incorrect perception that analog is equivalent to a lower quality. In this particular application it is actually much better.

There's no place like 127.0.0.1
TomCat is offline  
post #3 of 17 Old 02-19-2013, 03:12 PM
AVS Special Member
 
Otto Pylot's Avatar
 
Join Date: May 2004
Location: San Jose, CA
Posts: 7,076
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 91 Post(s)
Liked: 228
Blu-ray discs are encoded in 24 bit color (8 bits for each color, R-G-B). Certified High Speed HDMI cables have the capability of sending higher bit rates but there in nothing, so far, that is encoded in higher bit rates. Up-converting 24 bit color to 30, 32, or 48 will only introduce contouring with a resultant loss of pq. Deep Color is a nice idea, and most blu-ray players and tv's can support it, but there is nothing commercially available that uses it. It's sort of a spec for the future that really hasn't taken off. It's best to leave Deep Color disabled because you will have issues.

HDMI cables are either Standard or High Speed. The numbered designation (1.3, 1.4a, etc) is a hardware spec, not a cable spec.
Otto Pylot is offline  
post #4 of 17 Old 02-19-2013, 05:07 PM
AVS Special Member
 
TomCat's Avatar
 
Join Date: Jul 2000
Posts: 2,400
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 12
Quote:
Originally Posted by Otto Pylot View Post

Blu-ray discs are encoded in 24 bit color (8 bits for each color, R-G-B)...
That is then a misleading misnomer. It really means that each pixel value is represented by a 8-bit digital word, which is the same thing as what the definition of 8-bit video is. The Y, the Cb, and the Cr in Y Cb Cr are also each represented by 8-bit digital words for each pixel, and no pixel in either format is represented by a 24-bit word, which is what "24-bit" really implies. I guess we can chalk that up to the conventions of how video is termed compared to how PC monitors are referred to, but since BR is really a video format, it should conform to the "8-bit" designation, because that is what it truly is.

The difference is that RGB is a true 4:4:4 colorspace, and actual colors are being represented by those coefficients, which are added together to get an actual color (usually in the human vision system). Y Cb Cr works much the same way, except for two differences; first the RGB colors are matrixed together to derive a luminance signal and two color difference signals, and later dematrixed back to their original colors, a completely non-lossy conversion.

The second thing is what happens once they are matrixed to Y Cb Cr, which is that the colorspace is dithered from 4:4:4 to, in this case, 4:2:0, which essentially means the chroma resolution is reduced by a factor of four (one Cb or Cr coefficient represents a group of 4 pixels rather than each pixel being represented by a unique coefficient) while the luminance resolution stays the same. For Cb or Cr, pixel 1 and 2 of line one are averaged, and that value is also then designated as the value for pixel 1 and 2 of line 2, and that repeats throughout the entire pixel map.

Obviously, that is a degradation of color representation not seen in true RGB, but due to the human lack of ability to perceive detail in color, not at all perceptually significant, which is exactly why they do it.

Still, both are truly 8-bit systems, regardless of what they are branding it as.

There's no place like 127.0.0.1
TomCat is offline  
post #5 of 17 Old 02-20-2013, 08:37 AM
AVS Special Member
 
Otto Pylot's Avatar
 
Join Date: May 2004
Location: San Jose, CA
Posts: 7,076
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 91 Post(s)
Liked: 228
^^ so do we agree or disagree on the basics? Your explanation is obviously much more detailed than mine but I can't quite figure out if your correcting me or just giving a more precise definition of my simplified response.
Otto Pylot is offline  
post #6 of 17 Old 02-20-2013, 08:57 AM
AVS Special Member
 
Chronoptimist's Avatar
 
Join Date: Sep 2009
Posts: 2,559
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 203
Blu-ray and basically all MPEG content is 8-bit 4:2:0 YCC.

Outputting that as 10-bit, 12-bit, or 16-bit, should not hurt image quality in any way. In fact, because HDMI does not support 4:2:0 transmission, content has to be upsampled to 4:2:2, 4:4:4, or converted to RGB, and there are clear benefits from doing that with more than 8-bits of precision.

So I would recommend outputting 4:4:4 or RGB if you have the option, and using the highest bit-depth available to you. Even though most displays are 10-bit native or less, most will be performing at least some kind of processing on the image before it is displayed, so you want to give it as much information to work with as possible.

I can't think of any reason it would be detrimental to send a high bit-depth signal, but I can definitely think of reasons why it might be detrimental to only be outputting 8-bits.
Chronoptimist is offline  
post #7 of 17 Old 02-25-2013, 04:14 AM
AVS Special Member
 
Joe Bloggs's Avatar
 
Join Date: Apr 2007
Location: UK
Posts: 2,462
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 60 Post(s)
Liked: 42
Quote:
Originally Posted by Otto Pylot View Post

Blu-ray discs are encoded in 24 bit color (8 bits for each color, R-G-B). Certified High Speed HDMI cables have the capability of sending higher bit rates but there in nothing, so far, that is encoded in higher bit rates. Up-converting 24 bit color to 30, 32, or 48 will only introduce contouring with a resultant loss of pq.
Why would properly up-converting 24 bit colour to eg. 48 bit colour introduce contouring? Surely if the TV was capable of properly showing 48 bit colour (16 bits per channel), it should lead to less contouring (banding). eg. contouring is when you can see the steps from one colour to the next. With upscaling you could create in-between values - so there would be less of a step, and less banding. Though I suppose you'd have to try differentiate from what was supposed to be step changes (which shouldn't be interpolated) and what wasn't supposed to be - ie. what was supposed to be a change without steps (which could be interpolated).
Joe Bloggs is offline  
post #8 of 17 Old 02-25-2013, 10:04 AM
Senior Member
 
Tazishere's Avatar
 
Join Date: Apr 2008
Location: Broadus, Montana
Posts: 250
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 22
When I set my Panasonic GT55 to "wide color", the banding on sunrises/sunsets disappears. Is this an example of consumer level increased bit depth?
Tazishere is offline  
post #9 of 17 Old 02-25-2013, 12:25 PM
AVS Special Member
 
8mile13's Avatar
 
Join Date: Nov 2009
Posts: 3,708
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 95 Post(s)
Liked: 190
Quote:
Originally Posted by Tazishere 
When I set my Panasonic GT55 to "wide color", the banding on sunrises/sunsets disappears. Is this an example of consumer level increased bit depth?
Wide color has probably nothing to do with xvYCC (which expends the overall color gamut) or Deep Color (Deep Color increases the available bit depth for each color component).

On Sony TVs Wide Color space will get you more vivid colors. Wide Color Space does to Color what Motion Interpolation does to frames, kind of. Normal Color Space produces most accurate, natural looking result. Seems to me that 'Wide Color'' is Panasonics version of Wide Color Space.
8mile13 is online now  
post #10 of 17 Old 02-26-2013, 11:46 AM
Senior Member
 
Tazishere's Avatar
 
Join Date: Apr 2008
Location: Broadus, Montana
Posts: 250
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 22
So what is causing the banding to disappear in that setting?
Tazishere is offline  
post #11 of 17 Old 02-26-2013, 04:13 PM
AVS Special Member
 
8mile13's Avatar
 
Join Date: Nov 2009
Posts: 3,708
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 95 Post(s)
Liked: 190
Whatever it is there is no increased bit depth when set to Wide Color.
8mile13 is online now  
post #12 of 17 Old 02-26-2013, 07:18 PM
AVS Special Member
 
Chronoptimist's Avatar
 
Join Date: Sep 2009
Posts: 2,559
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 203
Quote:
Originally Posted by 8mile13 View Post

Whatever it is there is no increased bit depth when set to Wide Color.
There is no/minimal CMS being used when wide color is selected, the panel is just running at its native gamut.
To constrain the gamut to spec, you are effectively throwing away bits to implement the CMS.

So if you disable the CMS, you have better gradation (reduced banding) but less accurate color.

Here is the simplest way I can explain it:

gamutwzp66.jpg

To be clear about what this image is showing: the outer "triangle" that is colored in, covers the entire gamut that the human eye can see. The dotted triangle is the BT.2020 spec (UHDTV) and the solid inner triangle is the BT.709 spec. (HDTV) That doesn't actually matter, but it saves me having to explain it.
I have added a white line for illustrative purposes. This image is only a 2D representation of color, but the reality is that it is three dimensional. That doesn't matter so much for this explanation either.

And as note of caution, I am using the terms gradation, precision and bit-depth interchangeably.


If we are working in 8-bit RGB we have signals that can range from 0 to 255 for each red, green, and blue component.
If you were to send a 000,000,000 signal, it would be in the center of the triangle.
If you were to send it 000,255,000 that is 100% green, and would be right at the very green corner of the triangle.
If you were to send it 000,128,000 that is 50% green and it would be halfway between the center and the corner.


Now to keep things simple, imagine that you have a panel that is capable of 12-bit gradation, with a native gamut that covers the BT.2020 gamut exactly - the dotted triangle in this illustration.
But you are not watching Ultra HD content on it, you are watching regular HD (BT.709) content on it - the smaller solid triangle in the middle.

Now look at where the inner triangle intersects with that white line - it's roughly 1/3 of the way along it. By having a very wide native gamut, and using a CMS to bring it to spec, you have effectively turned a 12-bit panel into a 4-bit one along that axis.
If you look at 100% green for the BT.709 triangle (the corner) it's closer to halfway along the line, so you have the equivalent of maybe a 6-bit panel there.

Red and blue are a lot closer, so you won't lose so much precision from constraining the gamut with them - though this is a CIE xy chart rather than a CIE uv chart, so the difference is actually bigger than it appears here.

(Technical note: a 12-bit panel would actually have 4096 shades of gradation, and 1/3 of that would be 1365, which is actually still more than 10-bit (1024) but hopefully you see the point I am trying to make - you're losing a lot of precision)


Now today's displays are a lot closer to the BT.709 spec than that, but the best we have on the consumer side is only a 10-bit panel with the best LCDs and SXRD displays. Depending on how you want to count it, technically you could say that a Plasma has more gradation than that (just look at Panasonic's numbers - 10-bit only has 1024 steps of gradation but Panasonic advertise thousands) though in reality, I would argue that a Plasma typically shows less than 8-bit - and that is at its native gamut. Once you start using a CMS to reduce the gamut further, you are losing even more precision.

This is why it's actually bad for displays to have wide native gamuts - at least when they are intended for displaying BT.709 content. Once you go beyond the BT.709 gamut, you need more and more precision to make up for what you have to throw away to bring the panel back into spec.

That's why, as long as it can cover the BT.709 gamut in its entirety, LCDs using white LEDs in their backlight, rather than RGB LEDs can actually put out a better picture. White LED backlit sets have a native gamut that is very close to BT.709 and don't need much correction, so they should have very good gradation. RGB LEDs cover an extremely wide gamut, so you are losing a lot of precision to bring them into spec and potentially a 10-bit RGB LED backlit display could have worse gradation than an 8-bit White LED backlit display.

It's also why a 10-bit OLED set with its very wide gamut can still show banding when displaying BT.709 content.
helvetica bold likes this.
Chronoptimist is offline  
post #13 of 17 Old 02-27-2013, 07:38 AM
AVS Special Member
 
8mile13's Avatar
 
Join Date: Nov 2009
Posts: 3,708
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 95 Post(s)
Liked: 190
Quote:
Originally Posted by Chronoptimist 
There is no/minimal CMS being used when wide color is selected, the panel is just running at its native gamut.
To constrain the gamut to spec, you are effectively throwing away bits to implement the CMS.

So if you disable the CMS, you have better gradation (reduced banding) but less accurate color.
So there are more bits, better gradation, but less accurate color on the GT55 when using the ''Wide Color'' option.
Quote:
Originally Posted by Chronoptimist 
Here is the simplest way I can explain it:

gamutwzp66.jpg

To be clear about what this image is showing: the outer "triangle" that is colored in, covers the entire gamut that the human eye can see. The dotted triangle is the BT.2020 spec (UHDTV) and the solid inner triangle is the BT.709 spec. (HDTV) That doesn't actually matter, but it saves me having to explain it.
I have added a white line for illustrative purposes. This image is only a 2D representation of color, but the reality is that it is three dimensional. That doesn't matter so much for this explanation either.

And as note of caution, I am using the terms gradation, precision and bit-depth interchangeably.


If we are working in 8-bit RGB we have signals that can range from 0 to 255 for each red, green, and blue component.
If you were to send a 000,000,000 signal, it would be in the center of the triangle.
If you were to send it 000,255,000 that is 100% green, and would be right at the very green corner of the triangle.
If you were to send it 000,128,000 that is 50% green and it would be halfway between the center and the corner.


Now to keep things simple, imagine that you have a panel that is capable of 12-bit gradation, with a native gamut that covers the BT.2020 gamut exactly - the dotted triangle in this illustration.
But you are not watching Ultra HD content on it, you are watching regular HD (BT.709) content on it - the smaller solid triangle in the middle.

Now look at where the inner triangle intersects with that white line - it's roughly 1/3 of the way along it. By having a very wide native gamut, and using a CMS to bring it to spec, you have effectively turned a 12-bit panel into a 4-bit one along that axis.
If you look at 100% green for the BT.709 triangle (the corner) it's closer to halfway along the line, so you have the equivalent of maybe a 6-bit panel there.

Red and blue are a lot closer, so you won't lose so much precision from constraining the gamut with them - though this is a CIE xy chart rather than a CIE uv chart, so the difference is actually bigger than it appears here.

(Technical note: a 12-bit panel would actually have 4096 shades of gradation, and 1/3 of that would be 1365, which is actually still more than 10-bit (1024) but hopefully you see the point I am trying to make - you're losing a lot of precision)


Now today's displays are a lot closer to the BT.709 spec than that, but the best we have on the consumer side is only a 10-bit panel with the best LCDs and SXRD displays. Depending on how you want to count it, technically you could say that a Plasma has more gradation than that (just look at Panasonic's numbers - 10-bit only has 1024 steps of gradation but Panasonic advertise thousands) though in reality, I would argue that a Plasma typically shows less than 8-bit - and that is at its native gamut. Once you start using a CMS to reduce the gamut further, you are losing even more precision.

This is why it's actually bad for displays to have wide native gamuts - at least when they are intended for displaying BT.709 content. Once you go beyond the BT.709 gamut, you need more and more precision to make up for what you have to throw away to bring the panel back into spec.

That's why, as long as it can cover the BT.709 gamut in its entirety, LCDs using white LEDs in their backlight, rather than RGB LEDs can actually put out a better picture. White LED backlit sets have a native gamut that is very close to BT.709 and don't need much correction, so they should have very good gradation. RGB LEDs cover an extremely wide gamut, so you are losing a lot of precision to bring them into spec and potentially a 10-bit RGB LED backlit display could have worse gradation than an 8-bit White LED backlit display.

It's also why a 10-bit OLED set with its very wide gamut can still show banding when displaying BT.709 content.
There are just a few RGB LCd's (sony XBR8, Sharp Aquos LC-xxXS1E). The 10-bit RGB XBR8 LED LCd has been the top-LED for several years - best overall picture -. I am not aware of lots of XBR8 banding problems comments (when displaying BT.709 content) in the Official thread.

Lately i have been reading about banding problems which seems to be related to 55"+ TVs (size related?) .
8mile13 is online now  
post #14 of 17 Old 02-27-2013, 08:49 AM
AVS Special Member
 
Chronoptimist's Avatar
 
Join Date: Sep 2009
Posts: 2,559
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 203
Quote:
Originally Posted by 8mile13 View Post

So there are more bits, better gradation, but less accurate color on the GT55 when using the ''Wide Color'' option.
That's the most likely reason for it, yes.
Quote:
Originally Posted by 8mile13 View Post

There are just a few RGB LCd's (sony XBR8, Sharp Aquos LC-xxXS1E).
Yes, but this is part of the reason why there aren't more of them. The upcoming 4K wide gamut displays from Sony may suffer from the same problem - poor gradation with BT.709 content. Or they might use sufficient precision so that it's not a problem - Sony tend to be good about that.
Quote:
Originally Posted by 8mile13 View Post

Lately i have been reading about banding problems which seems to be related to 55"+ TVs (size related?) .
Well it's more obvious the bigger you go, and more displays are now implementing a full CMS to output "perfect" BT.709 color, without increasing the bit-depth of the display.
Chronoptimist is offline  
post #15 of 17 Old 08-12-2013, 06:09 PM
Newbie
 
faunus's Avatar
 
Join Date: Aug 2013
Posts: 4
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Hi!
I'm new to this particular forum, but I am hoping that someone here can help me with this.

I have just purchased a new Epson 1080p projector that advertises that it supports 10 bit color. It is clearly mentioned in the specs. Now I assume that the 10 bit color is carried over HDMI (as it is the only data link available on the projector that has the bandwidth for 10 bit color). The laptop that I use for my photography work is an HP Elitebook 8740W and yes it has a 10 bit display (Dreamcolor). I want to use the projector to display my beautiful native10 bit photo images, but after reading your discussion here, I am concerned that this will not be possible and Epson has mislead me. There is a BIG difference between native 10 bit images and 8 bit images, and I wish to demonstrate this with my projector. So... my question. Will the projector really allow me to display these images, or is Epson misleading me?
faunus is offline  
post #16 of 17 Old 08-12-2013, 07:36 PM
AVS Special Member
 
Chronoptimist's Avatar
 
Join Date: Sep 2009
Posts: 2,559
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 203
The projector probably does support 10-bit color over HDMI, and will likely do all its color processing with at least 10-bits of precision internally.
I don't know that your Quadro graphics card supports outputting 10-bit over HDMI though, I think 10-bit support on PCs is limited to DisplayPort.
Some AMD cards report "10-bit" via HDMI, but I think that's just the video card LUT, rather than them actually supporting 10-bit content via HDMI.
Chronoptimist is offline  
post #17 of 17 Old 08-12-2013, 07:42 PM
Newbie
 
faunus's Avatar
 
Join Date: Aug 2013
Posts: 4
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Yes... I will connect via display port to hdmi adapter via my Quadro graphics card. I wish I could find someone with direct experience of this.
faunus is offline  
Reply Flat Panels General and OLED Technology

User Tag List

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off