Samsung LCD: 10-bit vs. 16-bit processing - AVS Forum
Forum Jump: 
 
Thread Tools
post #1 of 28 Old 04-14-2007, 12:28 PM - Thread Starter
Newbie
 
jsizzz's Avatar
 
Join Date: Apr 2007
Location: San Francisco, CA
Posts: 7
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
I'm trying to decide between two Samsung 40" LCD models, the LN-T4061 and the slightly more expensive LN-T4065. There are a few differences, most notably that the 4065 has a higher contrast ratio of 15000:1 vs. the 4061's contrast ratio of 10000:1.

The one I don't really understand is the color processing. In reading the PDF spec sheets from Samsung's website it appears that the 4061 is much better in this area:
- 4061: "... the endless color range of 16-bit processing ..."
- 4065: "10 bit processor"


If someone could explain to me what this means in terms of color depth and PQ I would really appreciate it.
jsizzz is offline  
Sponsored Links
Advertisement
 
post #2 of 28 Old 04-15-2007, 06:27 AM
One-Man Content Creator
 
wmcclain's Avatar
 
Join Date: May 2006
Posts: 17,486
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 517 Post(s)
Liked: 365
Quote:
Originally Posted by jsizzz View Post

I'm trying to decide between two Samsung 40" LCD models, the LN-T4061 and the slightly more expensive LN-T4065. There are a few differences, most notably that the 4065 has a higher contrast ratio of 15000:1 vs. the 4061's contrast ratio of 10000:1.

The one I don't really understand is the color processing. In reading the PDF spec sheets from Samsung's website it appears that the 4061 is much better in this area:
- 4061: "... the endless color range of 16-bit processing ..."
- 4065: "10 bit processor"


If someone could explain to me what this means in terms of color depth and PQ I would really appreciate it.

I don't know anything about those specific models, but:

(1) Advertised contrast ratios are usually considered to be bogus. Independent testers find static contrast ratios in the low to mid hundreds.

(2) Almost all current signal sources are limited to 8-bit color: broadcast/cable/satelite, SD-DVD, HD-DVD, Blu-Ray. This is not going to change; maybe some future disc format, still years away, will have deep color.

There are camcoders that have deep color, maybe some upcoming games and video cards. I'm told the PS3 can do it.

It is possible that extra color bits in the display circuitry would improve 8-bit color signals, but I would have to see it before I could comment.

-Bill
wmcclain is offline  
post #3 of 28 Old 04-15-2007, 06:48 AM
AVS Special Member
 
irkuck's Avatar
 
Join Date: Dec 2001
Location: cyberspace
Posts: 3,519
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 51 Post(s)
Liked: 65
Quote:
Originally Posted by wmcclain View Post

I(2) Almost all current signal sources are limited to 8-bit color: broadcast/cable/satelite, SD-DVD, HD-DVD, Blu-Ray. This is not going to change; maybe some future disc format, still years away, will have deep color.
-Bill

With HD discs this is certainly not years away.

Manufacturers are just in the process of solving the chicken-and-egg game, they can not announce DC (Deep Color) players and discs when there are no panels to view them. So they are now introducing panels supporting HDMI 1.3 but keep silent about the discs and players just not to spoil the chicken market for HD. Once the new panels will be available in shops they will introduce players and discs. That might be expected to happen sometime in 2008. Now it is a very early stage since first HDMI 1.3 panels will hit after the summer.

irkuck
irkuck is offline  
post #4 of 28 Old 04-15-2007, 07:45 AM
One-Man Content Creator
 
wmcclain's Avatar
 
Join Date: May 2006
Posts: 17,486
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 517 Post(s)
Liked: 365
Quote:
Originally Posted by irkuck View Post

With HD discs this is certainly not years away.

Manufacturers are just in the process of solving the chicken-and-egg game, they can not announce DC (Deep Color) players and discs when there are no panels to view them. So they are now introducing panels supporting HDMI 1.3 but keep silent about the discs and players just not to spoil the chicken market for HD. Once the new panels will be available in shops they will introduce players and discs. That might be expected to happen sometime in 2008. Now it is a very early stage since first HDMI 1.3 panels will hit after the summer.

I've seen nothing about this. Do you have any reference links?

Deep color stresses both the disc capacity and bandwidth of HD players. And would future deep color discs be usable on current players? If not, it's another new format.

-Bill
wmcclain is offline  
post #5 of 28 Old 04-15-2007, 09:02 AM
AVS Special Member
 
doug_k's Avatar
 
Join Date: Jan 2003
Location: New York City
Posts: 1,415
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by wmcclain View Post

It is possible that extra color bits in the display circuitry would improve 8-bit color signals, but I would have to see it before I could comment.
-Bill

These new panels have gone a long way to removing banding. If those 10/16 specs aren't some kind of error, it could be that the pixels are dithered from 8->10 bits in the drive electronics, and the actual video processing (scaling, enhancement, etc) is done at 16 bits.

Just speculating.
doug_k is offline  
post #6 of 28 Old 04-15-2007, 09:28 AM
Senior Member
 
Blackraven's Avatar
 
Join Date: Oct 2005
Posts: 281
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
From what I know, the 8-bit and 10-bit processing is meant to increase the maximum number of colors that an HDTV set can produce. Like in the Samsung sets that have 10-bit processing usually have around 10-12 billion colors (of course the human eye can't even see the difference between 1 billion colors unless....you have god-like vision)

Well that's only from what I've heard about it.

Typically, higher bit processing = better.

To the common layman, how do you explain this "bit processing"?
Blackraven is offline  
post #7 of 28 Old 04-15-2007, 12:09 PM
Member
 
Idjiit's Avatar
 
Join Date: Oct 2006
Posts: 191
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by irkuck View Post

Once the new panels will be available in shops they will introduce players and discs.

Doesn't the Toshiba HD-XA2 already support 1.3/Deep Color?
Idjiit is offline  
post #8 of 28 Old 04-15-2007, 12:17 PM
One-Man Content Creator
 
wmcclain's Avatar
 
Join Date: May 2006
Posts: 17,486
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 517 Post(s)
Liked: 365
Quote:
Originally Posted by Idjiit View Post

Doesn't the Toshiba HD-XA2 already support 1.3/Deep Color?

Neither HD-DVD nor Blu-Ray support Deep Color.

-Bill
wmcclain is offline  
post #9 of 28 Old 04-15-2007, 12:33 PM
Member
 
Idjiit's Avatar
 
Join Date: Oct 2006
Posts: 191
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by wmcclain View Post

Neither HD-DVD nor Blu-Ray support Deep Color.

So what is this supposed to mean?

Quote:
from http://www.tacp.toshiba.com/dvd/prod...p?model=hd-xa2
The HD-XA2 is our flagship HD DVD player and incorporates our latest technologies. With a chassis reinforced by solid brushed aluminum panels, the HD-XA2 uses advanced digital and analog video processing technology and includes 1080p with Deep Color support specified in HDMI 1.3a.

Is the connection support there, but the media support not there?
Idjiit is offline  
post #10 of 28 Old 04-15-2007, 12:47 PM
One-Man Content Creator
 
wmcclain's Avatar
 
Join Date: May 2006
Posts: 17,486
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 517 Post(s)
Liked: 365
Quote:
Originally Posted by Idjiit View Post

So what is this supposed to mean?



Is the connection support there, but the media support not there?

HDMI 1.3 supports Deep Color, and the XA2 has HDMI 1.3. But HDMI 1.3 has lots of features that not all devices will provide. The marketeer who wrote that passage is being deceptive.

I don't believe the existing HDTV standards provide for Deep Color either. The standards will presumably be extended and published in the future.

-Bill
wmcclain is offline  
post #11 of 28 Old 04-15-2007, 01:35 PM
AVS Addicted Member
 
bfdtv's Avatar
 
Join Date: Nov 2002
Posts: 13,484
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by irkuck View Post

With HD discs this is certainly not years away.

Manufacturers are just in the process of solving the chicken-and-egg game, they can not announce DC (Deep Color) players and discs when there are no panels to view them.

I wish this were so, but it is not.

The Blu-ray specification for movies (BD-ROM) does not support Deep Color or the new xvYCC color space. In fact, both Blu-ray and HD-DVD movie formats are limited to 8-bit 4:2:0 YCbCr. Nor does any formal proposal exist to add expanded color capability to the BD-ROM or HD-DVD ROM specifications.

The Blu-ray insiders have told us that there are no plans to support xvYCC anytime soon, if ever. Blu-ray and HD-DVD are still using 8-bit 4:2:0. Today, Hollywood films are telecined to digital, with the masters stored on D5 tape as 10-bit 4:2:2. That's a nice jump from the 8-bit 4:2:0 we have now, but it's not Deep Color or xvYCC. At the moment, there exists no equipment to master and store films with the xvYCC color space, so any serious talk about that format is just putting the cart before the horse.

The next step for Blu-ray and HD-DVD is 10-bit 4:2:0 or 10-bit 4:2:2, both of which are supported by the existing Hi10P and Hi422P profiles in AVC, but neither of which is currently supported by VC-1. With those AVC profiles, studios could encode their movies with the full color resolution of the original master.

Quote:
Originally Posted by jsizz View Post

If someone could explain to me what this means in terms of color depth and PQ I would really appreciate it.

The claims of 12-bit, 14-bit, 18-bit, etc have very little to do with the number of colors you see on the screen. Aside from Sony's AVCHD camcorder format, all sources today -- cable, satellite, DVD, Blu-ray, HD-DVD, Xbox360, and PS3 -- are 8-bit per pixel, i.e. 24-bit RGB. If you display the original source directly on the screen, without any degradation, you'll never have more than 8-bit per pixel on the screen. Some extra precision is obviously required to eliminate rounding errors in processing, but 10-bit 4:4:4 through the entire data path is more than sufficient to do that.

Unless you are going to apply some artificial color expansion** -- such that the colors on the screen no longer correspond to the colors in the original source content -- there is no practical reason to have processing with precision greater than 10-bit 4:4:4. Few LCD panels can even display 10-bit color, regardless of their processing; most displays use 8-bit panels. The only reason you even need 10-bit is to eliminate rounding errors. But 12-bit, 14-bit, and 18-bit are completely unnecessary to eliminate rounding errors with 8-bit sources.

Bitness alone is no indication of performance. In fact, to me, it suggests the opposite. The larger the width of the datapath, the more costly and complex it is to design silicon capable of quality scaling and deinterlace. If you do all your processing at 18 bits per pixel, then chances are good you aren't doing much, because doing everything at that precision would introduce significant cost. Take the example of per-pixel, motion-adaptive deinterlace for high-definition. To do this with a high degree of quality, the display processor must analyze 4-5 different 1920x540 fields simultaneously. It must compare adjacent fields to determine which pixels have moved in the past (and next) 1/60 of a second, and which pixels have not. Once it determines what pixels are in motion, it must look at the 4-5 fields to determine the properties of that motion so it knows how to best interpolate the difference, and where it should apply certain filters.

The bandwidth required to compare all those pixels at 18-bit precision is substantially higher than that required to do the same thing at 10-bit precision. Maintaining such high levels of precision throughout the entire pipeline makes it more complex and costly to implement motion-adaptive deinterlace and inverse telecine, which determines the source resolution output to your screen. Hence, in many cases, higher "bitness" equates to lower source resolution on your screen.

** Many LCDs and plasmas do offer the option for artificial color and black level expansion. This feature has various names, but a few among them include "Dynamic Contrast" and "DNIE." Most members recommend that you disable these features for optimum picture quality. However, if you do like these artificial expansion modes, then higher color precision does have some practical use.
bfdtv is offline  
post #12 of 28 Old 04-15-2007, 02:09 PM
AVS Addicted Member
 
bfdtv's Avatar
 
Join Date: Nov 2002
Posts: 13,484
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by Idjiit View Post

So what is this supposed to mean?

Is the connection support there, but the media support not there?

As with most other source devices, the "Deep Color" claim on the Toshiba HD-XA2 is bogus.

The NEC decoder in the Toshiba HD-XA2 decodes and processes signals at a maximum of 10-bit 4:4:4. The same is also true of the highly-regarded Silicon Optix ReonVX video processor in the Toshiba HD-XA2. The only part on the Toshiba HD-XA2 that can actually handle "Deep Color" output is the Silicon Image HDMI 1.3 transmitter. But such content cannot actually make it through to the transmitter for output.

The PS3 uses the same HDMI 1.3 transmitter in the Toshiba HD-XA2, although the current software on the PS3 is limited to HDMI 1.1 functionality. I'm not intimately familiar with the RSX or the Cell, so I don't know what the technical limitations are of that console. As far as I know, no Nvidia graphics processor supports more than 10-bit per pixel precision throughout the entire pipeline (incl. frame buffer).
bfdtv is offline  
post #13 of 28 Old 04-15-2007, 02:10 PM
Member
 
Idjiit's Avatar
 
Join Date: Oct 2006
Posts: 191
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Thanks for the detailed explanation, bfdtv - very helpful!
Idjiit is offline  
post #14 of 28 Old 04-15-2007, 02:18 PM - Thread Starter
Newbie
 
jsizzz's Avatar
 
Join Date: Apr 2007
Location: San Francisco, CA
Posts: 7
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
@ bfdtv: thanks for the detailed response, that's some good info.
jsizzz is offline  
post #15 of 28 Old 04-15-2007, 03:48 PM
AVS Special Member
 
swifty7's Avatar
 
Join Date: Aug 2005
Posts: 1,662
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
I second that! great info...THANKS!!!
swifty7 is offline  
post #16 of 28 Old 04-15-2007, 05:58 PM
AVS Special Member
 
Richard Paul's Avatar
 
Join Date: Sep 2004
Posts: 6,959
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 4 Post(s)
Liked: 29
Quote:
Originally Posted by bfdtv View Post

The PS3 uses the same HDMI 1.3 transmitter in the Toshiba HD-XA2, although the current software on the PS3 is limited to HDMI 1.1 functionality. I'm not intimately familiar with the RSX or the Cell, so I don't know what the technical limitations are of that console. As far as I know, no Nvidia graphics processor supports more than 10-bit per pixel precision throughout the entire pipeline (incl. frame buffer).

Just to point this out but most modern graphic cards are capable of up to 128-bit per pixel rendering and you can see this in the NVIDIA 7 series of GPUs. In fact high bit RGB rendering is one of the ways HDR can be done and here is one game engine that makes use of it. Personally I don't know whether the RSX output is limited to 30-bit or 36-bit RGB (both of which are supported by Deep Color), but even 30-bit RGB would be a good improvement over 24-bit RGB for a Deep Color display. Also some of the HDMI companies have actually shown Deep Color off using a PS3 at a demonstration at CES.
Richard Paul is offline  
post #17 of 28 Old 04-15-2007, 07:29 PM
AVS Addicted Member
 
bfdtv's Avatar
 
Join Date: Nov 2002
Posts: 13,484
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by Richard Paul View Post

Just to point this out but most modern graphic cards are capable of up to 128-bit per pixel rendering and you can see this in the NVIDIA 7 series of GPUs. In fact high bit RGB rendering is one of the ways HDR can be done and here is one game engine that makes use of it. Personally I don't know whether the RSX output is limited to 30-bit or 36-bit RGB (both of which are supported by Deep Color), but even 30-bit RGB would be a good improvement over 24-bit RGB for a Deep Color display.

Just make sure you don't confuse the internal precision of the vertex shaders, pixel shaders, etc with the precision of the frame buffer, which is what you ultimately limits what you see on your screen. I don't know the specifics of the RSX in the PS3 so I don't know what it can and cannot do.
bfdtv is offline  
post #18 of 28 Old 04-15-2007, 09:33 PM
AVS Special Member
 
doug_k's Avatar
 
Join Date: Jan 2003
Location: New York City
Posts: 1,415
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
I think the reasons for bit depth are the interactions of different *picture* processing algorithms being applied and how they may reveal limitations in the source material when shown on the screen.

Gamma/brightness/contrast maps may truncate either the source material or leave available luminance extremes unusued. So right off the bat you've lost some of your 1024 luminance levels @10 bits. Say, 800? Now those nonlinear ramps have also stretched out the resolution in some spots and you could start to see false contouring there. Or you could see crushed blacks/whites/colors, or limited ranges that don't take advantage of what the screen has to offer (should you choose to disregard ISF).

Now, the old 256-levels-is-enough adage wasn't 'proved' on these super-high contrast displays. Some of the plasma panels go way beyond film/ISF ranges. When you stretch all this stuff out again, what are you left with? Where are the weak points? And what's displayed on those weak points, and how could they be made worse by further processing?

What about the fact that the human eye can distinguish finer variations in some colors (esp green) than others?

(just a subset of the big picture)
doug_k is offline  
post #19 of 28 Old 04-19-2007, 04:49 PM - Thread Starter
Newbie
 
jsizzz's Avatar
 
Join Date: Apr 2007
Location: San Francisco, CA
Posts: 7
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
So I just found this info on a c|net forum. It was posted by a member who reportedly works for Samsung but is not himself a tech/HD guy. Thought it was worth sharing.



-------------------------------------
10 bit or 16 bit processing...
by Mr_Samsung - 4/11/07 9:45 AM
In reply to: Mr. Samsung by gaslaugh

gaslaugh -

Thanks for asking me this question. No one ever talks about this so I never really learned that much about it. I just sat down with one of our HDTV gurus (you would love this guy - he's a fountain of knowledge) and he broke it all down for me.

First off, the 4061 and the 4065 both have the same processing capability. The difference in the specs is based on what part of the processing cycle is measured. For the 4061, we provided the measurement of the processor. On the 4065, we provided the measurement of the panel. We should have been uniform in our listing so it didn't confuse the customer. The TV team tips their hat to you for noticing. Few people have brought this up. However -both TVs (4061 & 4065) have the same internal systems in regards to processing.

Now - let me explain what I learned about processing and HDTVs. There are four parts to the puzzle. SOURCE, INPUT, PROCESSING and OUTPUT. Each piece has different levels of processing capability. Sources such as DVD have 8 bit. Cable is 10-12 bit and Blu-ray has the potential to be 16 bit. How the source is inputted into the TV also has levels of processing. HDMI 2A (on the 4061) is 12 bit. I believe HDMI CEC is 16 bit but I have to check on that. Then there is the capability of the actual processor - which in both the 4061 and 4065 is 16 bit. And at the end of the puzzle is the Output or Panel. I believe most LCD panels are 10 bit.

What does all this mean? Well - it seems like the processing number is one of those specs that sounds a lot more important that it currently is. I'm sure it does impact the picture but not as much as you might think. And since the 4 pieces to this puzzle all have a variety of processing capability, it adds to the confusion. From my quick conversation with one of our gurus, it seems like HDTV sources need to improve before we can truly feel the power of a 16 bit processing experience.

Does this help? Let me know.

Mr. Samsung
jsizzz is offline  
post #20 of 28 Old 04-19-2007, 05:27 PM
One-Man Content Creator
 
wmcclain's Avatar
 
Join Date: May 2006
Posts: 17,486
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 517 Post(s)
Liked: 365
Quote:
Originally Posted by jsizzz View Post

Cable is 10-12 bit and Blu-ray has the potential to be 16 bit.

I've never heard that before, and at the moment I don't believe it.

-Bill
wmcclain is offline  
post #21 of 28 Old 04-19-2007, 07:49 PM
AVS Addicted Member
 
bfdtv's Avatar
 
Join Date: Nov 2002
Posts: 13,484
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
As you suggest, that poster is not a "tech guy." He was mislead or is confused.
bfdtv is offline  
post #22 of 28 Old 04-22-2007, 02:18 PM
AVS Special Member
 
Gary McCoy's Avatar
 
Join Date: Jul 1999
Location: San Jose, California, USA
Posts: 6,261
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 18 Post(s)
Liked: 43
You guys are all over the map on this one. The question the OP asked concerned 10-bit processor versus 16-bit color processing.

The precise answer is that the two specifications concern different but related aspects of the display and should not be directly compared. Since the models mentioned are two different displays within the same product family, it is most likely the case that both share 10-bit processors (referring to the width of the CPU pipeline used to manipulate the color bits) and both displays also utilize 16-bit color processing (referring to the internal colorspace depth), while the two differ only in other features and in the marketing literature. With any CPU/GPU design, the pipeline must be wider than the data being processed or multiple processors must be used to process the wider data using narrower pipelines. In the case being discussed, 16-bit color data is broken into two 8-bit bytes, then run through two seperate 10-bit processors, and the extra bits are called the "carry" bits. We design the chips this way because two 10-bit processor "slices" are much simpler/cheaper than the single much more complex 18-bit chip design.

Such misunderstandings are not uncommon when attempting to interpret the marketing jargon. Indeed, after working as an EE on new computer and graphic products for over 20 years, one of the tasks I dread in each new product program is reviewing the marketing literature and trying to keep the marketing professionals honest (at least on paper).

Which is not to say that color depth should not concern you. There are two common color depths used in consumer displays today, which are 8-bit and 6-bit, referring to the shades of the Red/Green/Blue video signals. More commonly we call these "24-bit color" and "18-bit color". These color depths distinguish video displays from graphics displays which is something that needs to be understood, especially by HTPC afficianadoes.

Computer graphics displays that must refresh quickly use 6-bit RGB for a total of 256K colors represented by these 18 bits. The reduced video signal bandwidth makes possible faster screen updates. The total number of colors possible is actually 16.2M, a technique called "dithering" generates the extra shades between the ones in the video signal. When these 6-bit monitor displays or GPU designs are used for video source material, the visible result is often referred to as "color banding" but more properly is called "posterization". The color depth actually got compressed before input to the GPU, the output got dithered in the display, and one heck of a lot of number crunching was avoided entirely. Such very fast displays are ideal for text, video gaming, and general purpose flicker-free high resolution graphics, often at 150Hz or faster refresh. Desktop images are intended to be synthesized in the GPU at extemely high frame rates limited only by processing power. Still, such displays have long been used for simulations and gaming.

Video displays including the ones used for HD use 8-bit RGB (total 24 bits) and therefore display 16.7M colors natively without dithering. More video signal bandwidth is consumed and therefore the refresh rate must be dropped to cram the extra color information into the data stream. Since film-souce video originates as 24 frames-per-second and video source as 60 fields-per-second, the tradeoff is a good one, avoiding the posterization artifact while maintaing a refresh rate that is generally considered adequate for video. The result is an excellant approximation of the infinate number of colors in nature as only 16.7M recorded colors.

Now, while it is true that the human eye can distinguish more than 16.7M colors during liesurely inspection of static images, in truth 24-bit color is more than sufficient to represent moving images. (In fact we steal two bits and master NTSC DVDs with 22 bits and hardly anyone notices.)(Those few that do notice call it a "Chroma Bug" but we did it deliberately.)(We are NOT stealing two bits on the HD media.) The 32-bit color depth of professional graphics displays simply allows pixel manipulations of the 24-bit video source that avoids loss of color information when rendered down to 24-bit color for distribution on HD media.

By contrast the prior generation of game developers worked with 24-bit color displays and rendered their games at 18-bit color depths for distribution - but technology marches on, and games intended for today's HD displays originate at 24 bit color depths and must be manipulated on a 32-bit graphics system. Likewise CGI manipulations of real video images intended for that same 24-bit HD distribution media must be manipulated at 32 bit depth, to avoid that somewhat "unreal" appearance that marred recent movies like The Chronicles of Narnia. (They manipulated 24-bit video source on yesterday's 24-bit systems.)

==> There are no plans to extend 32-bit color depth to consumer displays or consumer HD media. Such additional color depth may or may not be a feature of the next generation of consumer products beyond HD-DVD and Blu-Ray, probably 10+ years away, and therefore of ZERO CONCERN to today's consumers. Some few relatively expensive displays are being designed for the HD camcorder crowd to allow the same manipulation of video at 32-bit color depth as is practiced today, at lower entry pricing with modern PCs. The interface specifications for connections like HDMI are also being expanded for the same reasons, and for the same semi-pro applications. Such displays will have NO ADVANTAGE WHATSOEVER after the manipulated HD camcorder video is rendered for the 24-bit distribution media.

In case you ar wondering, manipulation of 32-bit color requires implementation of FOUR existing processor slices or very much more expensive chips with wider internal pipes - but silicon is getting cheaper all the time (Moore's Law). The most advanced GPU designs now use entire arrays of smaller processors in parallel, a technique borrowed from yesterday's scientific supercomputers.

Here's a non-technical discussion of color depth as applied to consumer displays: http://compreviews.about.com/od/mult...a/LCDColor.htm

...and for those who have it bad, the gory detail on Chroma Bug: http://www.hometheaterhifi.com/volum...ug-4-2001.html

Gary

Gary McCoy
The United States Constitution ©1791. All Rights Reserved.

Gary McCoy is offline  
post #23 of 28 Old 04-27-2007, 01:16 PM
AVS Special Member
 
Richard Paul's Avatar
 
Join Date: Sep 2004
Posts: 6,959
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 4 Post(s)
Liked: 29
Quote:
Originally Posted by Gary McCoy View Post

Which is not to say that color depth should not concern you. There are two common color depths used in consumer displays today, which are 8-bit and 6-bit, referring to the shades of the Red/Green/Blue video signals. More commonly we call these "24-bit color" and "18-bit color".

I think you may be confused about this since I personally know of no modern LCDs that use 6-bit processing and in fact most modern LCDs use at least 10-bit processing.


Quote:
Originally Posted by Gary McCoy View Post

Video displays including the ones used for HD use 8-bit RGB (total 24 bits) and therefore display 16.7M colors natively without dithering.

It is 16.7 million color combinations but only if you use full range RGB. All HDTV resolutions though are actually based on the use of video range RGB which only allows for about 10.5 million color combinations.


Quote:
Originally Posted by Gary McCoy View Post

Now, while it is true that the human eye can distinguish more than 16.7M colors during liesurely inspection of static images, in truth 24-bit color is more than sufficient to represent moving images.

Instead of saying that it is sufficient I personally would say that it was more like "good enough" just as 480i60 was good enough for 40+ years. That doesn't mean it was ideal and for low light scenes 24-bit RGB is actually rather bad.


Quote:
Originally Posted by Gary McCoy View Post

(In fact we steal two bits and master NTSC DVDs with 22 bits and hardly anyone notices.)(Those few that do notice call it a "Chroma Bug" but we did it deliberately.)(We are NOT stealing two bits on the HD media.)

Actually the use of 4:2:0 YCbCr encoding steals far more than 2-bits and reduces the bits per pixel from 24-bits to 12-bits. That reduces not only the number of effective color combinations but also greatly reduces the effective color resolution of the image. Also 4:2:0 YCbCr continues to be used for encoding HDTV video.


Quote:
Originally Posted by Gary McCoy View Post

The 32-bit color depth of professional graphics displays simply allows pixel manipulations of the 24-bit video source that avoids loss of color information when rendered down to 24-bit color for distribution on HD media.

Actually your average home computer, and game consoles since the Dreamcast, run using at least 32-bit RGBA and the extra 8-bits is used for the alpha channel which allows for translucency effects. Just to make this clear but the alpha channel is only used for processing.


Quote:
Originally Posted by Gary McCoy View Post

By contrast the prior generation of game developers worked with 24-bit color displays and rendered their games at 18-bit color depths for distribution - but technology marches on, and games intended for today's HD displays originate at 24 bit color depths and must be manipulated on a 32-bit graphics system.

Just to point this out but it was actually 16-bit RGB (high color) that used to be used on computer games. Today most current games run at 32-bit RGBA but DirectX 9 video cards are actually capable of running at up to 128-bit linear RGBA, which allows for graphical effects like HDR. One rather impressive game engine that shows off the advantages of 64-bit linear RGBA is the Unreal Engine 3.


Quote:
Originally Posted by Gary McCoy View Post

Likewise CGI manipulations of real video images intended for that same 24-bit HD distribution media must be manipulated at 32 bit depth, to avoid that somewhat "unreal" appearance that marred recent movies like The Chronicles of Narnia. (They manipulated 24-bit video source on yesterday's 24-bit systems.)

Many of the all CGI companies like Pixar make their movies at 128-bit linear RGB. Also I believe the same is true for many of the professional companies that create CGI for live action movies.


Quote:
Originally Posted by Gary McCoy View Post

==> There are no plans to extend 32-bit color depth to consumer displays or consumer HD media.

True, it is actually 30-bit or 36-bit RGB that will be possible with certain consumer displays this year. As for current consumer HD media it won't change anytime soon in terms of using 8-bit 4:2:0 YCbCr, but game consoles and computers are in a whole different ballgame. For instance HDMI companies have actually shown Deep Color off using a PS3 at a CES demonstration with a 24-bit RGB display and a 30-bit RGB display.


Quote:
Originally Posted by Gary McCoy View Post

Here's a non-technical discussion of color depth as applied to consumer displays: http://compreviews.about.com/od/mult...a/LCDColor.htm

That article is so old, in terms of the computer world, that is almost belongs in a musuem. I don't even know if the 6-bit processing LCD computer monitors he mentioned in that article are even produced any more. All of the LCDs I know of in the CE world though use at least 8-bit processing and in fact some of the new ones actually use up to 14-bit processing (such as the new Toshiba LCDs).
Richard Paul is offline  
post #24 of 28 Old 08-31-2007, 03:34 PM
AVS Special Member
 
BIG ED's Avatar
 
Join Date: Jan 2003
Location: California Wine Country
Posts: 3,290
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 12
Good thread for what I was looking for.
Some GREAT stuff, some not so great stuff.
Anyway...

What I wanted to know was about 10 bit panels vs. 12 bit (or above) panels.
Is it the panel itself? Or when you see 12 or 16 used is that the processor?
Are all panels 8 or 10 bit today?

BIG thanks.

"I wonder if any of the releases had slipcovers though."
"Are these comfirmed to have slipcovers?"
"They look nice in those slips."
"This slipcover looks too good to pass up."
BIG ED is offline  
post #25 of 28 Old 09-16-2007, 09:56 AM
Senior Member
 
El Espectro's Avatar
 
Join Date: Aug 2007
Posts: 217
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
I've been dealing with this somewhat in the owners' thread for the set, but this thread looks like it could have someone with a very precise answer reading it. I am getting some posterization on my new Samsung lnt-4061f LCD with my PS3. Unfortunately I have no other source for testing except for an old DVD player which will be prone to artifacts anyway.

So my question is this: Can some posterization be expected in modern LCD HDTV displays with the PS3, or is there something up with my TV? Is plasma a solution?

Thanks for the great thread so far.
El Espectro is offline  
post #26 of 28 Old 04-17-2008, 08:58 PM
Member
 
sunwaterpool's Avatar
 
Join Date: Sep 2006
Posts: 185
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
What if I bought a cheaper brand lcd, but ran everything through a seperate Realta HQV processor? Would I get a similar picture to buying a tv that includes the processor?

What are some of the highest bit lcd sets that are out for 2008?
sunwaterpool is offline  
post #27 of 28 Old 06-28-2008, 04:11 PM
AVS Special Member
 
skibum5000's Avatar
 
Join Date: Jun 2006
Location: Boston, MA
Posts: 3,587
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 6 Post(s)
Liked: 26
Quote:
Originally Posted by wmcclain View Post

I don't know anything about those specific models, but:

(1) Advertised contrast ratios are usually considered to be bogus. Independent testers find static contrast ratios in the low to mid hundreds.

(2) Almost all current signal sources are limited to 8-bit color: broadcast/cable/satelite, SD-DVD, HD-DVD, Blu-Ray. This is not going to change; maybe some future disc format, still years away, will have deep color.

There are camcoders that have deep color, maybe some upcoming games and video cards. I'm told the PS3 can do it.

It is possible that extra color bits in the display circuitry would improve 8-bit color signals, but I would have to see it before I could comment.

-Bill

while the listed ratios are garbage they surely are better than in the low to mid-hundreds! maybe 5-10 years ago! Top LCD are easily in the 1500-2000:1 ratio these days.

it doesn't matter that most cources are only 8bits and the display itself can only show 8bits, you DO NOT want 8bit processing, then any calibration and post processing will create banding. You certainly want at least 10bits for processing.
skibum5000 is offline  
post #28 of 28 Old 06-28-2008, 04:59 PM
AVS Special Member
 
8IronBob's Avatar
 
Join Date: Nov 2005
Location: Suburbs of Cleveland, Ohio
Posts: 2,735
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
IIRC, the A750 actually has both 10-bit panel AND 16-bit processing, so in that regard, you pretty much don't have to choose one over the other. Wonder how futureproof that would wind up being.

AVS Forum...The Final Frontier. My continuing mission. To explore strange, new equipment, to seek out new movies and technology. Boldly going where no enthusiast had gone before.
8IronBob is offline  
Reply OLED Technology and Flat Panels General

User Tag List

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off