You guys are all over the map on this one. The question the OP asked concerned 10-bit processor
versus 16-bit color processing
The precise answer is that the two specifications concern different but related aspects of the display and should not be directly compared. Since the models mentioned are two different displays within the same product family, it is most likely the case that both
share 10-bit processors (referring to the width of the CPU pipeline used to manipulate the color bits) and both displays also utilize 16-bit color processing (referring to the internal colorspace depth), while the two differ only in other features and in the marketing literature. With any CPU/GPU design, the pipeline must be wider than the data being processed or
multiple processors must be used to process the wider data using narrower pipelines. In the case being discussed, 16-bit color data is broken into two 8-bit bytes, then run through two seperate 10-bit processors, and the extra bits are called the "carry" bits. We design the chips this way because two 10-bit processor "slices" are much simpler/cheaper than the single much more complex 18-bit chip design.
Such misunderstandings are not uncommon when attempting to interpret the marketing jargon. Indeed, after working as an EE on new computer and graphic products for over 20 years, one of the tasks I dread in each new product program is reviewing the marketing literature and trying to keep the marketing professionals honest (at least on paper).
Which is not to say that color depth should not concern you. There are two common color depths used in consumer displays today, which are 8-bit and 6-bit, referring to the shades of the Red/Green/Blue video signals. More commonly we call these "24-bit color" and "18-bit color". These color depths distinguish video displays from graphics displays which is something that needs to be understood, especially by HTPC afficianadoes.
Computer graphics displays that must refresh quickly use 6-bit RGB for a total of 256K colors represented by these 18 bits. The reduced video signal bandwidth makes possible faster screen updates. The total number of colors possible is actually 16.2M, a technique called "dithering" generates the extra shades between the ones in the video signal. When these 6-bit monitor displays or GPU designs are used for video source material, the visible result is often referred to as "color banding" but more properly is called "posterization". The color depth actually got compressed before input to the GPU, the output got dithered in the display, and one heck of a lot of number crunching was avoided entirely. Such very fast displays are ideal for text, video gaming, and general purpose flicker-free high resolution graphics, often at 150Hz or faster refresh. Desktop images are intended to be synthesized in the GPU at extemely high frame rates limited only by processing power. Still, such displays have long been used for simulations and gaming.
Video displays including the ones used for HD use 8-bit RGB (total 24 bits) and therefore display 16.7M colors natively without dithering. More video signal bandwidth is consumed and therefore the refresh rate must be dropped to cram the extra color information into the data stream. Since film-souce video originates as 24 frames-per-second and video source as 60 fields-per-second, the tradeoff is a good one, avoiding the posterization artifact while maintaing a refresh rate that is generally considered adequate for video. The result is an excellant approximation of the infinate number of colors in nature as only 16.7M recorded colors.
Now, while it is true that the human eye can distinguish more than 16.7M colors during liesurely inspection of static images, in truth 24-bit color is more than sufficient to represent moving images. (In fact we steal two bits and master NTSC DVDs with 22 bits and hardly anyone notices.)(Those few that do notice call it a "Chroma Bug" but we did it deliberately.)(We are NOT stealing two bits on the HD media.) The 32-bit color depth of professional graphics displays simply allows pixel manipulations of the 24-bit video source that avoids loss of color information when rendered down to 24-bit color for distribution on HD media.
By contrast the prior generation of game developers worked with 24-bit color displays and rendered their games at 18-bit color depths for distribution - but technology marches on, and games intended for today's HD displays originate at 24 bit color depths and must be manipulated on a 32-bit graphics system. Likewise CGI manipulations of real video images intended for that same 24-bit HD distribution media must be manipulated at 32 bit depth, to avoid that somewhat "unreal" appearance that marred recent movies like The Chronicles of Narnia
. (They manipulated 24-bit video source on yesterday's 24-bit systems.)
==> There are no plans to extend 32-bit color depth to consumer displays or consumer HD media. Such additional color depth may or may not be a feature of the next generation
of consumer products beyond HD-DVD and Blu-Ray, probably 10+ years away, and therefore of ZERO CONCERN to today's consumers. Some few relatively expensive displays are being designed for the HD camcorder crowd to allow the same manipulation of video at 32-bit color depth as is practiced today, at lower entry pricing with modern PCs. The interface specifications for connections like HDMI are also being expanded for the same reasons, and for the same semi-pro applications. Such displays will have NO ADVANTAGE WHATSOEVER after the manipulated HD camcorder video is rendered for the 24-bit distribution media.
In case you ar wondering, manipulation of 32-bit color requires implementation of FOUR existing processor slices or very much more expensive chips with wider internal pipes - but silicon is getting cheaper all the time (Moore's Law). The most advanced GPU designs now use entire arrays of smaller processors in parallel, a technique borrowed from yesterday's scientific supercomputers.
Here's a non-technical discussion of color depth as applied to consumer displays: http://compreviews.about.com/od/mult...a/LCDColor.htm
...and for those who have it bad, the gory detail on Chroma Bug: http://www.hometheaterhifi.com/volum...ug-4-2001.html