Originally Posted by discopaul
Well, as msmith198025 stated, your analysis is really apples and oranges. The original discussion is about whether there is a benefit to using higher (18 bit) processing for color reproduction.
It's not apples and oranges. Once again, current content sources are 8-bits per pixel. If you display the original source directly on the screen, without any degradation, you'll never have more than 8-bit per pixel on the screen. In fact, you'll have far, far less, because that represents the entire spectrum of possible colors used by content, which is never going to happen in a single frame, outside of test patterns.
Unless you are going to apply some artificial color expansion -- such that the colors on the screen no longer correspond to the colors in the original source content -- there is no practical reason to have processing with precision greater than 10-bit 4:4:4. Few LCD panels can even display 10-bit color, regardless of their processing; most displays use 8-bit panels. The only reason you even need 10-bit is to eliminate rounding errors. But 12-bit, 14-bit, and 18-bit are completely unnecessary to eliminate rounding errors with 8-bit sources.
As a sidenote to the deinterlacing topic mentioned. I'm not sure how much emphasis should be placed on this. I'm not saying it's not important but I've read and linked Ultimate AV test of the Elite 940. It only tested fair in this area. How many of us would conclude that based on that it's picture must suck and we should buy Visio because it passed the deinterlacing test.
Deinterlacing determines what resolution your screen is able to resolve and display on interlaced sources. Resolution is only one of many attributes that determines the quality you perceive on the screen. If resolution were all that mattered, 1080p displays would always look better than 720p displays and 720p displays would always look better than 480p displays. But that is not the case.
Contrast, color decoder accuracy, and display uniformity are other attributes that impact the quality you perceive on the screen. However, resolution is important, and as stated before, maintaining such high levels of precision throughout the pipeline makes it more complex and costly to implement motion-adaptive deinterlace and inverse telecine, which determines the source resolution output to your screen.