It's tough to say there will be NO advantage from "deep color", but it will certainly not result in the dramatic improvement some people seem to expect.
First of all, there's no mass market deep color CONTENT out there. Consider:
* There will NEVER be deep color content on ANY future HD-DVD or Blue Ray discs. EVER. The disc formats don't support it.
* There will never be deep color content on standard DVD discs. Same reason.
* Deep color content on disc will require adoption of an entirely new disc format that is not even on the horizon yet. Think many years. Think how long it took HD-DVD and Blue Ray to get launched.
* Adding deep color content to broadcast TV, regardless of delivery method (off air, cable, or satellite) will require the adoption of a new HDTV standard. The current one doesn't support it. Think of how long it is taking to get HDTV into local stations right now. Again, we are talking many years. Even an outfit that controls both transmission and receivers, such as DirecTV, is going to be hard pressed to do anything because there'll be nobody producing "deep color" content for them to transmit (since OTA HDTV doesn't support it).
* The industry doesn't have the tools yet to digitize traditional, film stock based movies that well. So even if there WERE a way to get the content to the consumer, the production of the content would be dependent upon digitally originated and produced live action films (a technology just getting going) or computer based animation rendering. And of course that would only work for NEW films -- not existing libraries.
So since the content isn't there, where is the mass market deep color going to come from? The answer is that it can only come as the result of various processing algorithms on regular old 8 bit content. I.e., the extra bits are used to eliminate "rounding errors" during processing inside of some device.
[NOTE: Exclude from consideration experimental stuff that might be traded around the internet, or specialty formats such as, say, a new digital tape format (which is also not on the horizon yet). These are not mass market sources. "Deep Color" WILL, on the other hand, appear in games -- basically as a gimmick.]
Now there are lots of good reasons to do digital processing with extra bit depth. But that's INSIDE each device doing the processing. Better video devices ALREADY do this sort of thing today -- think HDTVs that tout "10 bit video processing" today for example. This sort of internal processing reduces the need for the device to have to be too clever about how it does things. Over reliance on dithering for example.
But rounding the final result down before transmission to the NEXT device in the video chain is just not that big of a deal, because the original content didn't have that fine level of information content to begin with.
What most people are (in my opinion) actually seeing when they THINK they are seeing rounding errors that might be eliminated like this are improper Gamma correction and/or improper setup of gray and color ramps to begin with.
Keep in mind that YCbCr 4:2:2 in HDMI V1.1 (for example) already allows for up to 12 bit gray scale and color samples at the expense of reducing horizontal color pixel counts by half. That's the SAME max bit depth for gray scale and color info as is in "deep color" YCbCr 4:4:4 for HDMI V1.3. And yet there's been little eagerness to go that route up to now.
But there WILL be other VERY SIGNIFICANT improvements in mass market video processing technology over the next few years. For example, really high quality de-interlacing and scaling solutions will work their way into affordable products. And so better engineered, new devices touting "deep color" *WILL* likely look better! It's just not "deep color" that's doing it. I.e., you could get the same improvement in HDMI V1.1 products, just as people are seeing it today in more exotic high end products like the Anthem Statement D2.
The key missing element in video rates right now is not the frame rate of the content transmission -- although 24 frames per second for film stock based stuff really is too slow for some of the things movie makers would like to do. It is, instead, the REFRESH rate IN DISPLAYS. Understand that this is ONLY an issue in the display technology itself. You can move /24Hz and /48Hz digital video around just fine with HDMI V1.1 today.
And what will likely make the real impact here is the wide availability of 120Hz refresh rate TVs in the near future.
120 is a multiple of 24 and is also a multiple of 30. So it can be used for judder free display of both film and video based content (in NTSC markets where this is a problem today).
So a well engineered display with a "native" 1080p display matrix, that can accept 1080p/24Hz (or /48Hz) and 1080p/60Hz and display both of them at a refresh rate of 120Hz -- i.e., without having to change refresh rates to eliminate judder -- is a cool thing.
But the INPUT rate to such a TV need not exceed 1080p/60Hz! And again that is already covered today by HDMI V1.1.
Try to go beyond that data rate and you are back in the realm of having to wait for years to see any practical advantage. Once again you are content limited.
That leaves auto lip sync and improved robustness.
Auto lip sync is, in my opinion, hype. Lip sync is only a problem today for improperly re-transmitted content and devices with bugs. I.e., you are correcting for bugs.
Adding auto lip sync for HDMI V1.3 is NOT GOING TO HELP if the content is wrong before it gets to you, or if your devices still have bugs. Nor is auto lip sync going to be able to free video processing designs to do more elaborate, time consuming stuff any time soon, because any design that ships with severe video delay built in by design is going to fail in the marketplace since MOST people won't have HDMI V1.3 along their complete chain of devices. I.e., you'll only be able to sell such a device to the more limited, "enthusiast" market.
Improved robustness IS an important factor. The HDMI V1.3 program comes with new criteria and labelling for HDMI cables and that will help *ALL* HDMI users (even non V1.3 users) buy cables with more confidence. It will also raise the COST of cables, but since better cables are already retail priced up in the exosphere, this may not make much difference.
[Folks with current in wall cables may be in for a shock however. Their cables may not do well with the higher bandwidth signal HDMI V1.3 may try to send across them. There are more exotic versions of the HDMI V1.3 driver chips which will help here -- extra equalization that is more robust against signal problems. But that means consumers are going to have to try to figure out which style of HDMI V1.3 chip happens to be used in the devices they want to buy.]
It is also evident that manufacturers are finally taking HDMI seriously with V1.3. And thus they are likely going to expend more resources on design, testing, and manufacturing quality control of HDMI than has been the case up to now.
But the problem is that most HDMI failures are actually due to poor implementation in source devices. And legacy source devices (think cable TV boxes for example), are going to out there making life miserable for folks for a long time. There's only so much a new, HDMI V1.3 receiver, for example, can do to try to cover up the problems in these older, shoddy products.
And whatever increased effort goes into making HDMI V1.3 designs more robust ALSO has to cross a higher bar. That is, HDMI V1.3 is TOUGHER to implement than prior versions. Particularly if you are going to tout its optional features.
Finally, despite efforts to try to get some interoperability testing going, the bottom line is that companies with a history of doing shoddy HDMI are more than likely STILL going to keep screwing it up. Which means some new HDMI V1.3 devices will also still cause problems.
That is, HDMI V1.3 is also not likely to be the panacea people are hoping for to eliminate HDMI system integration problems.
Now the industry has a lot invested in making HDMI V1.3 desireable. If they can only convince people of that, then they win big, because even folks with current HDMI stuff will feel the need to replace it with new HDMI V1.3 stuff much sooner than they otherwise would.
Marketing guys are doing the heavy lifting right now. They are in full spin mode.
But engineering WILL likely come up with real advantages tagged to HDMI V1.3 (even if they actually come from some other piece of the technology like better scaling solutions).
So what's a consumer to do?
My advice is to approach this stuff cautiously. I assert there is no reason to wait for HDMI V1.3 nor to pay a premium for it.
But there is also no reason NOT to get it -- potential cable issues aside.
So if you find a device which is otherwise attractive to you and happens to come with HDMI V1.3 then go for it! This will be particularly attractive for folks who like to buy on the bleeding edge of new technology. Just be sure the price you are paying is justified by THE OTHER STUFF the device does.
But another approach could also work for folks. Right now is a great time to be looking at HDMI V1.1 and V1.2 products. The best of such products there will ever be (more or less) are out there now, and you can buy with added confidence because the devices have been put through their paces by reviewers and enthusiasts. Let the dust settle a bit on V1.3. You really won't be missing out on anything dramatic anytime soon.