Well, the actual issue could be a trivial one.
In an Intel GMA thread here it was mentioned that Intel's driver engineers had a hard time in the past simply for the fact that in their lab they only had a few AVRs to test their HDMI parts of the driver. And, to be honest, with the market situation (vital parts of graphic cards produced and sold at penny cost) there is no way ATI or Intel/nVidia would put enough money in their driver R&D departments to buy a few dozen quality AVRs and tv sets to test enough permutations.
What I have observed is that many of you have above-average configurations, and our displays certainly should be capable of accepting RGB 4:4:4 & YCrCb 4:4:4, YCrCb 4:2:2. What if ATI only tested with a few displays, and actually tried to compensate for their deficiencies, not "knowing" that as a result they mess up other displays, such as those used in this forum?
Don't attribute to incompetence that which could simply come from lack of funds.
Also, and this is purely anecdotal:
A few days ago I downloaded one of tulli's pre-made .INF files (for a smaller version of my Panasonic plasma) and replaced the default "Plug&Play Monitor" driver. I also have applied an .ICC (generated by SwitchResX). This is under Windows 7 x64, using a Radeon 4650, 10.1 driver.
It *seems* to me that desktop colors got "better" after this change. Could it be that ATIs driver actually reads and interprets the EDID from the display "driver", instead of sniffing it on their own?