UPDATE: this is quite a bit out of date, and some things have been fixed. With 8.3-8-5, I find the only absolutely necessary is to standardise levels with UseBT601CSC=1, and the acceleration fixes if you use dual screens. Use others only to correct specific problems - it does vary with different cards and OS's.
UPDATE2: if you have an AGP card, even this lot of tweaks might not fix things. Unfortunately, there are a bundle of problems with agp cards, and it's just pure luck if they work or not.
UPDATE3: with 8.8 2600+dual monitors can be made to accelerate all again. Set vforceuvdcd1 & vforceuvdh264 to 1, and dxva_nohddecode to zero.
Useful app to apply tweaks:http://exdeus.home.comcast.net/ati-hd2x00/
Quick way to apply individual ones in the right place:http://bluesky23.hp.infoseek.co.jp/DXVAChecker_1810.zip
(run the app, right-click any of the listings in the top box, driver settings, change stuff, OK, reboot if in XP)
LIST OF REG TWEAKS:
The "UseBT601CSC"="1" driver expands SD levels, the same way as HD is expanded automatically by the drivers, giving you 0-255, aka PC levels. If you want the original 16-235 (video levels) then go into CCC/avivo video/basic colour and turn "use application settings" off, brightness to 16, contrast 86 - pretty much exactly reversing the expansion. Note that in Vista EVR will always expand, and so will VMR9 with hardware acceleration. However, VMR9 without acceleration will not expand at start, and neither will the CCC colour controls work, until you manually move a colour control via the likes of procamp.
(this isn't a fix though, because when it doesn't expand it gets moody and instead resorts to screwing the colours).
Note that this can create a problem with HD 720p material that has a wider aspect ratio than 16/9, and has a vertical resolution less than 720 lines (eg mkv 720p at 2.4:1 gives resolution of 1280*533 if encoded without black bars). These are encoded with standard HD bt709, but the card thinks that all <720 is SD, so assumes bt601. So, bt601 on bt709 material = colours unsurprisingly wonky. I think possibly the drivers respect material with colourspace flags though, so this might not be an issue - need to test. If needed you could use Haali renderer set to auto, and it'll select the correct one (breaks acceleration though).
More advanced solution to levels: use ffdshow-tryouts to output RGB32 high quality (with fullrange set under RGB conversion, the default "standard" setting confusingly actually expands). This breaks acceleration, but means no expansion. Note you'll need to select between bt601/709 for sd/HD, or you can make autoswitching profiles based on resolution (i use horizontal res above/below 1024 to do this). Note this also disables all driver based postprocessing, from sharpening/denoising to deinterlacing, as all of these require YUY2/YV12 to be fed to the renderer (and deinterlacing requires nv12 for anything better than basic bob). If this is all gibberish, probably not a good idea to use it, as you need to be fairly familiar with ffdshow for this.
Quick guide here:http://www.avsforum.com/avs-vb/showp...postcount=6237
I don't know how XP behaves in software mode with expansion, so you'll have to experiment a little. A good way is to printscreen some video and paste into irfanview, then you can click any pixel and it'll tell you in the window header what the RGB values are. If it's 0/0/0 in the blackest bits, or 255/255/255 in the whitest bits, that means it's expanded. Non-expanded, ie the original, will be around 16 and 235 respectively.
If you're changing drivers, it's a good idea to run the complete ATI self-destruct sequence to rid your machine of driver remnants: in control panel/programs, uninstall ATI Catalyst Install Manager, and tell it to delete all ATI software. Reboot after, then delete any ATI directories in c:\\program files, and if you have them run drivercleaner or xdc.
Edit: quick update, I find uninstall/reboot/drivercleaner completely wipes all traces, both files and in the registry.
These tweaks will work with any driver version - some of the key names changed so this covers them all.
THE REG STRINGS (yes, they're all apparently reg strings, even in XP)
trdenoise 0 = turns off forced temporal denoise, which can just blur things. Even when set on (as it is by default), this appears to only work for interlaced content, and when motion-adaptive or vector-adaptive deinterlacing is in use. It also seems to be broken in the Vista 7.7 drivers, but works in 7.10. When on, it has a hefty hit on the shaders, 15% on a 2600XT. Probably it defaults to off for the 2400, as it would probably equate to 30-40% there (if anyone can test please let me know, just turn it on and off with va deinterlacing in use, and watch rivatuner gpu).
UPDATE: in 8.3 ATI added a CCC slider for denoise, which defaults to 64 (too much imo, try 30ish). Note this only works when MA or VA deinterlacing are engaged - this is correct imo, as film origin stuff doesn't need denoised. If you only want SD denoise, you can set vforcehddenoise=0. To enable sliders for XP, set Denoise_NA=0
dxva_detailenhance 0 = turns off forced sharpener.
UPDATE: in 8.3 ATI added sliders for detail and denoise. The above tweak is no longer necessary, just untick in CCC. Note the detail enhance appears to be totally disabled for HD. It does work for all SD though, progressive & interlaced. I reccomend putting the slider to 50. To enable sliders for XP, set Detail_NA=0.
VForceMaxResSize 2800000 (aka SORTOverrideVidSizeCaps in Cat7.7 and earlier) = this sets the PDVD HDDVD/Bluray max render size in pixels, but the default setting is so small that it can't fill a 1080p screen. The formula appears to be screen width squared * 0.75. On a 2600 the default value when the key doesn't exist appears to be around 2.3 million, which is enough for a 1600*1200 screen, but not for 1920 pixels wide. For that the 2.8 million value above is enough, but if you have a 2500*1600 screen it needs to be 4687500 or greater. Since there's maybe a performance reason for a limit, it's probably not worth setting it higher than you need.
DXVA_WMV_NA set to 0. Makes the CCC checkbox visible in CCC that allows controls WMV acceleration. Still no WMV acceleration in Vista regardless though :-(
Fleshtone and ColorVibrance minimum/defaults set to 0. The standard 2600 install has these on and at 25, but if you disable them in CCC they actually still continue to work at their default, ie 25. Setting mins and defaults to 0 means that when you deselect them in CCC, they actually turn off - your choice of course, but several people have reported that vibrance in particular results in posterisation.
UPDATE: note, with newer drivers this seems not to be an issue. Not sure quite when they fixed it though, so if you're using older drivers it's worth checking.
ColorVibrance_NA and Fleshtone_NA to 0. This makes the sliders visible in XP. Does nothing in Vista, they're visible already.
UseBT601CSC set to 1. This causes SD to expand to 0-255, in the same manner as HD (which expands regardless what you do) for a consistent single calibration between the two.
(2600 only) default deinterlacing mode to vector adaptive. In Vista there's a bug with 7.9 where auto mode in CCC just selects bob for 1080i. This reg change forces the issue. For the integrated/2400/3400 cards, see vforcedeint....
VForceDeint = 2/3/6. These force the availability of various deinterlacing modes, which are restricted on the lower models due to limited shader power. 6 allows all to be available, and you can then manually select which you want in CCC. Watch your gpu %.....
VForce24FPS1080MPEG2, VForce24FPS1080H264, VForce24FPS1080VC1 all to 0 (and dxva_only24fps etc for cat7.7 and earlier). Allows the 2400 to run European 1080i50 accelerated. For the 2600 these already work, so makes no difference.
dxva_nohddecode set to 0. Enables mpeg2 HD acceleration for the 2400, does nothing for the 2600 (except in official cat7.10). This can potentially create problems - if your GPU (see rivatuner) is being maxxed out by 1080i mpeg2 after this, set this back to 1 and use PowerDVD with hardware deinterlacing in options/video/advanced forced to 3C (vector-adaptive, the best) or 55 (motion-adaptive, next best). This avoids the enormous hit of mpeg2 HD decoding, which is around 50% GPU on a 2400pro or 35% on a 2600pro, but still gets you hardware deinterlacing. As above, in Vista the 2400 drivers will happily ignore whatever deinterlacing mode you select in CCC. XP is better behaved and will obey the CCC mode, but outside PDVD to get deinterlacing without acceleration the only choice (afaik) is the Bitcontrol mpeg2 decoder, or ffdshow set to decode mpeg2 (in codecs, you also have to go to the ffdshow output tab, tick "set interlace flag" and set to bob, then above that tick only NV12 of the colour types). Unfortunately with these methods you get no choice of what deinterlacing mode is used, unlike PDVD, so Vista 2400 users are a bit screwed.
VforceUVDVC1/VforceUVDh264 =1. With 8.1 onwards, ATI broke acceleration with dual display on lower end models (which includes the fairly powerful 2600xt for some reason, but not the 3800 cards), if the second display is HD (ie >719 lines vertical, thus pretty much every PC monitor). The above tweaks get acceleration working again. Quite why they chose to do this I have no idea, since every card from the 2400 up is capable of dualscreen acceleration.
Other known bugs:
* mpeg2 not in UVD but in shaders
* wmv acceleration broken in Vista
* 2400 Bluray mpeg2 not accelerated (and in 7.10 2600 also)
* 50hz displays have no overscan correction
* (cat7.8-7.10) Vista EVR+cyberlink mpeg2 results in juddervision deinterlacing errors (this includes Vista PDVD itself, ati avivo codec+EVR is fine though, vmr9 is unaffected)
* (cat7.8-7.10) deinterlacing set to auto in CCC bobs for 1080i h264. Forcing vector-adaptive works, at least in Vista.