1. Theories
Movies are normally encoded in YUV 4:2:0 with luma range 16-235, but most graphics card works in RGB 4:4:4 colour space with luma range 0-255, so conversion of both chroma and luma is required.
1.1 Luma resampling - why WTW and BTB is important
Luma resampling refers to expand video level (16-235) to PC level (0-255), and as you can see that the algorithm produces fractional numbers which even HDMI 1.4 can not carry. In order to send luma thru HDMI, fractional numbers are rounded to integers as the following example shows:
Orig Expanded Rounded
16 0 0
17 1.163636364 1
18 2.327272727 2
19 3.490909091 3
20 4.654545455 5
21 5.818181818 6
22 6.981818182 7
23 8.145454545 8
24 9.309090909 9
25 10.47272727 10
26 11.63636364 12
There are several problems with such algorithm:
1) BTB and WTW is cut off before luma expansion.
2) Banding is introduced. See the above example, the transition from 19-20 is mapped to 3-5.
3) Some graphics card later convert 0-255 back to 16-235 when using HDMI output which potentially causes more information lost.
The luma information outside video range (0-16 and 235-255) is also important though you don't normally see them in a movie. IMO they provide the following benefits:
1) It gives you the baseline when you calibration the brightness and contrast of your display.
2) It shows that the video luma is not cut off/expanded along the video pipeline.
3) Some movies contains information above 235, therefore it's important to reserve white up to 240.
Finally comes some suggestions:
1) Avoid luma expansion/compression if possible. It's fine that your HTPC output 0-255 without expanding luma and let your display to cut BTB and WTW.
2) If you have to do luma conversion, make sure that you have dither enabled to reduce banding. You can use FFDshow with dither enabled or MadVR to archieve this.
1.2 Chroma upsampling - why more bits is good
Chroma upsampling applies when converting YUV 4:2:0 to RGB/YUV 4:4:4. You can find more information about how it works below:
http://en.wikipedia.org/wiki/Chroma_subsampling
Chroma upsampling generates colour information which doesn't exist in the original video, therefore the results are different from different renderers. Some chroma algorithms are compared by Madshi here:
http://forum.doom9.org/showthread.php?t=146228
MadVR is a unique renderer that uses 16bits processing, the rest only uses 10bits or even 8bits. The ATI's internal video process pipeline uses 10 bits for example. Below is the description from Madshi with regards to why more bits are important:
"I've seen many comments about HDMI 1.3 DeepColor being useless, about 8bit being enough (since even Blu-Ray is only 8bit to start with), about dithering not being worth the effort etc. Is all of that true?
It depends. If a source device (e.g. a Blu-Ray player) decodes the YCbCr source data and then passes it to the TV/projector without any further processing, HDMI 1.3 DeepColor is mostly useless. Not totally, though, because the Blu-Ray data is YCbCr 4:2:0 which HDMI cannot transport (not even HDMI 1.3). We can transport YCbCr 4:2:2 or 4:4:4 via HDMI, so the source device has to upsample the chroma information before it can send the data via HDMI. It can either upsample it in only one direction (then we get 4:2:2) or into both directions (then we get 4:4:4). Now a really good chroma upsampling algorithm outputs a higher bitdepth than what you feed it. So the 8bit source suddenly becomes more than 8bit. Do you still think passing YCbCr in 8bit is good enough? Fortunately even HDMI 1.0 supports sending YCbCr in up to 12bit, as long as you use 4:2:2 and not 4:4:4. So no problem.
But here comes the big problem: Most good video processsing algorithms produce a higher bitdepth than you feed them. So if you actually change the luma (brightness) information or if you even convert the YCbCr data to RGB, the original 8bit YCbCr 4:2:0 mutates into a higher bitdepth data stream. Of course we can still transport that via HDMI 1.0-1.2, but we will have to dumb it down to the max HDMI 1.0-1.2 supports.
For us HTPC users it's even worse: The graphics cards do not offer any way for us developers to output untouched YCbCr data. Instead we have to use RGB. Ok, e.g. in ATI's control panel with some graphics cards and driver versions you can activate YCbCr output, *but* it's rather obvious that internally the data is converted to RGB first and then later back to YCbCr, which is a usually not a good idea if you care about max image quality. So the only true choice for us HTPC users is to go RGB. But converting YCbCr to RGB increases bitdepth. Not only from 8bit to maybe 9bit or 10bit. Actually YCbCr -> RGB conversion gives us floating point data! And not even HDMI 1.3 can transport that. So we have to convert the data down to some integer bitdepth, e.g. 16bit or 10bit or 8bit. The problem is that doing that means that our precious video data is violated in some way. It loses precision. And that is where dithering comes for rescue. Dithering allows to "simulate" a higher bitdepth than we really have. Using dithering means that we can go down to even 8bit without losing too much precision. However, dithering is not magic, it works by adding noise to the source. So the preserved precision comes at the cost of increased noise. Fortunately thanks to film grain we're not too sensitive to fine image noise. Furthermore the amount of noise added by dithering is so low that the noise itself is not really visible. But the added precision *is* visible, at least in specific test patterns (see image comparisons above).
So does dithering help in real life situations? Does it help with normal movie watching?
Well, that is a good question. I can say for sure that in most movies in most scenes dithering will not make any visible difference. However, I believe that in some scenes in some movies there will be a noticeable difference. Test patterns may exaggerate, but they rarely lie. Furthermore, preserving the maximum possible precision of the original source data is for sure a good thing, so there's not really any good reason to not use dithering.
So what purpose/benefit does HDMI DeepColor have? It will allow us to lower (or even totally eliminate) the amount of dithering noise added without losing any precision. So it's a good thing. But the benefit of DeepColor over using 8bit RGB output with proper dithering will be rather small.
"
The MPC-HC internal YV upchroma shader also produce a very close result, in order to make it work you need to feed NV12 to MPC-HC and select EVR renderer.
1.3 Resizing algorithms
There are also several resizing/scaling algorithms that you can choose from different renderers, for example bicubic in EVR or VMR9 and nearest neighbor in overlay and VMR7. In general, bicubic provides better quality than other scaling algorithms which gives an advantage of using EVR renderer over VMR/overlay.
There is a comparison among different scaling algorithms which can be found here:
http://audio.rightmark.org/lukin/gra...house_more.htm
Since EVR, Halli and MadVR provide superior scaling algorithm, and I would suggest to use these renderers instead of others. If for some reason you have to stick with overlay, you can use ffdshow to do scaling instead.
1.4 De-Interlacing
I don't have much knowledge in this area, and the ATI hardware de-interlacing satisfies my requirements.
Someone who knows more about it, please help me here, I think it's also a quite important factor.
1.5 Smooth 24P playback - avoid 3:2 pulldown
The FPS(frame per second) in different video files might not be the same, for example BBS content is normally in 25P. However, most of movies are 24P which means the 23.976 frames per second (23.976 comes from24/1.01). In order to play such video at 60HZ, 3:2 pulldown is introduced by ATI. 3:2 pulldown repeats the first frame 3 times, and then 2nd frame 2 time, so for every 2 frames it generates 5 (24/60=2/5). The potential problem of 3:2 lies in the fact that some frames stay on the screen longer than others which can cause noticable judder. True 24P playback doesn't need 3:2 pulldown and the playback is much smoother. In order to enable true 24P playback you need to make sure that:
1) Your display accepts 1080/24P input
2) You can choose either 23HZ or 24HZ refresh rate in display setup. Be aware that you should choose 23HZ if both 23 and 24 exist in your display settings.
When you TV or projector receives 24P signal, it normally either do 5:5 pulldown(display each frame 5 times) and use creative frame creation to generate intermidiate frames(generate 4 frame between every 2 frames). Personally I prefer frame creation which is available in my panasonic projector, but nevertheness, both options should give you smooth playback (comparing to 3:2 pulldown). If your display doesn't support 24HZ refresh rate, then try 48HZ (2:2 pulldown) or 72HZ (3:3 pulldown) instead.
I choose 23.976HZ (23HZ in CCC) when I play 24P, 50HZ when I play 25P, and 59.94HZ(59HZ in CCC) when I play 30P or 30i.
End of theory. Please leave your feedback so that I know if I should continue 
Thanks for reading and bearing with my poor English.