Well, I picked up a Kill-a-Watt meter and finally got around to taking some measurements. I guess you would classify my HTPC as a Premium one...
Intel Core i5-2500K
GIGABYTE GA-Z68MX-UD2H-B3 motherboard
Saphire Nvidia GT-440 card
Seasonic X-560 PSU
LG WH12LS30 Blu-ray, etc. drive
Avermedia duo TV tuner card
Hitachi Deskstar 5K3000 2TB (x2)
Crucial Technology 64GB M4 SSD (boot drive)
OCZ Platinum Extreme 4GB DDR3-1600 CL7 1.35V memory
2x Noiseblocker MF12-S2 multiframe case fans
Noiseblocker PFM fan for CPU cooling
4x USB 3.0 expansion card
Hdplex IR receiver
Kill-a -watt readings:
112 – BluRay (TMT5 via WMC)
103 – WMC playing live TV
100 – WMC playing recorded TV
99 – sometimes drops to ~73 for AVCHD 720p video via WMC
94 - MKV playback with 1080p/DTS-HD & MadVR
72 – WMC Netflix plugin
68 – WMC 10-megapixel picture slideshow
68 – WMC Zinc internet TV / Netflix
64 – streaming music via MOG web app
2.6 - sleep
(pictures here: http://www.avsforum.com/avs-vb/showt...0#post20555940
I'm running Lucidlogix Virtu as well, but am using the Nvidia card as the default display device - so it does not have a major effect unless I am transcoding (I tried it the other way around, but TMT5 playback is unacceptably jumpy running on the IGP).
Here are a couple of quick measurements with Virtu running with the IGP as default
64 - MOG streaming
78 - BR playback as above
I'm sure I could crank some serious watts by playing video games, but it's not my thing. 60-100 watts for typical use seems pretty reasonable to me. The amount I could save by deleting the video card is dwarfed by the amount of energy I consume for hot water/heating/refrigeration/AC/dehumidification in any case.
I'm still not 100% sure whether the perceived MadVR video playback quality is mainly a placebo effect or not, but I'm happy with it.