Originally Posted by gorthocar
To get accurate numbers, somebody would have to measure their system with a kill-a-watt, then change only the CPU & measure again, etc, for all those CPUs.
That's exactly the kind of comparison I was thinking. Further, I'd say for HTPCs that are media clients (so no transcoding, etc. - just watching HD video, music, photos, TV tuner, etc.) the interesting comparison isn't really the full load draw but power draw at idle and when watching HD video -- the "common use case".
That leads to a problem with the benchmarks, measuring true power usage is made harder by the fact that even though the power draw might be higher, if it does the job in 1/2 the time then overall it could be using less power (kW/h). That's important for ripping and transcoding tasks -- things that can be run faster. However, that doesn't come into play in the idle and the watching HD video use cases. Idle is, well idle -- idle doesn't get done faster on a faster processor. Also, a video you're watching takes however long the video is to watch, no matter the processor speed. None of these tasks can be made more efficient in the time dimension but can be in the power dimension. Does that make sense?
Sorry for the rambling, just thinking out loud how to truly measure power usage for the common HTPC use cases.