Originally Posted by impetigo
Cool story, but frankly I take anything that CNET puts out with a large grain of salt ever since the CES "best in show" award debacle. Of course, that probably applies to any mainstream tech blog/site since they are mostly owned by corporations that have some vested interested in the products and industries they cover. They mostly all receive direct advertising revenue from the very companies they review as well. I don't know how anyone can take any tech blog seriously these days.
Honestly, no one would be questioning any of it had it not been for that stupidity with the best in show award.
Quite frankly, that doesn't give me pause at all when it comes to actual testing and reviews, though.
An award for "best this" or "best that" is almost always subjective. It's a preference made all the worse by corporate interests. It's like creating a best of list for movies or TV shows: there's no right answer, though there are clearly ones that are the most acceptable to the majority.
Testing processes are less opinion and more fact because those opinions are often supported by hard testing data. Either the product meets the requirements or it doesn't.
Sure, there are certain preferences that can come into play, but if the black levels are no good, the edges leak light or the speakers suck, you're not going to get away with showering the product with love for very long before you're called on it.
Further, there are other outfits doing roughly the same tests, so it's easy enough to compare testing reviews, unlike event awards for best use of LEDs or best cable management innovation.
Clearly, corporate interests are always a potential issue when it comes to rating a product, but you can have the opposite problem with independent reviewers, too: there are a lot of individual reviewers that are clearly fans of particular products or companies and will always rate them better despite not having any corporate overloards.
Even an outlet like Consumer Reports can have biases. For example, they used to consistently mark down audio receivers with poor radio tuners, despite the fact that most people never use the function on them. There would occasionally be constradictions, too, like marking down one car for having a dash that's too busy, while marking down another for being too baren.
The key is to use the data to assist in your research, not as a be all and end all to what to buy.