or Connect
AVS › AVS Forum › Video Components › DVD Players (Standard Def) › Double Blind Testing considered bad?
New Posts  All Forums:Forum Nav:

Double Blind Testing considered bad?  

post #1 of 8
Thread Starter 
I've been reading several threads where its the subjectivists vs objectivists. I keep seeing that people don't like double blind tests or reviewers dont do it because its bad. This completely baffles me. DBT is an incredibly valuable tool for evaluating qualitative differences in the subjective domain. I've used this technique to significant benefit in doing product design/development where the end result was highly subjective. We needed help to create the best possible product (i.e. one that more people like).

Could some one enlighten me on this topic?

Phil
post #2 of 8
I think double blind testing is great as long as the judges are experienced enough (especially to overcome their own biases) and as long as the goal or endpoint of the testing is clearly defined. The most interesting debate is the the Dolby Digital vs. DTS issue. The Dolby people seem to favor subjective evidence while the DTS people (i.e. referring to the DTS company) seem to favor objective evidence. I tend to favor the DTS argument, but who's to say that those who prefer the sound of a Dolby Digital soundtrack are "wrong"? Two questions could possibly lead to two completely different answers even with the same double blinded test. If the question is which soundtrack do consumers prefer and the double blinded test involved a Dolby Digital soundtrack vs. a DTS soundtrack, then if the judges are just common consumers there's a good chance the Dolby Digital soundtrack would win. Or if the judges are well trained audiophile movie enthusiasts, the DTS soundtrack might win ("more transparent", "better imaging"). On the other hand, if the question were which soundtrack sounds more like the original master and the double blinded test involved a Dolby Digital soundtrack, a DTS soundtrack, AND the original master, the DTS soundtrack might win even with the common consumer audience.
post #3 of 8
Thread Starter 
Good point about what method favors your product. DTS has a bigger "bit budget" so of ccurse it would be perceived as better in a dbt.

By the way, I was refering to comments about reviewers not liking DBT. That still mystifies me. wouldn't they want to tell their readers the truth? Oh, yeah, advertising and sponsorship. sigh Hmmm.... I'm starting to get paranoid.
post #4 of 8
Quote:
Originally posted by philba
That still mystifies me. wouldn't they want to tell their readers the truth? Oh, yeah, advertising and sponsorship. sigh Hmmm.... I'm starting to get paranoid.
Perhaps some people fear DBT because they're afraid that if they had to choose between two "things", they might pick the one they think sounds better rather than the one that actually IS better. ;)
post #5 of 8
Quote:
Originally posted by csv96
I think double blind testing is great as long as the judges are experienced enough (especially to overcome their own biases) and as long as the goal or endpoint of the testing is clearly defined. The most interesting debate is the the Dolby Digital vs. DTS issue. The Dolby people seem to favor subjective evidence while the DTS people (i.e. referring to the DTS company) seem to favor objective evidence. I tend to favor the DTS argument, but who's to say that those who prefer the sound of a Dolby Digital soundtrack are "wrong"? Two questions could possibly lead to two completely different answers even with the same double blinded test. If the question is which soundtrack do consumers prefer and the double blinded test involved a Dolby Digital soundtrack vs. a DTS soundtrack, then if the judges are just common consumers there's a good chance the Dolby Digital soundtrack would win. Or if the judges are well trained audiophile movie enthusiasts, the DTS soundtrack might win ("more transparent", "better imaging"). On the other hand, if the question were which soundtrack sounds more like the original master and the double blinded test involved a Dolby Digital soundtrack, a DTS soundtrack, AND the original master, the DTS soundtrack might win even with the common consumer audience.
The correct way to do this sort of comparison is with reference to an uncompressed original. That is, you take various types of material and encode it in both formats. Then you carefully match levels, randomize trials, and so forth and determine how often experienced (and preferably trained) listeners with unimpaired hearing can actually tell the difference between the original and the compressed version and how they grade the severity of the degradation observed when there is an actual detection. At no time does anyone involved in the test know which is which, and it changes randomly from trial to trial. Good codecs are actually very hard to catch out; there are even libraries of sounds that are known to be difficult for perceptual coders that are used for evaluation by folks who are serious about this stuff.

This is really the only way to evaluate perceptual coders like Dolby Digital, DTS, MPEG, etc., because there is no set of measurements that can evaluate the masking effectiveness directly. People often fall back to arguing about data rate, but that's useless when trying to compare different codecs; it is very easy to design a poor codec that causes more impairment at a high data rate than a good codec at a much lower rate.

Finally, it's pretty much impossible to do a really meaningful comparison of Dolby Digital and DTS using commercially available material--partly because you don't have access to the originals and partly because the original may not even be the same for the two versions or there may be other differences in the processing applied. Too many uncontrolled variables. One, some, or even all people may prefer on over the other in any given case, but there's often no way to determine exactly why--and the one preferred may be the less accurate one.
post #6 of 8
Quote:
Originally posted by philba
By the way, I was refering to comments about reviewers not liking DBT. That still mystifies me. wouldn't they want to tell their readers the truth? Oh, yeah, advertising and sponsorship. sigh Hmmm.... I'm starting to get paranoid.
A lot of the reason is that it tends to give unwanted results. :) Years ago essentially everyone agreed that the best way to compare two pieces of audio equipment by ear was via level-matched, instantaneous switching, to overcome the problem presented by our very short memory for fine sonic details. But as the equipment--the electronics, at least--got better and better, people gradually stopped hearing differences this way. It was around that point that "long-term listening" came into vogue as a comparison tool and audio reviewing started down the road to the very colorful state we find it in today!

I think this mostly has to do with some people feeling that the disappearance of differences takes the fun out of the hobby aspect. And some people seem to have a lot of their own self-image tied up in feeling that they can hear stuff that other people can't. There is a commercial angle insofar as some manufacturers are selling based on professed sound quality advantages and insofar as some magazines and Web sites make their livings describing said advantages their readers, which is a problem if you acknowledge that many of the professed advantages are not real. I think that's a secondary effect, however. Unfortunately, it does muddy up the waters considerably and makes it very hard to pick out the useful observations from the detritus. Not to mention making a lot of reviews excruciatingly long-winded, opaque, pompous, and boring.
post #7 of 8
Thread Starter 
MDRiggs, I'm afraid you have a lot of valid points there. sigh. In the PC world, reviews are very much "lab based", numerical score driven. It makes it easy as a buyer to look at the reviews and pick out the features/performance that matters. Yeah, some people just go with the editor's choice(s) but its great to be able to know objective evaluations of the products. I've found that sorely lacking in the AV space. Granted this is harder to do. This is particularly true for audio.

What amazes me is the incredible price range of similar gear and how little objective justification there is for the price differential. I wonder how many features of a given product actually get used by the typical customer? It reminds me of the old line auto sales guys maxim about cars aren't bought, they're sold.
post #8 of 8
Thread Starter 
as to your point on original source material. this is very true for evaluating encoders. Its even more complex because you can skew the results by boosting different octaves or, worse, doing cheesy things like adding a little high band noise to make audio sound brighter and getting more people to like it. A certain very large software company does this. Is it cheating, I dunno. feels slimey.

I was thinking of DBT for other things like receivers, DVD players, speakers, interconnects, power cords (yes, there are boutique power cords) and so on. It would be easy to summarize the results ("72% of our panel preferred the Denoy over the Pannon product"). It shouldn't be a substitute for personal evaluation of the products but I am so strapped for time to eval this stuff I need all the help I can get. I'm pretty certain I am not alone.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: DVD Players (Standard Def)
This thread is locked  
AVS › AVS Forum › Video Components › DVD Players (Standard Def) › Double Blind Testing considered bad?