Originally Posted by andyc56
A few years back, Bob Cordell did some demos at RMAF and HE2007 regarding the peak and average power of actual music. He built a meter containing a true RMS-to-DC chip of the type used in true RMS voltmeters. The output of the chip was processed and calibrated to show the computed average power, assuming an 8 Ohm load, on a display. The circuit also contained a peak detector that held the peaks for 1 second and displayed them on a meter, also calibrated to read Watts into 8 Ohms. He hooked it up to the output of an amp while it was playing music for the demo, so you could see the average and peak power in real time.
He did a writeup of the demo here
, in the section titled "The Peak Power Demands of Well-recorded Music". The most extreme recording regarding peak-to-average power was a Rikki Lee Jones track that had 1-2 Watts average and 250 Watt peaks. The meter design is described here
No special meters are required to perform this analysis. The waveforms in question must be present in the original recordings which can be obtained from the pre recorded media by various means including the use of a computer. The same computer is capable or running software that is used for doing recording on PC by both professionals and amateurs. Most such software can be used to perform the same kind of analysis.
Don't confuse measured results with differences that are actually heard. The ear is very tolerant of short term clipping if clean.
The most significant comment that I found in the paper you cited above was:
which you referenced, but may not have read thoroughly was:
"There were no "night and day" results. Indeed, for many attendees the differences were difficult to hear. Moreover, those who perceived a difference were just as often wrong in selecting which amplifier they thought was the tube amplifier. This shocked all of us."