Originally Posted by ironhead1230
I hope this isn't too far off topic but I've wondered for awhile the accuracy of clipping indicators on amps. Its been discussed in this thread how some meters are too "slow" to capture short durarion peaks. Do clipping indicators have similar limitations? Is there a certain duration or "amount" of clipping before clipping is indicated? Could an amp have very brief clipping without being indicated?
The answer is "it depends".
Some are simple voltage indicators-which means if they are set for clipping at 8 ohms-then there will more clipping when used with lower impedances.
HOWEVER-our ears can "tolerate" a good bit of distortion before it becomes noticable. Some people are more sensitive than others.
An old story from decades ago was with the Crown DC300-THE serious amp back in the 70s for PA and HI FI alike.
The original amps had no indicators. THen they came out with a model that would detect any difference (distortion) between the input and output signals and an LED would flash.
People started saying that the new amps were not as loud as the old amps.
THe joke was "How do you make a new Crown louder?" Put a piece of electrical tape over the LEDs. Because people would look at the lights instead of using their ears.
I have done some blind testing where distortion was introduced without changing the levels. It was actually embarassing.
It is real easy to get hung up on some things and completely miss the big picture.
Such as looking at distortion specs on a piece of electronics and thinking it will make a difference in their sound. Yet they never look at (or can even find) the distortion specs on the loudspeakers.
Loudspeakers have distortion several magnitudes (Think thousands of times higher) than the electronics in front of them.
So the thing with the highest level of distortion is simply ignored
Does that make sense? But yet people "brag" on how low the distortion is-YEAH RIGHT! Let's talk the WHOLE system-not just one piece of gear.