Originally Posted by Bigus
Impact as in reduce/eliminate hum, or impact as in deepen the soundstage and add more space between instruments in the recording, improving bass definition and locking in the timing and rhythm...
Just to make sure we're on the same page, and keep people from inadvertently wandering too deep into the forest.
No prob., good question.
With pro equipment Bill's primary focus is on hum and noise. In one of his demonstrations he shows a clamp on ammeter monitoring the interconnect current between pieces of equipment. He shows 58 milliamps, which I believe he indicated or implied to be either 60 or 120 hz.. That is on a line that should in essence, have NO net current. So the equipment in his demonstration has to be able to ignore 58 milliamps into pin 1, remaining hum and noise free for the intended signal.
That is of course, one of the reasons for differential inputs, to be able to reject the common mode hum and noise. The nefarious issue with the pin 1 problem, is the path into the amplifier for that current cannot affect the amplifier. Bill demos and derives how IR drop within the pin 1 path can work it's way into the signal stream. Given his focus on 60 cycle, he didn't consider the impact of inductive coupling and it's frequency dependence. He did note somewhere that the coupling in some cases increases with frequency, but never did attribute that to inductive reactance and coupling. That is why my test goes 20 to 20k. I'd go higher, but I used a QSC rmx1450 as the drive amp, so band limited it to the amp capability.
Bill NEVER considers the impact of the power amplifier itself on the powerline currents. For switchers such as the qsc powerlite series, they've really decoupled the audio signal from the line. As well, the pro balanced differential disregards the ground loop currents several orders of magnitude better than unbalanced consumer RCA driven equipment.
And yet, the pro world STILL has problems as Giz acknowledged..."inherent flaws" as it were.
In consumer unbalanced, the issues are much larger by reason of design limitations. There is little attention paid to how the supply pulls from the line cord, from the power modulated haversines of the diodes in the bridge, the mutual coupling between the rail currents and the transformer secondary, or even current driven magnetic fields within the chassis. Indeed, with poor layout, it is easy to make the externally formed ground loop the secondary winding of the 60 cycle input current from the PC. (always have line and supply runs twisted so that all bundles have net zero current and the loops are as small as possible.
The inputs of unbalanced reference the chassis and safety ground typically. Since the amplifier input needs a reference ground, and it's input ground it heavily bonded back to the wall outlet and the source bonded ground, at the lower frequencies of audio it will in essence leave the IC core out in the wind so to speak...a core signal wire where the effective ground return current is flowing through the safety ground instead of the intended shield of the IC.
The net result of all this is, the ground loop which is compromising the input shielding may be carrying not only 60 cycle from external fields, but modulated odd harmonics of the power amp line draw, modulated even harmonics of currents after the bridge and within the supply rails, as well as actual hf music content which bleeds back into the line cord. (measured in the past).
If you consider this unintended "" out to in"" coupling , you can easily see that there are coupling paths which can alter the system's response. My tests are designed to spot and measure that occurrence for equipment designers.
As to alteration of soundstage? Amazingly unlikely for any well designed equipment with fully differential balanced inputs.
In consumer equipment...an entirely different ballgame. It may happen in some cases, it may not in others.
Without actually designing the equipment to properly ignore the problem, who's to say? My floating source test allows one to see the residual of the loop that the amplifier would see while driven to power. I designed it that way because if you simply use the source and monitor it's drive to the amp, you have already included the induced errors of the system, if the amp reproduces that with fidelity, the difference will be zero. Mind you, the error can easily be phase shifting of the signal, which would be lost if you used an FFT to analyze.
Your initial question includes lots of descriptors, I believe it is also important to clarify definitions. For example, bass definition/locking timing/rythm?? I can guess, but have no idea what that means with respect to what is happening to the signal in engineering terms. I do know that it's not exactly possible to delay the signal sufficiently to actually hear a timing change, so am at a loss with respect to your question.
My apologies for the lengthy response.