Help understanding amplifier difference (150w Bridge vs 200w)
I'm trying to wrap my head around whether or not a few things matter or how they will interact and compare. Mainly, I'm curious how a bridged stereo amplifier compared to something non-bridged with respect to actual output and overall quality of sound, but not to the nitpicking golden ears level, but at least worth mentioning in case bridged quality is extremely poor or something.
I currently have a couple of the well known AudioSource AMP100 amplifiers that I use with my towers & center. I use them as bridged monoblock amplifiers and they're supposed to do 150watts at 8ohms. I've used them in stereo mode which is 50watts at 8ohms and bridged. I realize taking the power from 50 watts on my speaker to 150 watts on my speaker is only 3x the increase in power, so maybe +4~5db SPL is gained there, but of course, I'm more interested in whether it provides more current and power needed to successfully drive the speakers fully. The speakers in question are budget Polk Monitor 70 II's. These amps drive the M70's to ear splitting levels. But I'm not necessarily looking for being super loud. I'm looking to see if they're being fully driven so that they're not able to output everything they should be capable of in a given room. Polk rates them to 275 watts which seems nuts to me, but maybe that's supposed to be for a very large room? I'm trying to keep things normal so a 20x16 room or something around 24 x 20, give or take, basically a typical living room in a house, as a reference point for what's needed. I don't have the ability to measure the SPL of them at this time, but will soon when I get my microphone in but I'd like to know whether or not it matters that the speakers are receiving 50watts or 150watts (or whatever is really delivered).
So then, assuming the 150watts is real in bridged output on the above AMP100 amps, I'm curious how that would compare both power wise and sound quality wise to an amplifier that is not bridged, such as the Monolith 200watt x 3 channel amplifier, or an Outlaw 200watt monoblock, basically any 200 watt fairly affordable amplifier that isn't bridged. I realize a better amplifier should sound better, but I don't want to fall into the whole better brand, higher price, automatically means better sound type thing as I know that's not the case as the room treatment and correction makes a big difference. So before moving into better speakers, I'd like to get a handle on being able to see if it matters if I'm using a bridged 150 watt amp into my Monitor 70 II's or if a 200 watt unbridged amp would be a significantly difference experience and worth the cost. If that's the case, and I treat and correct for the room appropriately, that I might be able to truly benefit from better speakers (such as Rti A7's maybe), then that opens doors. But I don't want to get more stuff without being able to really know it will matter in the first place. I assume it would matter. But, like many things, assuming doesn't lead to reality all the time. And I would love to avoid the hyperbole of more expensive things automatically being better no matter what from people who have never even heard them nor measured anything, but I digress...
So is there a real difference between a bridged 150 watt amplifier compared to a non-bridged 150 watt amplifier? Any references or links to information? I would like to even measure this on my own with a speaker and the amplifiers once I get my microphone and use REW to figure things out.
Will a Monolith 200watt or Emotiva 200watt or Outlaw 200watt amplifier truly be better in every way than a 150 watt bridged amplifier? Is there a way to truly know that other than listening to it? I realize this is truly difficult to answer, and that it's an automatic "yes" for most everyone reading because they're better brands, more expensive, likely better components and more power output, but, that doesn't truly mean they're literally better in a room without treatment & correction.
I'm truly curious if a $150 or less 50watt amp that can be bridged to 150watts at 8ohms is essentially junk, or if it's a useful budget approach, before dropping $400~$1000 for 50 watts more of power, and whatever comes along with it for that cost increase. And I'm trying to keep the context on current speakers, the Monitor 70's, and then extrapolate anything from there to better speakers in the future, A7's maybe, if anything can be truly figured out from this.
Will the additional power merely increase bass response while not really doing much for mids and treble?
Will the quality of the amplifier make a serious difference in the quality of the perceived sound under the same room treatment & correction?
@MalVeauX First, yes better speakers should sound nicer on a simple little 85w denon x1400 then ok speakers on a bad ass 140w receiver like an Anthem 1120. I always suggest speaker upgrades first.
Second, I own old Integra 2 channel amps for zone 2, atmos, etc. and are similar to the amp100. They are great for simple zone 2 or atmos power or maybe a small LR 2 channel setup, but the monolith amps are really in another league. They are well built and are meant for cranking LR speakers in an audiphile type 2 channel setup or the even LCR in a bad ass theater setup.
my $0.02, hope that helps?
(Series) bridging is generally used to put out the most voltage possible, often because the speaker is high impedance or the power supply is relatively low. This also drives the most power, so bridging is used to squeee the most power from an amp. That high voltage will try to drive high currents into less-than-high impedance loads, and that's where things tend to fall apart.
When two channels are series bridged into an 8R load, each channel effectively experiences a 4R load. Heat, distortion, and the chance of triggering protection circuitry increases with decreasing load impedance. But it won't be much worse bridging into an 8R load than running both channels into two 4R loads.
The AMP100 bridged can put 150W into an 8R load, but a monoblock that does 200W into 8R is usually rated around 300W into 4R. So I'd expect the monoblock to be able to put out more current (or to do so with less complaint).
Either the Monitor 70ii's r the RTi A7 shouldn't be difficult loads, and they're both rated 8R at around 89-90dB sensitivity. You're already at "ear-splitting levels", what do you think you're missing?
I'd suggest borrowing 1kW/channel amplification, and seeing if that impresses you. You'll get the ability to play even louder (at least for a minute, until power compression and hearing loss kick in). Turn it up more, and a tweeter or crossover cap might be the first to die.
If you really want it louder, more efficient speakers will get you there with the same amps.
|All times are GMT -7. The time now is 01:18 AM.|
Powered by vBulletin® Copyright ©2000 - 2020, Jelsoft Enterprises Ltd.
vBulletin Security provided by vBSecurity (Pro) - vBulletin Mods & Addons Copyright © 2020 DragonByte Technologies Ltd.
vBulletin Optimisation provided by vB Optimise (Pro) - vBulletin Mods & Addons Copyright © 2020 DragonByte Technologies Ltd.
User Alert System provided by Advanced User Tagging (Pro) - vBulletin Mods & Addons Copyright © 2020 DragonByte Technologies Ltd.