AVS Forum banner
1 - 14 of 14 Posts

·
Registered
Joined
·
39 Posts
Discussion Starter · #1 ·
I'm tired of trying to follow this amusing debate through the thread that was started for MFW owners. So, let's do the intelligent thing here and separate the two so that neither are interfering with the other.


First, my background:

I've been building computers for ~17 years or so and have experimented with quite a few methods of cooling. I've done water, phase change, and many many air based setups. My brother was also the research and development project manager for Velocity Micro (boutique computer company similar but way smaller than Alienware) and he often had the opportunity to talk to Intel/Nvidia engineers about certain aspects of their hardware including talking about operating temperatures a LOT. His job was to custom build new computers to overclock and optimize to sell to consumers, so I learned a ton through him in the past few years. Now, I'll throw my opinion in here for it to get disputed.


First things first, electronics without a doubt run better (and overall, longer) cooler than hotter. This is fact and is not worth disputing. However, there are also things called design tolerances. When an engineer sits down and designs any piece of electronic gear, temperature is one of the major concerns, because engineers know that temperature severely affects electronics. Knowing the temperature concerns, they factor in things like intended use, operating environment, ambient temperature, etc. Anything in particular about the design that is important is usually printed in owner's manual (things like x inches/feet of clearance for airflow or what not).


Electronics get hot as a byproduct of current flow. The more current is flowing through the system, the hotter it gets. Engineers know this when they design their products and the good ones build in a sufficient safety margin to still reach their ideal of x many years before failure. The reason why heat is a bad thing is because as heat goes up, resistance goes up and more current/voltage are required to do the same work.


Back to the real world and not design, most electronics now a days have a very large thermal margin. You, as the consumer, generally have no idea what the design tolerances or the thermal margin for your electronics are. As a result, this has driven some people to do as was described in the previous thread and that is to increase the cooling of their electronics to within whatever their "safe" zone is. This is not a bad thing, and if that's what floats your boat, then that's great. It most of the time is not necessary and what you gain out of it is difficult to quantify.


I used to fall into this camp and was a huge proponent of more cooling is better cooling, but then I bought a pair of Radeon HD 4850 video cards for my computer and when I plugged them in, and booted up and checked temperatures, they were running in the mid-60(C) range. My heart almost stopped for a second as I was sure that any second I was about to get a blue screen and my computer would be useless for a while. This didn't happen and after a few minutes I decided to go ahead and run a few benchmarks and see what happened. Hottest temperature I ever saw on them was 87(C) after overclocking (stable, no artifacts, working properly). I was amazed. I put in a call to my bro to see if this was normal and he told me it was and that the thermal tolerance of that particular card was somewhere around ~115-125 or so. I was shocked, but I've learned since then that most things are like that now.


It was different back in the day when processors didn't put out much heat and there was no such thing as a graphics card. Heat wasn't a big problem but now that it has become one, it is being taken into account in the design of new materials and the result is that things are going to run hotter, but you're getting more performance out of them AND they were meant to run hotter, so the only real problem is that now you are creating more heat in your home.


Let me reiterate that more cooling is not a bad thing, just not necessary in most cases (with modern equipment) in my opinion. Unfortunately with computers and home theater equipment (mainly receivers) it seems that some people live under the assumption that they can stick it in the smallest area they can, enclose it, and never think about it again with regards to heat and it will be fine. At the other end of the spectrum are the people that use liquid nitrogen to cool their equipment. As long as you are using any given electronic component within it's prescribed operating environment, the chances of it failing are very low.
 

·
Registered
Joined
·
643 Posts
I agree with much of your post...however...those ati cards you could call waffle irons could easily raise your CASE temps way too high for your other components to be happy at, esp a highly oc'd CPU.


I run a GTX280, and in my case I idle at 38c in IE, while most people's run far hotter.


A lot of cards today dump the heat into the case, instead of EXTERNALLY.


If you put your hand where my EVGA dumps its heat outside the case during Crysis it feels like a hair dryer...but my case stays cool inside...


Unless of course your case looks like this..(my homemade test bench)....



The FIRST thing I questioned about my MFW was the fact it had no heatsink finning on the backplane, when it was running hot...


It smelled hot, then burnt up.


coincidence?


I think not.
 

·
Registered
Joined
·
206 Posts
I run dual GTX 285s in SLI and I have 6 fans in my case (runs surprisingly quiet though because they are all very expensive fans...) I too am amazed these graphics GPUs are designed to run at the boiling point of water with no ill effects. However, I do take care my other components are no affected (one fan each blowing over my hard drive cages, keeps them all at 33c).
 

·
Registered
Joined
·
206 Posts
Now on the topic of subwoofers, because of power inrush problems, I'm forced to have my dual A7S-450s on continuously (each has an LT1300 amp). Kill-A-Watt tells me they use 20 watts each doing nothing, and the amps get quite hot. Is this the same sort of thing people are complaining about with the MFWs?
 

·
Registered
Joined
·
15,246 Posts
One thing to bear in mind when looking at the amps in the MFW, at least the original ones for now, is that the person(s) picking the electrical components used in the amps as opposed to say video cards or hardrives. In the latter situations, you can reasonably assume that you're dealing with competent engineers and companies that screen their suppliers, perform statistical QC on parts and finished products, and even do accelerated longevity testing. If AV123 has taught consumers anything it's that they simply don't have this capability. Hence, I can completely sympathize with those users who are looking to find inexpensive ways to drop the temperature a few degrees.
 

·
Registered
Joined
·
450 Posts

Quote:
Originally Posted by merzbow /forum/post/16851446


Now on the topic of subwoofers, because of power inrush problems, I'm forced to have my dual A7S-450s on continuously (each has an LT1300 amp). Kill-A-Watt tells me they use 20 watts each doing nothing, and the amps get quite hot. Is this the same sort of thing people are complaining about with the MFWs?

Yes that's exactly it.


One point I was trying to make is that a hot heatsink isn't bad. It's actually good. Bad contact with the component you are hoping to cool usually results in a cool heatsink (no heat transfer) - which can make you think the system is working well (because it doesn't feel hot) - and then a few minutes later your the magic blue smoke comes out of your CPU. or whatever...


The key is not whether or not someone THINKS a heatsink is too hot or not (or the use of an arbitrary touch test). The key is whether or not the system is exceeding the temp limits a properly functioning device was designed to operate at.


One thing the google study shows is that things that are going to fail are going to do it quick. Look at year 1 in the study. Failure rates were high regardless of temperature.


That first line shows us that bad stuff is going to go bad, and it usually does it fast and regardless of temperature. elviscerator's MFW blew in the first year. It was hot before it blew. It's very hard to say heat caused it to go bad, because the fact is that most electronics that are going to fail do so in the first year - regardless of heat.


The point is you really need to know the specifications of the amp components before you can determine whether or not it is running "too hot". One key thing to note is that I have yet to see an amp that DID NOT run hot relative to many other components - and yet amps tend to run for decades.


Another point I intended to make there, is that running cooler is better - but how much cooler and how much better?


Large decreases in temperature can impact the lifespan of electronic components - and increases in temperature can decrease their lifespan.


In the case of CPUs - I know that I am running mine hot on air to reach the speeds I want to get out of it. I also know that if I halve the lifespan of the component due to this treatment - it will STILL be obsolete before I can burn it out. Now, I could run water cooling or phase change and all the hassle and extra electricity involved in that - but I know that a $50 air cooler can keep my chip alive for as long as I'll need it - so I'm not going to worry or burn up all that extra cash on a $300 water cooler or $100 TEC + a lot more electricity for both.


The question has yet to be answered how much of a decrease is seen from running a fan behind an MFW amp. Then you need to know the lifespan of the components (20 years? 30?) you'd have to know how high temperatures impact the components - which their design will dictate - to determine what you're buying by running the fan behind it - and paying for the electricity to run the fan.


If the fan is dropping the temps by 5C or so - I would say that isn't worth it. I would think you'd want to knock them down by 10C or more to really have a significant impact on the lifespan. Even then - what are you talking about? An amp that lasts for 23 years instead of 21? I know it's expensive equipment now, but who runs 20-yr old subwoofers?


A key characteristic of the vast majority of electronics is that they are obsolete long before they wear out. I suspect the MFW amps (without faulty hardware) will reach that point along with everbody else's subs and amps. Time will tell.
 

·
Registered
Joined
·
450 Posts

Quote:
Originally Posted by shenlon /forum/post/16850219


First things first, electronics without a doubt run better (and overall, longer) cooler than hotter. This is fact and is not worth disputing.

I might actually dispute that for the purposes of this application - outside of the longevity debate which I addressed in my previous post.


Considering most of the components in the amp are passive - a few C lower isn't going to have much of an impact on their performance.


Maybe a few degress above 0 Kelvin you could find measurable increases in the performance of the electronic components (like capacitance and resistance), but we're talking about pointing a consumer-level fan at the back of the plate.


Do you really think the amp is going to perform better at cooler temperatures - as you imply here?
 

·
Registered
Joined
·
41 Posts

Quote:
Originally Posted by Chu Gai /forum/post/16854193


One thing to bear in mind when looking at the amps in the MFW, at least the original ones for now, is that the person(s) picking the electrical components used in the amps as opposed to say video cards or hardrives. In the latter situations, you can reasonably assume that you're dealing with competent engineers and companies that screen their suppliers, perform statistical QC on parts and finished products, and even do accelerated longevity testing. If AV123 has taught consumers anything it's that they simply don't have this capability. Hence, I can completely sympathize with those users who are looking to find inexpensive ways to drop the temperature a few degrees.

I've managed to ignore most of your posts lately, Chu Gai - but this one is just unbelievable.


The OP wants to discuss HEAT SINKS and clearly states he's trying to create a separate discussion, and you still find a way to make it all about bashing av123.
 

·
Registered
Joined
·
15,246 Posts
That's great Daryl. I can't seem to recall a single one of yours.

This thread came out of another thread where people were looking to cool their MFW amps - old and new. I happen to think there's merit in it both generally and specifically. Noting that engineering competence for a Seagate or an ATI is substantially greater than AV123 seems like a no brainer to me. They didn't get into this mess because of engineering excellence. One day their forums will reopen and you can once again join the thronging masses tossing rose petals and singing the praises. Soon.

In the future Daryl, why don't you just put me on ignore. That way you won't have to manage to ignore most of my posts. You can ignore them all.
 

·
Registered
Joined
·
41 Posts

Quote:
Originally Posted by Chu Gai /forum/post/16860680


In the future Daryl, why don't you just put me on ignore. That way you won't have to manage to ignore most of my posts. You can ignore them all.

It appears that I can safely put you "on ignore" - the likelihood of you contributing anything useful or new seems pretty low at this point.
 

·
Registered
Joined
·
15,246 Posts
I'm sure that'll work Daryl. My posts, I'm sure, pale in comparison to your prolific and stellar contributions here. Cheer up though. Someday the other forum place will reopen.
 

·
Registered
Joined
·
1,130 Posts

Quote:
Originally Posted by Noubourne /forum/post/16856633


I might actually dispute that for the purposes of this application - outside of the longevity debate which I addressed in my previous post.


Considering most of the components in the amp are passive - a few C lower isn't going to have much of an impact on their performance.


Maybe a few degress above 0 Kelvin you could find measurable increases in the performance of the electronic components (like capacitance and resistance), but we're talking about pointing a consumer-level fan at the back of the plate.


Do you really think the amp is going to perform better at cooler temperatures - as you imply here?

I don't think anybody is saying they will increase their performance by running cooler, but they will stand a better chance of not being cut off by their over-heat circuits by running cooler.


Talk about video cards, chipsets, hard drives and CPU's are completely different than a subwoofer amp because they will slow down as heat builds up, not cut off like stereo equipment. Unless you have that ability turned off in the BIOS or in software/OS.


To me however, keeping your computer, stereo and subwoofer et al cooler can only help it live longer. Basic, simple and it works.



Continue with your argument that hot is good, I'm out.
 

·
Registered
Joined
·
41 Posts

Quote:
Originally Posted by Chu Gai /forum/post/16863243


I'm sure that'll work Daryl. My posts, I'm sure, pale in comparison to your prolific and stellar contributions here. Cheer up though. Someday the other forum place will reopen.

I don't post much here, Chu. I usually come here to learn - which is why it will be so easy to ignore you.
 

·
Registered
Joined
·
450 Posts

Quote:
Originally Posted by KlipschHead281 /forum/post/16863337


I don't think anybody is saying they will increase their performance by running cooler, but they will stand a better chance of not being cut off by their over-heat circuits by running cooler.


Talk about video cards, chipsets, hard drives and CPU's are completely different than a subwoofer amp because they will slow down as heat builds up, not cut off like stereo equipment. Unless you have that ability turned off in the BIOS or in software/OS.


To me however, keeping your computer, stereo and subwoofer et al cooler can only help it live longer. Basic, simple and it works.



Continue with your argument that hot is good, I'm out.

That is what I consider to be a strawman argument, but I suspect it is more a symptom of your understanding of electronics than anything else. It is not accurate to state that electronic devices "slow down as heat builds up".


I assume you are referring to clock throttling. This is a feature of some CPUs and GPUs - not hard drives or chipsets. It is generally employed for very expensive components that have a real likelihood of overheating.


Clock throttling is triggered based on the estimated thermal limits of the processor design. It is not a case of having the processor gradually slow down as heat increases. It is a sudden decrease to a fraction of normal speeds once a critical temperature threshold is reached where continued operation at normal speeds is expected to cause damage to the processor. You basically have to install the heatsink wrong to make it go off, and usually it is possible to exceed these temperatures for a fairly long period of time before any real damage occurs due to how conservatively the limits are set. Of course - that changes with any given processor because each one has unique characteristics due to tiny variations that occur in production.


Pointing a crappy fan at the back of the sub amp will lower the temperature, sure. My point is that it is important to consider what benefits you could expect before you go ahead and produce all that extra noise and consume all that extra electricity.


If we are talking about the risk of hitting a thermal limit of the amp where it may shut down - then in my opinion the fan is useless.


If you have an amp that is running that close to the thermal shutdown threshold - then you either have a faulty amplifier or you live in a black tent in the desert. I don't see many people claiming that they are constantly triggering a heat protection circuit and are forced to wait some extended period of time to cool it down before it will work again...


Let's say thermal protection kicks off at 150C to prevent damage to the electronics. You're at 85C before the fan, and 83C after the fan. What benefit are you gaining in this situation? It didn't shut off before the fan, and after the fan - it still isn't shutting off. There is no discernable benefit in this situation.


The other argument being made is that of longevity. I think a similar question needs to be asked for this argument as well. If your fan (or other cooling solution) can lower the temperature by 20C or more - then you may gain a measurable benefit in longevity of the equipment. But again - a few degrees isn't going to have a significant impact on the longevity of the hardware.


In general cooler is better but I think you're oversimplifying.


Here is my point: Cooler is not better when the cost of cooling outweighs the benefits of having the equipment run cooler.
 
1 - 14 of 14 Posts
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top