or Connect
AVS › AVS Forum › Display Devices › Flat Panels General and OLED Technology › Which produces more heat, CRT or plasma TV's?
New Posts  All Forums:Forum Nav:

Which produces more heat, CRT or plasma TV's? - Page 2

post #31 of 163
There's nothing less "green" than buying a brand new big TV you certainly don't need that was manufactured with all sorts of chemicals in another country, loaded onto a ship that sails across the ocean burning fuel oil, then is trucked across the country to your home. TVs like these are luxury items that 90% of the world's population can't afford; to pick one because it's "greener" than another is feel-good window dressing that doesn't make a damn bit of difference.

cylonsix, as tluxon described above, the only thing that matters when it comes to heating your room is the amount of power consumed. An IR thermometer might identify hot spots, but the only tool that will actually tell you how much heat is being brought into the room by an appliance is a Kill-a-watt meter. So, if your SXRD uses 220W and the V10 325W, the V10 is putting ~50% more heat into the room. How much of a difference does 100W make? Not very much.

FYI, small electric space heaters use 1,500W.

jeff
post #32 of 163
So I sounds like as long as the 58v10 watts is around 325 +/- 15% I will not feel any differance.
For the green part, I can do it, I have never owned a car and I have been recyling since I was 9... Long before it was cool. I do use less power then most homes, my PG&E bill is always been $41-$62 a month, with a large electic water heater, 15 year old frig. (yes I need to replace it), electric oven/stove, 2 PC's, 2 HD-DVRs, 2 recivers and 3 HDtvs and with 2 people in the house. I even got an Ipod ( waiting for that USB solar charger from energizer) so I do not have to turn my computer on, now I just turn the computer on just a hand full of times since last Nov. & I always buy American if I can and do not mind spending the extra $$.
post #33 of 163
Quote:
Originally Posted by greenjp View Post


cylonsix, as tluxon described above, the only thing that matters when it comes to heating your room is the amount of power consumed. An IR thermometer might identify hot spots, but the only tool that will actually tell you how much heat is being brought into the room by an appliance is a Kill-a-watt meter. So, if your SXRD uses 220W and the V10 325W, the V10 is putting ~50% more heat into the room. How much of a difference does 100W make? Not very much. FYI, small electric space heaters use 1,500W.

Yeah it's funny how so many people worry about how much electricity a flat panel TV uses, yet they don't talk about how much power their other household products use. My two table lamps each have 150 watt light bulbs so when i'm reading with the TV off, i'm actually using more electricity than if i were watching my 42" Plasma TV (220 watts average draw according to my Kill-A-Watt meter). My microwave uses 1,100 watts, my space heater uses 1,200 watts, and my blow drier uses 1,000 watts. I don't even know how much my air conditioner uses
post #34 of 163
Quote:
Originally Posted by RandyWalters View Post

Yeah it's funny how so many people worry about how much electricity a flat panel TV uses, yet they don't talk about how much power their other household products use. My two table lamps each have 150 watt light bulbs so when i'm reading with the TV off, i'm actually using more electricity than if i were watching my 42" Plasma TV (220 watts average draw according to my Kill-A-Watt meter). My microwave uses 1,100 watts, my space heater uses 1,200 watts, and my blow drier uses 1,000 watts. I don't even know how much my air conditioner uses

Well, there you go Randy. You have provided a way that people can actually save energy and enjoy a far superior home theater movie experience, by watching a Plasma set, with the lights turned off.
post #35 of 163
Thread Starter 
Quote:
Originally Posted by greenjp View Post

cylonsix, as tluxon described above, the only thing that matters when it comes to heating your room is the amount of power consumed. An IR thermometer might identify hot spots, but the only tool that will actually tell you how much heat is being brought into the room by an appliance is a Kill-a-watt meter. So, if your SXRD uses 220W and the V10 325W, the V10 is putting ~50% more heat into the room. How much of a difference does 100W make? Not very much.

FYI, small electric space heaters use 1,500W.

jeff

A watt meter will only give you a theoretical potential heat output, which can change with the size of the device, placement, ambient temperature, air flow, the way it dissipates heat, etc. An IR thermometer will give you the actual degrees heat output that's actually being put into a room.
post #36 of 163
Quote:
Originally Posted by Clint S. View Post

A watt meter will only give you a theoretical potential heat output, which can change with the size of the device, placement, ambient temperature, air flow, the way it dissipates heat, etc. An IR thermometer will give you the actual degrees heat output that's actually being put into a room.

A watt meter gives the power input to the device. Power out = power in unless you are charging a battery. The amount of power out in the form of light is negligible so the rest is output as heat.

A higher power device may run cooler due to larger radiating area and/or ventilation fans but it will still be dumping more heat into the room than the lower power device.

Daniel Lang
post #37 of 163
Thread Starter 
Quote:
Originally Posted by dblang View Post

A watt meter gives the power input to the device. Power out = power in unless you are charging a battery. The amount of power out in the form of light is negligible so the rest is output as heat.

A higher power device may run cooler due to larger radiating area and/or ventilation fans but it will still be dumping more heat into the room than the lower power device.

Daniel Lang

I realize all that. But: An IR thermometer will give you the actual degrees heat output that's actually being put into a room. Whereas with a watt meter, you have to guess at the degrees. And due to the reasons I mentioned, that guess can be very inaccurate. With a thermometer, there's no guesswork.
post #38 of 163
Quote:
Originally Posted by Clint S. View Post

I realize all that. But: An IR thermometer will give you the actual degrees heat output that's actually being put into a room. Whereas with a watt meter, you have to guess at the degrees. And due to the reasons I mentioned, that guess can be very inaccurate. With a thermometer, there's no guesswork.

You seem to be confusing energy, heat, and temperature. As others have stated, and the laws of thermodynamics dictate, power in = power out, and all that output eventually ends up as heat. It may go out in different forms, or be concentrated in different ways, but for the purposes of considering the device's effect on the room the only thing that matters is power in.


jeff
post #39 of 163
Thread Starter 
Quote:
Originally Posted by greenjp View Post

You seem to be confusing energy, heat, and temperature. As others have stated, and the laws of thermodynamics dictate, power in = power out, and all that output eventually ends up as heat. It may go out in different forms, or be concentrated in different ways, but for the purposes of considering the device's effect on the room the only thing that matters is power in.


jeff

Again, I know about all that Jeff, and I'm not confusing anything. My point is: why use a watt meter and look at a display that only shows "354 watts" (or whatever the figure may be), and have to GUESS at the temperature in degrees that's being put out, when you can use an IR thermometer and get the actual numerical degrees and heat that's being put out.

"It may go out in different forms, or be concentrated in different ways", is exactly why a thermometer is the only way to see the actual temperatures in degrees.

I already knew the power consumption of my TV. That told me little. I can see "284 watts" or "350 watts", whatever, on a meter's display. Uhhh, oooook...but how much heat? What's the temps? I already knew that when I touched the back, or the screen, I felt "heat", but I had no idea how much.
Only when I used an IR thermometer could I actually see the heat in degrees, which is what (I would think) one wants to know when they want to know how much heat something is generating.

Sure, we all know that a watt meter will tell you what device is theoretically putting out more heat than another by their power consumption, there's no denying that. But that doesn't tell you "which device is hotter". It was "cylonsix" that mentioned degrees, which is why I did. As he mentioned, his DVR is only 180 watts, but it gets up to 145°. But his 50" Sony only 92° and it's 220 watts.
post #40 of 163
The temperature of a device just proves how well (or bad) the dissipation on that device work. My dsl router runs hotter than my PS3, that doesn't mean it produces more heat, it's just that it doesnt have a 14cm fan running over the heatsink

Actually, if the exhaust air from a device is very hot, that's a good signal. If the "body" is very hot at touch, it's a bad signal.

The "heat" the device "produces" is exactly the power consumption it has. It takes Xwatts and convert that energy into light/sound/residual heat, which is just the same in a different form, that eventually gets back into "heat" :P
post #41 of 163
Quote:
Originally Posted by Clint S. View Post

I realize all that. But: An IR thermometer will give you the actual degrees heat output that's actually being put into a room. Whereas with a watt meter, you have to guess at the degrees. And due to the reasons I mentioned, that guess can be very inaccurate. With a thermometer, there's no guesswork.

Take into account that an IR thermometer can tell you the temperature of one point, which is not a right way to measure the energy output of a device. Just attach a fan on the side of that device, and magically the IR thermometer will show that the temperature has decreased several degrees. And it is true, in that point the temperature has decreased, because the air surrounding has gotten hotter and has been thrown away by the fan, but the device is outputting exactly the same ammount of "heat".

Again, if something uses 200W of power, it generates more heat than something that uses 100W. And exactly generates twice the heat, NO MATTER HOW HOT IT RUNS!!. 171,970 KCal per hour instead of 85,985 KCal per hour, precisely, and again, it doesn't matter if the first one runs at 30ºC and the other one runs at 100ºC

First we should define "heat". Heat is the kinetic energy the air posseses. The faster the molecules move, the stronger the air "strikes" are against your skin, the hotter you feel. If a TV uses 200W, ALL THOSE 200W are being transformed in light/sound/whatever. Those waves eventually make the air/walls/your molecules to move faster, and that increases the temperature of the room.

You can not create/destroy energy, at least in a living room :P
post #42 of 163
Thread Starter 
Quote:
Originally Posted by Daviii View Post

Take into account that an IR thermometer can tell you the temperature of one point, which is not a right way to measure the energy output of a device.

Right, and that is all we were talking about--HEAT, temperature.
post #43 of 163
Quote:
Originally Posted by Clint S. View Post

Right, and that is all we were talking about--HEAT, temperature.

I don't understand exactly your point. What we are talking about is "which produces more heat, CRT or plasma", and that's something can not be measured using a thermometer. The heat produced can not be measured in terms of temperature, because it depends on the room, the isolation with the outside and definitely on the point on which you measure that temperature.

The heat produced by any electric device is directly proportional to the wattage it needs to work. Depending on the dissipation system (Fan/Fanless) the heat is transmitted more quickly or more slowly to the ambient, but eventually the entropy of the room as an isolated system will be decreased in the same ammount.

We could discuss if the dissipation index of the device, alongside with poor isolation of the room delivers higher or lower perceived "heat", but at the question of "Which produces more heat" there's only ONE answer: "The one that consumes more watts".
post #44 of 163
Quote:
Originally Posted by Clint S. View Post

Right, and that is all we were talking about--HEAT, temperature.

Heat and temperature are two different things. Davii's post #41 explains this quite well.

jeff
post #45 of 163
Thread Starter 
Quote:
Originally Posted by Daviii View Post

I don't understand exactly your point.

http://www.avsforum.com/avs-vb/showt...5#post16611995
post #46 of 163
Thread Starter 
Quote:
Originally Posted by greenjp View Post

Heat and temperature are two different things. Davii's post #41 explains this quite well.

jeff

http://www.avsforum.com/avs-vb/showt...5#post16611995

ENOUGH!!!!! You CANNOT GET temperature in degrees FROM A WATT METER!!
post #47 of 163
Thread Starter 
And since I was the one that started this thread, and it's old, and some people just insist on creating arguments where there is none, and refuse to read, I'm unsubscribing.
http://www.avsforum.com/avs-vb/showt...7#post16611147
post #48 of 163
I'm beating my head against a wall here, but for the education of anyone who cares to learn:

You state: "you can use an IR thermometer and get the actual numerical degrees and heat". and "An IR thermometer will give you the actual degrees heat output that's actually being put into a room." As we're trying to explain this is 100% wrong. You can get a single temperature reading but that tells you absolutely nothing about total heat output. The power input tells you the heat output.

You've said several times that you understand the relationship between power, heat, and temperature, but then you continue to have them all confused in your posts.

jeff
post #49 of 163
Quote:
Originally Posted by Clint S. View Post

http://www.avsforum.com/avs-vb/showt...5#post16611995

ENOUGH!!!!! You CANNOT GET temperature in degrees FROM A WATT METER!!

From your post:

"My point is: why use a watt meter and look at a display that only shows "354 watts" (or whatever the figure may be), and have to GUESS at the temperature in degrees that's being put out, when you can use an IR thermometer and get the actual numerical degrees and heat that's being put out"

Your point is invalid and simply twisted. With a watt meter you have a real representation of how much "degrees are being put out". With a thermometer you must guess and extrapolate acording to measures of lots of points in order to have a valid aproximation.

For example, the following picture is a PC case being photographied in IR mode:



How much "degrees are being put out"? 19ºC or 62ºC? It depends on WHERE you place the thermometer, and how does the airflow cool the components. Just look how the motherboard is hotter than the CPU heatsink while the CPU is obviously heating the room orders of magnitude quicker. That's because there's a fan over the heatsink and the air takes the heat from the heatsink and spreads it all over the case, while keeping the CPU cool.

In this case, just looking at the power consumption of the PC, one can guess EXACTLY how much "energy is being put out". The "degrees" are not put out, are just a representation of how much energy is concentrated in a given point.

Eventually, in a closed system, all that energy will be spread homogeneously so it doesn't matter which component is hotter or cooler, because the thermodynamics are just something one must deal with, and definitely something I've not discovered

And please, just stop this nonsense.
post #50 of 163
Is there a good website that lists the power consumption of tvs? For example I was trying to find values on the samsung LCDs. I downloaded the user manual, and for power consumption is said to read the sticker on the tv.

Edit: I see that there are some values on cnet.
post #51 of 163
So if my 32" sony hd set (kv-32hs420), consumes 240 watts, then the Samsung LN52B750 drawing about 129 watts after calibration according the cnet will produce a lot less heat for the room it's in (13'X17'). Although if I was getting that model it would either be the 40" or 46"
post #52 of 163
Quote:
Originally Posted by ay221 View Post

So if my 32" sony hd set (kv-32hs420), consumes 240 watts, then the Samsung LN52B750 drawing about 129 watts after calibration according the cnet will produce a lot less heat for the room it's in (13'X17'). Although if I was getting that model it would either be the 40" or 46"

Yes, it produces a lot less heat. Almost half the heat. It's normal since CRT is much less efficent than LCD.
post #53 of 163
Quote:
Originally Posted by ay221 View Post

So if my 32" sony hd set (kv-32hs420), consumes 240 watts, then the Samsung LN52B750 drawing about 129 watts after calibration according the cnet will produce a lot less heat for the room it's in (13'X17'). Although if I was getting that model it would either be the 40" or 46"

I wouldn't worry about the heat form the TV as it's negligible in a room. I have a plasma and 2 LCD's and none of them produce any heat to effect the room temperature. Cable boxes, PS3's and AV receivers produce much more heat in my setups.

This whole heat crap was stirred up mainly by one old poster that has an agenda to knock down plasma any which way he can.
post #54 of 163
Quote:
Originally Posted by maxdog03 View Post

I wouldn't worry about the heat form the TV as it's negligible in a room. I have a plasma and 2 LCD's and none of them produce any heat to effect the room temperature. Cable boxes, PS3's and AV receivers produce much more heat in my setups.

This whole heat crap was stirred up mainly by one old poster that has an agenda to knock down plasma any which way he can.

I disagree. I just bought a plasma for my bedroom, and it absolutely causes the room to warm up significantly. I will put the disclaimer though that I have it running in vivid mode with the break-in images running so it's using more juice than it would be when running calibrated. However, I still expect it to heat up the room when in normal use as it's using (according to cnet) 107 watts more energy than the LCD that sits in my living room. I can report back when I get the break-in done.
post #55 of 163
Quote:
Originally Posted by ToBeFrank View Post

I disagree. I just bought a plasma for my bedroom, and it absolutely causes the room to warm up significantly. I will put the disclaimer though that I have it running in vivid mode with the break-in images running so it's using more juice than it would be when running calibrated. However, I still expect it to heat up the room when in normal use as it's using (according to cnet) 107 watts more energy than the LCD that sits in my living room. I can report back when I get the break-in done.



which model plasma and LCD are we talking about? Have you ever run your LCD under the same conditions in that bedroom? At 107 watts difference that will cost you about 1.97 per month more in electricity and about $24.00 a year and that's not going to make any difference in the heating of a room as otherwise that would be much more effecient than any furnace available.
post #56 of 163
Quote:
Originally Posted by maxdog03 View Post

which model plasma and LCD are we talking about?

Panasonic TC-P42S1
Samsung LN52B750

Quote:


Have you ever run your LCD under the same conditions in that bedroom?

No, the 52" is in the living room. I can stand in front of the plasma and feel the heat coming off of it and the back of the tv is quite warm. I can't feel any heat coming off my LCD and the back of the tv is cool to the touch. (Keep in mind... the plasma is running vivid for break-in).

Quote:


At 107 watts difference that will cost you about 1.97 per month more in electricity and about $24.00 a year and that's not going to make any difference in the heating of a room as otherwise that would be much more effecient than any furnace available.

As I said, at the moment the plasma is running vivid for break-in so it's likely the difference right now is significantly more than 107 watts. This may all be a moot point when it's running in it's calibrated state, but for now it most certainly heats up my bedroom. All one has to do is walk from the living room into the bedroom to notice the difference. They normally feel the same temp.
post #57 of 163
Heat is measured in btu/h period. It is not measured by thermosats, that is the just tempurature. The amout of watts that are used will tell you how many btu/h is output. The way to figure it out is easy watts used in an hour x 3.41 btu's. Also to require and extra ton of cooling, which isn't that much takes 12000 btu/h. So there is the sceintific basics of how to see how much HEAT is output by you equipment.
post #58 of 163
Quote:
Originally Posted by bigbare View Post

Heat is measured in btu/h period. It is not measured by thermosats, that is the just tempurature.

I don't care about the heat aspect, I only care if it affects the temperature in my bedroom. I couldn't care less if it increases the cost of air conditioning. It's a comfort issue, not a cost issue. So it most decidedly is measured by the thermostat.

Quote:


The amout of watts that are used will tell you how many btu/h is output. The way to figure it out is easy watts used in an hour x 3.41 btu's. Also to require and extra ton of cooling, which isn't that much takes 12000 btu/h. So there is the sceintific basics of how to see how much HEAT is output by you equipment.

The question is does this plasma cause the temperature to rise noticeably in my bedroom? The answer is a most definite yes. The thermostat for the air conditioner is in the opposite end of the house so the rise in temperature will not cause the air to kick on. From my perspective, plasma puts out enough heat to cause a difference in the comfort level in my bedroom, period.
post #59 of 163
Perhaps you should read up a bit on this. In order for your temperature to rise you must increase the heat output to the room. In other words you need to increase the btu output. This is why btu output in a home or building or room is used to when calculating a cooling load and not the running temperature. So if you are truly concerned about your room heating the btu/h is what you want not the temperature. Argue the point all you want but this is how it is done. I happen to be a HVAC technician and know when there is a media or server room that needs the tempurature controled we need to figure how many btu/h the equipment puts out not the tempurature reading off the units.
post #60 of 163
Quote:
Originally Posted by bigbare View Post

Perhaps you should read up a bit on this.

I have no need to.

Quote:
In order for your temperature to rise you must increase the heat output to the room.

That's obvious... that would be from the plasma tv when it's turned on.

Quote:
So if you are truly concerned about your room heating the btu/h is what you want not the temperature.

I'm not concerned about it. The tv is staying. The question was asked if having a plasma can noticeably increase the temperature of a room. It was stated that it cannot. I disagreed because it definitely does increase the temperature of my bedroom.

Quote:
Argue the point all you want but this is how it is done.

How am I arguing your point? I'm saying the plasma is outputting enough heat to increase the temperature in my bedroom. If you're saying it couldn't be doing that, then yeah, I'm disagreeing with you.
New Posts  All Forums:Forum Nav:
  Return Home
AVS › AVS Forum › Display Devices › Flat Panels General and OLED Technology › Which produces more heat, CRT or plasma TV's?