PDA

View Full Version : Cost of Removing Extra Heat in Summer



wle
29-06-2010, 02:59 PM
> + say you have a house
> + it has a normal AC
> + it;s 90 degrees outside
> + normal roof, insulation etc
> + say the house is 2400 sq feet, it has a 4 ton central
> air
> + inside temp is set at 75F
> + normal sunshine
> +
> + here is the question:
> + how much extra AC cost is it if you add a 100 watt heat
> source in
> + the house?
> +
> + state other conditions or assumptions to reach the answer

yes i know there are a lot of variables

just a ball park answer

maybe just something like

"if the AC is SEER 15, it is 50% efficient, so it will cost 200 watts to get 100 watts of heat out"

thx

wle

tbirdtbird
29-06-2010, 05:17 PM
Simpler to just tell the kids to turn the lights off, or if they insist on leaving them on put in smaller bulbs

Brian_UK
29-06-2010, 11:14 PM
College question is it?

Seems to be second copy email. ;)

cool runings
30-06-2010, 12:41 AM
> + say you have a house
> + it has a normal AC
> + it;s 90 degrees outside
> + normal roof, insulation etc
> + say the house is 2400 sq feet, it has a 4 ton central
> air
> + inside temp is set at 75F
> + normal sunshine
> +
> + here is the question:
> + how much extra AC cost is it if you add a 100 watt heat
> source in
> + the house?
> +
> + state other conditions or assumptions to reach the answer

yes i know there are a lot of variables

just a ball park answer

maybe just something like

"if the AC is SEER 15, it is 50% efficient, so it will cost 200 watts to get 100 watts of heat out"

thx

wle


Well Hello Wle.

Welcome to the site.

Nice introduction.

How about you telling us your calculations and conclusions and maybe we will coment.

coolrunnings

.

wle
30-06-2010, 01:47 AM
Well Hello Wle.

Welcome to the site.

Nice introduction.

How about you telling us your calculations and conclusions and maybe we will coment.

coolrunnings


==wle
=================================
.



==i don;t have any, that is what i am asking you guys for

==not a college final
just wondering what does it cost if you forget to turn the lights off?


==wle

wle
30-06-2010, 01:48 AM
is it really that hard to figure out?

wle

taz24
30-06-2010, 10:53 AM
is it really that hard to figure out?

wle


No it is easy to work out, if you know how :D and the cost for a 100 watt would be low, but I think the slow responce is more to do with the way you asked the question.

Looks more like a demand than a polite request..


taz

.

dougheret0
30-06-2010, 03:29 PM
A continuous 100 watt source operating in a conditioned space for one month (720 hours) will add a total of 72 KW to the space. 72 KW us equivalent to 246 Btuh. If the SEER if 15, the AC can remove 15 Btuh for every Watt of power expended. Therefore, watts expended on a hot day will be 246/15 or 16 KW. If the cost of power is 10 cents per KWh, the cost to run a 100 watt source for a month will be $1.60 USD.

A tutorial: EER (and SEER) = Btuh of heat removed /watts expended. One watt = 3.413 Btuh, so EER = (Watts removed x 3.413)/ (watts input to the ac unit). Finally, 72 KW is equivalent to 246 Btuh. Hence, an EER of 15 can be 246 Btuh/16 KW.

Note that the energy used, KWh, is independent of the time taken to remove the heat. If the entire 72 KW is removed in 1 hour, the energy used is 16 KWh. If, as in the example, it is removed at the rate of 100 Watts each hour for 720 hours, the energy used is still 16 KWh.

Note also that the size of the house
and ac unit are irrelevant. The only relevance to the outdoor temperature is that this would be summer, and the system is always in cooling mode, trying to maintain a set temperature.

wle
30-06-2010, 09:41 PM
good answer!!!

THANKS
WLE

kathysierra
01-07-2010, 02:28 PM
Hello, your answer is very nice which is based on some technical calculation.
It is very helping for the community , in fact air conditioning should be dealt with extra care.

jjakucyk
04-07-2010, 03:21 AM
I guess this is splitting hairs since the example is an incandescent lamp, but does 100 watts in really equal 100 watts out? Of the 100 watts of input, "only" about 98 of those watts are released directly as heat, while the remaining two are visible light. Now, I guess those two watts of light do break down (such as it is) into heating the surfaces they illuminate, though in any typical scenario some light is lost out the window, thus the load on the A/C would be very slightly less than 100 watts.

I suppose the same could be said of motors, in that the motion breaks down to heat through friction. Still, if you were able to do this exercise in a completely controlled environment, would you find that ALL the work done by all electrical or electromechanical devices eventually degenerates to an equal amount of heat as the device's input power? Something tells me this isn't the case, even if the actual amount of work that doesn't degenerate to heat is very very small.