### How BTUs and EERs Work

Most air conditioners have their capacity rated in BTUs, or British Thermal Units. A BTU is, generally, the amount of heat required to raise the temperature of one pound of water one degree F. Specifically, a BTU is 1,055 joules, but the first definition is easier to understand in real-life terms. One “ton”, in heating and cooling terms, is 12,000 BTUs. A typical window air conditioner that you find at a local retailer might be rated at 10,000 BTUs. What that means is that the air conditioner has the ability to cool 10,000 pounds of water (about 1,200 gallons) one degree in one hour. Or it could cool 5,000 pounds 2 degrees in one hour. Or 2,500 pounds 4 degrees in one hour, and so on.

Not many of us live in aquariums, so knowing how much water an air conditioner can cool is not much use. To get a very rough idea of how much air can be cooled, take the fact that a cubic foot of water weighs about 63 pounds. Water is about 6,300 times denser than air. So 100 cubic feet of air weighs about a pound. A typical bedroom contains about 1,000 cubic feet of air or 10 pounds of air. That means (ignoring differences in heat capacity) that a 10,000 BTU air conditioner can lower the temperature of a bedroom, if it is perfectly insulated, by 10 degrees in just a couple of minutes. It is not the case that the room is perfectly insulated (in fact many rooms have little or no insulation) but what that tells you is that you probably do not need a 10,000 BTU air conditioner for a typical 10′ x 12′ bedroom. For comparison, you can happily cool an insulated 2,000 square foot house with a 5 ton (60,000 BTU) or so system, implying that you might need perhaps 30 BTU per square foot. Keep in mind that these are all rough estimates and you should not rely on any of this information to size your home’s air conditioner – ask an HVAC contractor.

The EER (Energy Efficiency Rating) of an air conditioner is its BTU rating over its wattage. For example, if a 10,000 BTU air conditioner consumes 1,200 watts, its EER is 10,000/1,200 = 8.3. Obviously you would like the EER to be as high as possible, but normally a higher EER is accompanied by a higher price. How do you decide if the higher EER is worth it?

Let’s say that you have a choice between two 10,000 BTU units. One has an EER of 8.3 and consumes 1,200 watts and the other has an EER of 10 and consumes 1,000 watts. Let’s also say the price difference is $100. To understand what the payback period is on the more expensive unit you need to know:

- Approximately how many hours per year you will be operating the unit
- How much a kilowatt-hour (KWH) costs in your neighborhood

Let’s say that you plan to use the air conditioner in the summer (4 months a year) and it will be operating about 6 hours a day. Let’s also imagine that a kilowatt-hour costs 10 cents in your neighborhood. The difference in energy consumption between the two units is 200 watts, which means that every 5 hours the less expensive unit will consume one more KWH (and therefore one more dime) than the more expensive unit. Assuming there are 30 days in a month, you find that during the summer you are operating the air conditioner 4 months * 30 days/month * 6 hours per day = 720 hours. 720 hours * 200 watts/hour / 1000 watts/KW * 0.10 cents/KW = $14.40. Since the more expensive unit costs $100 more, that means that it will take about 7 years for the more expensive unit to break even.