### More efficient than a 110 volt plug-in heater?

Would a 220 volt baseboard heater save me energy, compared to the two 110 volt plug in heaters I’m currently using to heat our kitchen addition? My husband claims it will but I can’t see why. Are 220 volt heaters more efficient? Are baseboard heaters more efficient than plug-in heaters?.

### Answer from Green Energy Efficient Homes

You’re absolutely right, you won’t get any improvement in efficiency by switching from 110 volt plug in heaters to 220 volt baseboard heaters. The advantage of a 220 volt heater is that it can provide greater heat output in terms of the BTU or watts it puts out, than a typical plug-in 110 volt heater. That is mainly because 220 volt heaters are installed on dedicated circuits that have enough amperage (usually 30 or 50 amps) so they can convert more electrical power to heat. But for a given amount of heat output, the baseboard heater will consume exactly the same amount of electricity as two plug-in 110 volt heaters.

Watts can be used to measure both the capacity of an electric heater (amount consumed at any given time – for example, a 1500 watt heater) or its electricity consumption (amount consumed over a time period, for example 1.5 kilowatt hours for a 1500 watt heater running at full blast for an hour). 110 volt circuits are usually wired for 20 amps (sometimes 15 in older houses). Since Watts = amps x volts, a 15 amp circuit x 110 volts can handle a 1650 watt load, so a 1500 watt heater can safely be plugged into such an outlet. Compare this to a typical 220 volt circuit installed for baseboard heaters. Even if wired at 20 amps (which should never be done in practice) such a circuit could actually handle 20 amps x 220 volts or 4400 watts of power; but the circuits for a 220 volt baseboard heater are typically wired at 30 to 50 amps, which means they can handle significantly higher power output. This is why you can get a lot more heat out of a 220 volt baseboard heater.

Your husband is mistaken if he thinks that a 220 volt baseboard heater of a given wattage is more efficient than a 110 volt heater of the same wattage just because it draws a lower amperage. If they have the same wattage they produce the same heat output for the same electrical input.

Because any electrical resistance heater is 100% efficient at converting electricity into heat (there is no other work the electricity can be put to in an electric heater, other than producing heat), either choice will cost about the same in terms of electricity consumption. My page on Energy efficient electric heaters explains this efficiency issue in more detail.

Whether you change to a 220 volt heater or not is therefore a question of aesthetics, capacity, or other issues, not energy efficiency. If you don’t want the clutter of two bulky space heaters in your kitchen addition, a baseboard heater will reduce that bulk significantly and you’ll eliminate the risk of knocking a space heater over by accident (although many modern space heaters come with an automatic shutoff if they are toppled). If the two space heaters are doing an adequate job alread you won’t need the extra capacity of a 220 volt heater, but if they are struggling to keep the room warm, a 220 volt baseboard heater will help. The other thing to consider is that installing a 220 volt heater requires extra wiring and an extra 220 volt circuit on your breaker panel, which can add significant costs if you hire an electrician to do the job. (If you’re handy yourself and have experience doing wiring, it’s not that difficult a job, but be sure to understand the risks first.)

Of course, the most energy efficient solution is to find a way to avoid using any kind of electrical heater, since electrical heat is the most expensive form of home heating. Anything you can do to improve insulation levels in your addition will help. In our house we had a kitchen addition that was built in 1995 and to keep costs down the owners at that time just installed a 220 volt baseboard heater on an outside wall. That was cheaper than running an extra duct from the furnace main. When we replaced the furnace in 1998 we got the installers to run an extra duct through the crawl space (insulated, of course) and out the addition floor, and we were able to remove the baseboard heaters and stay comfortable. Later on I thoroughly insulated the crawl space walls, and that reduced the amount of heat escaping out the floor, which allowed us to close off the addition vent and still stay warm. Finally, in 2010 we tore down the addition and built a full-width addition (10×20 feet) and because we made sure it was very well insulated, it stays plenty warm there even on -20F days, without any extra electric heaters or even an extra forced air heating register.

### Leave a Reply

Want to join the discussion?Feel free to contribute!

This is false information, a 220v is in fact more energy efficient and your statement about a 220v being put on a 20 amp breaker would cause a fire quickly if the breaker failed to shut off. All 220v run on either a 30 amp or 50 amp breaker

In fact, as I point out, electric heaters are all 100% energy efficient because they convert all their electricity to heat. The only way in which a 220v heater would be more efficient than a 110v is that a higher voltage wire at a given gauge transfers more of the electricity to the device, and less of that electricity is lost as heat due to resistance in the wire. If you have a wire running to a heater through an exterior wall, some of the extra heat produced in that wire may be lost to the outside. For the length of wire involved the difference between the 110v and 220v wire, for a heater of the same wattage, is pretty minor – under 1% for a 100 foot length of typical wire.

I already stated that most 220v heaters are connected to a 30 or 40 amp breaker. My example of the 20 amp breaker was just to show the math around total heat output in the 110v and 220v scenarios. I’ve added a note to remind people not to connect a 220v heater to a 20 amp breaker. Thanks for pointing that out.

2 110 set at 1500 watts max temp for heating would be 1500/110 / 2 =13.6 Amps /ea

2 220 set at 1500 watts max for heating would be 1500/220 /2 = 6.8 Amps /ea

How does that have the same efficiency?

Amperage is a measure of current. Energy efficiency is not related to current, it’s related to the amount of power used.

If you have a source of power at 3000 watts running for one hour, you consume 3 kilowatt hours of power. You pay for that 3 kilowatt hours based on the $/kwh rate.

Whether that 3000 watts is transmitted/consumed at 110 volts (resulting in 27.2 amps of current) or 220 volts (resulting in 13.6 amps of current) doesn’t change the amount of power used (except that, for the brief distance from your breaker panel to the heater, the 110 volt wire will have a slightly higher power loss to heat, but then most of that heat is inside your walls so will stay in your house). The only thing that affects energy efficiency is how much of the power from the source becomes heat at the output. In this case, both the 110 volt and 220 volt heaters have their power 100% converted to heat, so that 3000 watts per hour produces 3 KWh worth of heat. Thus the efficiency of the heater is pretty much independent of the voltage or amperage of the circuit. It’s the wattage that determines the power consumption. And since 100% of the electricity flowing to the heater turns into heat (either directly, or eventually, e.g. the circulation from a fan does eventually become heat) they both have the same efficiency.

what’s with the academic discussion? For a given amount of heat (wattage), a 220 Volt heater is half the cost of running that a 110 Volt heater – simply because an electricity meter measure current flowing.

That’s not correct – the cost is identical regardless of the voltage of the heater. An electricity meter may measure current flowing, but it also knows the voltage and the result it computes is kilowatt hours, not current. You are charged based on kilowatt hours consumed. Since heat output is measured in watts and electricity is measured in kilowatt hours, the voltage has no effect on how much you are charged. Just check what’s on the front of the meter, or on your electricity bill – it’s kWh. You’re not charged for amp hours.

You missed the point, 220 volt is cheaper to run. Because when you buy electricity you pay for each Kw from both 110 leg. If you don’t use both legs at the same time, you are paying for both but only getting the electricity from 1 leg. Therefore it is always cheaper to use 220 volt. that way you use all of the electricity you pay for. That is why electricians will do load balancing in the house to try and guess and spread each 110 volt leg over both legs in the house to try and maximize power usage. When it is used by a 220 volt appliance it is already maximized.

That’s not correct. Your electricity is billed by kilowatt hours. Watts = amps x volts. So if you are producing 1500 watts on 110 volts, you are using 1.5 kWh per hour. If you are producing 1500 watts on 220 volts, you are still using 1.5 kWh per hour.

If you don’t use both 110 volt legs at the same time, you are not paying for both. You are only charged for the one leg you are using. Electricians balance loads so that you don’t wire all power away from one of the two 110 volt sources. If you have 100 watt service, you would want to draw 50 watt off one leg and the other 50 watts off the other 110 volt leg to balance your available power. I’m not alluding to which voltage draw (110v vs 220) is more efficient. Just giving my 2 cents to the previous comment.

But again you aren’t charged for volts you’re charged for kilowatt hours, which is a measure of amps x volts / time. So it really doesn’t matter whether you’re using 110V or 220V – the meter measures kwh.

The only thing I know is I have electric heat and had 110 baseboard heaters . Changed them all to 220v and my bill dropped by 50% . All the heaters were rated the same wattage when they were 110v . So seeing is believing 220v verses 110 v is no contest 220v is cheaper then 110v