There’s nothing wrong with trying to save energy and lower your power bill, but unplugging your smartphone, tablet, and laptop chargers when they aren’t charging isn’t doing much. They’re not the energy vampires some people think they are.
Photo by trenttsd.
If you’ve ever wondered, “how much is it costing me to leave my charger plugged in?” Chris Hoffman at How-To Geek has your answer: next to nothing. Using a Kill a Watt meter, Hoffman tested chargers for the iPhone, iPad, MacBook, Chromebook, Windows laptops, Android phones. Individually, each charger rang up a big 0.0 watts of power usage. Then, just to be thorough, Hoffman filled an entire power strip with six different chargers and got a total reading of 0.3 watts.
Assuming you had the highest cost of electricity in the country, it would cost you less than a dollar to leave them plugged in for the entire year. And that’s assuming you never unplug them at all for travel or moving around your home over the course of the year’s 8760 hours. So go ahead, keep your chargers plugged in if you want. That being said, there are still some energy vampires around your home you should keep an eye out for. Also, everything changes when your charger is actually connected to your device, so it’s still not a good idea to keep your devices perpetually charging. The whole experiment is worth a read, so check it out at the link below.
Tested: Should You Unplug Chargers When You’re Not Using Them? [How-To Geek]
Comments
15 responses to “Don’t Stress About Unplugging Your Wall Chargers When They’re Not In Use”
I used to be really worried about leaving my chargers plugged in, but after testing with a power meter I discovered that most register as using 0.0 watts. My worst charger is from a no name Android tablet, which comes in at a whopping 0.3 watts. It is nice to have one less thing I need to concern myself with.
Times by how ever millions in the world.. Yeah, time to grow up and think broader sometimes.
Thanks for repeating exactly what the article had already told us. You da real mvp.
Most people don’t realise that it’s the ‘load’ on the power outlet that uses electricity. Chargers should be zero load when they’re not charging anything unless they’re poorly made and have inefficient coils in them, creating load and causing them to heat up.
Of course if your charger has any kind of LED on the circuit that stays on, then you’ll be wasting electricity, albeit quite little.
Switch mode power supplies will always have small amounts of power consumption even when under no load. The supply is still running.
http://www.zl2pd.com/img/smps/smps_block.gif
All those section still run with no output load so they do still draw tiny amounts of power.
Transformer based supplies will also draw an even smaller amount of current but due to inefficiencies in the components
Over all for the individual is a complete non issue but over how many billion power supplies in the world draining that small amount it is a lot.
Though I’ve not actually tested it, I’d suspect that switch mode power supplies would draw considerably less power under no load than transformer supplies. One indication of power draw is temperature. Switch mode supplies are consistently cooler at rest than transformer supplies. The higher the potential current delivery the higher the no-load draw for transformers. Put your hand on a large transformer supply then do the same on an equivalent current capacity switch mode supply and you notice a significant difference. I’ve tried to convert all my transformer supplies to switch mode as they’re generally more efficient all round.
The higher the potential current delivery the higher the no-load draw for transformers.
very nice generalisation there and would love to know where you got that from.
Or I’ll give you an example and you can tell me where the extra draw is from.
On one of my recent supplies I built as a high current variable supple 5-30V@20A
I used a simple 3 terminal regulator, with external bypass transistors, the transistors switch on once there is 500ma draw through the regulator, a typical standard current boost circuit for 3 term regulators, so when there is less than 500ma draw the transistors are completely switched off, and I can easily vary the total current output but adding or subtracting transistors along with a suitable transformer change. The quiescent current through the regulator is still the same no matter how many transistors I piggy back onto the regulator, so how does adding more transistors and therefore increasing current capacity, increase its total no load draw?
I am more worried about spoiling the charger, or the charger burning and causing fire 🙂
Yeah, the fire risk and danger to little ones and pets was the main reason I always thought powerstrips should be off.
Wait, did he test this with the devices plugged into the chargers but turned off, or just the chargers on their own?
I often leave my laptops plugged in for extended periods while not in use and I’d be stunned if they’re not drawing any charge at all like this says – at the very least they’ll be trickling in some charge to keep the battery topped up.
I am 100% certain “leaving the charger plugged in” does not include “while my laptop is plugged into it” 🙂
This is me wearing the Dad hat. But I unplug all of mine when not in use for fear the little one may grab the cable and put it in their mouth.
This happened recently to someones child and was in the news. So yes if you have young kids or curious pets, keep your chargers well out of reach when they’re plugged in.
Lets say there are 10 million of these chargers in the world.
0.3W × 3.154e+7s (year) x 10e+6 (count) = 9.4e+13 J of energy per year per 10 million chargers.
That’s the same order of magnitude as the total fuel energy in one full tank of an A380. I.e. tiny.
In comparison the U.S. alone in 2009 used 1.4×10^19J. If 10 million chargers had been left unconnected for the whole year, the U.S. would have saved 0.0007% of their energy consumption that year.
This is such a small amount that it is effectively useless. There are much better ways to save power. Infact, seeing as the calculations were only correct to 2 significant figures, a 0.0007% saving of energy is within the margin of error, and could infact be much much smaller.
Regards
Someone who thinks smarter rather than broader.
current estimates say there are around 4.3 billion mobile phones in the world at the moment, so a few more than 10million chargers being used. and that is for phones only, when you look at every device that uses standby power, I am in my study and I can see 10 devices that use standby power, that’s just in one room. in my whole house, theres probably 30.
Say for Australia, last census there were 7.76million households, times 30 devices. so 22million for aus alone.
World wide your 0.0007% could quickly rise to even 1% but seen as world wide we burn the equivalent of 4000 million tonnes of oil a year, 1% is a hell of a lot.