Wireless charging has an efficiency issue

Shawn Knight

Posts: 15,256   +192
Staff member
Editor's take: Modern flagship smartphones are increasingly turning to wireless charging as a handy alternative to plugging in a physical cord to juice up. While the feature can be convenient, two major shortcomings - efficiency and range - will likely need to be addressed before wired chargers become obsolete.

On paper, wireless charging has its benefits. Being able to simply plop your phone down on a mat to recharge eliminates the potential for wear and tear on charging ports and cables. Others favor it due to sheer convenience.

But according to a recent report published by Medium’s OneZero, wireless charging is far less efficient at delivering power than you may realize.

Eric Ravenscraft used a Google Pixel 4 to test multiple wireless chargers and compared their efficiency to a standard block charger with a wired connection. To measure power draw, he used a “high-precision power meter” that sat between the charging block and the power outlet.

Charging from completely dead to 100 percent with a cable used an average of 14.26 watt-hours (Wh). With a flat Yootech wireless charger, Ravenscraft said a full recharge consumed around 21.01 Wh on average, or more than 47 percent more energy.

Worse yet, power consumption increased even further when the phone wasn't perfectly aligned on the charger.

Results were a little better with Google’s official Pixel Stand charger, as it eliminates the possibility of vertically misaligning the phone during charging. In testing, the Pixel 4 consumed an average of 19.8 Wh. Still, that’s nearly 39 percent more power versus using a charging cable.

Also worth noting is the fact that both wireless chargers consumed a small amount of power even when no phone was being charged. Over a 24-hour period, this standby power draw amounted to around six Wh. In comparison, the standard charger cable didn’t exhibit any measure amount of standby power draw.

The extra power consumed by charging one phone with wireless charging versus a cable is the equivalent of leaving one extra LED light bulb on for a few hours. It might not even register on your power bill. At scale, however, it can turn into an environmental problem.

To get an idea of what it would look like at scale, Ravenscraft consulted with the crew over at iFixit.

Arthur Shi, a technical writer for the repair specialist, said that, “at 100% efficiency from wall socket to battery, it would take about 73 coal power plants running for a day to [wirelessly] charge the 3.5 billion smartphone batteries [in the world] once fully.” Now assume that everyone put their phones on the charger wrong and efficiency was cut in half, you’d suddenly need double the number of power plants in order to charge all of the theoretical batteries.

Image credit: Nor Gal, Andrey_Popov

Permalink to story.

 
I have a wireless charger hooked up to a Choetech 100w wall adapter on USBC.

I also have a USB-C to Lightning cable coming from that adapter.

If I am doing anything with my phone like listening to Sirius XM radio, Youtube or finalizing 4K video, I absolutely have to plug it in because otherwise, it charges way too slow.

Trying to use the phone while wireless charging causes the back to heat up quick and the phone regains very little charge - if any. With the heat up comes thermal throttling.

The wireless pad is best used as I go to bed because charging slower over hours still presents a 100% charge by morning.

The other issue is that if the positioning isn't 100% perfect, it can sit there and not charge and is easy to overlook.

Wire just works best.
 
Last edited:
"But according to a recent report published by Medium’s OneZero, wireless charging is far less efficient at delivering power than you may realize."

Breaking news..from 2012.
 
What if I'm on an important call and my phones about to die? Like what I am I supposed to do???? "Hey, Ill call you back in like 30 when I can pick my phone up again......"
 
For my overnight charging, where efficiency isn't a big deal, I leave my Note 10+ on my Samsung wireless charger and its fine. I like how it props the phone up so I can see the time for when I wake up. Again, I'm sleeping so it doesn't matter that it takes 3 hours to charge instead of 1 hour through a wired cable.
 
You know, you can also eliminate all those issues with wear and tear on the connections by just using a NetDot or one of the other magnetic contacts on the port. I rolled all my rechargeable items over to those about two years ago and it has made my life a LOT easier ..... yes kiddies, at 64 I'm having trouble hitting the hole ... at least when it comes to chargers .... LOL
 
Those wireless charge pads that stands up your phone is the best because like above, it eliminates vertical alignment. It will also be great if the phone tells you how well aligned it is, so you can maybe get ones with movable horizontal alignment too (maybe like the phone stand for cards).

Besides, the stand up ones are so much better anyway, you can watch Youtube next to your bed; which was the primary reason I chose to use it in the first place.....
 
How is this a surprise? Does anyone understand how EM waves work? If so, how is this a surprise? And where do you think those 47% of extra energy ends up? Yes, radiated all over your room. Congrats on slowly cooking your brain if your charger is on your desk, not far from your head.

Now think about those smartasses who want electric cars to be charged wirelessly. Geniuses, right? I mean, I see only one good use for that. You could tell your mother in law: "Dear <insert her name>, you just stay resting in the car while it is recharging, I'll be back in 2 hours". One problem solved.
 
My god, the sheer waste... Please don't post awful, depressing stories like this.
 
Weel the guy measuring the power draw could have saved some time by reading the input specs of the power adapters and comparing them
 
It is a known issue.
Other than that usually wireless charging is adding heat to the phone itself, affecting battery health in the long time.
It is handy, but it’s not an efficient way to charge a smartphone.
 
47% loss, versus less than 5% for wired charging. What's the big deal? It amounts to a few pennies of electricity.

I also have an issue with the reporting that a wired charger consumes no measurable power draw on idle. This is contrary to all tests I have done previously. Not sure how sensitive the measuring equipment was, but it couldn't be very accurate.
 
A small percentage change in the efficiency of charging a phone is no issue for an individual. But because many people have smartphones, a general change to wireless charging will mean an increase in demand for electricity, leading to tons of additional carbon dioxide entering the atmosphere. We have to be concerned with the environmental costs of such changes, particularly as there are other ways to design the power connectors for phones.

Apparently, Apple is switching to wireless charging only because it is unhappy that otherwise it would be forced to switch tot he standard USB-C connector by laws in some countries. That having phones that can only be wirelessly charged can legitimately be banned on environmental grounds means that this bad attitude can be dealt with.

But the USB-C connector, although good in that it's the cheap cable, not the one in the phone, that wears out and needs to be replaced, could have been designed better; as has been noted, there are ways to design a physical power connector that doesn't have wear issues, and the same is true for data.
 
That having phones that can only be wirelessly charged can legitimately be banned on environmental grounds...

The total electricity use of all cell phone charging worldwide is trivial. The average US household consumes nearly 11,000 kWh/year of electricity; a smart phone roughly 5 kWh of that. The wireless overhead is, by the article, thus about 2.5 kWh annually. You'll save 50 times as much electricity by turning up your AC one single degree, or by taking a shower with slightly cooler water.
 
The total electricity use of all cell phone charging worldwide is trivial. The average US household consumes nearly 11,000 kWh/year of electricity; a smart phone roughly 5 kWh of that. The wireless overhead is, by the article, thus about 2.5 kWh annually. You'll save 50 times as much electricity by turning up your AC one single degree, or by taking a shower with slightly cooler water.

But don't forget that inefficient RADIO-WAVES-based charging means that all those lost radio waves have to end up somewhere else. Energy never disappears, it just changes place or format. Guess where those lost waves end up? Bouncing around your room, before being absorbed by the walls, furniture and your body.
 
Guess where those lost waves end up? Bouncing around your room, before being absorbed by the walls, furniture and your body.
Your average radio station transmits 50,000 - 100,000 watts. The sun transmits to the earth some 174,000,000,000,000,000 watts, much of it dangerous ionizing radiation. Where do you think all those "lost waves" go?

A few extra watts from a wireless charger isn't going to make a perceptible difference by any conceivable stretch of the imagination -- especially since it is all low-frequency, non-ionizing waves.
 
Your average radio station transmits 50,000 - 100,000 watts. The sun transmits to the earth some 174,000,000,000,000,000 watts, much of it dangerous ionizing radiation. Where do you think all those "lost waves" go?

Most of the solar radiation gets absorbed, dispersed or reflected by our atmosphere. Without it we'd be dead. The remaining solar radiation can be very dangerous for your skin and eyes. So you have to be exposed for a short time, wear clothing or skin protection, and sunglasses.

And regarding low-frequency non-ionizing waves, they can cause damage too. By warming certain points in your body, which has been proven. But probably in other ways as well (research is still underway). It's known that some low-power radio waves can damage your eye if they are entering within a certain narrow angle. Because the eye lens focuses the radiation to a very small spot, which gets heated and damaged.

Other spots in the body can also be damaged sometimes, if they get exposed for a longer time to standing wave nodes. That means if your body parts and the source of radiation are stationary for a prolonged time, and some of the standing wave nodes happen to be located in your body.
 
Most of the solar radiation gets absorbed, dispersed or reflected by our atmosphere.
No, only about 30% does. We get a full kilowatt per sq. meter at the surface on a good day ... and you know what? People don't try to avoid it; they strip off most of their clothes and go bathe in that radiation.

Now, most of the EUV and corpuscular radiation gets blocked by the atmosphere ... but you were referring to long-wavelength rays.

It's known that some low-power radio waves can damage your eye if they are entering within a certain narrow angle. Because the eye lens focuses the radiation to a very small spot, which gets heated and damaged.
Sigh. First of all, you're confusing research done on millimeter-wave frequencies, versus wireless charging which works on wavelengths a thousand times larger. The eye does not focus meter-sized waves. Secondly, the power levels from these wireless chargers are very small, and due to the old inverse-square law, drop off extremely rapidly. Unless you're pressing your eyeball directly to the charging pad, you're not absorbing enough energy to cause measurable heating.
 
Back