Also, my phone charges slower with a wireless charger.
Lots of hand wavy theories and generalizations in the answers below, some of them sound very convincing. None of them actually cite any sources or backup those theories with data.
Here’s my own acedotal experience. I’ve put my phone down on a seemingly well designed wireless charging pad every night for almost 4 years and this phones battery has shown zero sign of deterioration that I can see. This is the first phone I’ve ever owned with wireless charging and also the first with a battery that hasn’t given up the ghost in 2 years or less. The same pad also charges my smart watch every night, which doesn’t even have any other option for charging.
Next they’ll be telling you to avoid using cruise control on the highway because it will wear out the transmission. Use your phone as it was designed to be used and stop worrying.
Magnetic charging loses some energy in the form of heat on both coils.
Technologies like MagSafe lessen the severity of energy loss via ensuring the coils allign, however there is still some energy lost in the form of heat.
This is just a limitation of electromagnetic induction.
It’s a producer of heat placed right next to the battery.
This inefficiency also makes it take more energy to charge your battery. However, I would imagine it’s a nominal amount.
That’s an interesting theory. I’d like to see some numbers because I really doubt that this heating could be anywhere close to the many other kinds of heat produced through normal phone use. Especially considering that you’re unlikely to be stressing the biggest sources of heat in your phone (the screen and the processor) while it’s sitting in a wireless charging cradle. Also, the charging circuits certainly monitor and adjust for this kind of heat dissipation specifically and are able to control it far better than, for example, the sun hitting the screen or a warm pocket.
The charging circuit will maintain a “safe” temperature, sure, but using any battery monitor app you like, you can clearly see for yourself that when wireless charging, the phone will sit at a higher temperature during the process, than when wired charging. And every fraction of a degree matters. Not when you use the feature once, but if you do it every day, always charging at a higher temperature, it WILL shorten the lifespan of the battery. The same way every charge does, but to a slightly higher degree for every fraction of a degree in higher temperature.
The battery does not suffer as much damage from heat when discharging, or when just sitting there, not that that is good for it, either. But every extra bit of thermal strain WHILE charging, causes more damage than if it were running cooler. That’s simply a fact of lithium ion batteries.
I remember LTT wanting to find out the damage that wireless or fast charging does but found that the way we charge our phones mattered more or something: https://youtu.be/AF2O4l1JprI
I found the segment and may have summarized it incorrectly but I can’t rewatch the video entirely right now.
I can only offer you my experience-based evidence, but three magnetic chargers I’ve used have all made my phone significantly hotter then charging it at the equivalent speed with a cable.
This has been true across 6 Android devices. Two from Google. Four from Samsung. However, I will also say that because of this trend, I stopped using wireless chargers about a year and a half ago, so it’s quite possible they might have improved since then.
Yes, the tech has gotten a lot better. 6 phones over about 12 years (rough length of time since debut of inductive charging in smartphones) averages to about 2 years per phone. If you weren’t getting the flagship phone each year that lifetime would be shorter. That was comparable to the lifetime of each over my phone’s during that same time, none of which had wireless charging. The phone I have now is the first I’ve had to use inductive charging and it has already lasted twice as long as any of the others and shows now signs of deterioration.
The problem is heat, not charging speed. A wired charger heats the phone less than a wireless charger, and a slow charger heats it less than a fast one.
It’s not like wireless charging will literally destroy your battery instantly, but it WILL do so faster than wired charging at the same speed.
You could offset the heat by charging even slower via wireless (easy with something that has a small battery to begin with, like a watch) but no matter what method is used, the one that runs the battery the coolest WILL last the longest, whether the difference is just one year out five, more, or less.
Which only states that wireless charging will wear out your battery faster than wired. Not that it’s critically damaging the instant you opt to use it, or that it will shorten the lifespan of a device to nothing.
There is a difference, that’s straight up true. One method has the battery sitting at a temperature that is worse for the chemistry involved. That is indisputable. Super fast fast charging that only slows down to keep an already hot battery from becoming dangerous, essentially redlining it for the whole process, has the exact same downsides in terms of thermals, except that wireless charging, being inherently slower for equivalent temperatures, keeps the battery warm for longer.
What exactly the difference ends up being, varies from application to application, and from device to device. Obviously, if you lower the charge speed of wireless until it doesn’t heat the battery any more than wired, there wont be a difference, but then you could just do the same for wired charging, and have the battery last even more cycles.
Bottom line, whatever option runs the battery the coolest on a given device, WILL CAUSE THE LEAST WEAR. That’s simply true.
Within the first year, even with slow wired daily charging, the battery would have lost at least a few percentages of capacity. By year three, losing around 10% is basically unavoidable, but typical loss at that point is closer to 15 or 20% simply due to age.
You can use something like AccuBattery to measure the current real capacity of your battery. It will measure the amperages going in and out of the battery, calculate the milliamphours, then average it over several battery cycles.
Even when new, the real capacity will vary several hundred milliamphours from cell to cell, within the same exact model. Batteries are a chemical device, some inconsistency from one cell to the next is unavoidable. That’s why cell monitoring and balancing circuits are so critical in multi-cell packs.
You may not have noticed a difference, but the capacity loss that your battery has suffered is almost certainly worse than it would have been if you’d charged wired and slowly.
Then you’re not exactly using wireless charging the way the average person does, are you?
A “hotter” charger will degrade the battery more. Have I at any point claimed something beyond that?
The average Qi charger wont trickle at the slowest speed possible to meet a schedule, unless a user specifically sets it up that way. Comparing the average use cases and user habits, the cooler charging solution will net you more cycles.
Does your doctorate allow you to somehow claim otherwise?
There’s a wide temperature range where these materials function optimally. You’d need to get them to the point where they’re burning to touch before any significant degradation occurs. For reference 50 °C is a good temperature for tea.
Now that, is much better than walking in with “my battery has had ‘zero’ degradation in three years”.
Why did you lead with something that sounds like obvius anecdotal bullshit if you knew this?
That said my phone sits between 55 and 65 degrees when wireless charging at even just 5 watts. I don’t think it’s ever not been hot to the touch when picking it off a wireless charger.
Wireless charging WILL wear out your battery faster.
For longevity, use a slow wired charger. This will put the least thermal strain on the battery.
How exactly?
Also, my phone charges slower with a wireless charger.
Lots of hand wavy theories and generalizations in the answers below, some of them sound very convincing. None of them actually cite any sources or backup those theories with data.
Here’s my own acedotal experience. I’ve put my phone down on a seemingly well designed wireless charging pad every night for almost 4 years and this phones battery has shown zero sign of deterioration that I can see. This is the first phone I’ve ever owned with wireless charging and also the first with a battery that hasn’t given up the ghost in 2 years or less. The same pad also charges my smart watch every night, which doesn’t even have any other option for charging.
Next they’ll be telling you to avoid using cruise control on the highway because it will wear out the transmission. Use your phone as it was designed to be used and stop worrying.
Magnetic charging loses some energy in the form of heat on both coils.
Technologies like MagSafe lessen the severity of energy loss via ensuring the coils allign, however there is still some energy lost in the form of heat.
This is just a limitation of electromagnetic induction.
It’s a producer of heat placed right next to the battery.
This inefficiency also makes it take more energy to charge your battery. However, I would imagine it’s a nominal amount.
That’s an interesting theory. I’d like to see some numbers because I really doubt that this heating could be anywhere close to the many other kinds of heat produced through normal phone use. Especially considering that you’re unlikely to be stressing the biggest sources of heat in your phone (the screen and the processor) while it’s sitting in a wireless charging cradle. Also, the charging circuits certainly monitor and adjust for this kind of heat dissipation specifically and are able to control it far better than, for example, the sun hitting the screen or a warm pocket.
The charging circuit will maintain a “safe” temperature, sure, but using any battery monitor app you like, you can clearly see for yourself that when wireless charging, the phone will sit at a higher temperature during the process, than when wired charging. And every fraction of a degree matters. Not when you use the feature once, but if you do it every day, always charging at a higher temperature, it WILL shorten the lifespan of the battery. The same way every charge does, but to a slightly higher degree for every fraction of a degree in higher temperature.
The battery does not suffer as much damage from heat when discharging, or when just sitting there, not that that is good for it, either. But every extra bit of thermal strain WHILE charging, causes more damage than if it were running cooler. That’s simply a fact of lithium ion batteries.
I remember LTT wanting to find out the damage that wireless or fast charging does but found that the way we charge our phones mattered more or something: https://youtu.be/AF2O4l1JprI
I found the segment and may have summarized it incorrectly but I can’t rewatch the video entirely right now.
I can only offer you my experience-based evidence, but three magnetic chargers I’ve used have all made my phone significantly hotter then charging it at the equivalent speed with a cable.
This has been true across 6 Android devices. Two from Google. Four from Samsung. However, I will also say that because of this trend, I stopped using wireless chargers about a year and a half ago, so it’s quite possible they might have improved since then.
Yes, the tech has gotten a lot better. 6 phones over about 12 years (rough length of time since debut of inductive charging in smartphones) averages to about 2 years per phone. If you weren’t getting the flagship phone each year that lifetime would be shorter. That was comparable to the lifetime of each over my phone’s during that same time, none of which had wireless charging. The phone I have now is the first I’ve had to use inductive charging and it has already lasted twice as long as any of the others and shows now signs of deterioration.
So your anecdotal evidence trumps everyone elses, as well as actual knowledge of the chemistry involved?
I typically use my phones for longer than two years. This included both mine and my wife’s devices.
The problem is heat, not charging speed. A wired charger heats the phone less than a wireless charger, and a slow charger heats it less than a fast one.
It’s not like wireless charging will literally destroy your battery instantly, but it WILL do so faster than wired charging at the same speed.
You could offset the heat by charging even slower via wireless (easy with something that has a small battery to begin with, like a watch) but no matter what method is used, the one that runs the battery the coolest WILL last the longest, whether the difference is just one year out five, more, or less.
I don’t know but anecdotally I’ve experienced this with every single phone I’ve had that’s been wireless charging.
It just shortens its life somehow. I thought I was crazy. It didn’t make sense unless it does fancy shit with the crystals inside or it heats it badly
That is a deeply unsatisfying non-answer.
Seeing as you’re unhappy with the actual answers, I’m thinking you just wanted to be agreed with.
I’m unhappy with the answers because they just parrot the first comment and provide little evidence.
Which only states that wireless charging will wear out your battery faster than wired. Not that it’s critically damaging the instant you opt to use it, or that it will shorten the lifespan of a device to nothing.
There is a difference, that’s straight up true. One method has the battery sitting at a temperature that is worse for the chemistry involved. That is indisputable. Super fast fast charging that only slows down to keep an already hot battery from becoming dangerous, essentially redlining it for the whole process, has the exact same downsides in terms of thermals, except that wireless charging, being inherently slower for equivalent temperatures, keeps the battery warm for longer.
What exactly the difference ends up being, varies from application to application, and from device to device. Obviously, if you lower the charge speed of wireless until it doesn’t heat the battery any more than wired, there wont be a difference, but then you could just do the same for wired charging, and have the battery last even more cycles.
Bottom line, whatever option runs the battery the coolest on a given device, WILL CAUSE THE LEAST WEAR. That’s simply true.
I’ve only ever wirelessly charged my Pixel 5 and it’s still going strong after 3+ years.
I’ve charged my Pixel 6 wirelessly every night for 3 years and had zero battery loss. Simply not true.
Then you are truly, talking out your ass.
Within the first year, even with slow wired daily charging, the battery would have lost at least a few percentages of capacity. By year three, losing around 10% is basically unavoidable, but typical loss at that point is closer to 15 or 20% simply due to age.
You can use something like AccuBattery to measure the current real capacity of your battery. It will measure the amperages going in and out of the battery, calculate the milliamphours, then average it over several battery cycles.
Even when new, the real capacity will vary several hundred milliamphours from cell to cell, within the same exact model. Batteries are a chemical device, some inconsistency from one cell to the next is unavoidable. That’s why cell monitoring and balancing circuits are so critical in multi-cell packs.
You may not have noticed a difference, but the capacity loss that your battery has suffered is almost certainly worse than it would have been if you’d charged wired and slowly.
I have a Ph.D. in battery chemistry
Ok that’s what matters in a device. A 10% falloff on a 2 day battery life is not significant.
You can charge wirelessly slowly. I had the system set up to charge to full over the course of the entire night which is a rate around C/10.
Then you’re not exactly using wireless charging the way the average person does, are you?
A “hotter” charger will degrade the battery more. Have I at any point claimed something beyond that?
The average Qi charger wont trickle at the slowest speed possible to meet a schedule, unless a user specifically sets it up that way. Comparing the average use cases and user habits, the cooler charging solution will net you more cycles.
Does your doctorate allow you to somehow claim otherwise?
There’s a wide temperature range where these materials function optimally. You’d need to get them to the point where they’re burning to touch before any significant degradation occurs. For reference 50 °C is a good temperature for tea.
https://www.researchgate.net/figure/Lithium-ion-battery-life-vs-temperature-and-charging-rate-36-39-44-45_fig2_260030309
Now that, is much better than walking in with “my battery has had ‘zero’ degradation in three years”.
Why did you lead with something that sounds like obvius anecdotal bullshit if you knew this?
That said my phone sits between 55 and 65 degrees when wireless charging at even just 5 watts. I don’t think it’s ever not been hot to the touch when picking it off a wireless charger.