From having worked a bit in the industry I'm a bit skeptical about this study, I've definitely seen studies and experiments that used different initial charging conditions that would have shown better fade performance if this was true.
Not to mention, how much does the increased SEI change the impedance of the cell (thus reducing the subsequent charge speed) and the capacity available.
Agreed, the study summary needs better explanation to justify the contradiction with dozens of other lab tests. We have several boxes of 21700 cells from various manufacturers (Samsung/Sony/Panasonic) undergoing aging trials for over 2 years now.
All LiIon and LiPol chemistries have shown the following:
1. deep-cycle discharges below 60% full cuts usable charge cycle counts from 8000 to under 2000 uses.
2. high-current discharge or rapid-charging accelerates capacity losses by about 15% a year
3. Internal resistance goes up as dendrite defect shorts damage the cell. Additionally, the self-discharge rates increase as the cell is degraded.
Very surprising if the technique works for all cell chemistries. =3
The analogy they use in the article is all sorts of dodgy too :
> Removing more lithium ions up front is a bit like scooping water out of a full bucket before carrying it, Cui said. The extra headspace in the bucket decreases the amount of water splashing out along the way. In similar fashion, deactivating more lithium ions during SEI formation frees up headspace in the positive electrode and allows the electrode to cycle in a more efficient way, improving subsequent performance.
They'll never do it because it means decreased profits.
There are articles that appear here and elsewhere semi-frequently about how doing something simple extends battery lifetimes a huge amount, but those never get implemented in practice except perhaps for highly niche applications.
Instead what usually happens is they'll then find a way to make them last the same amount of time, but with higher energy density. The "high voltage" lion cells (>4.2V end of charge) are an example of that process; they will last much longer than previous types if charged to 4.2V, but they'd rather advertise them as 4.3 or 4.35 or even 4.4V(!) and the extra capacity that gives.
TLDR: During a battery's initial "formation" charge, some of the lithium deactivates, forming a squishy, protective layer around the negative electrode, called the solid electrolyte interphase (SEI). Today, manufacturers typically do a slow formation charge, during which about 9% of the lithium is lost to the SEI. It was thought this was needed to form a robust layer. But the researchers found at the higher initial charge currents used in this study, 30% becomes SEI - so you loose some battery capacity (for a given amount of lithium), but wind up with a beefier protective layer on your electrode and better longevity across subsequent charge cycles.
> so you loose some battery capacity (for a given amount of lithium), but wind up with a beefier protective layer on your electrode and better longevity across subsequent charge cycles
If there's a capacity tradeoff, why not use a slightly modified chemistry (like how LTO is, for example)? Though I guess this article was more about the existence of the phenomenon rather than using it.
I was able to revive lithium batteries which have been discharged to much and didn’t charge by connecting them to a fully charged one for a couple of seconds.
That's all about the electronics inside the battery, rather than the chemistry. You can force feed them with any power supply, ignoring the 'standard' BMS.
Since a good SEI layer on the electrode is important, couldn't they put the layer on the electrode before assembling the battery? Then they could make the layer's shape more even.
Whats a battery lifespan? Is it capacity degradation or random failure?
If discovery slows down capacity degradation, but now your EV battery is 100x more likely spontaneously fail ($$$) - it's not really an improvement. Maybe ok for consumer device tho.
There are two lifespans. The shelf life and the number of charge cycles (less of a span perhaps) where you charge to 100% and discharge to near 0. If you keep your charge/discharge to 80/20 then your battery life is limited primarily by the shelf life. eg. keep your Nissan Leaf in the 20-80% state of charge range and it will probably last 20 years, DC fast charge it every time to 100% you'll probably only get 2000 cycles (5-7 years) out of it.
TL;DR the high current causes a layer on the negative electron to form a bit differently (and obviously faster), previously it was thought that a slower initial charge led to better formation. This is a process tweak incremental improvement, not anything truly fundamental.
> This is a process tweak incremental improvement, not anything truly fundamental.
Regardless of whether this is a "process tweak" or something "truly fundamental", a 50% increase in battery lifespan would be huge, regardless.
The conspiracy theorist in me though thinks that a lot of consumer electronics makers wouldn't like this, because lower battery capacity has to be a big driver of upgrade cycles. I'm guessing a lot of folks are similar to me: these days, somewhere in the 2-3 year mark my cell battery capacity starts degrading noticeably. My phone otherwise works great, and I certainly don't need the features in the latest model phone, and of course I know I can pay for just a battery replacement, but sometimes I think "Well, if I need to replace the battery, I might as well get a new phone - it's got <some feature that is marginally better but that I'm now convincing myself is super cool to justify my not-really-necessary upgrade purchase>".
I think with 50% more battery lifespan I would rarely, if ever, use dwindling battery capacity as an excuse for an upgrade purchase.
There's a specialized version of this called BC/RL for battery research as well.
This particular article sits about half way the scale. This was an actual study, with actual batteries, that reportedly actually had improved life spans. So dismissing that with a hand wavy this is all just academic nonsense doesn't quite fly here. But of course from here to production is indeed quite a journey. I bet a lot of companies with active investment in battery R&D are paying attention and might be going to try to replicate the success.
Also worth noting that if you only pay attention to the stuff that is at the highest levels, you basically miss out on new things until they are old news. For example if you have been dismissing solid state batteries, you might have missed the news that they are being used in products now.
As battery innovations go, this one seems relatively trivial to implement? The bigger problem is probably shipping battery packs that are sitting fully charged for a long time before the customer gets them, depending on the chemistry.
It turns out this is about the very first charge after assembly of the cell, not regular use.
However, I doubt that this finding will be used much, except perhaps in applications like aerospace; it is in manufacturer's economic interests that their products have short lives.
Edit: looks like as usual, comments that expose the truth get buried ;-)
Just 2 paragraphs down, it's very clearly explained:
> giving batteries this first charge at unusually high currents increased their average lifespan by 50% while decreasing the initial charging time from 10 hours to just 20 minutes.
starky|1 year ago
Not to mention, how much does the increased SEI change the impedance of the cell (thus reducing the subsequent charge speed) and the capacity available.
Joel_Mckay|1 year ago
All LiIon and LiPol chemistries have shown the following:
1. deep-cycle discharges below 60% full cuts usable charge cycle counts from 8000 to under 2000 uses.
2. high-current discharge or rapid-charging accelerates capacity losses by about 15% a year
3. Internal resistance goes up as dendrite defect shorts damage the cell. Additionally, the self-discharge rates increase as the cell is degraded.
Very surprising if the technique works for all cell chemistries. =3
m463|1 year ago
https://electrek.co/2023/08/29/tesla-battery-longevity-not-a...
oska|1 year ago
> Removing more lithium ions up front is a bit like scooping water out of a full bucket before carrying it, Cui said. The extra headspace in the bucket decreases the amount of water splashing out along the way. In similar fashion, deactivating more lithium ions during SEI formation frees up headspace in the positive electrode and allows the electrode to cycle in a more efficient way, improving subsequent performance.
unknown|1 year ago
[deleted]
mensetmanusman|1 year ago
userbinator|1 year ago
There are articles that appear here and elsewhere semi-frequently about how doing something simple extends battery lifetimes a huge amount, but those never get implemented in practice except perhaps for highly niche applications.
Instead what usually happens is they'll then find a way to make them last the same amount of time, but with higher energy density. The "high voltage" lion cells (>4.2V end of charge) are an example of that process; they will last much longer than previous types if charged to 4.2V, but they'd rather advertise them as 4.3 or 4.35 or even 4.4V(!) and the extra capacity that gives.
rkagerer|1 year ago
user_7832|1 year ago
If there's a capacity tradeoff, why not use a slightly modified chemistry (like how LTO is, for example)? Though I guess this article was more about the existence of the phenomenon rather than using it.
bluSCALE4|1 year ago
sharpshadow|1 year ago
xxs|1 year ago
Reubachi|1 year ago
'risking it for the biscuit"
mleonhard|1 year ago
dzhiurgis|1 year ago
If discovery slows down capacity degradation, but now your EV battery is 100x more likely spontaneously fail ($$$) - it's not really an improvement. Maybe ok for consumer device tho.
earleybird|1 year ago
jostmey|1 year ago
Euphorbium|1 year ago
i80and|1 year ago
2021: low frequency pulsed charging: https://vbn.aau.dk/ws/portalfiles/portal/451327786/C5.pdf
2024: high frequency pulsed charging https://onlinelibrary.wiley.com/doi/10.1002/aenm.202400190
Not up to really reading them right now, but this is a pretty neat area of research!
keepamovin|1 year ago
fencepost|1 year ago
hn_throwaway_99|1 year ago
Regardless of whether this is a "process tweak" or something "truly fundamental", a 50% increase in battery lifespan would be huge, regardless.
The conspiracy theorist in me though thinks that a lot of consumer electronics makers wouldn't like this, because lower battery capacity has to be a big driver of upgrade cycles. I'm guessing a lot of folks are similar to me: these days, somewhere in the 2-3 year mark my cell battery capacity starts degrading noticeably. My phone otherwise works great, and I certainly don't need the features in the latest model phone, and of course I know I can pay for just a battery replacement, but sometimes I think "Well, if I need to replace the battery, I might as well get a new phone - it's got <some feature that is marginally better but that I'm now convincing myself is super cool to justify my not-really-necessary upgrade purchase>".
I think with 50% more battery lifespan I would rarely, if ever, use dwindling battery capacity as an excuse for an upgrade purchase.
rkagerer|1 year ago
firefox2025|1 year ago
[deleted]
westurner|1 year ago
zweifuss|1 year ago
anonymous247|1 year ago
[deleted]
solarkraft|1 year ago
jillesvangurp|1 year ago
There's a specialized version of this called BC/RL for battery research as well.
This particular article sits about half way the scale. This was an actual study, with actual batteries, that reportedly actually had improved life spans. So dismissing that with a hand wavy this is all just academic nonsense doesn't quite fly here. But of course from here to production is indeed quite a journey. I bet a lot of companies with active investment in battery R&D are paying attention and might be going to try to replicate the success.
Also worth noting that if you only pay attention to the stuff that is at the highest levels, you basically miss out on new things until they are old news. For example if you have been dismissing solid state batteries, you might have missed the news that they are being used in products now.
webprofusion|1 year ago
userbinator|1 year ago
Faster than what?
It turns out this is about the very first charge after assembly of the cell, not regular use.
However, I doubt that this finding will be used much, except perhaps in applications like aerospace; it is in manufacturer's economic interests that their products have short lives.
Edit: looks like as usual, comments that expose the truth get buried ;-)
thesh4d0w|1 year ago
> giving batteries this first charge at unusually high currents increased their average lifespan by 50% while decreasing the initial charging time from 10 hours to just 20 minutes.
Panzer04|1 year ago
They don't exist in a vacuum, and batteries are a commodity market. Cheap ways to improve their product seem like an easy win to me.
Demand for batteries is virtually insatiable. The only constraint is the price, and a better quality product can command a higher price.