top | item 41483654

Charging lithium-ion batteries at high currents first increases lifespan by 50%

249 points| snazz | 1 year ago |eurekalert.org

109 comments

order

starky|1 year ago

From having worked a bit in the industry I'm a bit skeptical about this study, I've definitely seen studies and experiments that used different initial charging conditions that would have shown better fade performance if this was true.

Not to mention, how much does the increased SEI change the impedance of the cell (thus reducing the subsequent charge speed) and the capacity available.

Joel_Mckay|1 year ago

Agreed, the study summary needs better explanation to justify the contradiction with dozens of other lab tests. We have several boxes of 21700 cells from various manufacturers (Samsung/Sony/Panasonic) undergoing aging trials for over 2 years now.

All LiIon and LiPol chemistries have shown the following:

1. deep-cycle discharges below 60% full cuts usable charge cycle counts from 8000 to under 2000 uses.

2. high-current discharge or rapid-charging accelerates capacity losses by about 15% a year

3. Internal resistance goes up as dendrite defect shorts damage the cell. Additionally, the self-discharge rates increase as the cell is degraded.

Very surprising if the technique works for all cell chemistries. =3

oska|1 year ago

The analogy they use in the article is all sorts of dodgy too :

> Removing more lithium ions up front is a bit like scooping water out of a full bucket before carrying it, Cui said. The extra headspace in the bucket decreases the amount of water splashing out along the way. In similar fashion, deactivating more lithium ions during SEI formation frees up headspace in the positive electrode and allows the electrode to cycle in a more efficient way, improving subsequent performance.

mensetmanusman|1 year ago

Such a cool finding if it pans out in production. A hidden process variable hiding in plain sight.

userbinator|1 year ago

They'll never do it because it means decreased profits.

There are articles that appear here and elsewhere semi-frequently about how doing something simple extends battery lifetimes a huge amount, but those never get implemented in practice except perhaps for highly niche applications.

Instead what usually happens is they'll then find a way to make them last the same amount of time, but with higher energy density. The "high voltage" lion cells (>4.2V end of charge) are an example of that process; they will last much longer than previous types if charged to 4.2V, but they'd rather advertise them as 4.3 or 4.35 or even 4.4V(!) and the extra capacity that gives.

rkagerer|1 year ago

TLDR: During a battery's initial "formation" charge, some of the lithium deactivates, forming a squishy, protective layer around the negative electrode, called the solid electrolyte interphase (SEI). Today, manufacturers typically do a slow formation charge, during which about 9% of the lithium is lost to the SEI. It was thought this was needed to form a robust layer. But the researchers found at the higher initial charge currents used in this study, 30% becomes SEI - so you loose some battery capacity (for a given amount of lithium), but wind up with a beefier protective layer on your electrode and better longevity across subsequent charge cycles.

user_7832|1 year ago

> so you loose some battery capacity (for a given amount of lithium), but wind up with a beefier protective layer on your electrode and better longevity across subsequent charge cycles

If there's a capacity tradeoff, why not use a slightly modified chemistry (like how LTO is, for example)? Though I guess this article was more about the existence of the phenomenon rather than using it.

bluSCALE4|1 year ago

And how long does it take to achieve this 9% layer?

sharpshadow|1 year ago

I was able to revive lithium batteries which have been discharged to much and didn’t charge by connecting them to a fully charged one for a couple of seconds.

xxs|1 year ago

That's all about the electronics inside the battery, rather than the chemistry. You can force feed them with any power supply, ignoring the 'standard' BMS.

Reubachi|1 year ago

this is the equivilent of using a car and it's alternator (generator) to jump another car with a alt/generator you know is bad.

'risking it for the biscuit"

mleonhard|1 year ago

Since a good SEI layer on the electrode is important, couldn't they put the layer on the electrode before assembling the battery? Then they could make the layer's shape more even.

dzhiurgis|1 year ago

Whats a battery lifespan? Is it capacity degradation or random failure?

If discovery slows down capacity degradation, but now your EV battery is 100x more likely spontaneously fail ($$$) - it's not really an improvement. Maybe ok for consumer device tho.

earleybird|1 year ago

There are two lifespans. The shelf life and the number of charge cycles (less of a span perhaps) where you charge to 100% and discharge to near 0. If you keep your charge/discharge to 80/20 then your battery life is limited primarily by the shelf life. eg. keep your Nissan Leaf in the 20-80% state of charge range and it will probably last 20 years, DC fast charge it every time to 100% you'll probably only get 2000 cycles (5-7 years) out of it.

jostmey|1 year ago

I’m confused… Is this just a prediction or has it been experimentally verified?

Euphorbium|1 year ago

I remember a recent paper that found that charging at double the current, but at 2khz frequency square wave basically eliminated battery degradation.

keepamovin|1 year ago

Probably burns in the microstructure making it more stable to filament formation, like the way high voltage electricity etches wood.

fencepost|1 year ago

TL;DR the high current causes a layer on the negative electron to form a bit differently (and obviously faster), previously it was thought that a slower initial charge led to better formation. This is a process tweak incremental improvement, not anything truly fundamental.

hn_throwaway_99|1 year ago

> This is a process tweak incremental improvement, not anything truly fundamental.

Regardless of whether this is a "process tweak" or something "truly fundamental", a 50% increase in battery lifespan would be huge, regardless.

The conspiracy theorist in me though thinks that a lot of consumer electronics makers wouldn't like this, because lower battery capacity has to be a big driver of upgrade cycles. I'm guessing a lot of folks are similar to me: these days, somewhere in the 2-3 year mark my cell battery capacity starts degrading noticeably. My phone otherwise works great, and I certainly don't need the features in the latest model phone, and of course I know I can pay for just a battery replacement, but sometimes I think "Well, if I need to replace the battery, I might as well get a new phone - it's got <some feature that is marginally better but that I'm now convincing myself is super cool to justify my not-really-necessary upgrade purchase>".

I think with 50% more battery lifespan I would rarely, if ever, use dwindling battery capacity as an excuse for an upgrade purchase.

rkagerer|1 year ago

I think you meant negative electrode.

westurner|1 year ago

But are there risks and thus costs?

zweifuss|1 year ago

Initial Litium deactivation is 30 % compared to 9 % with slow formation loading.

solarkraft|1 year ago

Rule of thumb: It’s a battery innovation/“breakthrough”, so the chance it’ll reach the market any time soon is slim.

jillesvangurp|1 year ago

A more productive way to think about this is in terms of technology readiness levels: https://en.wikipedia.org/wiki/Technology_readiness_level

There's a specialized version of this called BC/RL for battery research as well.

This particular article sits about half way the scale. This was an actual study, with actual batteries, that reportedly actually had improved life spans. So dismissing that with a hand wavy this is all just academic nonsense doesn't quite fly here. But of course from here to production is indeed quite a journey. I bet a lot of companies with active investment in battery R&D are paying attention and might be going to try to replicate the success.

Also worth noting that if you only pay attention to the stuff that is at the highest levels, you basically miss out on new things until they are old news. For example if you have been dismissing solid state batteries, you might have missed the news that they are being used in products now.

webprofusion|1 year ago

As battery innovations go, this one seems relatively trivial to implement? The bigger problem is probably shipping battery packs that are sitting fully charged for a long time before the customer gets them, depending on the chemistry.

userbinator|1 year ago

is 30 times faster

Faster than what?

It turns out this is about the very first charge after assembly of the cell, not regular use.

However, I doubt that this finding will be used much, except perhaps in applications like aerospace; it is in manufacturer's economic interests that their products have short lives.

Edit: looks like as usual, comments that expose the truth get buried ;-)

thesh4d0w|1 year ago

Just 2 paragraphs down, it's very clearly explained:

> giving batteries this first charge at unusually high currents increased their average lifespan by 50% while decreasing the initial charging time from 10 hours to just 20 minutes.

Panzer04|1 year ago

In what way is it in the MFG interest?

They don't exist in a vacuum, and batteries are a commodity market. Cheap ways to improve their product seem like an easy win to me.

Demand for batteries is virtually insatiable. The only constraint is the price, and a better quality product can command a higher price.