Wikipedia:Reference desk/Archives/Science/2020 July 2

= July 2 =

Rechargeable lithium batteries: recharge-discharge-recharge-discharge-recharge-overheat-explode!
This question is mostly directed at our two resident experts on electrical engineering: and. Though of course, if others can contribute an answer, go for it!

See the story here. Apparently, the tenant, when evicted from the building, left some rechargeable torches (presumably with Lithium-ion or Lithium-polymer batteries) plugged in. After a few weeks, where the batteries were left to discharge and recharge continuously, a battery overheated and exploded - and the entire building went up in flames!

My question is this: When designing a product with rechargeable Lithium batteries (be it a torch, a cellphone, a notebook computer, etc), WHAT, exactly, should the electrical engineer do to stop exactly this from happening?

Obviously, ideally, a product with a lithium battery should never be left plugged in for more than a day or so (or whatever). But if YOU were the one designing the product, what are the measures that can be taken to mitigate or eliminate this particular scenario from occurring? Eliyohub (talk) 08:26, 2 July 2020 (UTC)
 * Lithium batteries almost always have a charge controller built in. this will stop excessive current flowing in or out, over charging, and also stop discharge when the voltage level is too low. I don't know if they measure temperature. but presumably the battery developed an internal fault. Once it shorts internally the controller cannot do anything to stop that. Graeme Bartlett (talk) 11:35, 2 July 2020 (UTC)
 * Temperature is commonly measured but not universally. Also there have been some very crappy lithium-ion cylindrical cell (18650 etc) chargers which are known to do some very silly things. Likewise hobbyist chargers often also allow a wide selection of settings meaning it's easy for someone careless to screw up charging. But anyway, the above source suggests it was batteries inside a device. Most of these tend to at least cut off charging, although whether they still do a proper CC-CV charging cycle or instead do things incorrectly may vary, and in addition their protection circuitry may be limited to preventing over charging and over discharging, with no real health monitoring. A bigger concern may be that the cheaper ones from generic brands out of China may use cylindrical cells of dubious repute, e.g. second hand or that weren't good enough for even a crappy laptop battery pack. But also since this was a torch, those can be treated poorly leading to damage. While it's probably not as bad as the RC/model vehicle situation where the batteries can get really dinged up so that potentially great care needs to be taken when charging them, it's still likely to be risky than a phone or even a laptop. BTW I'd note that source doesn't support Eliyohub's claim about the incident. There's no mention of a few weeks or charging and discharging continuously. Instead it simply says they were left charging overnight. I had a quick look but didn't find any sources suggesting they were charged for weeks. Nil Einne (talk) 16:47, 2 July 2020 (UTC)


 * Let me just circle back to the original question:
 * "WHAT, exactly, should the electrical engineer do to stop exactly this from happening"?
 * Well, this is a loaded question. Which electrical engineer?
 * And it's a doubly-loaded question: this is a question of safety. I am categorically not going to provide specific safety advice that meets your engineering safety requirements.  I will provide some general, broad, non-specific guidance to help orient you in a direction that I believe is generally correct, and refer you to resources that I know and use; but the safety of your engineering project is a complicated and multi-faceted ask: I simply can't "sign off" on any method that "will work" for you.  The core design for safety in an engineering product has ethical and legal consequences: the reference desk ain't the place for that kind of thing.
 * Okay - fine-print out of the way...
 * In a consumer product, a different engineer(-ing team) designs the battery, the power supply, and the "load". (In power engineering, everything else is a "load" - electrical load - whether that load is a CPU, a display, a radio, a motor, a complex system-on-chip, a fully-integrated device... )
 * So, all of these teams need to work together cohesively to define the parameters that enable safe operation.
 * I am a firm believer that the core of battery safety-engineering starts at the battery design itself: the cell designer is the primary responsible party for ensuring that runaway thermal events don't ever occur. The cell design team is only one sub-part of the entire battery engineering team; they're the ones who engineer the chemical and mechanical packing of the individual cells; they are part of the battery team who design the "pack" (or its equivalent).  Once again, the pack should be mechanically and electrically designed to preclude thermal runaway events.
 * A lot of the really really serious engineering work is in correctly specifying, and qualifying that specification is being met in mass-production. A well-designed battery should not undergo a thermal runaway event even if it is subjected to an abuse load.  This thesis of mine - which is really a statement of political nature, not of engineering merit - ultimately means that the electrical engineering teams who design the power-supply and the load need to do nothing except to verify the specification is met.
 * Now there are diverse opinions on this topic. The existence of a thing called a "battery management system" is the manifestation of the exact counterpoint to my personal opinion: others believe that the circuit designer should actively control and protect the battery.  As far as it pertains to safety, I respectfully disagree: the use of a BMS to ensure safety is the shirking of responsibility, or the proverbial leaking of abstraction, of this task into the power-supply and ultimately into the "intelligent load" (e.g. the software or firmware of the device that's consuming energy).  Broadly speaking, the objective for moving this kind of safety-logic out of cell and pack design, and into circuit- and software-, is motivated by a desire to make the cell and pack more energy-dense: more volt-amps per cubic meter, more volt-amps per kilogram.  It is my opinion that this motivation yields a cell that is less safe; and that some engineers justify this by claiming that they have made a whole system that is equally safe.  This is one element of the "political nature" of the problem: it really depends on how your engineering team delegates responsibility among its sub-teams.  And, it is my opinion that a whole system cannot be made more safe than its individual sub-components: that's just not how I understand the concept of "safety."  But... at least I am humble enough to admit that my opinion is a personal interpretation of a general trend in the industry, and not a fact in its own right.  You may inform your own opinions by comparing factual data or case-studies on various failures, if you read far and wide, broad and deep, in the literature in this field.
 * Now, whether an "intelligent" power-system is safety-critical or not, putting "intelligence" into the power system is a hot topic among power designers. Here are a few white-papers on it:
 * Renesas: What is Digital Power?
 * TI: What is digital power?
 * ... and sure, you can find the same kind of white-paper at the website of your favorite power-supply vendor. This is a broad topic: but the general trend over the last, uh, ...decades, has been that the power supply should use "intelligence" (complicated digital circuitry and software) to protect the load from the supply, and to protect the supply from the load.  This is of critical design importance when primary energy is wholly- or partially- sourced by a volatile electrochemical cell.
 * The place to go is APEC (Applied Power Electronics Conference), where you can meet the people and teams who do this stuff.
 * Ultimately, a safe product is one whose design is suitable for its actual use: that means taking into account normal and abnormal envelope of operational and environmental factors, plus reliability and robustness to a reasonable expectation of wear-and-tear as well as damage. There isn't one specific circuit or one specific software control algorithm, and there isn't even one particular cell mechanical structure electrochemistry recipe that makes the product safe.  It's all of these things together.  If you're an electrical engineer who cares: hang out at APEC and IEEE and other organizations; read their books and journals:
 * IEEE Power and Energy Society publications
 * APEC's list of useful publications
 * EE Times is great light reading; and there are a lot more heavy-hitting books and journals listed there. Those are the places where you can see current case-studies for designs (like specific circuit topologies and specific part numbers, if you're a power EE; or specific methodologies for firmware and software designers if you're a "load" engineer).
 * For example, just this week we had Buck-Boost Devices Extend Battery Life...; and while you're over there, you can read the two-part Perspectives piece on battery electric vehicles from last week and this week: Part 1 and Part 2, by Egil Juliussen, who is a very experienced engineer but is considerably more optimistic than I am.
 * The OP asked:
 * "what are the measures that can be taken to mitigate or eliminate this particular scenario"?
 * So let me just close by hammering it in: safe design is the whole-system. Your product can not be made safe by adding the juice from a single circuit-topology, a specific magic part-number, a particular control-algorithm.  Your product is made safe by robust, broad, deep, engineering design, validation, test, and verification.
 * Nimur (talk) 18:04, 2 July 2020 (UTC)