Why can't the state of charge of the battery be accurately measured?

Measuring the energy stored in an electrochemical device such as a battery is complex, and a state-of-charge (SoC) reading on a fuel gauge can only provide a rough estimate. Users often compare the battery SoC to the vehicle's fuel gauge. Calculating the liquid in the tank is simple because liquid is a tangible entity; battery state of charge is not. The energy stored in a battery cannot be quantified either, as prevailing conditions such as load current and operating temperature affect its release. Batteries work best at normal ambient temperatures; performance suffers when cold, and in addition, batteries lose battery capacity due to aging, so why can't a battery's state of charge be accurately measured?

Current fuel gauge technology is full of limitations, which are exposed when users of the new iPad assume that 100% on the fuel gauge should also correlate to a fully charged battery. This was not always the case, with users complaining that the iPad lithium polymer battery was only 90 percent charged.

iphone wireless charging

Modern fuel gauges used in iPads, smartphones, and laptops read SoCs by counting coulombs and comparing voltages. The complexity lies in managing these variables when using the battery. Charging or discharging is like a rubber band, pulling the voltage up or down, and the calculated SoC reading becomes meaningless. In open circuit conditions, as is the case when measuring a bare cell, a voltage reference can be used; however temperature and battery age will affect the reading. The open-circuit terminal voltage as a SoC reference is only reliable if these environmental conditions are included and the battery is allowed to rest for several hours before measurement.Also read:400ah lithium battery

A 10% difference between an iPad fuel gauge and a real battery SoC is acceptable for a consumer product. Accuracy may drop further with use, and battery aging may add another 20-30% to the error depending on the effectiveness of the self-learning algorithm. By this time, users have gotten used to the device's quirks, and the oddities are mostly forgotten or accepted. While the difference in run time will only cause a minor inconvenience to the average user, industrial applications such as electric powertrains in electric vehicles will require better systems. Improvements are underway, and these developments may one day benefit consumer products as well.

Coulomb counting is at the heart of today's fuel gauges. The theory dates back to when Charles-Augustin de Coulomb first established "Coulomb's Law" 250 years ago. It works by measuring the current flowing in and out. Coulomb counting also introduces errors; the energy going out is always less than the energy going in. Inefficient charge acceptance, especially near the end of charge, tracking errors, and losses during discharge and self-discharge during storage all contribute to this. Self-learning and regular calibration through full charge/discharge ensures an accuracy acceptable to most.

Leave a Comment