IEEE and NERC both require battery capacity testing as a means of predicting a battery system’s ability to perform when called upon during a loss of AC power, for acceptance of new installations, and in determination of end of life criteria for system replacement. While NERC is very vague and just states that “Capacity testing must be done at said interval”, IEEE 450, 1106 and 1188 have clearly defined frequencies, prerequisites and instructions on how these tests should be conducted.

From results that are submitted to us for warranty consideration or review, a group of common errors have come to light. Below are the top 5 most common mistakes that are seen in capacity testing.


We review many sets of results where the test was conducted at the 3 hour rate for a given battery model. The load bank is applied to the system at the correct rated amps for that particular cell, however, the test unit either stops automatically or a force stop is conducted as soon as the 3 hour point is reached. Yes, the battery ran three hours, however this does not tell us the actual true capacity of the battery system. The system may actually be at 105% or 121%, but unless the discharge is continued until the system as a whole reaches the “Low DC Voltage” limit for the test, a true capacity cannot be determined.

A 60-cell system of 125 Vdc being discharged to 1.75 volts per cell should remain on discharge until the total system DC voltage reaches 105.0 Vdc. That is the point in time when the system capacity is actually reached.


We often see tests where the test is terminated and a capacity calculation is done based on the time when the first cell reaches the low cell voltage limit. IEEE clearly provides guidance on what to do should a cell fall below a voltage level. System capacity is what a test is used to determine, not just the capacity of the weakest cell. Battery cells which limit the overall system capacity below 80% are the cells that need to be addressed.

Battery capacity testing systems which monitor each individual cell during the discharge provide a means for continuously monitoring how low each cell has dropped during the discharge period, and in the end allow you to calculate the specific capacity of each individual cell as well as the system overall capacity.


Only tests which are 1 hour or less in duration should have the temperature adjustment applied to the discharge rate for the battery prior to the test. In tests in the 1 – 8 hour range the temperature adjustment should be applied at the end of the test.


For a capacity test to be considered valid, IEEE only allows for (1) stop period during the performance of the discharge. This period is limited to 6 minutes or 10% of the total test time, whichever is shorter. For older systems it is important to have all the necessary jumper cables, hardware and personnel ready to accomplish and work to be done should a stop period be required during your test.


Most battery manufactures will provide guidance in their installation and operation instructions on the initial requirements prior to a capacity test. These typically follow right along with the Initial Conditions outlined in the IEEE for an equalization charge of the battery system followed by a return to float conditions for a period of 72 hours. This ensures that the battery is fully charged and cell voltages have stabilized prior to the capacity test being conducted. If there are cells which do not fall into their normal operational parameters after this float period, the manufacturer should be contacted for recommendations or corrective measures before the capacity test is conducted.

HELP IS AVAILABLE – If you have any questions on the requirements for capacity testing, procedures to be used, or general questions on testing and test equipment, Storage Battery Systems’ staff of highly knowledgeable experts in battery system maintenance and testing would be happy to speak with you. Call SBS at 800-554-2243.