## Introduction

Imagine we are looking to buy a battery, and want to know what its power capacity is. Or in other words, how much power the battery can store. How would we calculate how much energy a particular battery can store, and how would we size this up against the devices we will need it to power?

In this post we will explain the use of Ampere-hours (Ah) as the common measure of capacity, evaluate the use of Kilowatt-hours (kWh) as an alternative and more flexible measure, and determine how to calculate Kilowatt-hours (kWh) using Ampere-hours (Ah) and Voltage (V). We will also see how we can use these measures to determine how long they can keep our devices powered.

## Before we begin

We need to establish an understanding (or refresher) on 3 concepts: Ampere (A), Voltage (V) and Watt (W). These are common electrical units, and are worth a short introduction.

### Ampere (A)

The Ampere, or Amps for short, is usually represented by the SI unit “A”, and is a unit of electrical current. What this means is that it measures the number of electrons moving past a point per second.

For those curious, 1 ampere is equal to 6.24×10^{18} electrons worth of charge moving past a point per second. That’s 6.24 million trillion electrons per second. To put this into the perspective of an electrical device, a device rated at 1 amp, will pull 6.24 million trillion electrons from a power source per second.

From a practical perspective, the rated amps of a device points to how much current it will draw from the electrical circuit or power source.

### Voltage (V)

Voltage, or volts, on the other hand is a measure of electrical pressure, or potential difference, between two points in a circuit. The higher the voltage, the higher the electrical pressure, the higher the “force” pushing electrons from one end to another.

From a practical perspective, the rated voltage of a device is basically what voltage the electrical circuit should supply the device at. This value must match exactly – note the definition of voltage as *electrical pressure*. Intuitively, too high a pressure would cause a device to implode, the same way too much pressure can cause a water hose to burst. For example, lightbulbs burn out when too much voltage is applied to them. Conversely, too little voltage might mean that too little energy is supplied to allow a device to operate effectively.

Voltage can also come in AC and DC form. Details aside, it is important that a device that needs to be AC powered is not powered via a DC circuit, and vice-versa.

### Watt (W)

A Watt is basically a unit of work performed each second. It is best understood in relation to amps and voltage, and can be derived mathematically using the formula:

```
W = A X V
```

or, Watt (W) equals to Ampere (A) multiplied by Voltage (V).

Borrowing on our above intuitives about amps and voltage, we can look at watts in this way:

- The higher the Amps (A), the more current is flowing, hence more work is being done.
- The higher the Voltage (V), the higher the pressure, hence more work is being done.

I hope the above has helped to set the stage for what is to come.

## Let’s begin: Ampere-hours (Ah)

Knowing that an ampere is a measure of number of electrons moving past a point per second, an ampere-hour (Ah) is just that but scaled to the size of an hour.

For those curious, that is equal to 6.24 million trillion * 60 seconds * 60 minutes worth of electrons moving past a point every hour. Or 22.5 billion trillion electrons. That is a *mind-bogglingly* huge number, so we simplify it into one unit: Ah.

The Ampere-hour unit is also the preferred way to denote the **rated capacity** of batteries sold on the market, and is often easily found on battery packaging. It is usually represented in two forms: **Ampere-hours (Ah)** and **milliampere-hours (mAh)**. The latter (mAh) is just a thousandth of the former (Ah), or Ah divided by 1000. For the purposes of this article we will use Ah, for consistency’s sake.

Colloquailly we can also refer to ampere-hours as amp hours, for short.

For practical purposes, Ampere-hours represent the number of hours worth of amperes that a battery can supply, at a predetermined voltage.

### An example to build intuition

Imagine we have a lightbulb rated at 2 A, or 2 Amperes. This rating means that it is draws 2 amperes of electrical current every second, or 2 Ampere-hours every hour.

Now, imagine that we have a battery that is rated at 10 Ah, or 10 Ampere-hours. This rating means that the battery is able to provide a total of 10 Amperes of electrical current hours. This battery should be able to supply a 1 amp device with 10 hours of juice, or a 10 amp device with 1 hour of juice.

What about our 2 amp lightbulb? 10 Ah / 2 A = 5 hours of power. If fully-charged, the battery should be able to provide our lightbulb with 5 hours of life before becoming completely depleted.

### Using Ah

We can use the Ah of a battery to determine the hours of use it can supply to any of our devices. Like in the above example, what we can do is to

- Take the total Amp hour (Ah) of the battery
- Take the device Amps (A)
- Take Amp-hour (Ah) and divide it by Amperes (A) to get the number of hours of use the battery can provide to the device

In mathematical terms:

```
h = Ah / A
```

or, hours of use (h) equals to Ampere-hours (Ah) divided by Amperes (A)

### A caveat

But using Ah as a measure of battery capacity comes with a caveat. Let’s have a quick reminder of what Ah represents:

The number of hours worth of amperes that a battery can supply,

at a predetermined voltage.

The problem with this measure is that for it to be useful, all appliances that draw power from it have to use it at the same voltage. If you are using multiple devices, and some devices use different voltages from others (for example, a 5V iphone charger and a 100V toaster oven), Amp-hours (Ah) as a unit of comparison becomes moot.

If you intend to only use one device, or as long as the voltages of all devices connected to battery is the same, this will not be an issue. But if not, there is a better unit for comparison.

## A more flexible measure: Watt-hours (Wh)

A Watt-hour is a unit of energy that accounts for the voltage component (V) of the equation.

We can look at the equation to derive Watt-hour (Wh) to understand why:

```
Wh = Ah X V
```

or Watt-hour (Wh) equals to Ampere-hour (Ah) multiplied by Voltage (V)

Another way to understand watt-hours is to draw on our understanding of Watts. From our refresher section, we know that Watts are used as a measure of the amount of work done per second. Watt-hours is basically the same thing scaled up to an hour, or the amount of work done per hour.

## The more commonly used measure: Kilowatt-hours (kWh)

In practical terms, Watt-hour (Wh) as a measure is less commonly used than its larger counterpart, the Kilowatt-hour (kWh). This is because using watt-hours (Wh) leads to numbers that are usually too large for the typical way in which energy use is calculated for power companies – powering homes and businesses. For example, the average american home in used 886 kWh per month in 2021. Measuring that in Watt-hours (Wh) gives a value of 886,000 Wh, which can be quite the eyeful.

### Calculating Kilowatt-hours (kWh)

To convert Watt-hours to Kilowatt-hours, just multiply it by 1000 – as should be obvious from the example above.

```
kWh = Wh / 1000
```

or, Kilowatt-hours (kWh) equals to Watt-hours (Wh) divided by 1000.

Deriving it directly from Amp-hours (Ah) and Voltage (V), we do:

```
kWh = Ah X V / 1000
```

or, Kilowatt-hours (kWh) equals to Ampere-hour (Ah) multiplied by Voltage (V) divided by 1000.

### Using kWh

We can use the Kilowatt-hour (kWh) capacity of a battery to determine how long it can supply a device with electricity *through a transformer.

A transformer steps-up or steps-down the voltage being supplied to a device, in order to match the device’s voltage with the rest of the circuit. This is important because remember that a device’s voltage has to be equal to the input circuit’s voltage, or problems might occur due to too much or too little electrical pressure.

Assuming the use of a transformer has been handled, we can determine how long a battery can power a device by comparing their Kilowatt-hour specifications. This we can do using the following steps:

- Determine the kWh of the battery
- Determine the kWh requirements of the device.
- Divide the battery kWh with the device kWh.

#### Step 1: Determine the kWh of the battery

Using the `kWh = Ah X V / 1000`

equation, we can calculate the total battery capacity.

Here we have to pay attention to something called the battery discharge curve. In short, different operating voltages can result in a higher or lower effective Amp-hour (Ah) rating, something that should be stated on the battery or its operating manual. Note that some batteries only provide one rating, but others support multiple ratings. If there are multiple, we have to decide which voltage we are likely drawing power from the battery at, and use that voltage in the above calculations.

#### Step 2: Determine the power requirements of the device

Here we can use the simpler `kW = A X V / 1000`

. Notice how the equations are mostly similar – only the time component (h) is removed. This gives you the power requirements in Kilowatts (kW).

All devices should come with a rated Amps (A) and Voltage (V). The complication is that they do not always draw the same amount of Amps (A) consistently. For instance, a microwave oven draws different amounts of power, depending on whether it is used on high, medium or low, and on whether it is on standby mode. For such situations, the only advice we can give is to either (1) find an average amps value, or (2) directly use a Kilowatt-hour (kWh) estimate through some online research.

#### Step 3: Divide the battery kWh with the device kWh

As the header says, do the division:

```
h = kWh of battery / kW of device
```

or, hours of use (h) equals to Kilowatt-hour capacity of the battery (kWh) divided by the Kilowatt requirement of the device (kW).

## Lead-acid vs Lithium-ion Batteries

There is something else to consider, concerning the *type* of battery used. There is a general distinction between two kinds of batteries, made from two different materials: Lead-acid and Lithium-ion.

There are a multitude of differences between these two types of batteries. These include variations in battery capacity decay over time, energy to weight ratios (lithium-ion is more efficient if weight of the battery is an issue), and price.

Battery technology is in truth a very complicated topic of study, with different types of batteries exhibiting different discharge cycles and behaviors, even within these two general categories.

What we are concerned with here however is a difference in how both types of batteries respond to something called the depth of discharge. In simple terms, it refers to how much of a battery is discharged, before its charged again back to full. In other words, how far (read: deep, depth) the power charge drops in percentage terms, before its charged back up to full again.

The life of both types of batteries drop the higher the depth of discharge, but this drop is far more drastic in the case of lead-acid batteries.

Hence when choosing a battery, it is important to keep in mind a general rule: **whatever the calculated power capacity of a lead-acid battery is, halve it to get the actual usable capacity**. This is because, in general, you can only use a maximum of half the total capacity of a lead-acid battery before needing to charge it back up again. Doing otherwise would dramatically shorten your battery life and is not advised.

You can ignore this rule for lithium-ion batteries, which do not face the same constraints.

## Practical steps to Determine Usable Power Capacity of a Battery

For a simple set of steps to take to determine the usable capacity of a battery in Kilowatt-hours (kWh):

- Find the Ah or mAh of the battery
- Find out the power draw Voltage
- Multiply Ah by Voltage, then divide that by 1000, or
- Multiply mAh by Voltage, then divide that by 1000, then divide it again by 1000
- If the battery is a lead-acid battery, divide the resulting value by 2 to get the usable power capacity
- This gives you the actual power capacity in kWh

For those who want the above in equation form:

```
# Lithium-ion battery
kWh = Ah * V / 1000
kWh = mAh * V / 1000 / 1000
# Lead-acid battery
kWh = Ah * V / 2
kWh = mAh * V / 1000 / 2
```

Hope this helps!

## Bonus: Energy in Terms of Joules (J)

Actually, a more intuitive measure for energy would be the joule, which is a unit that should be familiar for students of physics. In fact, each kWh is equivalent to 3.6 Mega-Joules (MJ for short), or 3,600,000 Joules.

The reason I say it is more intuitive is because if we water as an analogy, joules are the conceptual equivalent of Litres (or Gallons, if you are American). To me at least, it seems like using one simple unit to measure capacity, like in the case of water, is better than a composite unit made up of other units. Imagine if we measured water capacity in terms of Litre hours per hour. 🤯

However, for certain reasons, power companies prefer the kWh measure, and for our purposes, using kilowatt-hours also happens to make our computations more straightforward.