How Demand Charges are Calculated

Consumption is measured at a rate based on kilowatt hours (kWh), and demand is measured in kilowatts (kW). To understand how this applies to your energy use, see the two examples below. Note that when demand is higher – i.e, using kilowatts at a higher intensity and for a shorter time period – demand charges will be higher versus using the same total amount of kilowatt hours but over a longer time period and at a lower intensity. Overall consumption remains the same between the two companies, but the amount each pays varies because of demand charges. 

Let’s assume these rates apply to both companies:

Electricity charge = $.0437 per kWh

Demand charge = $2.79 per kW

Example 1: Company A runs a 50 megawatt (MW) load continuously for 100 hours.

50 MW x 100 hours = 5,000 megawatt hours (MWh)

5,000 MWh = 5,000,000 kWh

Demand = 50 MW = 50,000 kW

Consumption: 5,000,000 kWh x .0437 =  $218,500

Demand: 50,000 kW x $2.79 =  $139,500

Total:  $358,000

Example 2: Company B runs a 5 MW load for 1,000 hours.

5 MW x 1,000 hours = 5,000 MWh

5,000 MWh = 5,000,000 kWh

Demand = 5 MW = 5,000 kW

Consumption: 5,000,000 kWh x .0437 = $218,500

Demand: 5,000 kW x $2.79 =  $13,950

Total:    $232,450

Depending on your rate structure, peak demand charges can represent up to 30% of your utility bill. Certain industries, like manufacturing and heavy industrials, typically experience much higher peaks in demand due largely to the start-up of energy-intensive equipment, making it even more imperative to find ways to reduce this charge – but regardless of your industry, taking steps to reduce demand charges will save money.