top of page

Better real-time loss analysis tells utilities where DERs make sense

Updated: Jan 12, 2021


Consumers who flip on lights and power up the ever-growing number of devices in their living rooms are receiving electricity from a grid that has to generate two Watts of electricity for every one Watt delivered. Put another way, the Department of Energy estimates that more than 60 percent of the source energy used to generate electricity is lost before it is consumed in our homes. Other estimates claim even larger losses.


According to the EIA, distribution system losses alone account for over $19 billion in the U.S. annually, in real physical losses and Unaccounted for Energy (UFE), the costs of which are passed on to all customers. In addition to the economic cost of such inefficiency, the negative environmental impact is substantial.

We must do better and with the tools available, now we can.

Ameren Illinois and PowerRunner recently announced that Ameren’s real-time loss (RTL) analysis was accepted by the Illinois Commerce Commission as the new tariff methodology for determining the quantification and allocation of technical losses and related Distributed Unaccounted for Energy (DUFE) across various voltage (loss) classes. This means that the utility uses the results of the analysis to calculate the loss factors for every account based upon the voltage level at which that account is served. The tariff went into effect on June 1, 2018.


The RTL analysis was developed on the PowerRunner Energy Platform, an application that joins disparate internal and external time-series data sources through a configurable data virtualization layer to create an enterprise utility business analytics platform.

The RTL methodology operates on 8760 hours of AMI and system load data to create hourly system-loss factors for each voltage class. Localized and real-time analyses of losses are further calculated by leveraging available upstream hourly SCADA data to determine loss factors by specific loss class, hour and circuit. This new method reduces the time it takes to perform a loss study from more than a year to seconds. In addition, it more accurately accounts for real losses and DUFE across the system.

The RTL analysis aggregates hourly service point load forecast data by voltage (loss) class vs. the hourly system load to determine the loss factors per voltage class for every hour of the day. This bottom up approach to creating hourly day-ahead system forecasts is more accurate and provides system operators with greater situational awareness on every circuit.

Engineers have known about the physics of impedance and its effect on system losses for more than a century. The problem was the losses could not be seen or accurately measured. The data granularity and computing power simply did not exist — until now.

Spotting Higher than Normal Losses and Finding DER Locations

For utilities this data, once analyzed and acted on, can help identify areas of the grid with higher than normal losses. Losses on system equipment, such as transformers, increase as loading on these assets increase, which would make these circuits good candidates for targeted DER initiatives. Metered loads on secondary residential circuits can average a loss factor of 7-8 percent.

More localized loss analysis may suggest that some circuits have higher losses than others which would increase the value and benefit of distributed energy resources (DER) on that circuit. In other words, if a local secondary circuit had a real loss factor of 10 percent during hours of higher than normal loading, then a 5-kW solar array could displace the need for 5.5 kW of centralized generation capacity.

With such analytic ability, it is now within our capability to analyze losses across the system and more accurately value DER on local circuits.

89 views0 comments


bottom of page