With rapidly growing DER and distribution grid IOT, system loss analysis has become more important and more complex. Due to limitations on available data, traditional system loss analysis has relied heavily on extrapolated forecasts based on data from a limited number of grid components operating at peak demand. These forecasting exercises produce no more than ballpark estimates, and a typical timeline for performing this type of a system loss study can be several months to years, and cost several hundred thousand dollars or more.

As the DER model continues to evolve, grid planners are going to require granular locational and temporal load and generation data to support faster, more accurate system loss analysis. The PowerRunner™ On Demand System Loss Analysis Application allows utilities to perform analyses that operate on tens of thousands of hours of service point AMI or profiled load data and system load data to create hourly system loss factors for each voltage class. Localized and real-time analysis of losses is further calculated by leveraging upstream hourly SCADA data to determine loss factors by loss class, hour, and circuit.

This new dynamic approach can reduce the cost and duration of static analytical studies by more than 180% compared to traditional approaches to system loss analysis.

Additional cost savings are realized by eliminating the need for future studies and providing the ability to create localized hourly loss analyses that support Diversion Detection, DER integration, and Asset Health Analysis.