- DER Technologies
- Utility Infrastructure
- Utility Innovation
Machine Learning Techniques Are Key for Modern Grid Stability
The future US power grid will likely look much different than it does today. The integration of distributed energy resources (DER) such as solar, wind, and energy storage systems is expected to increase rapidly in the coming years. This shift from the largely centralized grid systems of old makes short-term power forecasting more complex, which leads to difficulties in maintaining grid stability. But recent advances in machine learning techniques can help.
With current forecasting techniques, utilities can largely predict total power output from a DER system over the course of a day. These daily power outputs can be estimated relatively accurately using weather forecasts. However, short-term predictions (the minute-to-minute demands within a day) are difficult to create. Instantaneous changes in cloud cover or wind speed can result in varying power output from solar or wind-based systems. This presents a challenge for utilities as large variations in short-term power production can result in grid instabilities if they are not accounted for.
Algorithms Offer Stability through Real-Time Predictions
Although processing and analyzing massive amounts of complex DER data in real time seems like an impossible task, machine learning algorithms are up to the challenge. Using machine learning techniques, Australian researchers have trained an algorithm to improve wind and solar farms’ power output predictions for 5 minutes into the future by 45%. This increase in accuracy allows utilities to properly plan energy dispatch and increase grid stability, which in turn saves them money. The software, called PowerPredict, is now commercially available to wind farm operators. Similarly, researchers at Incheon National University in South Korea recently developed a hybrid algorithm that greatly increases accuracy for power output predictions from PV systems on both the short term and timescales up to 1 day.
The current level of DER on the US power grid can already benefit from machine learning algorithms to balance output. The large amount of centralized power sources that are tied to the grid (such as nuclear and natural gas plants) provide grid stability through their use of synchronous generators. However, this does not mean that utilities should wait to begin incorporating machine learning algorithms into their wind and solar farms. These algorithms help utilities dispatch power produced by DER more accurately, no matter how much DER capacity is attached to the grid. The algorithms also learn as they are given more data. If utilities want their algorithms to be as accurate as possible, once there are enough DER connected to the grid that algorithms are needed, they should be implemented in the near future and given the ability to learn over time.