How utilities are using data analytics to improve storm response
With the growing severity of weather events, it’s crucial utilities continue to evolve their storm-response practices to maintain the high levels of reliability that customers demand. Utilities can use data science to mitigate the effects of storms on their systems, but it’s often intimidating to know where and how to start.
Last month, E Source joined Alabama Power and PPL Corp. at DISTRIBUTECH International to discuss and share learnings on harnessing data science to address the challenge of storm response and severe weather events in their service territories.
Let’s dive into some key moments from the discussion highlighting the results both Alabama Power and PPL Corp. are seeing from their efforts.
Take a three-tier approach
The panel discussion included topics to help utilities better understand what they should prioritize and where they should begin to establish a data-driven approach for storm response strategies.
The panelists recommended a three-tier approach to understand and mitigate the negative consequences of severe weather.
Look to the past. Conducting a retrospective analysis of past storm events can help utilities understand and quantify the impact of weather and identify conditions that are increasing their risk of damage.
Forecast, forecast, forecast. Predictive modeling can forecast outages and more-accurately deploy resources or optimize mutual assistance requests. Combining predictive results with retrospective results can improve forecasting accuracy by 20%.
Plan for restoration. Scenario plan the impact of your estimated time of restoration (ETR) when adjusting resource allocation, and drill down to a better, more-informed ETR to communicate internally and with customers.
Consider data accuracy and trust
Shane Powell, data analytics and innovation manager at Alabama Power, focused on the need to solve one problem first and to think carefully about the organization of your data for this initial problem.
“When we built our first [predictive] model, we had to think very carefully about the data. How do you structure it? Where do you put it?” Powell said. “Because the second model is built off of the first one, it’s really important to document what you’re doing, and as you’re building subsequent models, things will get faster and faster at delivering value.”
This is a critical first step to obtaining value from predictive modeling: developing a strong foundational structure and data validation model. As the saying goes … garbage in, garbage out! If the information being fed into a model is poor quality, the result is likely also to be poor quality.
Models are also only as good as they are used and trusted by their users. Ben Spanswick, director of data science and machine learning at PPL Corp., emphasized this challenge with the company’s recent acquisition of Rhode Island Energy. When answering a question from the audience about the challenge of change management when implementing new technology, Spanswick said:
Each one of these operating companies is in a different spot as to how they respond to storms currently and how they want to respond to storms. They initially had a weather model and they didn’t trust it; they hadn’t seen it play out so they’re just starting to see results from it. Switching from an expert that you’ve relied on for the last 10 years to a model can be a scary thing. It takes time to build that trust. No matter how good your technology is or how good your team management is, if they don’t intersect then it’s not going to work.
Solve the challenge of uncertainty with data science
Powell and Spanswick are both working with Kyle Decker, director of Data Science at E Source, to implement new storm-response technology for their respective utilities. In the case of PPL Corp., E Source is supporting the rollout of a new storm predictive solution to forecast outages.
One of the key challenges to address with this model is the uncertainty of weather. Spanswick highlighted this issue with a recent weather report, which evolved from predicting just 1 inch of snow overnight to 8 inches. In this case, the new models caught the change and its impact on PPL Corp.’s system. Spanswick acknowledged amusedly, “We’re never going to be 100% perfect.”
One of PPL Corp.’s goals for implementation of the new model was to address this challenge of being good enough to consider the uncertainty that goes into predictive data science. To address the challenge of uncertainty, PPL Corp. is working with E Source to develop a range of possibilities that allows PPL Corp. to simultaneously predict, underpredict, and overpredict across different scenarios.
Spanswick said:
We never want to miss the ‘big one’ and that’s ultimately why we made the decision to overpredict. This is an example of how a business goal can be incorporated into a model. We might not always get it perfect, but when we don’t, we’re still always prepared.
E Source is working with both Alabama Power and PPL Corp. to deploy storm models specific to their needs. If you’d like to learn more about the models or their performance, contact us to connect with an E Source representative.