Why software, data and AI are shaping the future of an increasingly resilient and sustainable power industry
Exhibitor lists at events like this year’s DISTRIBUTECH conference shows just how pivotal technology has become in the power industry’s drive to build resilience. Emerging external threats and the growing need for interoperability are changing the industry and its approach to data. It’s a fascinating time to be a small part of the collaborative innovation that’s starting to sweep through a sector previously known for its resistance to change.??
Plugging into the power of data?
At the organizational level, utility companies have been talking about IT/OT integration for some time many have made good strides forward underpinning their transformations on hybrid cloud and machine learning focused AI. Today, however, more advanced Generative AI and large language models are driving new discussions around what would be possible if we had guardrails in place to bring data together differently and overhaul operational processes.?
Meanwhile, the huge task of decarbonizing our grid requires the entire industry to think differently about how we generate, share capacity and manage differentiated planning. Devastating events like the recent storms in Texas have given a glimpse of what could happen if we don’t have robust mitigation plans in place and more tools at our disposal.?
Building resilience on every level?
Fundamentally, assets – be they physical or tech-based - are the enablers of how we deliver cheap, reliable power. But more than that, they’re a key driver of energy security at country level and critical enablers of the energy transition.?
As extreme weather, sabotage and cyberattacks acts become increasingly common, utilities are using technology to help protect their assets, as well as predict and respond to changing demographic patterns. This requires long-term planning as well as risk mitigation strategies – both of which technology can help with.?
Asset management tools, for example, can interpret demographic data to help clients anticipate where a transformer might have issues down the line.? To prepare for weather events, meanwhile, there are solutions to help them anticipate their response, prepare inventory or crews more quickly, and foresee where the impact might be on the grid.?
Investing in long-term storage?
When it comes to mitigation planning, storage is a hot topic of conversation. Just as the cloud industry has been able to build data centres in 40-foot containers and sink them in the ocean for cheaper and cleaner cooling, why not put large capacity long term storage on trains or trucks to be available for devastating outages? Cities like Houston are already looking at building these kind of mitigation options, while the technology industry is fast-tracking new solutions for longer, more compact storage.?
Building resilience through grid connectivity is another important theme, in the case of major weather events, greater connectivity on offshore would offer greater resiliency. AI again can help here to automate the switching, the load balancing and overall grid management.??
Embracing data-enabled digitization?
As software becomes an increasingly important driver of the industry, we can expect to see utility providers embracing a progressive ‘platformisation’ of vertical processes to enhance their efficiency and resilience.?
For this, they need data they can trust, access, and share across what were previously vertical siloes. What might this look like? Well, we know that AMI 1.0 was driven primarily by metre to cash but what if we could integrate discrete AMI data into the outage management system and start to gain from edge data in systems it didn’t used to be viewed in??
We’re talking about creating data visibility across the entire organization and then asking questions of that new capability, such as: what if, in an outage, I could look at individual households to locate the issue? All that requires is a digital layer that enables data to be shared and interpreted to drive an outcome.?
The importance of quality collaboration?
For utility companies, reliably delivering against rapidly mounting expectations is a bigger job than it ever was. And it can’t be achieved in isolation. As AI discussions drive a different perception of what is achievable and how much can be anticipated and prepared for, customers rapidly need to develop advanced new capabilities. And that’s where collaboration with specialist vendors like IBM come in.?
Our long-term relationships with utility customers often start with our Maximo asset management software, and then move on to include tools which support process automation, data governance and security etc. We offer an extensive portfolio of technology, a skilled team of consultants and a broad range of services, but in the end it’s how we work with clients and industry partners that will drive real change across the industry. This won’t be done by parachuting in and recommending a wholesale move to the cloud or just shifting to embed AI just for the sake of it.???
In an industry that is dependent on a Safety first mantra, trust is a big ticket item, that’s why IBM’s approach to AI is open and secure access to the best open-source models, fit-for-purpose enterprise solutions rather than relying on a single model. The Energy industry needs to protect proprietary data and IP, deploy in multiple environments, and be supported with tools to mitigate risks. At IBM, we prioritize AI you can trust. watsonx.governance tracks data, curating methods, and models, enabling AI that can be updated to meet evolving business and regulatory requirements.??
By Casey Werth, Global Industry GM, Energy & Public Sector, IBM Technology?