Data as key driver for the Energy Transition

Data as key driver for the Energy Transition

Written by Dario Incorvaia and Nikolai Demydov

In the last decade, the proliferation of Smart and IoT devices and the intensification of digitization in many industries enabled the generation and exchange of very large amount of data. Every day more than 2.5 quintillion bytes are created and by 2025 there will be 175 zettabytes of data available in the world. Power and Electrical Grid industry is also affected by this phenomenon, although critical infrastructures are more conservative markets compared to other ones. With so much data available it becomes important to understand how to leverage access and how to share them.

From DERs (Distributed Energy Resources) to DDERs (Distributed Data and Energy Resources)

Is Big Data management a new topic in the Power industry? Not really. If you think about it, we are sharing large amounts of data since ever but there are now new drivers that make data a key enabler for the energy transition too. In fact, in the past data sharing was occurring pretty much bi-laterally and isolated, between two systems exchanging data in some form or shape to implement a specific and punctual use case. In that context, data streams were of secondary importance, coming after the functions of the systems themselves (OT-based approach).

What is happening now is different, and it is pushed by Distributed Energy Resources (DERs). DERs are a reality today and are expected to grow more than 7 fold until 2030 (compared to 2020), changing the paradigm of the Power Grid with decentralization of power generation (e.g. PVs on the roofs, wind turbines off-shore, …) and resulting in bi-directional power flows (e.g. power generated in a local district being made available to the main grid). This also implies that along the value chain of the Power Grid, more actors produce and consume data, enabling new end-to-end use cases to make the grid more efficient and resilient. In that sense, not only energy resources are distributed but related data are too.

Re-thinking systems around interoperability, modularity, and openness

No alt text provided for this image

To extract the maximum from energy and data decentralization, OT/IT architecture must be re-designed with a broader strategy based on interoperability, modularity, and openness. It is not anymore realistic nowadays to have vendors supplying all elements of the systems without allowing collaboration with other companies. Also, customers don't want to be locked-in with a single provider while they expect to involve internal and external departments to tailor the solutions based on their very needs.

Interoperability, modularity, and openness apply both between different actors of energy systems (e.g. TSOs , DSOs, Aggregators, Retailers, IPPs.. ) but also between different departments of the same company (e.g. planning department, operations department, maintenance department, …).

Let’s make some examples:

  • Time-series from smart meters, managed by Meter Data Management (MDM) systems, can be shared with planning tools to make simulation studies based on real consumption data rather than static models. As result, the higher accuracy of planning allows better decisions on investments. In this case, the "billing" department of an utility collaborates with the "planning" department thanks to two software systems exchanging data.
  • Number of intraday/day ahead transactions is constantly growing; lower generation units are incorporated in ancillary and flexibility services – this requires seamless data exchange between relevant market players; flexibility market works around the concept of sharing hosting capacity across private and public participants of the (micro)grid and require know-how on when each of the participant can change his behavior.
  • EV drivers’ habits can be used by DSOs to understand how the additional loads can be managed (leveraging potential of EVs as demand side flexibility) and what impact these will create on the grid.

No alt text provided for this image

Road blockers for generating value from data

More data available does not automatically mean more value, in contrast it can even have a negative impact, for instance by overloading legacy systems.

Data generates value if they can be interpreted and efficiently used by final recipients, either human beings or machines.

A big issue of today in the Power industry, is represented by the fact that data are available in exponential growing quantity but low quality (e.g. partial information, heterogenous granularity, …), not to mention that those data are scattered across multiple systems. Also, while there are standards like CIM for data exchange, there is not a common ground yet for homogeneous ontologies, enabling seamless data modeling.

Ownership of data is also a topic. Surely the primary owner of raw data is the owner of the assets themselves, and this entity needs to keep control over them. But system manufacturers and suppliers also claim their rights to access data, especially those resulting of their algorithms that finally represent their IP. In this area, legislations and standardization will play a fundamental role: EU Data Act, and the EU Action Plan about Digitising the Energy Systems, released in 2022, paved the road for governing this matter.

Regulators will play an important role in pushing digitization and SaaS business models favoring OPEX vs CAPEX. SaaS business models support the introduction of software solutions with higher speed of innovation and faster deployment, therefore utilization. Energy transition, in fact, requires a new pace that only software can match.

Operational Technologies (OT) of power utilities are often made with aging architectures. But big data handling is now requiring new cloud native models, deployed on elastic infrastructures, enabling near real-time streaming of the data and handling of massive batches for new usages, like training and operation of deep learning algorithms. That presumes acceleration of OT-to-IT (Information Technology) convergence building relevant skills/capabilities, and applying data governance policies by moving data between demilitarized zone (where deterministic control functions are executed) and hyperscale’s.

Technology and culture going hand in hand

To enable a more effective utilization of data, both technology and culture will come of help. One of them only will not make the trick.

Technology alone will not suffice to reach the goals. Technology needs to follow a purpose and requires companies to define that purpose and embrace it as part of their culture

Technology: to master the growing complexity of data and overcoming the obstacles of quality and availability, we see that especially Artificial Intelligence (AI), Digital Twins, Data Architecture and new Business Models will play a crucial role.

  • Artificial Intelligence will help very concretely in 1) making decisions faster and more accurately by using more information and learning from past experiences (e.g. early identification of situations that could provoke the activation of protection systems and outages), 2) enriching data by filling the gaps (e.g. provide visibility especially in the low-end of the grid by aggregating low quality data sources). For example, state estimation for electrical grid, based on traditional algorithms, takes about 5 minutes for a large grid - everyone understands that it is too long for highly volatile processes in our energy system. State estimation based on deep learning takes only few microseconds, enabling near real-time self-healing functions.
  • Electrical Digital Twin has the power of deriving insights out of jungle of unstructured and uncoordinated data. Although the physical grid is just the same for everybody, its informational representation differs case by case not always matching the reality. To create a digital representation of a grid it is essential to put in place processes that facilitate the gathering of information from multiple sources (e.g. asset management, GIS, meters, SCADA …) and its maintenance over time. Traditionally this task required great effort for utilities both in terms of manpower and time, therefore smart Grid Model Management solutions are essential. The business impact of digital twins is the unlocking of data that can be made available to more systems and create new use cases, near real-time, implementing the promise of interoperability.

Example: Fingrid, Finnish transmission system operator, was suffering from the effort of collecting data, resulting in partly losing the value coming out of them. By implementing an efficient digital twin that consistently organizes their assets, they can now focus on making analysis and studies about grid’s behavior. With the digital twin, Fingrid can make investment plans up to 25 years in advance running many scenarios that also consider changes to policy frameworks.

  • Data architecture: the decentralization of distributed energy resources comes with the decentralization of data. In the recent years the concept of Data Mesh is gradually replacing the one of Data Lake, especially in those environments where decentralization, federation, in addition to scalability, are required. Data Mesh architectures are built on the philosophy of treating data as Product, being them distributed in different nodes of the network, and giving to each Data Product Owner the authority of organizing and storing them. A layer on top allows to create a catalogue of data addresses to facilitate the search and access. Data Mesh leverages modularization and support collaboration without creating bottlenecks.
  • New business models: domination of CAPEX-intensive perpetual models is coming to its end. SaaS is offering a lot of operational flexibility to customers, enabling scaling of underlying platform according to current/future business needs and amount of data.

Culture: technology alone will not suffice to reach the goals. Technology needs to follow a purpose and requires companies to define that purpose and embrace it as part of their culture. Creating a culture that foster data access and sharing will force companies, both utilities and suppliers, to become more open and collaborative. Open means proving tools and APIs to allows 3rd party systems, including those from competitors, to access data and create new applications. Collaborative means enabling the use of the systems between different personas of the same company (e.g. planning department and access, with proper UX, the system also used by grid operators), but also enabling partnership with external companies. These cultural shifts are difficult to achieve if not supported by senior level management, because require changing the company philosophy from “I do it all – silos - monolithic” to “I am part of an ecosystem where other actors can play a role”.

No alt text provided for this image

Important to mention, as part of the culture, there is also the digitalization of the work force. Effectiveness of highly digitally enabled solutions require people able to interact with them in a new way and not anymore like happened in the last decades with traditional OT tools.

Key takeaways

  • Effective handling of data represents a huge opportunity to transform any industry, including conservative ones like Power and Electrical Grid
  • DERs have a big impact on data flow and generation along the Power Grid value chain
  • Interoperability, modularity and openness are fundamental pillars for designing software systems that can accelerate the energy transition
  • Regulators play an important role to unlock digitization by favoring OPEX vs CAPEX financial models and push software solutions based on SaaS
  • Technology, such as artificial intelligence, digital twins, data mesh, needs to be fueled by cultural changes and digitalization of work force


Our philosophy at Siemens Grid Software

No alt text provided for this image

In May 2022, Siemens Grid Software division announced publicly its vision and new mindset, which is the result of long experience in Energy Grid and Software areas. Such a vision is fully built on the importance of “data access and sharing” as fundamental pillar to enables 5 design principles:

  1. Interoperability: thinking every application from its role in supporting customers end-to-end processes and how it fits into the overall IT/OT landscape. Interoperability creates foundation for data exchange inside company and between market players (e.g. among TSO, DSO and IPPs).
  2. Modularity: is the commitment to a modular and cloud ready architecture (e.g. microservice approach) to provide customers with highest flexibility to react to even unforeseen developments for now and in future (in contrast to monolithic approach with vendor lock-in).
  3. Openness: with open interfaces (so-called APIs) and open standards such as CIM to allow extending applications without reducing the ability to easily upgrade. And of course a truly open approach that prevents a vendor lock-in.
  4. Resilient and Cyber Secure: our products stay resilient and cyber secure in every aspect, without any compromises. In current geopolitical situation security is major topic in critical energy infrastructure.
  5. Collaborative: applications are designed collaborative and user-centric – taking UX and cross-department collaboration to a whole new level (e.g. by working on the same network model). Breaking the silos across departments is possible thanks to the implementation of digital twin of the grid, making possible seamless data exchange.

Learn more on Siemens Grid Software website: https://www.siemens.com/global/en/products/energy/grid-software.html

#energytransition #data #powergrid #digitization #digitaltwin #artificialintelligence #datamesh #gridsoftware #interoperability #openness #modularity #siemens

要查看或添加评论,请登录

社区洞察

其他会员也浏览了