Library

Smart Assets: The Role of Digital Twins in the Utilities Industry

Digital twins offer countless opportunities for revenue generation and process optimization within companies.

DOWNLOAD THE WHITE PAPER
Smart Assets: The Role of Digital Twins in the Utilities Industry

When the Apollo 13 crew informed Houston they had a problem, ground controllers had to think quickly. How could they identify and fix an issue on a spacecraft 330,000 kilometers away from Earth? Short answer: simulators. While these complex machines were nothing like modern simulators, they can be considered one of the first examples of a physical twin – and the first conceptual example of a digital twin.

A digital twin makes it possible to do more than operate a physical asset. It allows users to understand the real-time conditions of a physical asset as well. It’s no wonder then that the U.S. government wants to create digital twins for its future N.A.S.A. and Air Force vehicles. With a digital twin, ground controllers can get pilots and astronauts out of tight situations using models developed with real-time data. As a result, the gap between the testing conditions and operating conditions becomes non-existent.

Today, interest in digital twins extends beyond the aerospace industry. Closer to home, enterprises like utilities, including electricity, oil & gas, and telecom companies, struggle to gain visibility over their assets. Such businesses may not have to deal with extraterrestrial assets, but they do have to manage significant breadth and complexity. The intricate components and processes of a portion of the power grid may be just as difficult to access and analyze as a spacecraft in outer space.

In other words, digital twins have a lot to offer right here on Earth.

A few factors have made this possible. As technology advances at a rapid pace, the price of key tools continues to fall. Advancements in cloud computing, big data analytics, the Internet of Things, cyber-physical systems, and edge computing have made digital twins not only a feasible investment, but an obvious investment for asset-intensive companies as well.

Digital twins offer countless opportunities for revenue generation and process optimization within companies. They allow companies to remain relevant during the fourth industrial revolution, also known as Industry 4.0. They help reduce both planned and unplanned downtime, which costs companies thousands of dollars per minute. They empower companies to obtain visibility over complex assets like a power plant or introduce efficiency to production lines through data integration and unification. At their best, digital twins unlock the capabilities of cyber-physical systems, allowing the physical asset and the digital asset to control each other, using actuators and networking, based on data received by the former or analyzed by the latter.

Among all this excitement, there is an important piece of the puzzle that’s often forgotten when discussing the creation of digital twins: the visual representation. Step one is creating high-fidelity, virtual models that accurately represent a physical asset’s geometry and physical properties. But these 3D representations, whether they are CAD models or photorealistic 3D renderings, need to incorporate real-time data from sensors and present themselves to users without latency issues.

In other words, if the data travels from the physical world to the model, but the model can’t provide a visual representation in real time, then the digital twin can’t deliver business value because not enough stakeholders can interpret it. An efficient, cloud-based interactive 3D streaming solution is necessary to preserve the integrity of a digital twin and keep it “alive” and up to date.

In this whitepaper, we will outline the issue of data silos in the utilities industry, chart the evolution and applications of digital twin technology, explain the pain points of developing and using digital twins, and explain how PureWeb solves one of the most pressing issues: transforming real-time, real-world data into meaningful visual models.

The Utility Industry’s Billion-Dollar Problem

Pacific Gas & Electric (PG&E), California’s largest utility company, is a case study in how not to manage your assets. Over the past several years, the company has suffered enormous reputational and legal damage. Its equipment has been faulted with causing devastating wildfires that claimed an average of 12 lives per year between 2010 and 2018. Over the years, PG&E has imposed occasional power shut offs to reduce fire risks.

The results have been lawsuits, a Chapter 11 bankruptcy filing, and massive restructurings that left the company with $38 billion in debt and contempt from Californians. Reports quickly emerged that PG&E’s infrastructure and assets were in awful shape. In 2018, PG&E’s Tower 27/222 collapsed, leading to a fire that killed 85 people and wiped out the town of Paradise. Later, reports surfaced that the tower had remained standing 25 years after its useful life.

This is a worst-case scenario story of what happens when utilities don’t have visibility over their infrastructure. And while PG&E is an extreme example sprinkled with allegations of corporate greed, their story speaks to a growing problem among asset-intensive companies.

Obtaining visibility is hard. And once utilities have visibility, analyzing, interpreting, prioritizing, and acting on the information is a challenge.

Managing the sprawling empire of asset-rich companies

New pressures have complicated traditional asset lifecycle management. In the past, the focus was on keeping equipment operational through regular maintenance and repair. Today, operating a utility is more complicated and there are additional pressures.

Aging infrastructure is an issue; the American Society of Civil Engineers gave U.S. infrastructure a D+. Water and wastewater infrastructure, broadband internet connections, and a patchwork electricity grid are just a few areas that are in dire need of upgrades. Moreover, there’s heightened consumer expectations around technology and customer service. Coupled with social media, this makes it difficult for utility companies to limit the reputational damage of even a minor outage.

Magnifying all of these pressures is the reality on the ground. Asset-intensive companies are managing fragmented empires. While assets may legally belong to a utility or energy company, they aren’t necessarily seen by that company. The result is a business environment where leaders move around in the dark, bump up against a system here or some data there, and use this limited information to make critical business decisions.

Unsurprisingly, important details fall through the cracks.

Enterprises have an abundance of information technology (IT) and operational technology (OT). Information technology encompasses the equipment needed for networking, computing, and storing data within a company’s enterprise. Operational technology encompasses the physical assets that generate value for the company by managing and controlling processes, such as specific applications, sensors, or actuators. These systems don’t always work together.

Different departments have their own applications and platforms, specifically designed for their operation’s needs. Insights about that environment’s operation sit in employees’ heads. In some cases, critical data sits in spreadsheets. Managers can’t access this information on demand for analysis, preventing data-driven decision making across the enterprise.

The prevailing wisdom is that digitalization in and of itself will solve the data silo problem. But in some cases, it compounds it. When large amounts of data are collected using new technologies, but not integrated into a larger system, all that remains is a new set of silos. They may be sophisticated, richer silos, but they’re still silos.

How does new technology contribute to the data silo problem?

Utilities, energy companies, and transportation companies have wholeheartedly embraced geographic information systems (GIS). These companies have widespread infrastructure, and GIS allows them to quickly scan and capture geographic data. GIS is so important to these businesses that they’ve been deemed a major growth enabler for GIS. By 2024, analysts predict the GIS market will exceed $9 billion in value.

GIS gives utilities visibility over their entire infrastructure. This is helpful when it comes to gathering infrastructure requirements or understanding the location of specific infrastructure to prioritize maintenance tasks.

Meanwhile, Internet of Things technology gives companies real-time data about the assets in their infrastructure using sensors and powerful networking capabilities. For instance, a sensor can determine whether a utility pole has suffered damage. With the right analytics tools, sensors can even support predictive maintenance capabilities, so teams can identify and address problems before they become outages.

In a siloed environment, these technologies and their associated data exist separately from each other. This eliminates the possibility of analyzing these data sets together.

And this is just when we consider newer technologies. In a siloed environment, GIS and IoT data exist in isolation from traditional data as well including CAD and building information modelling (BIM) data.

What are the possibilities with integrated utilities data?

Ideally, data would work together. For instance, if CAD data and GIS data could work interoperably together, issues could be identified early on in the asset management lifecycle. Suppose a utility pole were designed. If the CAD model were placed in a spot with unstable terrain, and there was interoperability between the CAD data and the GIS data, this could be flagged the moment the model was placed in that spot.

Later, when the utility pole is placed in an appropriate spot and built, sensors can help the company understand if the asset is under any stress and needs maintenance. If the IoT and GIS data incorporates interactive 3D imagery, it can provide a user-friendly, real-time system for monitoring and analyzing asset-rich infrastructure from anywhere in the world.

When a high-fidelity 3D visualization receives real-time data from sensors, it creates a digital twin. A digital twin allows utilities to provide a user-friendly interface with asset management and control and limit the number of dangerous on-site inspections without compromising safety. In other words, instead of dealing with plain text data, a staff member could conduct a virtual walkthrough of a location from their laptop or tablet, see that the pole has sustained damage thanks to the real-time accurate visualization, and take steps to address the issue.

This is possible because the photorealistic digital twin is a perfect reflection of the physical asset’s composition, its conditions, and its behavior. If vibration sensors or weather sensors collect information indicating the utility pole is fine, the visual representation reflects this. If those sensors pick up a problem, it feeds that data into the visualization, and the digital twin reflects this, that could show as cracks in the asset or flags indicating a specific issue.

However, there’s a specific problem that stands in the way of achieving this: A lack of standards that makes communication between these different types of data next to impossible.

Part 2: Why Utility Technologies Refuse To Speak To Each Other

In an ideal world, data from different technologies and platforms, like operational applications, CAD models, GIS data, IoT data, and more, could work together automatically. Employees could analyze this data in the same environment and find relationships between them. To take it even further, companies would be able to incorporate this data into photorealistic 3D visualizations that make it easy for any team member to understand.

Instead, the status quo of infrastructure management today is that users must access data from multiple, disconnected applications, such as GIS, IoT, and CAD applications. It’s next to impossible for data to “speak” to each other.

Expecting IoT data to integrate itself seamlessly with CAD data would be like adding German-language notes to an English-language report and then expecting an English speaker to understand and act on the additional information.

There would be an expectation that the German analyst would add his or her notes in English. Or, there’d be an understanding that the person reading the report would speak both English and German. Whichever arrangement the company chose, this arrangement would become the standard. This standard for handling and exchanging information would become expected throughout the organization and throughout partner organizations as well, allowing for the easy collection, interpretation, and use of information.

If utilities want to bring CAD, IoT, BIM, and other data together to create authentic digital twins, they need the right standards for interoperability.

What stands in the way of interoperability?

First, there are several proprietary file formats, closed ecosystems, standards, and protocols that complicate information sharing, even between stakeholders.

Then, there’s the challenge of transforming all this data into real-time 3D visualizations. Companies investing in digital twins need data visualizations. People understand photorealistic 3D environments better than plain data because they can put information in context. A 3D visualization would serve as a “mediator” of different data sources, so BIM, CAD, and IoT data would need to be combined into one platform. This often exceeds the capacity of most computers companies have on hand. And even if a utility could pull this off on one dedicated “super machine”, this would limit the reach of a digital twin, defeating the purpose. A digital twin should be accessible on demand, through the cloud, and on any device.

TWI Makes Offshore Wind Farms Easier To Manage and Maintain

Consider the case of offshore wind farm management. Offshore wind farms are increasing in popularity, due to their ability to generate more energy than on-land wind farms. Nevertheless, they are challenging to manage remotely. If offshore wind farms are to become a reliable source of energy, this problem needs to be overcome.

One way to address the problem is to reduce the occurrence of parts failure by monitoring key components for damage. Research shows that a high percentage of wind turbine failures are due to problems with the blade. Researchers found that they could monitor blades using acoustic emissions to spot cracks and monitor their growth. TWI, an independent technology research organization, is applying that research to its own digital twin solution, which will provide overall health and performance analysis of an offshore wind turbine farm. Consequently, this reduces the frequency of maintenance calls without compromising the integrity of the turbines.

Technology interoperability is a critical component in making this project work. If this set-up has an interactive 3D visual representation, off-site wind farm management could be given to anyone trained to interpret a 3D environment. Additional experience interpreting plain text data would not be necessary. In addition, if this real-time visual representation could be streamed to any device, the distance of offshore wind farms would pose less of a problem.

The challenge is finding a platform that can integrate all of these data streams and distribute a high fidelity digital twin.

Part 3: Laying The Foundation For a Digital Twin

Interoperability of different technologies is an important ingredient in digital twin creation, and we’ll address how companies can solve that in Part 4. But there may be some enterprises who have yet to assemble these separate components and need a little guidance.

Creating a digital twin requires a cross-disciplinary team to work together. Nevertheless, it’s important for business leaders to have a high-level understanding of the work required. Former McKinsey Associate Partner, Danny Castonguay, helps understand the basics of assembling a digital twin.

Creating a user-friendly interface

This is the mobile or web app that users will access to view the digital twin. Companies using an interactive 3D streaming solution, like PureWeb, will have a secure URL instead. For a digital asset to be an authentic digital twin, it must have a visual representation that reflects the geometric and physical characteristics of the asset as well as the incoming data. If you’re creating a digital twin of a large system, you’ll want an interactive, 3D rendering that allows your users to interact with the digital environment easily.

Identify the data being fed into your digital twin

Your digital asset needs real-time data to be considered a digital twin. Your team must identify all the possible integrations between the physical asset and the digital asset. This could mean breaking down your existing data sources into more granular integrations, adding to your existing data by finding other internal sources, or introducing useful external sources.

Build the simulator component of your digital twin

This component of the digital twin allows you to experiment and use your digital twin as a stunt double. Data professionals do the heavy lifting by building a baseline model, building a data science model, and evaluating their overall performance. As Castonguay explains it, this simulator will take the current state of your world as an input and deliver a possible future state of the world as output.

Introduce autonomous algorithms and model disruptions

Agents take action based on decisions. For this to occur, they need prescriptive algorithms. Teams decide which algorithms to apply based on the problems that need to be solved. In addition, your team will need to model potential disruptions (e.g. meteorological, financial, social) to derive the most business value from your digital twin.

Making the digital twin cognitively easy and physically accessible to users

Once the digital twin is set up and it’s receiving data in real time, the final problem is making it available to users on demand. A digital twin is extremely data intensive, not just because of the incoming information or the data analysis, but also because of the high-resolution graphics required to navigate the digital asset.

Why is this a problem worth addressing?

First, the business value of a digital twin comes from its ability to communicate information in real time. In a digital twin with advanced capabilities, this may happen automatically through sensors and actuators. But for many organizations, particularly in the early days of digital twin adoption, the action (e.g. shutting off a valve, scheduling a maintenance appointment) will be initiated by human users who interpret the real-time information provided.

If the information is presented in a confusing or cluttered manner, interpreting it introduces a cognitive burden. Imagine a digital twin of a boiler room. Key elements of the boiler room, aka “the physical/digital asset”, would be virtually marked with points that represent important pieces of information such as temperature data or fluid flow data. The more “pieces” of data this asset presents, the harder it is for a user to scan the asset and understand which information requires their attention. On the other hand, if they could access an interactive 3D walkthrough of the asset, they could quickly identify data points worth investigating.

Secondly, the new world of work is remote, and the 2020 coronavirus pandemic has proven that even the most “on-site” jobs need some “off-site” connectivity to ensure business continuity. Since digital twins are crucial to the management and maintenance of critical physical assets, they need to be securely accessible. In addition, they need to be accessible on any user device such as a tablet or laptop without compromising graphic and data fidelity. A product that delivers this solution exists.

Part 4: Interoperability & Interactive 3D Visualizations with PureWeb

Streaming an interactive 3D rendering of a digital twin is no easy feat. To deliver the digital twin to users, enterprises need a solution that offers the following capabilities.

  • Interoperability with multiple data feeds and integrations including CAD, GIS, IoT, etc.
  • Efficient management of cloud GPU resources
  • Enterprise-level security to protect proprietary data
  • Managed service that allows enterprises to focus on their core competencies
  • Ability to distribute interactive 3D renderings to remote, off-site locations

PureWeb delivers these capabilities to customers through a rigorously tried and tested interactive 3D distribution platform.

Compatibility with multiple data feeds and integrations

The PureWeb’s digital twin solution is a unique and proprietary technology that connects multiple applications into a seamless and unified user experience. Our product integrates software with different data formats using the Shared Data Model and presents them as a cohesive digital twin in a single interactive 3D view, creating efficient user workflows and allowing collaboration with other team members to draw better conclusions in less time.

The 4 Vs of successful data science are volume, variety, velocity, and veracity. Enabling an unlimited number of integrations allows clients to address and check all four boxes.

Efficient management of cloud GPU resources

Few companies possess the powerful GPUs required to carry out a large 3D rendering project. Moreover, it’s not advisable for companies to purchase all of this hardware. Instead, companies can use GPUs in the cloud to meet their digital twin 3D visualization needs. Nevertheless, just because cloud computing is more cost effective than purchasing the hardware, doesn’t mean cloud costs don’t add up. PureWeb manages this cost for companies by deploying multiple users to the same GPU.

Game engines like Unreal and Unity have dramatically simplified the creation of photorealistic 3D assets. Using efficiently managed cloud GPU resources ensures those assets don’t get caught in a publishing bottleneck.

Enterprise-level security to protect proprietary data

Thanks to PureWeb’s roots in healthcare technology, our distribution platform was designed with security best practices in mind from the beginning. Rendering happens at the server level, preventing unauthorized access to source data.

Managed service that allows enterprises to focus on their core competencies

PureWeb offers a managed service, so clients can focus on their core business. The team configures your chosen cloud solution, handles the coordination of streaming session connections, schedules user sessions to available servers, manages security and authentication, and more.

Ability to distribute interactive 3D renderings to remote, off-site locations

The PureWeb publishing platform allows distribution to remote, off-site locations around the world including oil rigs, off-site wind farms, and military bases.

Conclusion

Utilities, energy companies, and other asset-intensive organizations are managing unique pressures. They must manage aging infrastructure, environmental concerns, tightening regulations, and a well-connected, information-rich consumer base. These pressures leave little room for error or guesswork. Consequently, leaders need real-time visibility over their assets and associated processes.

Interoperability has been a major limitation to this for a long time. Not anymore. Today, asset-rich companies can bring all their departments and processes into the fold. They can unleash critical business data from employees’ minds or spreadsheets and use it to make real-time data-driven decisions.

This is the promise of digital twins. Digital twins offer a photorealistic 3D replica of their physical assets that constantly evolves and changes in response to real-time IoT data, GIS data, and more. PureWeb is helping companies make this promise a reality.