News
Artificial Intelligence and Machine Learning News

Virtual Duplicates

Digital twins aim to model reality so we can see how it changes.
Posted
  1. Introduction
  2. Twinning Towers
  3. Going Big
  4. Author
engine illustration next to a shadowy engine illustration

Back in 1970 during the seventh crewed mission of the Apollo space program (the third intended to land on the Moon), the three astronauts aboard Apollo 13 were calmly going about their duties when an explosion in an oxygen tank rocked the spacecraft, spilling precious air into space and damaging the main engine. Personnel in Mission Control suddenly had to devise a plan to get the crew home, and to do that they had to understand what condition the damaged ship was in and the materials available for repairs, and then test what the astronauts might be able to accomplish.

To figure it out, they turned to the flight simulators used to plan and rehearse the mission. They updated the simulators with current information about the physical state of Apollo 13 and tried various scenarios, eventually coming up with the plan that safely returned the astronauts to Earth. This was, some argue, the first use of a digital twin, a model that simulated the state of a physical system with real-time data and made predictions about its performance under varied conditions.

Digital twins are growing in popularity, especially as the Internet of Things provides data from sensors in all sorts of places. The concept is being applied in a range of areas, from buildings to bridges, from wind turbines to aircraft, from weather systems to the human heart.

A digital twin is more than just a simulation of some arbitrary object or system. “It’s not a generic model of an airplane or a car or wind turbine or a generic person,” says Karen Willcox, director of the Oden Institute for Computational Engineering and Sciences at the University of Texas at Austin (UT Austin). “It’s a personalized model of one particular aircraft or one particular person.”

To qualify as a digital twin, Willcox says, the model needs to take into account current information about the state of the system and evolve over time as it is updated with new data about the system. Another distinguishing feature is that the model and the data help people make decisions about the system, which in turn can change the data and require the model to be updated again.

Perhaps the most obvious use of digital twins is to monitor the long-term health of expensive or complex equipment, such as engines, manufacturing equipment, or industrial heating, ventilation, and air conditioning (HVAC) systems. That sort of use is increasingly being touted as part of Industry 4.0, which incorporates digital technology, machine learning, and big data to improve industrial processes.

IBM, for instance, is combining sensors with its Watson artificial intelligence technology to help large companies make decisions about what maintenance to perform and when, to extend the lifetime of equipment and cut costs. In one example, the company created a digital twin of an engine blade in a 777 aircraft to monitor when it begins to degrade and requires upgrade or replacement. Similarly, GE created digital twins of its wind turbines to predict when the equipment will need maintenance and builds that into a schedule, so wind farm operators can address issues before a turbine breaks, avoiding costly downtime. Researchers at Siemens are applying a similar approach to the human body, developing digital twins of individual human hearts they hope could predict the effectiveness of a specific therapy for a particular patient, instead of merely relying on statistics about hearts in general.

Digital twins of systems also are being created. The Netherlands’ Port of Rotterdam created a digital twin of its shipping system, which monitors activity at the port and may be consulted for the best times for ships to moor and to depart, reducing waiting times and helping the port run more efficiently.

One issue with digital twins, Willcox says, is that so far they are essentially artisanal products, made specifically for a particular application, rather than for general usage. “We’re seeing some really exciting examples and successes and high value,” she says, “but most of those deployments are very much custom.”

Willcox and Michael Kapteyn, a post-doctoral student at UT Austin, have developed a probabilistic graphical model that draws on Bayesian statistics, dynamical systems, and control theory to create a basic system that can be used to create many digital twins. A key part of making that work, he says, was being able to quantify how much uncertainty a model contains, so its predictions do not go too far astray.

A potential pitfall for digital twins is that they can get carried away by the easy availability of large volumes of data, Kapteyn says. “You’re going to quickly run into efficiency bottlenecks, and it’s going to be difficult to properly assimilate that data and get the useful information out of that data and into your models.”

Another issue, says Willcox, is that “a lot of computer scientists don’t appreciate that data don’t come for free in the engineering world.” While it might be possible to improve a simulation of an aircraft by adding many more sensors, for example, those sensors add weight and heat and consume power, putting a limit on how many can be feasibly installed.

Back to Top

Twinning Towers

The U.S. Department of Energy (DoE) Oak Ridge National Laboratory (ORNL) has created a digital twin of every building in the U.S., all 122.9 million of them. Researchers gathered publicly available data about the buildings and used sources such as satellite imagery to verify their footprints, LiDAR to measure their heights, and street-level photographs to count the number of windows. The researchers extrapolated those measurements into three-dimensional representations of each building, creating what they call Automated Building Energy Modeling, or AutoBEM. With information about the number of windows, the number of stories, the building materials, and the roof types, they were able to predict the energy-use characteristics of each building.

Buildings consume approximately 73% of the electricity produced in the U.S., says Joshua New, a computer scientist at ORNL who leads the AutoBEM project. New notes the DoE has as a goal to reduce energy use in commercial buildings by 30% of 2010 levels by 2030, in an effort to fight climate change. “By creating a digital twin of all the buildings in the U.S., we can look at the most cost-effective ways to reduce energy demand, emissions, and costs,” New says.


The initial modeling of all 122.9 million U.S. buildings was computationally intensive, requiring 45 million core-hours on Argonne National Laboratory’s Theta supercomputer.


Armed with information about how a building uses energy, researchers can simulate different interventions—such as increasing insulation, reducing the amount of cooled air that leaks, installing energy-efficient LED lighting, or adding smart thermostats—and then see how those changes to the digital twin affect energy consumption. ORNL researchers ran a project for the Electric Power Board of Chattanooga, TN, modeling the city’s 178,377 buildings and created an interactive map that lets users see the energy and cost-savings of eight different improvements in each individual building.

Running such simulations for every building in the U.S. would be costly, however. “We don’t want to simulate every building. That’s just simulation overkill,” New says. Instead, the team relies on what it calls “dynamic archetype.” They take a region of interest, such as an electrical utility’s service area, and select a representative building that displays median energy use for structures of that type. They then multiply that by the total floor space of all such buildings in the region, and run their simulations on the archetype. “So if you had 1,000 supermarkets that had 1,000 square feet each, we could pluck out just one of those supermarkets and multiply it by the 1,000 supermarkets it represents. And now you could just run one simulation instead of 1,000,” New says.

The initial modeling of all 122.9 million U.S. buildings was computationally intensive. It took 45 million core-hours on Argonne National Laboratory’s Theta supercomputer, requiring special permission to go beyond the normal project allotment of 30 million core-hours. The team is proposing to update the models, which it makes publicly available, once a year, to incorporate any modifications that are made, but has not yet been granted permission to do that. On the other hand, simulating energy use in an average building over the period of a year using DoE’s freely available EnergyPlus software takes users roughly a minute, so testing 10 different interventions would take users about 10 minutes per building.

Because ORNL relied on only publicly available data, it has fairly limited models of each building. The models can still meet the definition of a digital twin by getting feedback if a property owner adds the data it collects about how much energy is used, in what location and at what time to refine the model. That allows it to produce estimates of potential energy savings that meet the standards banks use to grant loans to energy service companies to pay for energy-efficiency improvements. The lab also has non-disclosure agreements with various utility companies that allow it to incorporate the companies’ proprietary data about energy usage as a source of feedback to run utility-specific versions of simulations.

Back to Top

Going Big

Even more ambitious than modeling all the buildings in the U.S. is a European Union project called Destination Earth that plans to create digital twins to model various Earth systems; not just weather and climate models, but also the biochemistry and geochemistry of the oceans, ocean circulation patterns, food and water availability, and other factors. The program, which is projected to take seven to 10 years to complete, will take advantage of the EU’s €8-billion (US$9.45 billion) investment in high-performance computing, and will collect data from sources ranging from Internet of Things sensors to seismic monitors to satellites.

The idea is to take advantage of existing climate and weather models, which are evolving to include more factors, says Peter Bauer, deputy director for research at the European Centre for Medium-Range Weather Forecasts (ECMWF), which is part of the project. More data with greater spatial resolution could go beyond, for instance, predicting a dry period in California to actually projecting the risk of wildfires in a given area. “So it’s not just an Earth system where it rains and temperature varies, but it’s also an Earth system that tells us about fires and droughts and heat and the potential political implications of these effects,” he says.


“If we could have SimEarth, there’s a lot of things we could simulate and test out and validate at different scales in the real world to see how to make the planet better for everybody.”


Exactly what can be twinned and what is too complex for this type of simulation is still an open question. Willcox thinks digital twins will be most useful in areas where people have a well-understood, focused model and a clear need for better decision making. “That’s in contrast to the situation where people think a digital twin is going to be this magic simulation that will answer any question,” she says.

New sees value in using digital twins to tackle the big problems facing humanity, such as climate change, in some future system resembling the world-building videogame SimCity. “If we could have SimEarth, there’s a lot of things we could simulate and test out and validate at different scales in the real world to see how to make the planet better for everybody,” he says. “There’s a lot we have to learn from these types of digital twins.”

*  Further Reading

Kapteyn, M.G., Pretorius, J.K.V, and Willcox, K.
A probabilistic graphical model foundation for enabling predictive digital twins at scale, Nature Comp. Sci., 2021. https://www.nature.com/articles/s43588-021-00069-0

Bass, B., New, J., and Copeland, W.
Potential Energy, Demand, Emissions, and Cost Savings Distributions for Buildings in a Utility’s Service Area, Energies, 2021. https://www.mdpi.com/1996-1073/14/1/132

Bauer, P., Stevens, B., and Hazeleger, W.
A digital twin of Earth for the green transition, Nature Climate Change, 2021. https://www.nature.com/articles/s41558-021-00986-y

Rasheed, A., San, O., and Kvamsdal, T.
Digital Twin: Values, Challenges and Enablers, arXiv, 2019. https://arxiv.org/abs/1910.01719

Introduction to Digital Twins, Simple but Detailed https://www.youtube.com/watch?v=RaOejcczPas

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More