The Digital Twin after Apollo 13

 


Market highlights

The use of digital twins in smart manufacturing is becoming commonplace due to the developments in cloud and telecommunications technologies, the maturity of MESs and the made investments in IIoT/IoT that spearhead the building of platforms that are increasingly more sophisticated and interoperable. The present pandemic has also been factored into digital twin investments. According to a Gartner survey, by 2023 at least 1/3 of mid- to large companies that implemented IoT will have at least one digital twin with a COVID-19 related use case (IoT News). Predictive maintenance and asset performance monitoring are among the leading use cases for the DT technology today.

Below is my analysis and comparison of the predictions for the global digital twin market size over the period 2021-2026 by several market research companies - Grandview Research, Research & Markets, Mordor Intelligence and Markets & Markets. The DT market forecasts envision a steep CAGR of up to 58%.

DT Market Size

(USD BN)

CAGR estimates have a 23% range.
In 2026, there is a 14% ($6bn) difference between the smallest and the largest size estimates.


The successful failure of Apollo 13

The idea of twinning came about in the 1960s, when NASA mirrored the modules of the Apollo 13 Mission (1970) on the ground in order to assess risk, initiate the correct action and prevent a catastrophic situation. During the Apollo 13 flight, one of the two Command Module (CM) oxygen tanks exploded while an astronaut performed a stir, leaving the first tank completely depleted and the second one leaking quickly into space. The astronauts, who just some minutes ago had cheerfully connected with TV viewers on Earth, now found themselves fighting for their lives. Many unknowns awaited them such as a risky move to the Lunar Module (LM), powering down the CM and preventing it from freezing, managing limited water, heat and food, assembling instructions from the ground control team correctly, fixing dangerous carbon dioxide levels, propelling themselves back to Earth using the LM instead of the CM, and powering up the low on oxygen and fuel CM at the end to land into the ocean…

Before the Apollo 13 flight, NASA had prepared to simulate conditions on board the spacecraft including the Command and Lunar Modules, as well as the Service Module, which hosted two oxygen tanks, fuel and the rocket engine for the return to Earth. Not only the ground control team but also the astronauts had been trained before the space flight took place on physical twins of the modules empowered by networks of digital computers. When the oxygen explosion happened, NASA needed to quickly convert the physical model of the spacecraft to a digital one, in order to recreate the unexpected event. Numerous failed simulations were run on Earth by Mission Control to come to the correct life-saving steps and communicate the exact calculations to space. What is important to note here was that the simulators on Earth received real-time data via radio telemetry with the physical spacecraft in space and were able to adjust to its current damaged state, in order to gauge the right strategic steps for survival.

Luckily, simulations that had been performed during the Apollo 10 Mission (namely experimenting with the LM charging the CM) also proved vital to providing information about the viability of actions between the modules of Apollo 13. Among the difficult tasks the simulators of Apollo 13 performed was to assess whether the small Lunar Module designed specifically for Moon landing and take off could take the astronauts back to their home planet, while also driving forth the Command and Service Modules. The ground team practiced on the simulators until they were certain that this was realistic. Probably the most difficult task tested on the simulators was how, when and at what angle to burn the LM engine, in order to jettison the three modules from the Moon back to Earth. The ground team had to write completely new procedures and test them locally before passing on to the astronauts.

The three-men astronaut crew turned around a situation of sure death by addressing many likely system failures identified by them and the ground control team, which was running twin simulations day and night, achieving the close to impossible feat of saving their lives. Anything that was rejected by the simulators was not passed to the team in space. This was probably the first real use of a digital twin, designed by the brightest minds of space travel with access to the most advanced technologies at the time.

Two Apollo 13 astronauts at the foot of a Lunar Module Simulator. Photo credit: NASA

Around the same time that NASA pushed beyond the limits of a new technology, Ivan Sutherland developed the first CAD software as part of his MIT PhD thesis. However, it was not until 2002-2003 that the concept of digital twin was coined by John Vickers of NASA and the model of the digital twin was presented by the University of Michigan professor Michael Grieves as the underlying model in product lifecycle management (PLM). It is also around 2002 that companies like Siemens and GE started to eye the digital twin idea and its relevance to IoT devices.


Fast forwarding to the present day

Why is the mentioning of NASA’s Apollo program important? The investments made into its space program opened the doors to innovation in today’s microelectronics, software and other industries. NASA’s success in handling Apollo 13’s complexity also paved the way to the digital twin at the enterprise.

The Apollo 13 flight happened more than 50 years ago, so how have digital twins changed since then? Apollo 13 used telemetry for communicating data back and forth with the spaceship, encoding data digitally from onboard sensors and sending the data stream to Mission Control entirely via radio technology. In our days, we have the Internet of Things empowering big data management from thousands of devices simultaneously and wide-range cellular technologies supporting connected devices, which communicate individual and aggregate information into useful forms for further analysis.

Developments in 3D visualizations and automation have also taken place. As CAD /CAM software matured and advances in Virtual and Augmented Reality made it possible to create, study and control precise 3D objects and systems, engineers today can perform simulations before the commencement of physical product design, and enable individual product or system components to interact digitally. Once the 3D visualizations go live, they connect to inputs from IoT platforms, historians and MESs.

The modern digital twin is a virtual, 3D replica of a business process or a physical product fed by IoT sensors and other data, exchanging and updating itself with real-time information from the physical twin. The IoT platforms support data management and connect to large databases and cloud storage that did not exist at the cradle time of the digital twin. The DT technology is no longer limited to the deep pockets of space travel but available across the board in manufacturing, telecom, transportation and other industries.

What sets the digital twin apart from other virtual concepts is the constant interaction and updating between the virtual and the physical model. Today’s organizations have more than one digital twin, each one serving different purposes while being able to interact with the others for providing a complete picture.

The modern digital twin has the following capabilities:

  • Combines a blueprint that describes the behavior of a physical object with real-time data exchange and updates between the virtual and physical model.

  • Models not only the visual representation but also the physical attributes of the object or process (e.g. temperature, depth, diameter, skewness, air quality, etc.).

  • Mirrors not only the entire entity or process but also the interactions of its individual components.

  • Can remotely control the object, system or process.

  • In manufacturing environments, the digital twin typically receives and aggregates real-time data from IoT sensors attached to physical objects and systems. There are other methods of collecting data for digital twins without sensors being attached to the physical twin object, such as via image-taking drones or thermal cameras. They can be more cost-effective but are typically more limited in scope.

  • Runs simulations and modifications, varies risk levels and performs stress tests to gauge the reactions of the physical model in a risk-free environment.

  • Has flexibility, adaptability and timely response to changing conditions of the physical object, even when dangerous situations arise quickly.

  • Collects data from MES, ERP and CAD systems.

  • Functions as historical information repository offering a full range of historical, functional and cost information related to the physical model. Such information may consist of blueprints, design sketches and manuals, system capabilities, history of installations and inspections, supplier information, financial and accounting data, and anything else to deliver a complete picture of the physical model.

  • Feeds function-specific data related to the product, system or process into usable templates to teams beyond engineering and production, such as sales & marketing, accounting or top management.

  • Blends physics and machine learning algorithms depending on the task at hand and exogenous factors. This is further elaborated in the next section.

This is a nice graph of the digital twin model from a report by Deloitte, illustrating the components and processes that a digital twin references to:

Source: Deloitte University Press


The power of the ecosystem

Digital twins are integrated into PLM together with IoT, machine learning and other technologies. They empower businesses to make rapid and well informed decisions based on the most recent and accurate data. Digital twins thrive as a supporting block of IoT platforms, whose frameworks can support various DTs -IoT helps them to become cost effective and, in its turn, the demand for digital twins drives investment in IoT platforms.

With the right ecosystem, the benefits of having a DT multiply. In line with advances in technologies such as 5G, improved cloud scalability and storage capabilities at a lower cost, sensor technologies and IoT platforms boasting AR, VR and AI stacks, the overall business impact of the digital twin grows. Edge devices enable computing speed close to the source, cloud technologies provide scalability and collaboration. New machine learning methods allow for millions of possible twin simulations in the cloud to find the best ways to model and bring into fruition more efficient products and processes.

The ecosystem of technologies breathes new power into the digital twin. One can now not only make use of physics models built on historical and well-known dependencies, but also benefit from complex uncertainty, risk and future-looking scenarios of machine learning and neural networks. AI models fit cases where large, accurate amounts of relevant unstructured data are available alongside uncertainty about trends. The DT paired with AI empowers new system self-learning capabilities that can prevent catastrophic situations that would not have been possible to simulate purely on the basis of historical and physical parameters knowledge.


Industrial footprint

In the past decades, manufacturing and avionics have been among the main users of the DT technology. Industries like automotive, aerospace, oil & gas, mining, construction and energy have boasted savings, increased efficiency and prevented catastrophic risks via successful deployment of digital twins. The reason for the massive use of DTs in manufacturing is the large investments to protect and the clear benefits of sensor technology throughout the smart factory. In other areas such as agriculture, forestry or human biology, in which environments change quickly and unpredictably, and where modelling is very challenging, the digital twin as a technology will take more time to justify and implement.

UC Berkeley’s professors Pushkar Apte and Costas Spanos define sustainability, smart innovation and health and safety as key areas benefiting from the use of digital twins. During their successful pilot projects, the teams combined DTs with AI, drone technology or ground robots to achieve impactful results in commercial building, agriculture, wind turbines, dams, and used AI/AR/VR technologies in parallel with digital twins in telehealth.

Here are some concrete examples of applying DTs by industry sector:

Automotive. Perhaps one of the most prominent applications of the DT technology comes from Tesla. The company has a digital twin for every physical car, streaming updates from the network of installed sensors and transmitting them to a simulator, which uses AI to determine whether the car needs maintenance, in some cases fixing problems via automatic software updates.

Oil & gas. Digital twins have also found wide applications in oil and gas, especially in asset integrity management (AIM) and predictive maintenance. Pipeline metadata can be accessed and consumed by various workflows such as stress analysis, flow assurance, cost estimations and more. The applications of DT can speed time to first oil. The DT data can be exported to fit specific external templates (e.g. bill of materials), or exogenous factors including current economic conditions can be factored into the model. AI is deployed to analyze carbon emissions and power consumptions and to recommend a shift of direction, setting enterprises on the right foot with the climate agenda.

Shell has partnered with AVEVA for digital twins in its asset lifecycle management, and with Axelos in Nigeria to identify critical areas to prioritize for inspection and maintenance (Computer Weekly).

Energy. UC Berkeley’s professors Pushkar Apte and Costas Spanos found out that DTs referenced to AI models can help reduce the energy consumption of commercial buildings by up to a half, making a big difference for sustainability. Buildings account for 40% of the world’s energy consumption and a similar level of greenhouse gas emissions.

The market for wind turbines is very large - according to Bloomberg between 2000-2019 there were 265,000 of them installed. In 2015, GE Digital introduced the world’s first digital wind farm.

There is digital twins software that is industry specific, as industrial knowledge plays a key role in the success of DT applications. The US Department of Energy made a grant to UC Berkeley, Akselos and the American Bureau of Shipping in 2019 for the development of the first digital twin software for floating offshore wind farms called DIGIFLOAT.

Transport and logistics. Warehouses and logistics are in the early stages of deploying digital twins. DTs at the warehouse can be used to reduce energy consumption, simulate movement of equipment, inventory, products and personnel, improve use of space, and to factor in operations data such as location, purchasing, requirements and more.

In March 2021, the Head of digital engineering of the UK’s HS2 high-speed rail described how a federation of digital twins across the European rail network will be linked, built by contractor suppliers who will then link the digital twin assets, systems and sub-systems. The goal is to remove waste, ensure better asset management and integrate the entire railway network worth €117 billion through visualization and analytics. (International Construction, 2021).

Construction and real estate. In 2019, the destructive fire at the Notre Dame cathedral made headlines around the world. Donations poured in from all sides, including a €300m injection from the French luxury tycoons Arnault and Pinault. A digital twin was built to redesign the old cathedral and create a more resilient design construction.

Imperial College in London installed the world’s 1st 3D—printed bridge in Amsterdam in July 2021, printed by four industrial robots over a period of six months. The bridge will serve as a data hub with sensors that monitor all movement, stress and conditions, loading the data into a digital twin that will record and manage changes over the bridge’s lifetime, providing valuable knowledge to construction engineers of the future.

According to the former CDO of GE William Ruh who joined the multinational construction company Lendlease, the DT technology’s cost, estimated at $150-200K now, will fall to $50K in the next few years. He also estimated that up to 10% in savings on build costs could be generated via use of DTs. (Estates Gazette 2020).


What are the implications of this for businesses?

DTs are actively applied in use cases like integrity management, industrial asset monitoring and control, risk modelling, performance optimization, inventory management, product design and preventive/predictive maintenance. DT has the capability to lower the cost of unplanned downtimes, enable companies to identify malfunctions and issues before they occur, reduce operational or customer service costs, monitor and increase asset performance, simulate various scenarios using real-time values, and help to adjust the maintenance cycle to the state and actual lifetime of equipment.

Tips for a successful digital twin journey:

1. Business goals.

  • Define the digital twin. Define which components and units will make up a single digital twin depending on your use case and concrete business challenges. Your digital twin does not need to be an entire system, as financial and technology limitations may apply.

  • Focus on purpose. Some experts believe that we are still years away from mirroring each instance of a physical product. Companies in many cases decide to fully mirror only problematic areas and approximate others. Know the core reasons for implementing the digital twin to assess its real upside potential.

  • Data is king. Have a data management strategy to extract the maximum benefits from deploying a DT. Explore the weaknesses of your data and have a clear map how to get it to the level where it will serve your business goals, with the efficiency and usefulness you need to stay competitive.

  • Foundation. It is preferable to have MES data maturity and advanced PLM systems in place to reap optimal benefits from deploying digital twins.

  • Decide on level of complexity. Do you want to use the digital twin in conjunction with other technologies like AI, AR, specific data analytics software?

  • Infrastructure readiness. Do you already have some IoT fabric in-house (e.g. existing sensors and actuators, analytics engines) to support the digital twin?

  • Use cases. What are the use cases that are relevant to the challenges you are trying to solve? What are your business priorities - cost reduction, performance optimization, risk analysis, asset control, etc??

2. Product.

  • Pilot. Can you have a pilot project to assess functionality, benefits and system fit before deciding whether to purchase? What market alternatives are there that compete with the offered product or solution?

  • Common standards. What systems and tools is the digital twin interoperable with? What types of files will the digital twin be able to process? The lack of common standards is one of the stumbling blocks for widespread deployment of digital twins. Standardization will allow digital twins to work with any standard-based tool, as it enables interoperability between vendor components. Yet standards can also make it easier for attackers to gain access to digital twins and data. At present, most digital twins sit in proprietary systems and it takes time and effort to integrate them.

  • Industry fit. When evaluating digital twin software or platform, consider its industry specialization and the expertise behind it. Do your research on the experience with this particular product/solution of users relevant to your market via forums, case studies and other market data.

  • Flexibility. Does the digital twin platform support open standards? How quickly can it adjust to rapidly changing conditions and dangerous situations? Is it more relevant for specific use cases?

  • Complementary technologies. Can the DT blend physics and AI data? Are complementary technologies like AI/AR/Data analytics integrated and offered as a built-in component as part of a platform?

  • Data hosting. How flexible is the provided solution or platform and who will manage it? Does the supplier offer more than one option for hosting including on-premise? Is there a hybrid approach offered where you can process some of your data locally while managing selected big data in the cloud?

  • Services and cybersecurity. Assess each of the services that come with the digital twin platform against your preferences and internal capabilities - maintenance, training, modular specialization, openness to third-party vendors. Engage your cyber specialists in the discussion with the solution provider to ensure that your data will be protected.

  • Usefulness. You may have millions of data points that are streamed, structured and analyzed in a perfectly working model, but it may not be the data that is useful for your particular situation. Make sure that the digital twin uses the data that is necessary to solve the exact business problems. A lot of AI and data analytics projects fail not because of insufficient skills or poor data, but because of focusing on areas that are not useful for solving concrete business issues.

  • Pricing and licensing. What are the preferred licensing models in your industry? What arrangement would you be comfortable with? Are there different options to choose from - up-front lump sum and low monthly payments, pay-as-you-go, user or module-based licensing, fixed annual fee, etc.?

  • Integration. The beauty of the digital twin is in processing data streams from multiple sources. Who will be responsible for the integration? How flexible is the offered solution in terms of integration vendors and partners? How involved will your team be in the process?

3. Team.

  • Internalize skills to create new value. Develop software and data skills internally to stay competitive or to create new value. Engage data scientists, engineers, and the rest of the enterprise in learning and improving data analysis and machine learning skills. For example, if you are in the business of smart devices, you can avoid heavy costs of servicing and unhappy customers by training your team to process data from the devices and predict their lifetime value via use of digital twins.

  • Straighten out liability. Now that you will be dealing with systems and integrations of data, who will be responsible for each point of failure? Outline duties and responsibilities ahead of time.

  • Engage and collaborate. Digital twins are not limited to production teams only. Use the skills of data scientists, machine learning experts, engineers, designers, sales and customer service teams to generate a thorough performance map via linking and analyzing cross-functional data to reap the highest benefits from the digital twin deployment.


The digital twin, when used to its full potential, is a digital transformation of the company linking the agents and players all along the value chain. Companies need to understand their supply chains and business opportunities better and thus the digital twin becomes an enabler of real-time, informed decision making. To achieve a good ROI, it is imperative that all partners develop and share a common set of digital values and act together, connecting all dots to achieve the quality, accuracy and competitive advantages for the enterprise and the end customer.

 
TechnologyNadejda Gountcheva