Connecting the Thread: Digital Twins
A factory floor digital twin can deliver insights to design engineers, but it takes some legwork to create a closed-loop workflow between manufacturing and engineering.
February 1, 2019
The digitalization of manufacturing, now commonly referred to as Industry 4.0, is giving birth to a new industrial asset that promises to enrich future products—the digital twin of the factory floor, which can provide engineers with valuable insights and context not widely available via traditional design practices.
Much like a digital twin of a physical product, the factory floor digital twin creates a virtual representation of the plant floor—both physical assets and their behaviors—using data collected through sensors, cameras, systems models, simulations and additional data sources, including enterprise systems.
Factory floor digital twins are steadily gaining ground among manufacturers: A Gartner survey found that by 2020, at least half of manufacturers with annual revenues of $5 billion or greater will launch at least one digital twin initiative for either products or assets.
Manufacturers are leveraging digital twins in multiple ways. One prominent use case is to tap a digital twin as a more cost-effective and agile way to plan and commission industrial assets on the plant floor. This allows manufacturers to optimize sequences and processes, and pinpoint potential problems well in advance of laying out and fitting up physical machinery—and with far less need for post-installation fine-tuning.
Manufacturers also see value in the pairing of digital twins and advanced analytics for monitoring plant floor performance, allowing for optimization and continuous process improvement and laying the groundwork for predictive and preventive maintenance that will reduce the risk of costly downtime.
These two examples, and other use cases, will fuel adoption of digital twins; Gartner expects the number of participating organizations to triple use of the technology by 2022.
Beyond benefits for production planning and performance, proponents of the factory floor digital twin see a real upside outside of manufacturing, specifically for design engineers. The factory floor digital twin, they contend, can serve as a bridge between manufacturing and engineering, providing a conduit for better early intelligence that can lead to the design of higher quality products that are more easily manufactured along with the development of industrial machinery that is less prone to breakage and downtime.
The Legacy Hangover
Despite the potentially huge upside, the reality is it’s still in the early days in terms of the factory floor digital twin playing any significant role connecting historically siloed engineering and manufacturing domains.
Although industry pundits, 3D software vendors and internet of things (IoT) providers play up the importance of flowing this data back to the engineering organization, there are few, if any, out-of-the-box offerings that result in a full-scale, closed-loop workflow. Instead, manufacturers are looking at a significant effort to integrate a factory floor digital twin into its own highly customized, engineering-focused solution.
“There are many organizations that want to do this stuff, but they can’t today because too many are still stuck in the Stone Ages [when it comes to plant floor technology],” says Dave Duncan, vice president of product management at PTC. “We need to start with simpler business cases that are addressable. We don’t want to burden engineers with making human decisions off of imperfect data, but we want to provide them with more than they have today.”
“It’s what your overall digital twin vision is for the enterprise, how you implement it with people and processes—that’s the missing piece ...”
The Stone Age scenario Duncan describes is the current state of many plant floor environments, which remain dominated by legacy machinery that collects and stores data in proprietary formats, making it difficult to integrate and share, and runs off of proprietary industrial networks without a robust connection to the standard internet protocol (IP) networks that are the backbone of the IT enterprise.
Even the growing numbers of manufacturers that have modernized operations with the addition of sensored equipment, edge computing and IP networks face challenges stretching the utility of the factory digital twin back to engineering. The time-series data collected by industrial internet of things-enabled (IIoT) plant floor machinery isn’t in a form that is useful to the average design engineer, thus requiring an intermediary layer that can parse and interpret the voluminous data streams for relevant insights and feed them back to engineering platforms with the proper context.
In addition to the real-time data generated by sensored machinery, there are other unstructured sources of information that need to be part of the mix, including work instruction PDFs and other data from enterprise systems like manufacturing execution systems (MES) or enterprise resource planning (ERP).
“We need to grab hold of the data and put it into a context that makes sense,” explains Bruno Demange, marketing and industry director for DELMIA Global Industrial Operations at Dassault Systèmes. “Having access to 1 million data points for any given machine is not going to help. The main objective is to allow for the analysis of large volumes of data coming from different sets of machines and putting it into context.”
The Building Blocks
Dassault Systèmes has started the process of integrating the plant floor digital twin with engineering. Its acquisition of Apriso several years back gives it an entry to MES and operations data. Further, a partnership with OSIsoft delivers a system for collecting and processing that high-fidelity, time-series data from those and other sources directly within the 3DEXPERIENCE platform, where it can be accessed by various stakeholders, including engineers, in the context of their preferred applications.
Critical to making this scenario effective is EXALEAD, Dassault Systèmes’ search and classification engine, which ensures the right data is retrieved and served up in a context where it can add value for a specific stakeholder along with machine learning capabilities as the catalyst for data-driven and predictive analytics, Demange says. “With these data gathering solutions, we can achieve the first important step in doing machine learning and predictive analytics—that is, putting all required data on the 3DEXPERIENCE platform,” he explains.
Although the data is technically available to any stakeholder (including design engineers) on the 3DEXPERIENCE platform, it’s not seamlessly integrated into traditional engineering tools like CAD or simulation to fuel a broad range of use cases.
Currently, engineers can view non-performance and quality data in a 3D model in a way that visually indicates, for example, that a particular gearbox part is prone to failure based on data analyzed from the factory digital twin as well as from the field. Engineers would then be able to use that intelligence to redesign the part to address the problem in subsequent releases.
“We plan to go further to leverage the power of EXALEAD and Apriso in context of the 3D model,” Demange says. “We’re starting with quality-related data, but we plan to extend what’s available in the context of 3D models in the coming months.”
PTC is also leveraging its ThingWorx IoT platform, Kepware industrial connectivity solutions, Windchill product lifecycle management (PLM) software and a partnership with automation giant Rockwell Automation to create a digital thread between the factory floor and engineering. One of the first fruits of the Rockwell partnership is the FactoryTalk Innovation Suite, powered by PTC, which integrates Rockwell’s analytics and manufacturing operations management (MOM) platforms with ThingWorx, Kepware and PTC’s Vuforia augmented reality solution. The solution, which provides a view of manufacturing operations tailored to specific roles, isn’t necessarily aimed at engineering, though this group could benefit from intelligence particularly as it relates to failure modes and effects analysis (FMEA).
“The next step is to create a connection right into Windchill,” says Duncan, adding that PTC is working on a closed-loop connection for FMEA and non-conformance data into Windchill by the end of this year. Moving forward, the plan is to layer artificial intelligence data over these closed-loop workflows to deliver insights that weren’t possible before.
For its part, Siemens PLM Software is working with a similar set of building blocks to make factory floor digital twins relevant for the engineering community, including its full portfolio of 3D modeling, systems modeling and simulation tools in addition to its MindSphere cloud-based open IIoT operating system that connects plant floor machinery and enterprise systems while delivering access to advanced analytics services.
Siemens also sees potential for engineers to make use of factory floor digital twin intelligence as part of the design workflow, especially as a way to fortify design decisions based on conjecture with some real-world intelligence gleaned during the production process and as a way to validate and build more accurate simulation models, according to Grama Chethan, consultant in Siemens’ office of architecture and technology, which is part of the CTO group.
Consider a crane in operation on the plant floor: Today, testing on that crane is typically done in isolation, the results are manually interpreted and important intelligence is handed over to the design team, typically at the discretion of the players involved.
With an IIoT platform like MindSphere, the proper mapping of sensors, and most importantly, the right contextualization, engineers can turn what has been a mostly manual and subjective process into an automated workflow. This can empower users with key insights—for example, operation under a specific set of humidity conditions led to a particular part failure on the crane—which would enable them to make better design choices to address the issue on subsequent product iterations.
Although the framework and technology foundation is available to get this done, Chethan admits the process is not yet turnkey and is difficult to scale. “It’s really complex today and most of the projects are done as one-off services like putting plumbing together,” he explains.
IIoT platforms like those from PTC, SAP, GE and others are critical building blocks to enabling the digital twin and for closing the loop between manufacturing and engineering, notes Manzoor Tiwana, ANSYS product manager for Twin Builder.
The other key enabler is a physics-based digital twin, or what ANSYS describes as an integrated, multi-domain system simulation that mirrors the life and experience of an asset. ANSYS Twin Builder facilitates the design of these models and reduces the need to interpret data collected by factory floor sensors, he explains.
“By utilizing this data, the design engineer can greatly improve the product design as they better understand operations and the environment in the field,” Tiwana explains. “The data can also expose failure modes not thought of before and give insight into how manufacturing variances of a component used can affect the end product.”
Nevertheless, Tiwana admits there are still challenges in getting the data, and more importantly, the insights, back into the hands of engineers where they can make a difference in future product design. “From a technology point of view, it’s there,” he says. “It’s what your overall digital twin vision is for the enterprise, how you implement it with people and processes—that’s the missing piece at this point.”
About the Author
Beth Stackpole is a contributing editor to Digital Engineering. Send e-mail about this article to [email protected].Follow DE