Simulation Lifecycle Management’s New Mission

The discipline takes on new challenges from IoT, apps and the cloud.

Visualization of results without the need for specialized post-processing tools enables non-experts to quickly access critical insights that simulations can provide. Image courtesy of Siemens PLM Software.


PTC ThingWorx Conference At its LiveWorx conference this May, PTC demonstrated the digital twin of a Santa Cruz mountain bike. This virtual model incorporates data streamed from sensors attached to the physical model. Image courtesy of PTC.

Matteo Nicolich, product manger for Enterprise Solutions at ESTECO, identified one of the hidden costs of democracy. “By democratizing simulation, you let more people run simulation, so more data will be generated,” he says.

The push to democratize simulation and the spread of design optimization has fundamentally changed simulation lifecycle management (SLM). It may have initially been developed as a version- and history-tracking tool, with some job queue management utilities on the side. But the sheer size of the source files, the number of iterations involved and the simultaneous evaluation of hundreds of design variants quickly transformed SLM’s mission. It’s now tasked with IP (intellectual property) guardianship, process automation, HPC (high-performance computing) management, and remote visualization (for a start).

When the Internet of Things (IoT) arrived on the scene, it also brought its own Big Data headaches. Industry watcher Gartner predicted “4.9 billion connected things will be in use in 2015, up 30% from 2014, and will reach 25 billion by 2020.” The volume of data from these connected devices and products—heart-rates reported by smartphone health apps, climate data uploaded by installed wind turbines and engine performance data collected from moving vehicles, to name but a few—represent new challenges for SLM.

SLM’s new challenges are also its greatest opportunities. The real-time data available from connected devices, well-defined processes captured as apps, and the (almost) infinite computing power available on demand is about to catapult simulation to new heights.

The Impact of IoT

“Simulation data is massive; device data is much more transactional. SLM software has to address that difference. They’ll need other database strategies that are more transactional,” says Todd McDevitt, marketing director, ANSYS.

The database challenges notwithstanding, McDevitt foresees new types of simulation made possible by device data. “Today, simulation users use lab data and sometimes assumptions for input, for values like loads, electrical charges and aerodynamics. With industrial IoT, we have an opportunity to use real-time data, measured and collected by thousands of products in the field. At best, analytics and measured data from a device can only predict when something is going to happen, given the existing configuration of the device and its operating environment. They can’t tell what will happen if you change these parameters in a substantial way, or how to optimize the performance of your product. This is where simulation fits in. It elevates Big Data from being predictive to prescriptive,” he says.

The source data for simulation tends to come from controlled physical tests, covering only a limited use of the product under idealized conditions. For example, an automotive simulation might be based primarily on a professional driver’s maneuvers in a closed course. However, with real-time data, the range and variety are almost infinite. “Now, with real-time data, we can simulate what happens with when a weekend warrior or a soccer mom is behind the wheel,” says McDevitt.

But there’s also the possibility that real-time IoT data may spawn “digital twins that mimic the real physical entities the customers are using,” says S. Ravi Shankar, director of Global Simulation Product Marketing at Siemens PLM Software. “As the real product goes through its lifecycle, the IoT data coming from the field and reflective of wear and tear can be used to keep the digital twin synched to the real object,” he says.

With such a setup, manufacturers could, for example, periodically conduct fatigue analysis on the digital twin of a bike, not only to understand its current state, but to predict when it might break or fail. The concept has been prototyped and presented by PTC at its LiveWorx conference; an event devoted to the company’s IoT offerings.

In a live demonstration at the event, Mike Campbell, executive vice president of CAD products at PTC, showed a digital twin replicating the movements and mechanical behaviors of a mountain bike from Santa Cruz Bicycles, using wheel speed, pedal cadence, pressure on suspension and steering angles reported by sensors mounted on the bike in operation in the field. The setup used real-time sensor data from PTC’s ThingWorx to animate the 3D CAD model of the bike constructed in PTC Creo Parametric software. The digital twin was presented in an iPad augmented reality viewer app.

Feeding into System-Level Simulation

The sensor data that must be incorporated into simulation is usually the domain of system-level simulation, conducted in software programs like Modelica. In such system models represented in high level abstraction, the electromechanical assemblies simulated in CFD (computational fluid dynamics) and FEA (finite element analysis) programs are considered subsystems. SLM (which governs CFD and FEA data) and system modelers have in the past remained apart from each other. But the pursuit of IoT may bring the two closer.

“We don’t believe these two types of simulation need to be kept in their own silos,” says Shankar.

Similarly, at ANSYS, “We see them [system modeling and FEA simulation] as coupled simulation. We have a technology that lets you create a reduced order model out of your detailed 3D model, and use that for your system level representation,” McDevitt says.

Siemens PLM Software Visualization Visualization of results without the need for specialized post-processing tools enables non-experts to quickly access critical insights that simulations can provide. Image courtesy of Siemens PLM Software.

ANSYS’ system modeling program Simplorer is described as “an intuitive, multi-domain, multi-technology simulation program that enables engineers to simulate complex power electronic and electrically controlled systems.” ANSYS customers may use ANSYS Workbench as the integration platform to bring together system modeling and finite element analysis.

Juggling Private and Public Cloud

Among small- and mid-sized businesses (SMBs), SLM may also be in the midst of a transition from in-house data centers to remote clusters and on-demand cloud resources. In the long run, the public cloud levels the playing field with its pay-as-you-go pricing and no-upfront-investment proposition. But in the transition period, SLM may need to straddle both on-premise clusters and on-demand cloud. The hybrid approach lets companies harvest the hardware they have already invested in, but also access additional horsepower from outside to address peak demands and overflows.

“We make our solvers available on Rescale, an on-demand cloud simulation platform. Customers have the option of using their existing licenses for the Siemens solvers or using licenses provided via Rescale on a pay-per-use basis. The advantage for the customer is the scalability of the IT platform,” Shankar says.

Though when looking at SLM, companies should also consider its broader implications. “You can’t talk about SLM without talking about process management and HPC management. HPC clusters have to be managed and monitored to keep track of the jobs’ progress. Our customers simulate very large sub-systems and complete products. Workflows can involve several groups distributed around the world. Organizations need tools to coordinate the process as well as manage the data,” says McDevitt.

In May, ANSYS released ANSYS Enterprise Cloud, which allows businesses to integrate public cloud resources into their simulation workflow. Users may access and manage the in-house and remote computing resources through ANSYS EKM, configurable for both individual users and collaborative teams.

Remote Visualization is Essential

Because of the size of the models involved in large-scale simulation runs, most experts recommend an IT setup and workflow that avoids or minimizes data movement. “Downloading and uploading simulation data involves a lot of wasted time and resources. So you want the data to reside in once place where people can access it and view it. Remote visualization is critical,” McDevitt says.

This method can also minimize data transfer. “When you take on design optimization, you need a lot of computing power and also generate lots of data. With powerful remote visualization tools, there’s no need to transfer the data back to your local systems,” says ESTECO’s Nicolich.

Capturing the Process in an App

In the past, developing repeatable simulation processes and protocols was just a prudent way to conduct business. But there’s an added benefit. “Mapping out your process and codifying your simulation protocols takes some time and energy away from regular work, so you might think it’s secondary, but to develop a SimApp and gain its benefits requires a streamlined process,” says Juan Betts, managing director of Front End Analytics.

A simulation app—or SimApp, in Front End Analytics lingo—is usually built on top of general purpose simulation solvers. With limited input fields and guided steps, they have become one of best ways to make software-driven simulation more accessible to the non-experts. Front End Analytics specializes in developing and deploying SimApps. “There have been cases where we had to diplomatically tell our customer that their processes are too ad-hoc to create a SimApp. We have then worked with these customers to mature their processes to the point where we can create a SimApp for them,” says Betts.

The company can build apps directly on enterprise SLM systems such as SimManager (from MSC Software) or SIMULIA SLM (from Dassault Systemes). It also uses EASA’s app-building platform to serve the clients. “If you have processes that don’t change a lot, your simulation steps are established, and data management is important to you, then we can probably build our apps directly on your SLM system,” says Betts. “But the drawback with that is, the ‘app-ification’ capabilities in SLM systems are still very limited. The variety and range of apps you can build in something like EASA is much higher.”

The best-case scenario, Betts pointed out, is where the app-building exercise leads to well-defined simulation processes in a company. “Because when you’re architecting an app, you’re also architecting a process,” he says.

Keeping the Data Clean and Lean

As a way to curb the exponential growth of SLM data, many experts recommend carefully selecting the type of data to archive. “Even if your model is just a few hundred megabytes, in a complicated process that involves a few hundred simulation runs, you will generate terabytes of data,” says Nicolich.

Even for large enterprises with considerable financial muscle, accumulating and maintaining terabytes of data for every project is not a pragmatic solution. The cost and IT burden of storage aside, the sheer volume of data would make analysis and reuse impossible.

“For design optimization or design of experiment, you may not need to keep the resulting models. You can just keep the input parameters, output responses, and key performance indicators. If you need to, you can run the simulation job again. Otherwise you end up keeping lots of useless data you that you won’t look back,” says Nicolich. In ESTECO’s simulation optimization platform, “You can set an expiration date on a data set or a model. So even if you forget to delete it, when the expiration date comes, the system automatically removes the data,” he says.

Balancing the archival needs and the preference for data cleanliness is more difficult for those who manufacturer products with longer lifecycles. “With aircraft, you may need to support the analysis of a model designed 50 years ago. With a wearable consumer device, the data is obsolete two years later,” says Siemens’ Shankar.

Cloud storage offers a viable alternative to those who prefer to archive SLM data for decades. But for some, security concerns still remain a barrier. “The reluctance to go to the cloud for a long time had to do with data security. That attitude may be changing now, especially among small- and mid-sized companies. But ironically, the big guys are the ones who generate those large-volume simulation data, and they are usually the most reluctant to go to the cloud,” says Christine Wolfe, lead product manager for multiphysics offerings at ANSYS.

SLM is relatively young, certainly younger than its cousins PLM (product lifecycle management) and ERP (enterprise resource planning). The rapid expansion of SLM offerings in the last five years or so reflects manufacturers’ increased reliance on digital prototyping and simulation. Deployed with tools for HPC management, remote visualization and IP security, SLM could help consolidate ad-hoc simulation jobs into a well-defined company-wide infrastructure for simulation-driven design. It should be treated as part of your simulation strategy, not a necessary evil.

More Info

Share This Article

Subscribe to our FREE magazine, FREE email newsletters or both!

Join over 90,000 engineering professionals who get fresh engineering news as soon as it is published.


About the Author

Kenneth Wong's avatar
Kenneth Wong

Kenneth Wong is Digital Engineering’s resident blogger and senior editor. Email him at [email protected] or share your thoughts on this article at digitaleng.news/facebook.

      Follow DE
#14198