Coping with the Analysis Data Deluge

The growth of simulation spurs a new quest for data management.

The growth of simulation spurs a new quest for data management.

By Kenneth Wong

 
Dassault Systemes
Insight 5.6 from Dassault Syst mes gives users a way to automate the execution of hundreds or thousands of simulations.

Last November, Scott Imlay, Ph.D., chief technology officer of Tecplot Inc., made a short trip across the bridge from his company’s headquarters in Bellevue, WA, to Seattle, where the Supercomputing 2011 conference was taking place. While swapping war stories with fellow simulation experts, he overheard one of his customers speaking of performing “tens of thousands of simulation runs.” The number might seem extraordinary to untrained ears, but it didn’t faze Imlay at all.

“Some ]customers] are even talking about running hundreds of thousands of cases,” he notes.

In the not-so-distant past, when manufacturers had no choice but to build and destroy concept mockups made from foam core, clay and plastic to test their designs, they kept their simulation exercises to a small number. After all, it’s not economically feasible to construct and break 10,000 sample units of a product, be it an iPod docking station or a new sports utility vehicle. But the switch from physical to digital prototypes changed the practice. With a modest investment in hardware and software, design firms can now virtually explode, stretch, crash and scorch a digital mockup made of pixels—over and over, with very little overhead. In fact, they don’t even need real laboratories to run these tests. They can run them in virtual space, right from their desktop workstations. So, as if to make up for all the years they’ve been holding back, engineers have begun performing tens, hundreds and thousands of simulation runs, sometimes on many different variations of their designs simultaneously.

Studying the scattered remnants of five to 10 drop tests and making intelligent deductions from them is easy. But sorting, storing, processing and reviewing the digital output from hundreds or thousands of finite element analysis (FEA) tests? That’s beyond the scope of human capacity.

“The topic of simulation data management comes up every time I’m at an event,” reports Bob Williams, a member of Autodesk’s SIM Squad. “It’s one of the hottest topics out there right now.”

Should you treat simulation as part of your product lifecycle management (PLM) strategy, and should you store simulation information in your product data management (PDM) system? Can you manage simulation as you would a business process? What is the best approach? At the present, few experts can point to a series of workflows you can follow as best-practice templates. But the worst you can do, as Williams observes, “is to do nothing.”

Content and Context
In technology, there are two ways to spot a pressing issue:

1. The industry begins using acronyms to refer to the problem. (If you’re going to be talking about it a lot, you’d better abbreviate it to save your breath.) So far, the data deluge in simulation has already produced two competing acronyms: SDM (simulation data management) and SLM (simulation lifecycle management).

2. An analyst firm writes a report to highlight the issue. CIMdata, a firm that closely monitors the PLM industry, decided it was time to bring SLM-related headaches to the forefront with a report—sort of like the FBI adding a new mug shot to its Most-Wanted list. The outcome was “Simulation Lifecycle Management,” a report released in July 2011. It was underwritten in part by Dassault Syst mes, which develops and markets the multiphysics simulation software suite SIMULIA Abaqus.

Dassault Systemes
Tecplot Chorus gives users a way to visually compare, contrast and inspect a
series of simulation sessions from a single window.

Keith Meintjes, Ph.D., CIMdata’s practice manager for simulation and analysis, points out what might be the crux of the problem: “Individual engineers running their own simulation projects know what they’re working on today. But ask them about the details of a project they did six months ago, they’re not quite sure. Ask a different engineer to look at another’s project, and they won’t really know what it is.   The context and rationale are missing.”

Most simulation software packages were originally developed to answer urgent design questions—one at a time: What would happen to the display screen of a phone when the battery heats up over three hours of continuous usage? (That calls for a thermal-electrical analysis job, in anticipation of a teenager chattering on a smartphone.) How will saline fluid behave inside the redesigned chamber of a catheter? (That’s a multiphysics computational fluid dynamic, or CFD, job.) Most analysis software packages are quite capable of addressing these questions as they come up, but storing, managing and comparing the outcome from hundreds and thousands of analysis jobs is not their specialty. If they include tools for that, they’re mostly an afterthought, not part of their primary mission.

To reuse designs, engineers must be able to know the context of a previous design. When the context is missing, and figuring it out seems like a daunting task, an engineer is invariably tempted to make a new design rather than to reuse an old one. Meintjes recalls a costly incident involving the reuse of an automobile horn in a new vehicle—“except the orientation of the installed horn was not recorded with the original design. With a different installation in the new vehicle, there was a 100% failure due to water intrusion and corrosion.”

For simulation, a lack of context may force engineers to reproduce a new mesh model, rewrite the specifications (materials, load, pressure, fixture types, etc.), and run a test that someone else might have already ran six months ago. In a business where digital simulation is routinely performed, duplicate simulation runs can tie up precious computing resources and server time—not to mention the delays in decision-making. For example, no one knows about the CFD test an engineer has run to verify that the valves are wide enough. So the same test is rerun, prompting the design team to put everything on hold for the next 24 to 48 hours.

“You need to manage the process, and also the results,” says Meintjes. “You need to know the provenance of the data ]for example, the specific CAD model used to create the mesh model], to make sure you’re using the correct geometry, material properties, and so onYou also need to understand the context of the data: What design question or performance evaluation does the simulation address?”

Selective Storage
As part of Autodesk SIM Squad, Williams and his colleagues take the time to respond to simulation software users’ questions. He describes that the methods some frustrated project managers have tried to tame the monstrous growth of simulation data “run the whole gamut, from a network location ]a shared folder on a server] to store result files, to burning DVDs to archive and setting up terabyte ]removable] drives.”

Instead, Williams suggests a delicate balancing act of storing enough data to be able to recreate the simulation conditions if you need to, but “don’t store so much that it overwhelms you.” If you ran a transient analysis, for example, do you need to retain the digital data generated for every time step of the event simulated? Can you settle for storing the data representing a critical stage in the event (say, the stage at which the product fails)?

Williams points out that one drawback with just storing the metadata—material properties, load and base geometry—is the possibility that, if you rerun the same simulation exercise using the same metadata but on a different hardware platform or a newer version of the analysis software, there’s no guarantee that you’ll get the same result. (Presumably subsequent analysis runs produce better results, generated by better hardware and improved solver code.)

A more formal approach is to use a PDM, like Autodesk Vault, to index and archive the data generated from simulation sessions.

“You can check in or check out files related to the simulation job,” Williams explains.
“You can also add notes and comments to the files. Managing simulation files and procedures inside PDM also helps you communicate better with designers.”

Another approach is to use a dedicated simulation lifecycle management application, such as SIMULIA SLM from Dassault Syst mes. The product extends Dassault’s ENOVIA PLM with functionality that is specific to managing simulation.

“This makes simulation a managed, repeatable process, which is important to many of our customers, ensuring consistent quality in the same way a well-managed assembly line ensures the quality of a manufactured product,” said Steve Levine, SIMULIA’s senior director of strategic planning. “The globalization of product engineering has made it increasingly challenging to ensure that best-in-class simulation methods can be shared and used across sites and national boundaries, while protecting important intellectual property. In practice, simulation contains critical information regarding the steps and conditions of the analysis—or in other words, the simulation scenario. We’ve learned that defining and managing this context is essential to leveraging the value of the data being stored, an aspect that is often overlooked.”

 

Teamcenter

Siemens PLM

Meshing models

Siemens PLM Software’s Teamcenter data management allows you to manage simulation data, such as mesh models, material properties and result files.

Who, What, When, Where, Why
To better manage simulation data, engineers need to record not just a set of results—the temperature at which a design fails, along with an animation file showing the deformation process—but the five Ws investigative journalists always demand: Who performed the simulation? What type of simulation was performed? When, or at which stage of the design cycle, did the simulation occur? From where did the geometry, material properties and load conditions originate? And, perhaps most importantly, why was the simulation done?

“Traceability is important,” notes S. Ravi Shankar, director of simulation product marketing at Siemens PLM Software. “There are more people inside a company ]beside engineers and simulation experts] who need access to the knowledge that CAE ]computer-aided engineering, or simulation] programs are generating. How do you make it more accessible without forcing them to become simulation experts themselves?”
The logical answer, Shankar proposes, is to manage simulation data inside a PLM system.

“What we’ve done with Teamcenter ]Siemens’ data management software] is to enhance the data model so it recognizes CAE data,” he explains. “The data model enables Teamcenter to not only capture all CAE data, such as finite element meshes or loads, but also to establish the proper relationships between the various types of CAE and design data. We set it up so that you can launch the simulation application—be it a solver or a pre-processing program—from the Teamcenter environment itself. And the results and reports are stored back in Teamcenter with the right links to existing data. This enables others to figure out, at a future date, what work was done, on what version of the design, what the results were and, if the design was changed because of it, who was notified.”

Because PLM systems track nearly every facet of the design cycle and revisions, if the design has been altered since a simulation is performed, it will be much easier for engineers to identify CAE results as out-of-date, Shankar points out.

Teamcenter can also automate the process of building the correct CAE model for a given type of simulation, explains Shankar. If the user is performing, say, noise and vibration type analysis, he or she can specify “the different components he or she needs to take into consideration, the type of mesh model to use, and the process is automated to build the CAE assembly. So when the user opens the model in NX CAE ]Siemens’ simulation software] or a preferred pre/post environment, all the parts are in the right place. Some of our automotive customers have used this type of automation to significantly speed up the model build process.”

Strength in Numbers
Another sign of digital simulation’s growth: Users are shifting from processing one scenario at a time to considering many scenarios simultaneously. It’s reflected in Isight 5.6, the latest release of Isight from Dassault Syst mes, which is said to provide “designers, engineers and researchers with an open system for integrating design and simulation models—created with various CAD, CAE and other software applications—to automate the execution of hundreds or thousands of simulations. Isight allows users to save time and improve their products by optimizing them against performance or cost metrics through statistical methods such as Design of Experiments (DOE) and Design for Six Sigma.”

One of the new features is a method for users to compute and sample around the most probable point of failure in a design, according to a Dassault Syst mes announcement. The software also includes updates to the Abaqus component (part of the SIMULIA software suite) to run multiple Abaqus cases. It does so with the option to parse all detected input files and create output parameters for multiple analyses. An improved data-matching component then lets you define and match multiple result data sets within multiple ranges.

“SIMULIA SLM suite, including Isight 5.6, is conceived to address the needs of heavy, multi-discipline simulation users such as the automotive OEMs, to ensure that simulation can become an integral part of their business processes,” states Levine. “In our view, tremendous efficiencies in new product development will come when design options are first evaluated by realistic, multiphysics simulations that are shared, managed and even automated, uniting the best simulation technologies available with proven in-house design practices. The advent of public and private clouds will produce breakthroughs in access to high-performance computing, and become a key driver for this transformation.”

Spotting Anomalies
Technology enables you to automate hundreds of thousands of simulation jobs, but when it comes to scanning and studying the outcomes, Tecplot’s Imlay says he believes “there are few things better than the human eye for detecting anomalies in patterns.”

Tecplot, which develops and markets the Tecplot 360 software, specializes in simulation data visualization. Its latest product, Tecplot Chorus, allows you to display a series of results simultaneously in the same window, making it possible for you to identify anomalies that you might otherwise miss.

“In terms of simulation management, process management and decision support, we’re fulfilling a unique niche,” Imlay says.

If each analysis session is the equivalent of a doctor’s visit to address an individual ailment troubling your design, you might think of Tecplot Chorus’ function as the annual checkup, where the primary physician studies your design’s medical history over the past year to look for any warning signs.

Imlay admits he is not sure “every analysis project rises to this level that needs to be managed in a PLM or PDM system.” A better approach, he proposes, is to keep the original data where it is, but extract the metadata, independent variables and graphics representing the outcomes from analysis sessions for comparison and diagnostics. In the future, Tecplot plans to add more automation features to identify anomalies in mounds of FEA and CFD results. But for now, Tecplot’s aim is to keep users “in the loop, but help to optimize their time.”

Analysis of Analysis Results
CIMdata’s Meintjes has more than an abstract understanding of the subject. In a “previous life,” as he puts it, he directed simulation engineers for vehicle and powertrain programs, and managed simulation strategy at GM.

“We had more than 150 simulation applications to look at different aspects of ]vehicle] physics. We were doing structures for stiffness and crash, thermal analysis and fluid flow, kinematics and dynamics of mechanisms, electromagnetics, human factors, and all the rest,” he recalls, noting that although 150 was an improvement from the more than 600 disparate applications the company used to juggle, the complexity continued to grow.

“Managed simulation data can be a competitive advantage; unmanaged data can become a huge liability,” he states. “A few years ago, except for a few heroic individuals’ efforts, there was very little cross-discipline analysis. Simulation engineers were pursuing competing objectives—for example, body stiffness for vehicle ride, handling and noise, vs. energy absorption for crash. Now, we have learned how to balance these requirements simultaneously, using multi-discipline optimization.”

Looking Ahead
As simulation becomes an essential part of design validation and design optimization, the discipline also becomes a point of contention—and a source of headache. Ideally, an SLM system will allow you to not only chronicle, archive and retrieve the simulations you’ve run, but provide you with a way to make sense of the cumulative outcomes, to extract wisdom that you could not get from individual simulation runs.
“History matters,” Meintjes concludes. “That’s why you need simulation data management.”

Kenneth Wong is Desktop Engineering’s resident blogger and senior editor. As part of his research for this article, he ran a wind-tunnel simulation on his own head, exported as a mesh model in a STL file. Check it out here: deskeng.com/virtual_desktop/?p=4825 and contact him via [email protected] or on Twitter @ KennethwongSF.

MORE INFO
Autodesk.com
CIMdata.com
3DS.com
Siemens.com/PLM
Tecplot.com

Share This Article

Subscribe to our FREE magazine, FREE email newsletters or both!

Join over 90,000 engineering professionals who get fresh engineering news as soon as it is published.


About the Author

Kenneth Wong's avatar
Kenneth Wong

Kenneth Wong is Digital Engineering’s resident blogger and senior editor. Email him at [email protected] or share your thoughts on this article at digitaleng.news/facebook.

      Follow DE
#1864