Access to Big Data is Changing Design

Design engineering teams turn to data and new workflows to thrive in the era of digital disruption.

The relationship between engineering design and data is changing.

In traditional product development methods, design leads to a specific dataset that describes a product. To make a long story short, that dataset is defined in CAD software, refined and tested in computer-aided engineering (CAE) software, and then given to manufacturing to become a physical product. Various informational inputs add important details throughout the process.

But this time-tested method is being forced to evolve. We are reaching a turning point where data — pre-existing or gathered — drives design. Digitally collected data can now inform design at all stages. Instead of design teams only creating new ideas or using existing information to validate a design, they can now gather data from a variety of sources and use it to articulate and inform design. The change is coming because we now have digital access to data from many new sources, including social media, online parts libraries, materials and properties databases, and and even directly from devices connected via the Internet of Things (IoT).

Industry analysts see this rise in data-driven design as one aspect of the phenomenon known as Industry 4.0. Manufacturing companies are transforming operations — including design and engineering operations — by adding smart technologies to the mix. Manufacturing strategy consultant Oliver Wyman forecasts the number of connected objects across all industries to top 75 billion by 2020.

Manufacturing strategy consultant Oliver Wyman forecasts the number of connected objects across all industries to top 75 billion by 2020.

As defined by an industry consortium, Industry 4.0 refers to a fourth industrial revolution. The term is often used to explain the rise of cyber-physical systems (CPS). Beyond mechatronics, CPS are next-generation engineered systems that require tight integration of computing, communication and control technologies that include engineering, manufacturing and on-site operational data.

The manufacturing industry research firm TrendForce predicts the market for “smart manufacturing solutions” such as CPS will hit $320 billion by 2020. The spending will come from upgrades in software and hardware used internally as well as new infrastructure to connect manufacturers with their products in the field. It also includes such advances as robotics, artificial intelligence, machine learning and data analytics. These last three are becoming particularly important to product development.

The manufacturing industry research firm TrendForce predicts the market for “smart manufacturing solutions” such as CPS will hit $320 billion by 2020.

Some of the biggest names in computing and industrial automation are heavily investing in machine learning, one key to data-driven design. Goals include reduced labor costs, elimination of product defects, limiting product downtime, transition improvement, and overall higher production.

Increased competitive pressure and the need for shorter product lifecycles are among the business trends affected by these technology advancements. A focus on innovation is necessary for manufacturing organizations to conceptualize, realize and utilize new features in their products consistently and at a faster pace.

“Throughout history, as I’ve observed it, people do not change their methodology for design unless they have to,” said Mentor’s CEO Wally Rhines at Mentor’s Integrated Electrical Solutions Forum for the automotive industry last fall. Mentor was acquired by Siemens last year to advance the company’s Industry 4.0 vision. “So if they have something that works, they keep doing it. Our industry continues to come in with things that are much more efficient, faster, better and so on, but there is a strong reluctance to adopt until the system you’re using stops working and then you have to adopt.”

IoT Research Initiatives

However, research efforts are underway to translate the new possibilities of Industry 4.0 into action. One initiative is Smart Innovation Engineering (SIE), which uses explicit knowledge of formal decision events. Previous decision events or sets of experiences are stored and made available for reuse. Product innovation is a “highly complex process that requires vast knowledge and other external factors,” write the research team at the University of Newcastle, Australia. SIE is a method for collecting team-based knowledge of past experiences related to production innovation.

As designed, SIE offers flexible access to previous innovative efforts, and can be further extended and used for lean innovation and sustainable innovation.

A research team in Sweden is proposing Model-Driven Engineering (MDE) as another way to support the increased need for innovation and the demand for shorter product cycles in an IoT framework. A connected system of devices and controllers may be permanently committed to one data-driven purpose, or they may serve what the researchers call emergent configurations (ECs); the various sensors, networks, and machines cooperate temporarily to achieve a goal. Because EC status can change unpredictably (mobility issues, dead batteries, breakdowns), the engineering model can serve as the base reference platform for analyzing data for further design work and making real-time performance changes.

The MDE research team say companies who have already started to use model-based engineering will be able to move more quickly to integrating the engineering model with the IoT. The long-term goal is self-adaptive systems, where the model works with both machine learning and artificial intelligence programs to inform collaborative development, improve reusability of existing products and systems, and to enable runtime self-adaptation.

Industry vendors are also investing in the IoT. For example, Dell Technologies is making $1 billion investment around IoT strategy help customers make the transition, says Scott Hamilton, industry strategist for Engineering and Manufacturing.

“Over the next three years, Dell Technologies develop IoT products, solutions, labs, partner program and ecosystem including data analysis, converged infrastructure, security hardware accelerators and deep learning products,” he says. See Dell Technologies IoT Lab Strategy Evolves with Customer Needs & Maturing Market.

The Connection Between Data-Driven Design and GPUs

Data-driven design: NVIDIA says its Quadro GP100 combines unprecedented double precision performance with 16GB of high-bandwidth memory so users can conduct simulations during the design process and gather realistic multiphysics simulations faster than ever before. Image courtesy of NVIDIA. NVIDIA says its Quadro GP100 combines unprecedented double precision performance with 16GB of high-bandwidth memory so users can conduct simulations during the design process and gather realistic multiphysics simulations faster than ever before. Image courtesy of NVIDIA.

Big data is best analyzed using parallel computer processing — the same approach to computing used for advanced graphics. Graphics processing unit manufacturers are reporting increased use of their GPUs for data-intensive tasks such as big data analytics. One example: For the past nine years NVIDIA has sponsored a series of international GPU Technology Conferences. The first couple of years, most presentations were on graphics processing and related physics, but by 2017 such graphics-based uses were competing for attention with artificial intelligence, machine learning, robotics, and autonomous transportation.

“The beauty of the virtual world is you can see the future without creating it,” said Bernard Charlès, CEO of Dassault Systèmes at last year’s Design in the Age of Experience conference in Milan, Italy. “We are investing heavily in it.”

Almost all engineering workstations are already equipped with GPUs suitable for model-based engineering and some simulation work that enable such virtual visualizations. A new round of workstation and GPU refreshes is taking place as designers move into virtual and augmented reality (VR and AR) and expand upfront simulation with easier to implement tools, such as ANSYS Discovery Live. These same upgrades will also equip workstations to handle the workload increase inherent in data-driven design.

Andrew Rink, NVIDIA’s head of strategy for manufacturing industries, sees an uptick in interest for the company’s most advanced offerings — specifically for engineering data processing. He sees benefits from using GPUs for data-driven design in a variety of feedback scenarios.

“Some products are unintentionally over-engineered,” Rink says. “Data gathered from operations can inform design modifications to components or help select better materials more appropriate to the task.” Both anecdotal and systems data input can lead to lightweighting based on materials changes and design changes such as mesh interiors, which can be 3D printed, for example.

Another data-driven design opportunity lies in understanding real-time localization issues. If a refrigeration unit is surrounded by significant swings in ambient room temperature, the data-driven design feedback loop can be used to re-engineer the unit to adapt in a more site-specific fashion.

“In some cases, depending on industry sector, customers know more about machines than the original manufacturers,” says Rink. They service it and they use it, he notes, so why not use site data to optimize both services and design?

Rink recommends starting small with data-driven design. Select one or two key attributes and analyze just that data. “Start small, find the benefit, and build from there,” he says. This approach is known as performance-based analysis (PBA). Real-world data from product prototypes is gathered specifically for product design optimization. Instead of making assumptions or using generalities to define specific capabilities, PBA uses data from a connected prototype to get targeted information. That data can then be used to redefine design boundaries and constraints, to make more efficient products.

The next step is to gather data from installed products in the field. Instead of running analytics on the results from one prototype, you now can harness the data from thousands of products to better understand use and behavior. Perhaps a product runs better in specific climates; having real-world data can help redefine product iterations to account for such performance variations.

CAD/PLM and IoT software vendor PTC says its research shows the availability of real-world data at scale makes possible:

  •   Quality and reliability design improvements;
  •   analysis on the impact of potential design changes;
  •   simulation to improve system design or software control systems;
  •   insight into engineering working envelopes; and
  •   better product-based business decisions.

“We’re helping to guide the PLM industry through generational transformation reimagined in the context of AR (augmented reality) and IoT,” PTC CEO Jim Hepplemann told the crowd at the company’s LiveWorx 2017 conference. “It’s the industrial revolution colliding head on with the digital revolution.”

Share This Article

Subscribe to our FREE magazine, FREE email newsletters or both!

Join over 90,000 engineering professionals who get fresh engineering news as soon as it is published.


#17507