DE · Topics · · Sponsored Content

Product Development in NVIDIA Omniverse™

NVIDIA Omniverse has expanded its capabilities for helping manufacturers collaborate and improve product development processes.

NVIDIA Omniverse has expanded its capabilities for helping manufacturers collaborate and improve product development processes.

The connection between Siemens Xcelerator (left) and NVIDIA Omniverse (right) will enable customers to develop full-design-fidelity, closed-loop digital twins. Image courtesy of NVIDIA and Siemens.

During the NVIDIA GTC 2021 Conference last Fall, NVIDIA officially rolled out NVIDIA Omniverse Enterprise, a scalable, end-to-end platform enabling enterprises to develop custom 3D pipelines and simulate large-scale, physically accurate virtual worlds.

Omniverse Enterprise is based on Universal Scene Description (USD), the Open-Source 3D scene description language developed by Pixar. At GTC, NVIDIA also announced a number of Omniverse Connectors for a growing list of popular design, visualization and other software tools, as well as the Omniverse Connect family of extensions and application plug-ins that allow users to connect their existing applications to Omniverse.

For companies in the manufacturing sector that may not be fully familiar with how Omniverse can be leveraged for product development and design, we spoke to Mike Geyer, Omniverse Product Manager at NVIDIA, about how these capabilities are evolving.

For manufacturers, what types of workflows can NVIDIA Omniverse potentially enable when it comes to product development processes?

Mike Geyer: To take a step back, Omniverse has been in development for several years, and it grew out of tools that NVIDIA had initially built for our own use. We realized we did not have a good way to connect many of these foundational capabilities, but if we did it would be very valuable for our customers. Since the launch, our customers have been experimenting with a number of product development applications.

From an NVIDIA perspective, we see three functional areas when it comes to use cases in manufacturing. They are full-fidelity visualization, custom application development, and synthetic data generation. By combining these functional areas in different configurations, manufacturers are able to go beyond just visualization, and actually add physics simulation and develop full-scale digital twins of their products and environments. That is how we are thinking about Omniverse and how to address the needs of our customers, and that framework is also informing our roadmap for the next evolution of the platform. All of these are applied in the service of building Digital Twins to model and simulate large, complex systems.

What is full-fidelity visualization?

To unpack full-fidelity visualization, what we see is the need to visualize a full spectrum of products in 3D, from discrete products such as electronics, to entire factories. Full-fidelity visualization of products would include rendering and other advanced tools, but this aids collaboration for design reviews. Omniverse delivers the ability to connect existing tools and enable multiple users in different locations to work together in different tools at the same time.

To make this level of visualization possible, we are connecting and extending existing tools with something we call ‘Connectors’. The goal isn’t to reinvent software already on the market, it’s to seamlessly connect and extend those capabilities. For example, we are continuing to develop connectors that can provide a live sync capability to existing software tools like PTC Creo and McNeel Rhino. We have also made a software tool kit and source code and documentation available to partners and customers so they can develop their own connectors. We are continuing to build those based on what our customers need.

We also have a tool in Omniverse Launcher called Connect Sample, which provides sample code for writing your own connectors. We have partners who have written these Connectors themselves in just a few days.

This ecosystem approach connects and extends existing tools, and this is the on ramp for full-fidelity visualization where companies can start to do that. We can bring those different things together to facilitate a more rich design review process. On the factory scale, this is really about aggregating massive data sets to conduct a factory fly through. You can do a quick evaluation of space planning, material flow, etc. There are some really interesting Connectors now. We have a Visual Components connector to create factory-level simulation studies, and ipolog has done interesting stuff around logistics and material flow optimization.

Image courtesy of NVIDIA.

You also mentioned application development – how are customers leveraging those capabilities?

Manufacturers have traditionally been forced to use software the way that vendor designed it. With Omniverse, every company has foundational tools to build their own applications. A lot of our larger customers are already approaching Omniverse from that perspective.

One customer has a very specific conceptual design review process, which is how they review new product variants and product concepts with leadership. They have built a simplified interface so that leadership can drive the design changes. They can interact with 3D models, change colors and materials, and look at what-if scenarios. It is entirely custom to them, and it looks like an internally developed tool.

On the factory side, one customer has developed their own user interface for collision detection and analysis. Other customers have built custom user interfaces. For app development, we offer Omniverse Kit (which is Python) and Omniverse Connect (C++).

We have written in the past about the use of Omniverse and digital twins. How does synthetic data dovetail into this?

I would preface this by looking at what is going on with artificial intelligence (AI). In the manufacturing space, AI is more commonplace. Companies are using AI tools for generating creative concept art, for example, and in applications like generative design.

NVIDIA first introduced the idea of synthetic data generation around self-driving cars. To get certified, these vehicles have to drive billions of miles, which is not practical. So you have to create these virtual environments that use synthetic data for that AI training. I would say that anywhere customers can imagine AI being applied is a place that synthetic data can be leveraged.

This is where Omniverse has a really compelling story, because you are not just visualizing things, but you can develop virtual worlds and digital twins that can be used to train an AI to help make decisions for itself, or to help augment human decision making. Where we are seeing that initially is in a lot of factory and warehouse applications, where you have vision-guided systems. You can use synthetic data to retrain vision guidance systems for robots when there are physical changes to the environment, for example. Rather than taking months to do this, you can retrain the AI that guides those systems synthetically in a few days.

Finally, for manufacturers who are just starting to investigate Omniverse, what sort of hardware requirements do you recommend?

Well, on the GPU side we generally recommend at least an NVIDIA RTX™ A5500 or NVIDIA RTX™ A6000 in your workstation. For users who want to test out Omniverse and RTX, we offer a cloud-based trial version through NVIDIA LaunchPad, which allows them to explore the ways in which they can leverage Omniverse without needing an RTX GPU. However, to get the full benefit of Omniverse Enterprise for product design, a professional workstation with an advanced NVIDIA GPU is the best option.


You can learn more about Omniverse and product development at the NVIDIA GTC Conference this week. This recent blog post outlines the Omniverse-related content at the conference.

In addition, Dell Technologies previously published a blog outlining how Omniverse Enterprise could be used in conjunction with its own Dell Precision workstations, Dell EMC VxRail or Dell EMC PowerEdge servers for 3D design and production. According to author John Kelly, Omniverse Enterprise can enable conceptual design reviews across locations; AI training and simulation for production robotics; supplier location; and creating massive, interactive factory layout datasets.

Dell also published a related white paper on leveraging the power of Omniverse on Dell Precision workstations.

More Dell Coverage

Artificial Intelligence for Design and Engineering Workflows
In this white paper, learn how artificial intelligence and machine learning can improve design and simulation.
Configuring a Workstation for SOLIDWORKS 2024
Learn how to select the right hardware for the latest release of SOLIDWORKS.
GPUs Drive HPC-Powered CAE and Machine Learning
JPR reports evolving CAE landscape
AI Drives Robotics and Automotive Configurators at CES 2024
NVIDIA delivers special address at the Consumer Electronics Show
HPC Performance on the Desktop: NVIDIA A800 40GB Active GPU
The new NVIDIA GPU provides powerful, double-precision capabilities for demanding engineering simulation workflows.
Tower of Power: Dell Debuts 96-core Professional Workstation
The Precision 7875 leverages NVIDIA RTX™ Ada-generation GPUs to support high-end simulation, visualization and AI workflows.
Dell Company Profile

More NVIDIA Coverage

Share This Article

Subscribe to our FREE magazine, FREE email newsletters or both!

Join over 90,000 engineering professionals who get fresh engineering news as soon as it is published.