Can AR Enhance Design?

Though augmented reality brings benefits for collaboration and design review, hardware limitations make engineers wonder about its viability.

Though augmented reality brings benefits for collaboration and design review, hardware limitations make engineers wonder about its viability.

Image courtesy of Getty Images/gorodenkoff.


Propelled by a new generation of hardware and software, augmented reality (AR) appears poised to assume a significant role in product design. But how much of this perception is based on hype, and how much actual value will the technology add to the design process? Will AR technology dominate the design and development field, or will it be coupled with complementary technologies?

Before companies adopt AR technology and select a platform, they have to understand what this particular immersive technology offers, examine its strengths, recognize its shortcomings and decide if these visualization platforms will deliver enough value to their development efforts to justify the expenditure of time and resources to incorporate them into their designers’ toolkits.

Taking Design to the Third Dimension

Developers of the immersive technology have been working to leverage the technology’s chief strength, visualization, which has long been a core element in the design process.

Until recently, designers relied on CAD modeling, rendering and simulation to envision and shape product concepts early in the development process. Unfortunately, the constraints imposed by computer screens have hampered designers, with the 2D medium suffering from disconnects between the design concepts and the realities of scale and spatial context.

To break free of these constraints, AR system developers offer interactive visualization, promising to help designers shift from passive viewing to an immersive experience. This type of visualization has the potential to provide engineers with the ability to choose multiple paths of exploration, allowing them to examine relevant points of interests from different angles and at various scales. In addition, AR can visualize clusters of interrelated data. With simultaneous access to different data sources, design teams can make better informed decisions.

AR systems can also help engineers place their designs within the actual context of use and interact with their ideas, rather than try to glean insights by studying static images on screens. A real-time environment lets designers assess how a design responds to various conditions. It further enables engineers to understand connections between operations and product performance.

The challenge now confronting design teams is in determining the extent to which these promises are true.

Blending the Digital and Real Worlds

AR aims to provide an interactive experience, where the system overlays digital data on a real-world environment in such a way that it is perceived as an immersive aspect of the real environment.

“Interactions with the virtual product are far 
from being ‘natural.’”

— Nicolas Dalmasso, ANSYS

The technology’s developers claim that their systems give designers the opportunity to manipulate 3D models with their hands and allow them to place design concepts in the real world, where they can walk around models and get a feel for form, proportion, mechanical processes and the product’s relationship with the environment.

“This is exactly where displaying results from a simulation on top of the real product can be extremely useful,” says Nicolas Dalmasso, chief technologist at ANSYS.

“It can help engineers and designers better understand how the product they are developing behaves within its real environment. For example, while developing HVAC [heating, ventilation and air-conditioning] systems, displaying CFD [computational fluid dynamics] results within the AR device on top of the real car really helps in the understanding of how shapes affect fluid propagation while adding the user within the environment.”

Incorporating semantic information or metadata from CAD systems directly into the AR environment can help explain or present product specifications during design reviews. For instance, gap and flush information, material properties and part information aid decision-making when reviewing the designed product.

“During the process of visualizing or creating an idea, there are many pieces of information required throughout the process: product specifications, interface specifications and resource requirements for every addition of a new or improved feature,” says Therese Fessenden, user experience specialist at the Nielsen Norman Group.

“Designers regularly must stop what they are doing mid-task to gather information that is not immediately on hand,” she says. “With AR, designers can continue working, dedicating mental energy toward the task at hand, without leaving or distracting themselves to gather that additional information.”

AR’s visualization strengths also include projection mapping, which has proven suitable for very large products or contexts. This feature, however, comes with a caveat.

“Space and relatively static locations become a limiting factor,” says Eric Kam, manufacturing business channel marketing and alliances director at the ESI Group.

With these features in mind, proponents of AR assert that the technology’s strengths can accelerate time to market and add new dimensions to collaboration. That said, the technology faces challenges that affect the productivity at the designer’s workbench.

Hardware Shortcomings, Performance Limitations

Some obstacles compromising AR’s ability to deliver enhanced visualization relate to hardware limitations. For example, consider the delivery devices used by AR systems. The devices fall into four general categories, each facilitating immersion to varying degrees. These include heads-up displays, holographic displays, smart glasses and handheld systems. The current technology suffers from shortcomings that prevent AR devices from achieving their full potential.

“Notable barriers for real AR adoption are lack of resolution, dynamic range and field of view, which are far from what a human eye can do,” says Dalmasso.

In addition, AR system providers also must address problems within the systems’ interaction capabilities.

“Interactions with the virtual product are far from being ‘natural,’” says Dalmasso. “Lack of haptic feedbacks and difficulty tracking gestures are some of the limitations slowing down adoption. Analyzing gaze (eye tracking) and some other human actions, such as voice, could help improve understanding of the user’s intent.”

AR device developers are still grappling with ergonomic issues. Users still find current AR systems relatively intrusive. Wearable devices can be heavy and cumbersome, and tethered systems can distract users while they are attempting to perform their task.

Developers could address these issues by reducing the size of the device, but if they do so that generally comes with a limitation of computation power. As a result, developers of these systems face tough trade-offs in balancing power demands with form factor limitations.

Power limitations present more of a challenge than ergonomics for AR devices. This points to an even larger problem facing designers using AR.

“Current AR devices have limited compute capabilities and limited memory, which generally causes the devices to struggle to perform computationally intensive tasks,” says Dalmasso. “For example, AR devices generally come with limited GPU [graphics processing unit] capabilities, preventing them from using state-of-the-art rendering techniques like real-time ray tracing or complex lighting rasterization. As a result, a photorealistic or physically correct rendering of a product cannot be easily achieved within an AR environment.”

Prepping Data for AR

The constraints imposed by limited compute resources also hinder AR systems’ ability to process CAD data into a usable format for AR systems. Many AR systems simply have limited ability to render full CAD models.

“When discussing graphics with extended reality [which includes AR] professionals, you will find yourself often discussing numbers of polygons as a way of describing modeling complexity, since all the 3D models will at some point be broken down from complex solids into collections of smaller, more discrete polygonal or tessellated models,” says Kam. “The number of polygons that can be rendered in a 3D view is limited by the available computing power on the CPU and GPU of the extended reality system.”

Most—if not all—stand-alone, mobile device and handheld AR systems cannot load large polygon models. CAD data of something like a fully modeled automotive engineering dataset likely consists of tens or hundreds of millions of polygons, which exceeds the capacity of AR devices.

Therefore, designers using AR systems must perform an additional step, optimizing and simplifying the models’ geometry so that it fits the “low-poly” requirements of the device. Unlike full CAD model poly counts, these “low-poly” requirements are often measured in the tens of thousands of polygons or hundreds of thousands of polygons. This process reduces complexity and makes it possible to load and use the data in lightweight viewing.

An additional step called decimating reduces the model size by a factor of 10 or 100. Unfortunately, this process introduces inaccuracies in the representation of the model.

In other cases, engineers might manually “cull” the data by eliminating some CAD objects that they decide might not be needed for the immersive review. This introduces the risk that the engineer might cull influential objects for the intended review, in effect whitewashing (or greenwashing) the very problems that the review was meant to identify.

“For most engineering teams, this is a non-value added step,” says ESI Groups’ Kam. “It also introduces the chance that during the optimization of data, decision-making relevant data is simplified in a way that masks a potential issue.”

Automated Processing of CAD Data

Moving models from CAD systems to AR systems becomes problematic for a number of reasons. Designers run into roadblocks thrown up by shortages of compute resources. Furthermore, using existing simplification, optimization and decimation processes exposes the design to complications that slow and even compromise the development process.

The questions that surface include: Would these problems go away if the two systems were tightly integrated? Would this eliminate or mitigate the challenges arising from the need to transfer data between the two systems? Companies like PTC believe the answer to both questions is “yes.”

“It’s a multi-step process with errors and losses of time at each end,” says Luke Westbrook, product management specialist, PTC. “AR should never be a separate, labor intensive tool. If you’re switching between different tools, that’s suboptimal.

“The way we see it, users shouldn’t have to worry about optimizing anything to efficiently work in an AR environment,” he says. “But for this to happen, the CAD, simulation and AR tools need to be tightly integrated. The optimization should be done on your behalf and put into a format for the AR viewer.”

What Does the Future Hold?

Attempts to see what the future of AR holds inevitably lead to face-offs between AR and its chief competitor, virtual reality (VR). There is the assumption that one of these technologies will rise to the top, while the other fades into oblivion. That probably won’t happen.

“I think that AR and VR don’t have to be placed in such mutually exclusive buckets,” says Kam. “Instead, there are likely many workflows where the two display and visualization technologies are complementary.”

In the near future, advanced forms of AR devices will likely incorporate elements commonly associated with VR. This is why futuristic AR headsets like HoloLens glasses are often confusingly described as “mixed reality” devices. 

Thoughtful consideration of the potential of the two technologies ultimately reveals that the binary distinction between AR and VR creates an inaccurate picture of the evolutionary path of immersive technologies. The fact is that the future will likely belong to devices that combine elements of both.

More Ansys Coverage

More ESI Coverage

Share This Article

Subscribe to our FREE magazine, FREE email newsletters or both!

Join over 90,000 engineering professionals who get fresh engineering news as soon as it is published.


About the Author

Tom Kevan's avatar
Tom Kevan

Tom Kevan is a freelance writer/editor specializing in engineering and communications technology. Contact him via .(JavaScript must be enabled to view this email address).

Follow DE
#22948