February 1, 2020
Have you ever heard the story of a piano teacher with a strange pricing policy? Her rate for complete beginners is quite reasonable, but she doubles the rate for students who have taken lessons elsewhere. The justification? It takes a lot more work to untrain the habits someone has already picked up.
In bringing engineering applications to augmented reality (AR) or virtual reality (VR), pioneering developers face the same uphill battle recognized by this piano teacher. Due to decades of parametric and NURBS modeling experience in desktop software, users have learned to employ specific steps to create and edit geometric shapes. They learned to draw 2D sketches and add depth to create solids, to select edges and surfaces to add chamfers and holes to them, to pull on dynamic points in splines to morph surfaces and so on.
These operations were invented to overcome the limitations of the mouse, keyboard and 2D monitor that did not provide a way to directly interact with the CAD model. You cannot, for example, reach through the 2D monitor to rotate a CAD component, or pull out a pin with your real fingers.
With mixed reality (MR), the previous input limitations no longer exist. Yet, the modeling paradigm persists, because it’s part of the user’s collective understanding of 3D objects.
How should 3D modeling be re-engineered for MR? Get rid of the mouse-based geometry selection and editing methods altogether? If so, replace them with what? The developers’ challenge is quite similar to the apocryphal piano teacher’s. Not only must they undo the deep-rooted modeling methods in the software user’s mind, they must also invent a whole new way of modeling for AR and VR.
In October, Autodesk released Alias Create VR, a VR-ready version of its industrial design software Alias. The software is widely used by automotive designers to create complex and elegant surfaces.
“One of our key goals is to enable our customers to leverage immersive technology within our products in a way that offers a unique benefit for them. This thinking was the driver to add VR creation workflows to Autodesk Alias. Our automotive customers in particular were open to exploring how to bring concept designers into the 3D space early in the process, and VR made this possible,” notes Thomas Heermann, senior director, Autodesk design products.
“I’m much more a fan of AR, because when you design, you still need to design for the real world.”
In the desktop version of Alias, you use the mouse pointer to push and pull on control points in splines to shape and edit surfaces. By contrast, in Alias Create VR edition, you use VR controllers with a two-handed approach to push and pull on control points. You may also pull on surfaces as if there were cloth to cover certain regions. With hand-tracking in certain VR systems, there’s also a possibility that soon you might just use your fingers instead of controllers.
“We are working very closely with our hardware partners, as well as observing technology trends to better understand how our users are working in VR. The biggest hurdles reside in user comforts: how it feels, whether users can do what’s needed in VR without getting fatigued. For us, the hardware is still in the early stages, but we expect there will be a lot more technology leaps in this space to come,” says Heermann.
Stylus or Fingers
Though flyingshapes is described as “CAD for VR,” Dr. Johannes Mattmann, co-founder of flyingshapes, admits he doesn’t have a strong CAD background, which actually may have helped in this case.
“When we started developing flyingshapes, we focused primarily on how modeling should work inside VR; we didn’t think about how other desktop CAD tools work,” says Mattmann.
In the markets that flyingshapes wants to target, surfacing software such as Autodesk Alias and Rhino are the dominant brands. The company’s founders and some developers came from the automotive sector, where such programs are the standard; this may explain flyingshapes’ similarity to these packages.
“The VR nature of the software opens it up for users who shy away from the effort of mastering one of today’s CAD tools,” says Mattmann.
Recently, computer peripheral device maker Logitech unveiled a pen-like device for VR, dubbed VR Ink. When deployed inside VR, the Logitech device’s top button triggers virtual ink for drawing operations. Its side buttons allow you to select and position surfaces and solids. flyingshapes is one of the application partners for the VR Ink.
With HoloLens 2’s finger tracking technology, developers like flyingshapes also have the option to allow users to draw with fingertips, the way a child might dip their finger into a paint bottle and draw on a piece of paper.
“There are pros and cons for using a stylus in VR and also for using fingers. We definitely want to support both,” notes Mattmann. “Currently, devices with finger tracking are more expensive because it adds complexity. But soon, it could be the opposite.”
Mattmann and his colleagues believe, as adoption picks up momentum, even consumer-class AR/VR devices could start implementing finger tracking. His rationale is, ultimately, a device with fewer hardware pieces costs less to manufacture and can be sold at a lower price point. And finger tracking offers the opportunity to eliminate the need for controllers altogether, leading to a more compact device.
Waiting for Better Hand Tracking
For moving and positioning objects, for example, the human hand offers a much more natural interface. Imagine being able to grab machine components and snap them together to form an assembly. However, for sketching 2D profiles or splines to mark trimming regions, the stylus is a more natural device with greater precision. Whether to support one or the other, or both, is a decision AR/VR modeling app developers must confront.
“Hand recognition has to get better. Right now, it’s not possible to recognize really subtle movements, so it’s not possible to implement, for example, virtual sliders to adjust dimensions in VR. With HoloLens 2 making it possible, you will see in the next year user interfaces adopting it and rapidly changing,” observes Greg Jones, senior manager, global business development, NVIDIA.
GPU maker NVIDIA developed a VR-based collaborative design space called the NVIDIA Holodeck. In the Holodeck’s virtual environment, users appear as avatars to interact with 3D models in true scale.
“With the current 3D hand-tracking technologies, precision is not enough for designing details (CAD scenario), but in many cases provides enough precision to sketch and sculpt in 3D where lower precision is acceptable,” notes Shahrouz Yousefi, CTO and founder of ManoMotion. ManoMotion provides a hand-recognition software developer kit to AR/VR developers working on mobile applications in iOS and Android OS.
“However, for natural interaction and manipulation of the design, hand tracking can bring a big value. In my opinion, controllers are reliable inputs for high precision design and hand tracking for interaction with the design. One option is to use both for these two different purposes until the hand tracking can provide the required precision level,” adds Yousefi.
Start off With Collaboration
SolidWorks, part of Dassault Systèmes, offers you an easy way to export VR-viewable models with a few clicks from its popular mechanical modeler SolidWorks. Its viewing and markup program eDrawings also offers VR and AR functions.
“Initially people find value in collaborating around VR models and decision making, so we thought eDrawings is the right vehicle because it’s much more lightweight,” notes David Randle, senior manager, strategy and business development, SolidWorks.
“A one-to-one true scale perspective possible in VR is not something you can get with the desktop app. That’s a new way to inspect and interrogate 3D assets. You get a better understanding of how some components may need to be accessed [in maintenance and repair],” he adds.
For a design-creation application for VR, Randle points to a SolidWorks software partner, Gravity Sketch, as one good direction. Gravity Sketch supports Oculus, Vive and Windows MR devices. The controller’s user experience for editing and manipulating surfaces and solids shows some resemblance to flyingshapes and Autodesk Create Alias VR.
Designing the Virtual Workspace
AR/VR developer Varjo has been developing a new design user interface (UI) for AR/VR. Codenamed “Virtual Workspace,” the UI was unveiled last December. It includes a library of virtual furniture and interior items for you to drag and drop into the VR environment. It also allows you to import Windows programs into VR, so you can, for example, access your file folders inside the VR workspace.
The new UI is meant for the company’s latest generation developer kit, the XR-1 Developer Edition. The headset’s built-in front-facing cameras track the user’s hands and re-create the same movements and gestures inside VR. In early demos, the author was able to employ a virtual hand (not a controller) to interact with 3D CAD assembly models in VR; however, some pixelation around the virtual hand suggests the software is still a work in progress.
“Varjo Workspace shows how professionals will use and interact with computers in the future. Unlike other immersive computing UIs, Varjo Workspace is not built on ‘hand-waving’ Hollywood UIs with no practical basis in reality, but instead integrates the way we work today using professional 2D applications—all enhanced by the capabilities of Varjo’s XR-1 Developer Edition,” says Urho Konttori, co-founder and chief product officer at Varjo.
Don’t Lose Sight of Reality
When you enter a virtual world, you also temporarily turn a blind eye to the real world, in a manner of speaking. By contrast, in AR, you see your real world enhanced with a layer of digital information. For example, when you look at an engine block before you, you may also see its installation date, service record and available replacement parts digitally overlaid on top of it.
“I’m much more a fan of AR, because when you design, you still need to design for the real world,” says Jon Hirschtick, founder and CEO of Onshape. “We view AR as a rich design platform, just like mobile devices are. Think of AR like a second monitor, another way to co-design.”
In October 2019, the software-as-a-service (SaaS), cloud-hosted CAD provider Onshape was acquired by PTC, which already owns an AR technology portfolio under its Vuforia brand. As part of PTC, Onshape could integrate Vuforia components to deliver Onshape for AR, for instance.
“We’ve already announced that we are developing an Onshape AR client. We haven’t released it yet, but it’s in prototype now,” says Hirschtick. “Modern AR devices support hand recognition, so you can imagine simply reaching out, grabbing objects and moving them around kinematically in assembly mode. You could also imagine shaping surfaces via control points using your hand, or adjusting dimensions using hand gestures. For example, thumbs up for increasing the dimension and thumbs down for decreasing.”
Cloud is Critical
For now, Hirschtick believes partial CAD tools for AR make more sense than a full CAD application for AR. “We also believe with AR, embracing cloud and SaaS technology is part of the vision. We can’t imagine on-premise file-based systems becoming popular among AR users,” he says.
As people have begun purchasing multiple devices and integrating them into their workspaces, the notion of a CAD session is rapidly changing. “We should be aware of multidevice usage. People log in and use multiple devices to interact with the same 3D model,” notes Hirschtick.
In a single modeling session, you might use a tablet to sketch out a 2D profile, edit the resulting solid model on your desktop then put on an AR display to review the design. This will likely come in conflict with traditional seat-based or node-locked CAD licensing that prevents users from launching multiple instances of the software without penalty.
How Real Should it be?
Generally, desktop CAD users do not work in true scale inside CAD software. A 350 cubic-in. engine appears as a fist-size SolidWorks assembly onscreen; a Boeing 747 with a 211-ft. wingspan appears as a 1-ft.-long 3D model on the monitor. The scaling makes it possible to work on large assemblies within the confines of a 2D monitor using mouse-driven rotation. But with MR, the limitation of the 2D screen disappears.
“We don’t have to worry about screen space budgeting, since in AR, screen is unlimited,” notes Hirschtick. “The advantage in VR is, you can move anywhere within the virtual space—no space limitation,” notes Mattmann. That means, in design review, you can look at the proposed concepts in true scale to better understand comfort and ergonomics.
Would you really want to travel 211 virtual feet to go from one end to the other of your Boeing 747’s digital twin in VR? Would you really want to stretch your arms wide enough to rotate a 350 cubic-in. engine in AR? For sure, most engineers would choose an easier way to execute such operations. How much realism is too much; how much is just right? How should scaling work in MR? These are open questions.
Comfortable and Cordless
One hurdle inhibiting movement in VR is the cord. Because the head-mounted display (HMD) needs to borrow computing power from a workstation-class system, many invariably come with cords. That means, even if the virtual world is unlimited, the user’s ability to roam is limited. The risk of tripping and entanglement is also a challenge.
“I think cordless solutions are the future, so the app has to be a cloud app. Most likely you won’t install the entire CAD system onto the hard disc in the head-mounted display; it’ll have to run in a client app—something Onshape is already architected to do,” notes Hirschtick.
“Cutting the cord is the key. Everybody wants to cut the cord,” says Randle. HoloLens 2 removes the cord by connecting the device to the powering system via Wi-Fi. Other devices have also begun adopting a portable powering system, no bigger than a hockey puck or a mobile phone, that the user can carry around.
“Cords and power stations are not limiting the quality of the modeling but certainly affect the comfort of the designer, especially the experience in VR, where immersion is a big factor,” says Yousefi. “This is obviously more important together with positional tracking technology where the designer needs to move within the physical space and might be even more limiting in collaborative VR with multiple users. The impact might be lower in AR since users see the physical space.”
The weight of the HMD is an inhibitor. Heavier devices prevent users from working with it for long periods; therefore, the most likely scenario for the present is, the user will only put on the HMD when viewing the design in AR/VR.
Dedicated design and modeling applications for AR/VR are still in the experimental phase. However, the UI similarities in the early programs suggest some standard modeling protocols are already emerging.