September 11, 2018
Two weeks ago, DE hosted an online roundtable, titled “Simulating Reality with a Touch of AR-VR” (full episode, now available on-demand).
- Eric Abbruzzese, Principal Analyst, ABI Research;
- J.C. Kuang, Analyst, Greenlight Insights;
- Jason Cooper, Chief Digital Officer, Horizon Productions.
Adoption of AR-VR (augmented reality and virtual reality) in engineering is driven largely by the the potential use cases in enterprise training, design review, and ergonomic checks. But a critical part of the virtual experience—the touch element—still remains the missing piece.
Price for Haptic AR-VR Remains Prohibitive
Incorporating haptics into AR-VR training may involve developing body suites and gloves; or installing sophisticated rigs with pneumatic treatments. Such mechanisms will allow, for example, a participant in a flight training program to reach out and grab a controller that closely resembles a pilot's navigational yoke, and feel vibrations that match the simulated takeoff and landing operations.
“We have had requests to develop a full-on haptic feedback system,” reveals Jason. “It's to be a flying simulation application where you can feel the wind, the fans ... I've tried glove-style haptic systems that use pneumatic, and the sensation you get is unbelievable. You can feel a spider crawl on your hands if that's what's simulated. But the cost is significant,” notes Jason.
By contrast, gesture input is widely available now at a cost infinitely less than haptic mechanisms. It can be implemented with gadgets like the Leap Motion Controller, priced below $100.
“At present, many of the peripherals cost over U.S. $100. Some controllers with haptic feedback technology are priced higher, ranging from U.S.$300 to over U.S.$1,000. Worldwide controller market is expected to reach 244 million units in 2022,” according to ABI.
Full Haptic in the Future
While the present gesture tracking systems' accuracy might be less than ideal, their intuitiveness makes up for the deficiency, Eric points out.
“Presently there's not a lot of activities in haptics ... The input method we're familiar with [touch screens or controllers] got carried over to [current AR-VR apps]. Sometimes a videogame controller got adopted for vibration. True haptics will hit the market in the next few years when some glove-style controllers appear so you get that granular feedback,” says Eric.
“Once we unlock the sub-millimeter accuracy, that will open up a whole number of new applications that weren't viable before, like healthcare training,” says J.C. “Haptics and gesture inputs are two complementary systems working together to produce the first-rate natural human-machine interaction, much more robust and immediate than controllers or voice command. We think gesture control is poised to experience significant growth. We've started to see established brands trading on the intuitiveness of gesture input, with devices like the Leap Motion and Microsoft HoloLens.”
What's Possible Now
A listener wants to know if he/she needs to embed microchips into machine equipment to enable AR—to be able to walk up to the machine and pull up relevant history and engineering documents. Not so, the panelists point out. Present technologies from companies like PTC (particularly its Vuforia division) offer AR applications that run on object recognition.
“It's like a super-advanced QR code,” says Eric.
SolidWorks eDrawings, a free viewer for SolidWorks CAD users, also includes AR-enabling features that work with a paper-printed QR code. Therefore, affordable AR-driven query systems for factory or plant operations can be implemented without the cost of microchips.
Another listener wonders about the data pipeline from CAD to AR-VR. One option is to go through popular game engines (Unity Engine and Unreal Engine, to name the top two). Both provide plug-ins to convert CAD data into game engine-ready models, which can then be used for AR-VR. For more read “From Solid Geometry to Responsive AR-VR,” August 2018.
To listen to the recorded panel discussion, visit the link here.