RealWear’s AR headset as deployed by Honeywell (image courtesy of RealWear).

Virtual Reality Strategy Conference 2018: New Ways to Interact with Pixels

Companies gather to swap tips, network, and promote virtual reality technology

Analyst firm Greenlight Insights hosts Virtual Reality Strategy Conference in San Francisco

Two weeks ago, in downtown San Francisco's Park Central Hotel, VR (virtual reality) hardware and software developers, content creators, telecom firms, and early adopters assembled for VRS (Virtual Reality Strategy) Conference, hosted annually by the analyst firms Greenlight Insights.

Teppei Tsutsui, CEO and managing director of GREE VR Capital and GFR Fund, was on a panel to discuss how investors view the AR-VR (augmented reality, virtual reality) landscape. Many startup founders in the audience were anxious to hear tips on how to make their pitches more appealing to investors. Tsutsui says, “I'd like a conversation, not a presentation.”

VRS saw abundant conversation, on the panels, in networking events, and in the lobby and hallways where attendees mingled. Many of the talks revolve around new interactions with digital objects, made possible by AR-VR technologies.

Talk to your AR headset

Tom Dollente, director of product management, RealWear, was on the VRS panel that explores the use of AR (augmented reality) for frontline workers. (The blog post's author served as the panel's moderator.) RealWear's products—HMT-1 and HMT-1Z1—use a head-mounted camera and a small projector mounted before the right eye. Designed for hands free use, it employs voice control for user input, command, and selection. Users may attach it to a hard helmet for deployment in construction sites and other hazardous zones.

“We learned a lot while working to perfect our voice-recognition algorithm,” says Dollente. “We fine-tuned our system so it can detect the direction of the voice. So if someone nearby happens to utter a command phrase (such as Home), the headset won't execute it. We also learn to use commands like Terminate, which has distinct consonants, highly unlikely to be confused with other words, and is not a phrase someone might say frequently.”

Such considerations are important, as the user will be wearing the RealWear system and going about his/her daily routines. Therefore, the system needs to be able to distinguish legitimate verbal commands from background noises, nearby conversations, and the user's own conversations with coworkers.

(See RealWear's demo video below.)

Content creators needed

Among the companies present was Kaon Interactive, which focuses on using VR and AR for B2B (business to business) marketing.

“Companies often have a difficult time explaining their value proposition to clients, so we provide VR and AR content to explain their competitive advantages,” said Gavin Finn, CEO, Kaon. “Our business model is, we build the applications for our clients.”

Kaon has created applications for, among others, Cisco Systems and Dell, incorporating interactive 3D product models, viewable from a browser. The company also build mobile apps, which can display digital product models in the mobile device's camera view.

Kaon's AR mobile app has the ability to recognize natural surfaces (such as tabletops and countertops) and can position the virtual models on them in correct propotion.

“We started out thinking of delivering an end user tool, an app creation platform, but we found out that sales and marketing buyers tend to want to buy the solution; they don't want to learn the platform to build their app,” says Finn.

Kaon's current business model is to create the content, deliver it in the form of a deployable app, and maintain the backend cloud infrastructure that houses and updates the content. (Shown below is Kaon's virtual product center app for Dell, imge courtesy of Kaon.)

Kaon specializes in building interactive apps for B2B marketing. Shown here is Kaon's app showcasing Dell server products (image courtesy of Kaon).

Hand tracking for AR-VR

uSens believes the key to a more natural interaction in AR-VR is hand-tracking. Accordingly, it uses computer vision to recognize and interpret hand gestures and movements that humans naturally use in communication (for example, a thumbs up) and object manipulation (for example, shaping clay with bare hands or striking nails with a hammer).

“We have two solutions for that. One uses your mobile phone's color camera and doesn't need any additional hardware,” says Dr. Yui Fei, cofounder and CTO. “Another is our own hardware, called Fingo [a kit that includes stereo cameras].”

The human hand is a soft-tissue object that presents challenges for the computer to recognize. Therefore, uSens uses a technique similar to motion-capture technology, where the system tracks the easily identifiable joints—in this case, finger joints.

“Our algorithm acts like X-ray; it can see through the skin and select the bone joints,” says Fei.

Pure computer vision, as it turns out, is not sufficient for hand tracking, especially in incidents where the angle of the hand obscures some of the fingers. But applying machine learning, uSens refines its algorithm to figure out the positions of the bone joints even when some of the fingers are invisible to the camera, Fei explains.

(Watch uSens demo video below.)

Feel the texture in AR-VR

In the exhibit area of VRS, with a mix of actuator-equipped gloves and camera tracking, MIRAISENS Inc. demonstrated how its technology could introduce haptic feedback to the AR-VR experience.”

VRS was MIRAISENS' first public demo of its patented technology, developed in Japan. It's designed to simulate sensations such as softness, hardness, bumpiness, and pressure when users interact with digital objects.

In its published technical paper, the company explains, “If the skin nerve is properly stimulated, the stimulation signal is sent to the brain and then generates haptics illusion (invented by Dr. Nakamura CTO of Miraisens). As a method of performing this stimulation, any kind of physical quantity such as vibration can be used.”

Beyond head-mounted displays

In its “2018 Global Immersive Displays Market Report,” Greenlight Insights turns its attention to immersive displays, with characteristics distinctly different from the more familiar head-mounted displays (HMDs). The report covers the use of VR/AR-enabled CAVEs and Domes, among other things.

“When corporate teams need to collaborate in virtual worlds, they prefer CAVEs and domes to head-mounted displays,” according to Ben Delaney, Chief Analyst at Greenlight Insights. “These large display systems can provide 3D imagery, and make it much easier to share insights conversationally, and by pointing and gesturing” (Greenlight Insights blog post, September 2018).

In photorealistic visuals, AR-VR technologies have matured, but in incorporating haptic feedback, natural interactions, and voice response, many devices are still in the exploratory phase. For entertainment and sales presentation, impressive visuals may carry the day. But for AR-VR to be deployed in enterprise training and ergonomics, the technology needs to more closely mimic how humans naturally interact with objects—not through game controllers and touch screens but with familiar hand movements and natural speech.

For More Info

Greenlight Insights:


Kaon Interactive:



Share This Article

Subscribe to our FREE magazine, FREE email newsletters or both!

Join over 90,000 engineering professionals who get fresh engineering news as soon as it is published.