Emotion as a Product Design Parameter

New technologies emerge to let engineers monitor, interpret and incorporate human emotions into product design.

Automakers may deploy Siemens PLM Software’s Tecnomatix Jack in immersive AR-VR environments to understand consumers’ subjective feelings associated with certain brands. Image courtesy of Siemens PLM Software.


Consumers don’t expect product design to express emotional intelligence. They don’t expect their car to calm them down in a traffic jam; nor do they expect their smartwatch to realize the last text message they read on it was a devastating piece of news. But the emergence of emotion-detection technologies, riding on the wave of sensor-equipped wearables and machine learning, points to a new kind of man-and-machine interaction—one that involves emotions.

For product design, it’s both a blessing and a burden. Being able to design products that are not only emotive but also emotion-aware opens new doors. But it also adds new parameters to include in system design that few have previously considered.

The Feel of a Brand Begins Before Product Design

What exactly is the Jaguar feel, BMW feel or Audi feel? The mix of luxury, comfort and prestige associated with a certain brand is difficult to define, even more difficult to measure. Some of it comes from the consistent marketing campaign that makes you associate the logo and the vehicle with a certain lifestyle or personality. Others come from years of engineering to ensure the brand’s recognizable form and emotive details are preserved.

Emotion Research Lab uses webcam-detected microexpressions to identify human emotions and moods that can be used in product design. Image courtesy of Emotion Research Lab. Emotion Research Lab uses webcam-detected microexpressions to identify human emotions and moods. Image courtesy of Emotion Research Lab.

“The automotive brand feel is quite subjective,” observes Ulrich Raschke, director of human simulation products, Siemens PLM Software. “Over a decade ago, car companies became interested in designing the brand’s feel. To capture the characteristics of that feel, some of them used Siemens PLM Software’s Tecnomatix Jack to immerse someone in the digital vehicle model to collect the test subject’s subjective responses.”

Part of the collection of human simulation solutions overseen by Raschke, Tecnomatix Jack can be used to simulate and analyze the assembly workflow in plants and factories. Jack and his female counterpart Jill can be configured to match the target population’s size attributes.

Their range of motion is restricted to match real human capacity. Therefore, if Jack and Jill are required to adopt unnatural postures to perform a certain task, they’ll show signs of distress (registered as torques in the affected regions in their virtual bodies). Simulating assembly routines with Jack and Jill could reveal certain work-related ergonomic issues and injury risks. With the emergence of affordable augmented reality and virtual reality (AR-VR) hardware, software like Tecnomatix Jack enters the realm of immersive simulation.

“With the test subjects sitting in the virtual vehicle, the car manufacturer tracked their movements,” explains Raschke, recounting the use of Jack in automotive brand design projects. “Could they reach the mirror, the gear shafts and the radio controls? Was it comfortable to reach for them? How did that feel?”

Automakers may deploy Siemens PLM Software’s Tecnomatix Jack in immersive AR-VR environments to understand consumers’ subjective feelings associated with certain brands. Image courtesy of Siemens PLM Software. Automakers may deploy Siemens PLM Software’s Tecnomatix Jack in immersive AR-VR environments to understand consumers’ subjective feelings associated with certain brands. Image courtesy of Siemens PLM Software.

Siemens PLM Software’s rivals—Autodesk, Dassault Systèmes and PTC—are also aggressively exploring AR-VR incorporation as part of the product design workflow. An immersive simulation made possible with AR-VR gear offers a new level of understanding previously not possible with desktop computer-based simulation.

“You can measure not only the pressures and forces exerted on the test subjects, but also ask them if they can imagine doing this job this way for eight hours a day,” Raschke points out. “The reporting tool might say numerically it’s OK to do this task, that there’s no risk of injury, but the test subjects’ feeling might be that they don’t want to work that way.”

Measuring Emotions

Industry analysts expect the wearable medical device market to grow rapidly by 2025. The market is spawning many new types of connected devices catering to health-conscious consumers.

Among them is Feel, a wristband described as an “emotion sensor and mental health advisor.” The company states: “The wristband monitors a variety of physiological signals throughout the day to recognize changes in your emotions.” The device can detect emotional patterns including joy, stress, distress, contentment and sadness, as made evident by the consumer’s heart-rate variability, skin temperature and other physiological changes.

The Feel wristband’s built-in sensors can detect human emotions and mood. Image courtesy of Feel. The Feel wristband’s built-in sensors can detect human emotions and mood.
Image courtesy of Feel.

Feel is designed mainly using SolidWorks 3D CAD program, according to Olga Labutina, Feel’s product & marketing manager. “We have five sensors in the wristband,” she explains. “The most important one is the custom sensor. Making that custom sensor, placing it to make sure the user can easily keep the device on 24-7, to make sure the sensor touches the skin properly to get good readings—these were the technical challenges.”

The centerpiece of Feel’s wristband is a GSR sensor that captures differences in skin conductance as an indicator of fluctuating emotions. Feel uses machine learning to analyze the changes to develop its proprietary algorithm for emotion recognition.

“In our controlled in-house tests, we use VR hardware, mainly Homido and Samsung Gear VR. Feel engineers let testers view multimedia content developed in-house with psychologists, then verified if the emotional states detected match what the test subjects were experiencing in a certain point in time. While in the uncontrolled environment, the user goes through [the] Feel experience of emotion recognition and self-reports via Feel Mobile App,” says Labutina.

Because it’s a wearable, Feel product design engineers also had to ensure the device is discreet and sleek, and won’t disrupt the daily domestic tasks people tend to perform. The product is now going through validation tests, says Labutina.

Monitoring Moods

Maria Pocovi, CEO of Emotion Research Lab, has the technology that she believes is changing the way people do market and brand research. Forget the focus groups and telephone surveys. These take too much time and money to organize. Pocovi prefers to work with webcams and algorithms. Though the technology was initially developed with market researchers in mind, Pocovi now thinks it can also be incorporated into IoT devices and autonomous vehicles.

“We developed an emotion-recognition algorithm based on microexpressions in people’s faces,” Pocovi explains. “That’s why we can work with just a webcam.”

In onsite deployment, a retailer might mount a webcam next to two versions of a product to understand the consumer’s receptiveness for one over the other. In online deployment, a marketer might let survey participants look at prepared content (for example, a TV ad campaign), then measure the basic emotions and moods generated (Neutral, Exuberant, Dependent, Relaxed, Docile, Hostile, Anxious, Disdainful, Bored) based on their reaction as detected via the webcam.

“Our technology computes and extracts the emotions in real time, so the camera footage doesn’t need to be recorded and archived,” Pocovi points out. This resolves some privacy concerns and eliminates the need for costly storage servers—on premise or in the cloud—to keep hours of webcam footage generated.

“Our vision is to enable machines to understand human emotions, so our API now allows designers to create products that can understand human emotions and interact in real time,” Pocovi says. “For example, detecting the emotional pattern of the driver is one step more to increase the security and avoiding accidents because people’s mood affects their capabilities. Product design engineers can integrate our technology in their IoT device or vehicle to make it more human”—that is, a smartphone or a car that knows how you feel and can issue helpful prompts.

The advancements in AR-VR gear make it possible for designers to study and understand how the consumer might feel about a product design—a subjective parameter that may have nothing to do with product quality or performance. The emotion-detection technologies point to autonomous vehicles that could recognize signs of road rage or distractions in the driver, or smartwatches that could tell if their owners are in distress. Incorporating these into the product development cycle and system design might become a standard practice in the not-so-distant future.

More Info

Autodesk

Dassault Systèmes

Emotion Research Lab

Feel

Homido

PTC

Samsung Gear VR

Siemens PLM Software

Share This Article

Subscribe to our FREE magazine, FREE email newsletters or both!

Join over 90,000 engineering professionals who get fresh engineering news as soon as it is published.


About the Author

Kenneth Wong's avatar
Kenneth Wong

Kenneth Wong is Digital Engineering’s resident blogger and senior editor. Email him at [email protected] or share your thoughts on this article at digitaleng.news/facebook.

      Follow DE
#17390