Simulation Paves the Way for a New Autonomous Testing Era

To predict certainty in vehicle performance, automotive companies will have to extend simulation technologies and embrace new ones.
Dominic Gallello

By Dominic Gallello

Right now, we simulate the car and leave the driving to the human. The transition to do both by a machine is a significant shift of the paradigm. The simulation has to be extended and has to predict the behavior of the vehicle, as well as react to changes in the environment. To predict certainty in vehicle performance, automotive companies will have to extend simulation technologies and embrace new ones. The key to achieving this will be to rely heavily on simulation software tools with real-time capabilities and machine learning.

Off-Line to Real Time

Although off-line solutions will continue to solve sophisticated and complex models, the need for real-time simulation is ever increasing, especially in the autonomous vehicle world, for two main reasons.

  1. The requirement to connect virtual models to physical hardware (such as sensors, controllers, driving simulators and so on), so-called hardware-in-the-loop (HiL). These physical assets have a defined communication speed. The associated simulation model must be able to keep up with this communication speed. This defines a real-time model.
  2. Traditionally, vehicle development (including vehicle dynamics) has been targeted at validating the machine. The human driver, whether following orders (test instructions) or making numerous on-the-fly decisions, was not considered a system that required validation. The concept of autonomous vehicle changes all of that. Now, the “driver,” the most complex system on the vehicle, has to be validated as well. This leads to many orders of magnitude for more simulation runs so the ability to solve quickly becomes a valuable asset.

With accurate representation of the vehicle response, we can introduce the model to a computer simulation of a real-world driving environment—complete with other cars, trees, people, buildings and so on. In order to “read” this environment, the autonomous vehicle model is equipped with a variety of sensors that continually monitor its surroundings. At this stage, some physical sensors may be included (HiL) as the necessary physics has not been captured in a virtual model. Based on feedback, the vehicle calculates what to do next and then sends appropriate signals to actuators that drive the car. This behavior is then coded onto the vehicle chassis controller. Until now, the vehicle dynamics model has been effectively a “black box” representation of vehicle behavior. That is, sensor and controls developers have not been interested in why a vehicle behaves as it does, only in how it behaves. The next generation of vehicle controllers will have an on-board, real-time vehicle model that adapts and learns to account for driver preferences and changing conditions.

Machine Learning

A significant challenge facing autonomous driving is an accurate perception, such as obstacle avoidance and terrain assessment. A self-driving vehicle has to assess the drivable surface, while, at the same time, avoiding a whole range of obstacles.

There are three ways a vehicle can address this:

  1. Self- Supervised Machine Learning—The terrain encompasses a supervised learning process by generating labels against events like rough road conditions when driving.
  2. Unsupervised Machine Learning—The terrain is structured into self-organizing maps. This is particularly helpful in measuring distances and normalization.
  3. Deep Learning—This technique is moving into the autonomous vehicle driving space. It encompasses complex mapping functions and machine learning algorithms to leverage huge amounts of training data from various sensors, and specifically from camera systems.

Various sensors added to the vehicle collect data, apply multi-sensor fusion techniques to consolidate, and reduce the data. AI systems that are learning continuously from experience discerning and recognizing surroundings allow the vehicle to react to environmental changes. Systems on a chip use compute intensive machine learning algorithms for scene interpretation or traffic sign recognition, or issue lane-departure control and notification.

The global automotive industry is committed to the development of the driverless car, but there is a long way to go to standardize communication formats, agree on liability and provide the infrastructure to handle the massive amount of data transfer that will be required. Computer simulation may be the only way to test all of the potential combinations of conditions, and the structured process described here is aimed at supporting the goal of safe, reliable autonomy. DE

Dominic Gallello is president and CEO of MSC Software (MSCSoftware.com), a fully owned subsidiary of Hexagon AB. Contact him via [email protected].

Share This Article

Subscribe to our FREE magazine, FREE email newsletters or both!

Join over 90,000 engineering professionals who get fresh engineering news as soon as it is published.


#16634