Embedded Toolbox: High-Fidelity Physics for Highly Accurate Vehicle AI

June 28, 2022

Video

Autonomous drive perception algorithms are not only foundational to today's advanced driver assistance systems (ADAS), but tomorrow's Level 5 autonomous vehicles as well. These algorithms are built and trained on data. And the amount of data needed for them to reach commercial readiness can seem infinite.

Consider the number of scenarios a driverless car's perception system will encounter in the real world. Perception algorithms must be trained to respond safely to each of them, but capturing the requisite training data from real-world environments is impractical. We must synthesize it.

Of course, solving one challenge only reveals others, and in this case synthesizing data is only effective if the data mirrors precisely what physical sensors would capture in real-world road conditions. Therefore, the virtual worlds from which sensors capture data must reflect our physical reality, and the virtual sensors doing the data capture must be represented in near-perfect fidelity as well.

In this episode of Embedded Toolbox, Michael Peperhowe, Lead Product Manager for Simulation, Models and Scenarios at dSPACE demonstrates advances in simulation and modeling tools that allow developers to reflect unlimited road conditions precisely as they would exist if you were to drive through them in your car.

Using dSPACE's new AURELION sensor-realistic simulation and visualization software and the cloud-based SIMPHERA simulation and validation platform, Michael walks us through the creation of various scenarios, integration of sensors, and application of their data in real-time towards the development of more accurate, robust perception algorithms capable of identifying road conditions, the behavior of other vehicles, and more. He goes on to explain how engineers can scale these test cases for more comprehensive validation of perception algorithms in less time.

Tune in.

Resources: