The Autonomous Road is Closely Ahead, and NVIDIA DRIVE Has the Wheel

By Tiera Oliver

Associate Editor

Embedded Computing Design

November 17, 2021

Story

The Autonomous Road is Closely Ahead, and NVIDIA DRIVE Has the Wheel
Image Courtesy of NVIDIA

Autonomous drive is no longer an idea of the future. Although we have yet to reach full levels of autonomy, these systems continue to advance, as do the capabilities that reside within and all around the vehicle.

At NVIDIA’s fall GPU Technology Conference, or GTC, founder and CEO Jensen Huang, offered a glimpse into what the company has been working on when it comes to the AI-driven automobile. These included more than 65 updates to NVIDIA SDKs that will advance everything from in-vehicle personal assistants to autonomous vehicle mapping and localization to designing with next-generation ADAS SoCs.

Not only do these releases target different parts of the vehicle, they also scale across the various levels of autonomy to enable what is hopefully a seamless transition into fully autonomous drive.

Hands (Almost) Off The Wheel

Personalization within a vehicle is ideal, especially as vehicles become more than just transportation systems. They are now also basically smartphones. As such, today’s users expect the same level of support from their vehicles in a way that is similarly tailored to their individual needs and preferences.

To lend an extra hand and limit the distractions that connected multitasking that can present while driving, NVIDIA DRIVE Concierge connects vehicle occupants to a range of always-on intelligent services. These include real-time conversational AI and support for commands that prompt level 3 and above vehicles to search for parking spots on their own, self-park, and respond to remote summons.

NVIDIA DRIVE Concierge is based on the company’s DRIVE IX cockpit software platform that runs on NVIDIA’s Drive Orin SoC, which is a software-defined AI platform NVIDIA is promoting as the car’s central computer. It also integrates tightly with NVIDIA DRIVE Chauffer.

DRIVE Chauffeur leverages NVIDIA DRIVE AV software for perception, mapping, planning layers, and DNNs to collect road data and take over driving responsibilities from the driver at certain times. For instance, Chauffer technology is at the heart of the automated valet functionality mentioned previously.

The Chauffer platform runs on Hyperion 8, a production-ready compute architecture that’s already supported by sensors from automotive tier 1 suppliers like Continental, Hella, Luminar, Sony, and Valeo to monitor road conditions. It’s designed to work with a sensor suite comprised of 12 cameras, 9 radars, 12 ultrasonic sensors, and one front-facing lidar to give vehicles much more than just one pair of eyes on the road.

Together, NVIDIA DRIVE Concierge and DRIVE Chauffer offer 360º, 4D visualization of the interior and exterior vehicle environment so drivers can become passengers whenever possible.

The Road to What’s Ahead, Behind, and All Around

How you use your vehicle and the assistance needed within is important, but knowing what’s occurring around it and how each environment will interact with it is just as pertinent. This is where AI quite literally takes the wheel and induces higher levels of autonomy so that your vehicle can seamlessly interact with the real world around it.

Simulation and perception are heavily relied upon in self-driving systems, measuring and detecting physical objects that can’t always be identified and analyzed by the human eye. Velocity, depth, occlusion, weather, and edge cases are some of the elements that can be labeled by ground truth within NVIDIA’s Omniverse Replicator.

The Omniverse Replicator engine generates synthetic data for training AI networks using ground truth. When applied to the company’s DRIVE Sim simulation tool, used to train deep neural networks (DNN) for autonomous vehicle perception systems, the DNNs utilize an algorithmic model and the data used to train the models to improve perception results. With NVIDIA taking advantage of the synthetic data provided by the Omniverse Replicator for hands on data development, developers can leverage the data to meet the specific needs of their model.

At this point, it seems it’s safe to ask, “what more could you possibly need in your vehicle?”

NVIDIA DRIVE technologies intend to advance the use of AI software in automotive systems. Closing the gap between our current reality and the possibility of the future is no longer in the rearview mirror, instead they’re growing and evolving right along with us.

Worry about other drivers on the road and the distractions that can occur when on it are some of the things that make the journey frustrating. The ease to our daily lives that fully autonomous driving will soon provide will make it enjoyable to get you where you need to go. No more figuring out how you’ll get there, if you’ve got time to make arrangements along the way, and whether you even feel like driving.

All you’ll have to do is just get in your car and go.

Tiera Oliver, Associate Editor for Embedded Computing Design, is responsible for web content edits, product news, and constructing stories. She also assists with newsletter updates as well as contributing and editing content for ECD podcasts and the ECD YouTube channel. Before working at ECD, Tiera graduated from Northern Arizona University where she received her B.S. in journalism and political science and worked as a news reporter for the university’s student led newspaper, The Lumberjack.

More from Tiera