Navigating Mixed-Criticality Requirements of Integrated ADAS and Infotainment
February 14, 2019
As the autonomous vehicle functions allow drivers and passengers to increasingly devote their attention to something other than the road, infotainment systems will become the focal point.
Infotainment design is more critical than ever as autonomous vehicle functions are integrated into displays to keep drivers and passengers constantly informed of their surroundings.
Dashboards have come a long way from just the standard AM/FM radio and temperature controls. Drivers are now exposed to massive infotainment screens that not only display entertainment and comfort settings, but also safety information generated by ADAS systems.
Infotainment displays have been the home of backup camera video streams for years now, and as vehicles become more autonomous, we can expect to find more safety-related information in that space. Meanwhile, as the autonomous vehicle functions allow drivers and passengers to increasingly devote their attention to something other than the road, infotainment systems will become the focal point of human cognition – both for safety- and non-safety-critical interactions.
“Whereas infotainment systems have traditionally been built for quick interactions with minimal distractions, developers are now free to focus on other aspects of the system such as a richer and more detailed feature set and more entertainment options, and so on,” said Daniel Sisco, Renesas Electronics Corporation’s Director of Automotive Advanced Systems Innovation Department. “This will influence which third-party applications are installed, HMI design and the ‘density’ of infotainment applications.”
“As vehicles transition from level 1 and 2 autonomy to level 3 and above, the driver is enabled and encouraged to safely spend their time focused elsewhere. Infotainment functions that were previously minimized, de-prioritized, or restricted, become more prominent,” he continued. “For example, think of in-vehicle entertainment for the passengers and driver or highly interactive maps for use while driving, etc. The integration of safety-rated functions into the infotainment space means vehicle system and software architectures must accommodate the mixed safety level.”
Hypervisors Keep Operations in Their Lanes
One similarity to the multiple, mixed-criticality workloads that will increasingly be found in the infotainment space is the automotive digital cockpit market. This is where clusters, connectivity, and multiple displays are leveraging more powerful hardware and virtualization technology. Through the use of virtualization technology, multiple cockpit functions can be powered from the same processor but isolated such that multiple OSs are able to run simultaneously while remaining protected against failures, errors, or crashes from the other environment.
As more autonomous drive functions are integrated with infotainment systems, a similar approach will be required. But whereas systems like the instrument cluster are easier to separate from cameras and IVI software, autonomous drive functions will require a deeper level of integration. To keep up with the trend of hardware consolidation in automotive system designs, implementing a hypervisor may be the answer.
“The advantages of hypervisors are that they allow each application to have its own OS but share the SoC and storage. Even within the storage, there would be physical partitions,” said Russell Ruben, Director of Embedded and Integrated Solutions at Western Digital. “With the storage being partitioned per application it prevents data from one application being shared with another, therefore providing a secure solution similar to if they were separate storage devices."
“One example is storage used in a hypervisor environment to support IVI, digital instrument cluster, HUD, AR/VR, and so on,” Ruben continued. “Also, as vehicles become more autonomous and adopt Gb Ethernet to transfer data around the car, the need for performance will increase drastically (See Sidebar). The autonomous vehicle needs quick data access to computer vision algorithms, software applications, and other data to make instantaneous, real-time decisions."
To support the performance needed by advanced, mixed-criticality infotainment systems, Renesas’ R-Car M3 and H3 SoCs provide multiple application cores with a real-time lockstep controller. According to Sisco, the devices are already being designed into digital cockpits to support similar requirements, with more highly integrated IVI system designs on the horizon.
“The Renesas R-Car M3 and H3 SoCs provide a popular and effective combination for infotainment solutions,” Sisco said. “A combination of the company’s hardware peripherals and third-party components including safety-rated virtualization systems enable the company to support both safety-rated and non-safety-rated functions on a single R-Car SoC.”
Infotainment GUI Design
The graphical user interface (GUI) is perhaps the most important piece of technology in the infotainment stack, especially where safety functions are concerned in autonomous and semi-autonomous vehicle use cases. When it comes to developing infotainment user interfaces (UI), many engineers are well versed in designing interfaces that are highly useable and captivating. But as more functionality is integrated onto a single hardware platform, they must also develop solutions that are safe, reliable and efficient enough to run on a single embedded processor.
“Automotive is still really in its infancy with regards to establishing cost-efficient and safe development methodologies,” said Christopher P. Giordano, Vice President – UX/UI Technology at DiSTI. “Your car is now basically a computer on wheels. If your website crashes in your office, no one dies. If your car’s computer crashes on the road, it could be catastrophic. As such, this job function has been morphing.”
“We see safety critical requirements mostly in the Cluster and HUD space. However, we have been seeing some requirements sneak into the IVI as well, especially as it pertains to a single system to run multiple displays,” Giordano continued. “What we like to target as a best practice is to find a way to have the UX/UI design groups working in tandem with the implementation engineers, the best of both worlds. This way, you can get an efficient development workflow that OEMs and suppliers can iterate on very quickly.
“Another capability we recommend to developers is to leverage the knowledge that already exists,” he said. “While the avionics development process is far more rigorous with mandatory third-party independent reviews, automotive OEMs can take a page out of that development process and leverage it against the ISO 26262 standard, which is still relatively open to interpretation.”
DiSTI’s GL Studio GUI development tool supported 3D capabilities even before the advent of GPUs, and, as such, was designed to be safe, reliable, and efficient enough for a range of embedded hardware targets. Today, however, the company has introduced a new state machine within the GL Studio environment, which allows users of all skill levels to leverage easy-to-use block-based programming to create engaging UIs.
Looking Down the Road
As driving systems become more automated, IVI systems will surpass steering wheels as the main user interface within vehicles. Soon what we remember as a standard dashboard may be unrecognizable in the very near future, making car shopping a very different experience.
Autonomous vehicles will definitely change the way society will look at driving; we can sleep, catch up on news, text in a moving vehicle without having to worry about ending up in a collision. Soon we’ll be able to sit back, relax and let hypervisors do all the work.