Intel's Mobileye deal marks an inflection point in automotive electronics

April 17, 2017

In summer 2016, when BMW, Intel, and Mobileye joined hands to bring fully autonomous cars to roads by 2021, the collaboration raised a few questions....

In summer 2016, when BMW, Intel, and Mobileye joined hands to bring fully autonomous cars to roads by 2021, the collaboration raised a few questions. For instance, what is Intel doing in this alliance without an automotive chip? And why would Mobileye share its renowned machine-vision algorithm with a chipmaker?

The answer came in March 2017 when Intel announced they’d acquired Mobileye for a whopping $15.3 billion – its second largest acquisition so far. But here, trade media remained busy in the usual tidbits associated with technology acquisitions and largely missed a critical dimension behind this multi-billion-dollar deal.

So far, the automotive electronics industry has mostly been sprinkling cars with low-cost MCUs to carry out a diverse array of functions in powertrain, infotainment, and body electronics. However, the advent of advanced driver assistance systems (ADAS) and subsequently the quest for self-driving care are shifting the focus toward larger and more powerful system-on-chips (SoCs) catering to functions such as vision processing and sensor fusion.

Intel’s earlier acquisition of Yogitech, a provider of functional safety verification tools and solutions, is a testament that the world’s largest chipmaker is aiming at more compute-intensive automotive SoCs for ADAS and autonomous car markets. And building large chips is Intel’s thing anyway. All it wanted was a springboard, and Mobileye seems to be a pretty good match when it comes to both ADAS now and self-driving cars in the future.

A majority of car OEMs are already Mobileye customers. Mobileye’s current ADAS offerings include lane departure warning, forward collision warning, headway monitoring, pedestrian detection, intelligent headlight control, and traffic sign recognition. Moreover, its next-generation vision SoC, EyeQ 5, expected to be available for sampling in 2018, claims to provide the central computing platform for sensor fusion in fully autonomous vehicles by 2020. The capabilities that EyeQ 5 chipset is going to offer includes 360-degree coverage by cameras and ultrasonic.

[The block diagram of Mobileye's EyeQ 5 chipset for autonomous cars.]

Mobileye is a success story in the nascent ADAS market. The Jerusalem-based firm co-developed a vision processor in collaboration with STMicro and used it to process visual information for driver assistance with cheap mobile-phone cameras back in the mid-2000s. BMW, General Motors, Nissan, and Volvo were the early adopters of Mobileye’s vision solution, and eventually, Mobileye became an ADAS market leader with 80 percent of the share of front-view camera processors.

Mobileye has come a long way since then. Apart from vision dataset and algorithms, it now boasts localization mapping called REM and sensor fusion software, which constitute the key building blocks of autonomous driving architecture.

Intel missed the mobile boat, and now automotive electronics is the next big thing in the silicon business. It will be interesting to watch how Intel executes this crucial bet on automotive electronics.

Majeed Ahmad is former Editor-in-Chief of EE Times Asia. He is a journalist with an engineering background and two decades of experience in writing and editing technical content. He is also the author of six books on electronics: Smartphone, Nokia’s Smartphone Problem and The Next Web of 50 Billion Devices, Mobile Commerce 2.0, Age of Mobile Data, and Essential 4G Guide.

Majeed Ahmed, Automotive Contributor