The pace of design in the automotive sector has picked up considerably, in just about every segment—from mission-critical engine control to the infotainment subsystem and everything in between. At the Automotive Technologies Virtual Conference, you’ll hear from experts that can cover five key areas of automotive design:
The live and semi-live presentations will go into deep technical detail and some will offer the ability to ask questions in real time.
(subject to change)
Time | Presentation Title | Description |
---|---|---|
1:00 – 2:00 ET | More Miles, Less Wires: Revolutionizing Automotive Battery Management |
EV drivers may not think much about the battery management system (BMS) in their cars, but we engineers know that it’s a critical component. In this session, we will explore how an innovative wireless BMS solution can increase the reliability and range of EVs by replacing heavy cabling with a proprietary wireless protocol.
Daniel Torres, Product Marketing Engineer, Texas Instruments |
The automotive industry makes use of standard processes to ensure the safety of vehicle software. The application to driver systems is understood; their use in user interface components is a relatively recent development. In driverless cars, the user interface has an increased safety role; it is the primary method of “controlling” the vehicle. Additional complexity is introduced to the systems by the increased power of installed SoCs. It is increasingly common for a single SoC to host driver and infotainment functions - so-called “mixed criticality” systems.
Jim Carroll, CTO, Digica
From Aircraft, Spacecraft, Space Suit Helmets, and Automobiles, Heads Up Displays (HUDs) are not new concepts. As computing power has become exponentially more powerful over the last decade, the ability to integrate HUD data into a 3D real-world environment has become readily available to automotive OEMs and suppliers with a low-cost barrier to entry. While this concept of Augmented Reality (AR) HUD is currently seeing significant expansion, it is not without its technical challenges. This discussion will cover the background derivation, technical definition, and the Visualization of AR HUDs and their use with ADAS features to create an enhanced driving experience while minimizing driving distractions.
Chris Giordano, VP of UX/UI Technology, DiSTI
The focus of artificial intelligence and its vehicular applications has always been outward facing – how does the car perceive the world that surrounds it and its position within it. An area with less focus, but of equal importance is the vehicle looking inwards and to how it monitors and interfaces with the driver. We look at the importance of flexible neural network accelerators that can power both ADAS and autonomous driving and also internal cockpit HMI functions. Within these areas we look at the use of convolutional neural networks and how Imagination has implemented them into real world voice enabled HMI solutions.
Bryce Johnstone, Director of Automotive Segment Marketing, Imagination Technologies
With a move to bigger and more screens, richer display environments and the replacement of mechanical dials with digital display, Graphics Processing Unit (GPU) is more fundamental than ever to keep the relevant information displayed without delay or failure. We look at the needs of automotive focused GPUs and how flexibility and a limitlessly scalable architecture are key to keeping up with the growth of displays in vehicles. We also look at key features required to render the safety critical information on a digital dashboard including a deep dive into Imagination Tile Region Protection and the future of Imagination’s automotive GPU offering.
Bryce Johnstone, Director of Automotive Segment Marketing, Imagination Technologies
With new challenges like automated driving or electric power trains, In-Vehicle Network (IVN) requirements are quickly evolving. The IVN has to support use cases such the vehicle data backbone, smart antennas, ADAS cameras/sensors, displays or data loggers which demand higher data bandwidth while maintaining the reliability level required by the automotive industry. It is necessary to define a new IVN standard for multi gigabit optical communications in the automotive environment. A new optical automotive IVN communication standard is currently under development within IEEE under the name 802.3cz and is supported by several industry-leading companies. The new standard will cover rates up to 50 Gb/s and support several in-line connectors. The target BER is better than 10-12 with ambient operation temp from -40ºC up to 105ºC (AEC-Q100 grade 2) in harsh automotive environments. High reliability (15 years operation, less than 10 FIT), and outstanding EMC compliance will also be fulfilled.
In parallel with the standardization effort, KDPOF is developing an optical transceiver capable of supporting the new standard requirements. The transceiver will be a single-component solution integrating the complete electronics (SerDes interface, PCS, PMA and PMD sublayers, MDIO, DSP, Analog Front-End, VCSEL driver, TIA, etc.), the photonics (VCSEL light source and photodiode) and optics. The new transceiver will enjoy a low power dissipation (around 1 W per port at 10 Gb/s) and an automotive industry standard cost. The new KDPOF technology will enable symmetric links, as well as strongly unbalanced upstream/downstream rates suitable for applications such as cameras and displays.
As the auto industry approaches the 100 Gb/s*m speed-length threshold, the need to move away from copper to optical physical data transmission media is becoming mandatory. Optical is the engineering-wise path for higher data rates. The IEEE 802.3cz is currently in the task force phase and is targeting data rates of 2.5, 5, 10, 25, and 50 Gb/s. KDPOF´s first transceivers will be available as engineering samples at the end of 2021. Start of production is scheduled for the beginning of 2025 in time to reach the targeted car models.
Carlos Pardo, CEO and Co-founder, KDPOF
Despite advances, today’s touchscreens remain a vision-based window to the digital world. In automotive applications where displays are growing in size, shape, and number, the risk of driver distraction increases. Programmable surface haptics add a new tactile experience to smooth featureless screens, making it possible to find and operate controls by moving your fingers across the screen and keeping your eyes on the road. As a solid-state replacement (or complement) to traditional vibrotactile haptics, HMI designers can define exactly what controls and textures feel like. Learn how hardware, software and HMI capabilities come together to bring these surfaces to life.
Kevin Klein, Vice President, Tanvas
Removing data securely from flash media is more challenging than older magnetic designs. The software and firmware must work in unison to provide secure solutions that are increasingly in demand. In this session, we detail the secure interface from the application to the media and point out the possible pitfalls along the way.
Thom Denholm, Technical Product Manager, Tuxera
Offerings for assisted and autonomous driving (AD) will increase substantially in the coming decade. They will be dominated by the mass market adoption of advanced assistance features, i.e. AD L2.9, while “eyes off/hands off” solutions will be restricted to specific areas and applications such as car sharing or robot taxis. In this presentation, we tackle the main AD use cases and requirements and outline how u-blox high-precision lane-level positioning designs will address them. We then touch upon the necessity of safety solutions for ADL3+ and the main challenges associated with implementing them.
Stefania Sesia, Global Head of Application Marketing, Automotive, u-blox
Date: May 13, 2021
Time: 11:00am EST
Dependency on software in the automotive environment is growing, and industry groups are leaning towards automotive process and coding standards to reduce the risk of software failure. Our software team tackled both head on while designing a new file system. This session details some of our findings and recommendations for projects using MISRA C and Automotive SPICE.
Thom Denholm, Technical Product Manager, Tuxera
Date: May 13, 2021
Time: 11:00am EST
Offerings for assisted and autonomous driving (AD) will increase substantially in the coming decade. They will be dominated by the mass market adoption of advanced assistance features, i.e. AD L2.9, while “eyes off/hands off” solutions will be restricted to specific areas and applications such as car sharing or robot taxis. In this presentation, we tackle the main AD use cases and requirements and outline how u-blox high-precision lane-level positioning designs will address them. We then touch upon the necessity of safety solutions for ADL3+ and the main challenges associated with implementing them.
Roberto Bagnara, CTO and Chief Scientist, Buseng
Date: May 13, 2021
Time: 11:00am EST
Lidar sensing is gaining traction as a viable modality for ADAS systems. In this presentation, we show a deep-learning-based process flow for real-time detection of pedestrians using LIDAR. The process flow is suitable for low-power, passively-cooled, and cost-effective embedded SoCs. We also demonstrate ideas on how pedestrian detection rates can be maintained for lower resolution solid-state LIDARs, offering a highly cost-efficient solution suited to mass deployment.
Kiwon Sohn, Ph.D, Renesas
Select the presentation(s) you would like to attend from the list on the left and fill out the form to register
Loading your events...
In the next few minutes, you should receive a confirmation email with your login information for each event you registered for. If you do not receive a confirmation email in 15 minutes, please email support.