Arrow Electronic Components Online

Development and solutions of ADAS and autonomous driving

Autonomous Driving17 Jul 2024
Close-up view of a Tesla car interior showcasing a futuristic steering wheel and dual digital displays.
View all articles

With the rapid advancement of semiconductor and artificial intelligence (AI) technologies, Advanced Driver Assistance Systems (ADAS) have become standard equipment in many vehicles, and autonomous driving is gradually emerging as a focal point for the future of automotive development. These sophisticated technologies use cameras, radar, sensors, and software to assist drivers by automatically detecting hazards and even controlling the vehicle when necessary, thereby reducing the occurrence of accidents. This article will introduce the development of ADAS and autonomous driving, along with related products and solutions.

ADAS reduce accidents caused by human errors

ADAS represent a suite of digital technologies that perform various computer vision functions to assist drivers with fundamental tasks such as parking and navigation, heralding the future of smart safe driving. ADAS can include a range of subsystems, from adaptive cruise control to automated parking systems. These systems aim to prevent accidents caused by human errors, which accounts for over 90% of collisions today.
 
The most advanced ADAS safety technologies can automate and enhance many driving-related functions to improve safety and ensure proper driving operations. These subsystems can be broadly categorized into two types: passive ADAS systems that enhance driver awareness, such as lane departure warning systems and blind-spot detection, and active ADAS systems that take action, such as automatic emergency braking (AEB), adaptive cruise control (ACC), lane keeping assist (LKA), and lane centering (LC). 
 
ADAS assists drivers in perceiving traffic conditions, analyzing and understanding driving behavior, and using predictive technologies combined with cloud computing, edge computing, and sensor data collection and analysis. These systems can notify drivers in advance of any potential vehicle issues, warning users to perform maintenance and ensure safety.

Illustration of a car equipped with advanced safety systems, including radar, camera, and ultrasound sensors.

Sensors and software technologies used in ADAS

ADAS employs various sensors to enhance vehicle safety and provide a wide range of autonomous driving functions, including four common types of sensors. Firstly, camera sensors are widely used due to their lower cost, making camera-based solutions the most prevalent sensor technology in ADAS. Secondly, millimeter wave radar sensors, which generate radio waves to calculate the distance between objects and the waves, are typically used as part of collision avoidance systems. Thirdly, Light Detection and Ranging (LiDAR) sensors use lasers to detect distances and can also detect people and geographical anomalies. Fourthly, ultrasonic sensors are primarily used for parking assistance and automated parking systems.
 
In addition to sensors, software plays a crucial role, including Human-Machine Interface (HMI) to improve the connection between the driver and the vehicle automation system. Furthermore, AI technology is used to recognize various vehicles and pedestrians on the road and to intervene in controlling the vehicle in emergency situations, playing a vital role.
 
ADAS and autonomous driving are two distinct technologies. ADAS is a collection of technologies aimed at enhancing overall driving safety, while autonomous driving refers to the vehicle's ability to drive itself without human intervention. However, it is important to emphasize that ADAS does not replace the driver. The primary use of ADAS technology is to improve road safety, and even though vehicles are equipped with more convenience and entertainment features, the driver must remain focused on driving the vehicle.

Close-up view of an Infineon microchip showcasing its design and structure.

Key ADAS smart sensors and application processors

ADAS involves a diverse range of systems and components. Below are some important products for your reference.
 
Firstly, Infineon's XENSIV™ BGT60ATR24C is an automotive 60 GHz radar sensor, capable of ultra-wide bandwidth frequency-modulated continuous wave (FMCW) radar operation in a small package. The sensor supports 4 GHz bandwidth and 2 TX / 4 RX channels, and can be configured and data can be acquired through a digital interface, with integrated state machine for independent operation and it can achieve independent data acquisition with optimized power modes for the lowest power consumption, and AEC-Q100/101 qualified.
 
This new smart sensor for gesture recognition includes several blocks, such as the radio frequency (RF) front end, analog baseband (ABB), analog-to-digital converter (ADC), phase-locked loop (PLL), memory (e.g., FIFO), and serial peripheral interface (SPI). The BGT60ATR24C provides a high level of integration in a single chipset.
 
The core functionality of the BGT60ATR24C involves transmitting FMCW signals through the transmitter channel (TX) and receiving echo signals from target objects on four receiving channels (RX). Each receiver path includes baseband filtering, a voltage gain amplifier (VGA), and an ADC. The digitized output is stored in the FIFO. Data is transferred to an external host, microcontroller unit (MCU), or application processor (AP) to run radar signal processing.
 
The Infineon's new smart sensor has already been applied in in-cabin radar sensing using the MulticoreWare with Cadence Vision P6 DSP. In-cabin radar sensing can monitor whether passengers are seated and their health status. It excels at monitoring vital signs and tracking heart rate, including detecting the presence of children when the car is locked. Additionally, the system provides intrusion alerts, enhancing the overall security of the vehicle.
 
Moreover, NXP has introduced the i.MX 95 application processor, which brings efficient and secure AI processing capabilities to automobiles. NXP's new i.MX 95 is designed to handle mixed-criticality functions in intelligent edge applications (including automotive) through flexible heterogeneous computing domains that meet ASIL-B and SIL2 safety standards. This processor provides secure and efficient AI processing capabilities for electronic cockpits (eCockpit) and connectivity domains.
 
The NXP i.MX 95 series application processors combine multicore high-performance computing, immersive 3D graphics, and integrated NXP eIQ® Neutron neural processing unit (NPU), enabling machine learning and advanced edge applications. Its application areas include automotive, industrial, and IoT.
 
For collision avoidance applications, OEMs and Tier 1 suppliers can now access affordable and reliable imaging radar sensor technology more easily. NXP has introduced a dedicated chipset, including the 16nm FinFETS32R41 automotive imaging radar processor and the TEF82xx RFCMOS transceiver, using a dual-cascading configuration.
 
NXP's radar chipset, with 4D imaging radar sensors, offers high cost-effectiveness and performance, featuring 48 channels, supporting one degree azimuth resolution, two degree elevation resolution, a maximum detection range of 370 meters for vehicles, and a maximum detection range of 130 meters for tires without rims.

Autonomous driving technology becomes a key development for new-generation vehicles

Autonomous driving technology has undeniably become a crucial component in the development of next-generation vehicles. Companies like Tesla have already introduced advanced autonomous driving assistance systems. Among these, ADAS based on LiDAR sensors is one of the most innovative and efficient technologies for autonomous vehicles. When combined with vision-based and radar systems, LiDAR systems provide highly accurate object detection and recognition in ADAS. The integration of radar, LiDAR, and vision-based systems effectively creates a safer autonomous driving experience. 
 
LiDAR sensors emit invisible laser beams to scan and detect objects near or far from the sensor, creating a 3D map of objects and surroundings on a display. In automotive applications, most LiDAR sensors are mounted on the top of the vehicle. These sensors continuously rotate and generate thousands of laser pulses per second. The high-speed laser beams from LiDAR are continuously emitted around the vehicle's 360-degree perimeter and are reflected off objects on the road. Using sophisticated machine learning algorithms, the data received from this activity is converted into real-time 3D graphics, typically displayed as 3D images or maps of the surrounding objects.
 
eInfochips, a subsidiary of Arrow Electronics, serves as a provider of automotive engineering services and solutions, assisting automotive companies in designing and developing ADAS systems using vision-based, radar, and LiDAR sensors. At the 2024 Embedded World (#ew24) exhibition, eInfochips showcased an autonomous mobile robot (AMR) capable of navigating between points while avoiding dynamic obstacles. It features advanced technologies, including Time of Flight (ToF) sensors, imaging sensors, an inertial measurement unit (IMU), and is powered by an NVIDIA main processor. Motor control and power management, including batteries, are handled by ADI components, demonstrating complex design capabilities.
 
Camera modules are also indispensable components in autonomous driving applications. D3 Engineering's DesignCore® camera series is well-suited for embedded vision applications that require the highest safety and precision. These cameras enable rapid prototyping and customized design for customers' production systems. D3's high-performance camera portfolio includes the new DesignCore® Discovery, Velocity, and Chroma series, each designed to maximize image quality. These cameras enhance optical signal through higher resolution and wider apertures and are optimized for out-of-the-box AI applications.
 
The AR0234CS by onsemi is also highly suitable for autonomous driving applications. It is a 1/2.6-inch 2.3Mp CMOS digital image sensor with a global shutter and an active pixel array of 1920 (H) x 1200 (V). This sensor uses an innovative global shutter pixel design to capture moving scenes accurately and quickly at 120 frames per second at full resolution. It produces clear, low-noise images in both low-light and bright scenes. The AR0234CS delivers industry-leading global shutter efficiency, producing extremely clear and sharp digital images, making it ideal for both continuous video and single-frame capture, perfect for autonomous driving applications.
 
Although autonomous driving technology still needs more time to fully prove its capabilities, the continuous improvement in edge computing power indicates that the maturation of autonomous driving technology is not far off. Meanwhile, AMR will continue to help the market prepare more robust AI algorithms before they are applied to passenger cars on the road.
 
As the shipment of mobile robots surges to meet the growing demand from industries seeking operational efficiency, NVIDIA is launching a new platform to support the next generation of AMR applications. NVIDIA's Isaac AMR brings advanced autonomy to mobile robots, offering advanced mapping, autonomy, and simulation capabilities. It is a platform for simulate, validate, deploy, optimize, and manage AMRs. It includes edge-to-cloud software services, computing, and a set of reference sensors and robotic hardware to accelerate the development and deployment of AMRs, thus reducing costs and shortening time to market.

Conclusion

The ongoing advancement of ADAS and autonomous driving technology are steering us into a new era of transportation. As these technologies mature, ADAS provides a safer and more convenient driving experience while laying the groundwork for fully autonomous driving. Although full autonomy faces numerous challenges, including technical complexity, legal, and ethical issues, it is undeniable that innovation and progress in this field are advancing rapidly. Due to space limitations, the solutions discussed in this article represent only a small portion of the relevant applications. For more information on ADAS system design methods and component topics, please contact Arrow Electronics directly.

Article Tags

Automotive & Transportation
Autonomous Machines
Automotive
Article
APAC

Related Content