Advanced Driver Assistance Systems (commonly known as ADAS) are a suite of technologies designed to enhance vehicle safety and improve the overall driving experience. These systems utilize automated technology, including various sensors and cameras, to detect nearby obstacles or potential driver errors and respond accordingly. ADAS has gradually become standard equipment in new vehicles, with market demand growing rapidly. This article will introduce the applications and development of ADAS, as well as the high-quality image sensor solutions launched by onsemi.
ADAS significantly enhances vehicle safety, comfort, and convenience
ADAS offers a wide range of applications that significantly enhance vehicle safety, comfort, and convenience. These applications include Adaptive Cruise Control, Obstacle Detection, Collision Avoidance, Lane Departure Warning, Sign Reading, Emergency Braking, Pedestrian Detection, Autonomous Driving, Traffic/Stop Light Perception, Security Comfort, Vehicle Occupancy, and Object and Presence Detection. ADAS systems are equipped with the latest interface standards and are capable of running multiple vision-based algorithms. This allows them to support real-time multimedia, vision coprocessing, and sensor fusion subsystems. The primary aim of these systems is to prevent accidents and injuries by reducing the number of car accidents and mitigating the serious impact of those that cannot be avoided.
The operation of ADAS relies heavily on various components, including image sensors, LiDAR, radar, and ultrasonic sensors. ADAS systems depend on a complex interplay between forward/surround sensing and in-cabin sensing technologies. The sensing system acts as the system's eyes and ears, utilizing technologies like LiDARs, radars, ultrasonics, and cameras to gather vital data about the surrounding environment. This raw data is then processed and presented to the driver through viewing functionalities. Information can be displayed via a head-up display (HUD) projected onto the windshield, minimizing distraction, or on a traditional in-dash display requiring a momentary glance.
Cameras are one of the most common types of ADAS sensors used in contemporary vehicles. They play a crucial role in detecting objects on the road, including cars, cyclists, and pedestrians. Without cameras, the car would essentially be blind to the world around it.
As technology continues to evolve, ADAS is expected to play an increasingly significant role in shaping the future of transportation. By leveraging a combination of image sensors, LiDARs, ultrasonics, and radars, ADAS can provide valuable insights to the driver, helping them make better-informed decisions on the road. This revolutionary technology promises to automate driving and enhance vehicle safety, paving the way for a safer and more efficient future of transportation.

ADAS offers a suite of features designed to enhance driver safety and improve the overall driving experience
ADAS represents a significant leap in automotive technology, offering a suite of features designed to enhance driver safety and improve the overall driving experience. Key components of ADAS include Adaptive Cruise Control (ACC), Pedestrian Detection, Parking Detection, Surround View Cameras, Lane Detection, and Blind Spot Detection.
Adaptive Cruise Control (ACC) is an advanced feature that builds upon traditional cruise control. Utilizing sensors such as radar, LiDAR, or cameras, ACC continuously monitors the road ahead to detect the presence and speed of other vehicles. It automatically adjusts the car's speed or engages the braking system to maintain a preset safe distance from the vehicle in front.
Lane Detection / Lane Keep Assist is an essential feature for extended highway driving. A camera positioned on the windshield continuously captures footage of lane markings on the road. Advanced image processing software analyzes this video feed, identifying lane boundaries and specific patterns such as dashed lines. If the vehicle unintentionally departs from its lane without activating the turn signal, the system triggers an alert, which can be an alarm or a vibration of the steering wheel and may also assist in steering.
Pedestrian Obstacle Detection is a critical safety component. It employs cameras with advanced computer vision software to scan the road for pedestrians and obstacles. By analyzing shapes, heat signatures from thermal cameras, and movement patterns, the system can identify pedestrians, obstacles, and potential collision scenarios.
Surround View Cameras are strategically positioned around the vehicle to provide a comprehensive 360-degree view. This bird's-eye view is displayed on the in-car screen, helping drivers visualize potential obstacles that might be hidden from their normal viewpoint. This feature greatly aids in parking and low-speed maneuvers, enhancing situational awareness and reducing the stress of navigating confined spaces.
ADAS drives the rapid development of fully autonomous vehicles
ADAS is also driving the development of fully autonomous vehicles. Autonomous driving can be categorized into six levels, each representing a step closer to complete autonomy. The extent of automation available to the driver has a defined scale, known as the SAE Levels of Driving Automation, ranging from Level 0 (No Driving Automation) to Level 5 (Full Driving Automation). Level 0 means the driver performs all driving tasks like steering, acceleration, braking, etc. Level 1 involves the driver continuously exercising longitudinal or lateral control. Level 2 requires the driver to monitor the system at all times. Level 3 means the driver does not have to monitor the system at all times but must always be in a position to resume control. Level 4 does not require a driver during defined use cases. At Level 5, the vehicle performs all driving tasks under all conditions, requiring zero human interaction.
As technology continues to evolve, automobiles will continue to ascend the SAE Levels ladder, adding more automated driving features over time. One thing is certain: each higher level will require ever more advanced image sensors to achieve its mission.
If a vehicle equipped with an ADAS or Autonomous Driving System (ADS) is to travel at high speed, it must be able to detect objects at long distances. For example, detecting a child crossing the road while traveling at 35 mph requires a detection range of about 50 meters. However, detecting that same child at 62 mph requires a detection range of 160 meters. The relationship between speed and detection range is not linear, as it involves the car's mass, inertia, and the time required to perform safe maneuvering. As the car speeds up, the detection range increases by an even greater amount.
Furthermore, driving at maximum speeds requires reliable recognition at a distance, both day and night. Night conditions for autonomous driving can be extremely complex, presenting some well-defined challenges. First, the sensor must accommodate large lighting variations, ranging from an urban environment with some ambient light to a rural environment without any ambient light. Detecting objects on the side of the car without any illumination is most challenging.
Then there is the headlights challenge. Headlights can be operated in low beam and high beam, and their brightness can also vary from country to country. Detecting smaller objects is especially challenging because the on-scene illumination from the headlights decreases inversely with the square of the distance. At night, beyond a certain distance, human beings are unable to see a lost tire or a small rock on the road, even with headlight illumination, whereas an ADAS system should be able to "see". Considering all the different variables, detecting objects at night is challenging but essential for the safety of ADAS and autonomous driving.
CMOS digital image sensors suitable for ADAS and autonomous vehicle applications
onsemi has developed a family of small pixel, high resolution, low power consumption CMOS digital image sensors called Hyperlux. A member of that family, the AR0823AT (2.1 μm, 8.3 MP, 1/1.8-inch), is ideally suited for ADAS and autonomous vehicle applications. AR0823AT delivers leading performance including 150 dB High Dynamic Range (HDR) operation, LED Flicker Mitigation (LFM), low light performance, high pixel sensitivity, sharpness, and resolution.
To demonstrate its applicability for automotive applications, onsemi has measured and tested the AR0823AT in the lab, as well as outdoors in various day and night scenarios. For example, tests were run to establish detection capabilities for Vulnerable Road Users (VRUs) and small objects. The study focused on sunset and rural night conditions with high and low beam headlights. Three different camera Fields of View (FOVs) were used: 30-degree, 60-degree, and 120-degree.
The AR0823AT sensor captures images in HDR with LFM. The 2.1 μm Super Exposure pixel enables up to 150 dB of dynamic range without the need for auto exposure adjustment. This significantly reduces latency in scene-dependent critical automotive systems, enabling faster and safer data gathering and decision-making, without pauses, missed or incomplete image information.
In addition to the above, the AR0823AT delivers best-in-class performance in low light conditions due to the high sensitivity of its Backside Illuminated (BSI) sensor, sub-electron dark noise floor, and excellent thermal stability across the entire automotive temperature range from -40°C to +125°C junction temperature.
The AR0823AT sensor also enables simultaneous and non-conflicting camera use for sensing and viewing applications due to its high color fidelity, as well as high-precision stereo cameras that offer better detection at longer distances.
In addition, onsemi also launched the AR0820AT image sensor, which is a 1/2-inch CMOS digital image sensor with a 3848 H x 2168 V active−pixel array. This advanced automotive sensor captures images in either linear or high dynamic range, with rolling−shutter readout.
AR0820AT is optimized for both low light and challenging high dynamic range scene performance, with a 2.1 µm DR−Pix BSI pixel and on−sensor 140 dB HDR capture capability. The sensor includes advanced functions such as in−pixel binning, windowing, and both video and single frame modes to provide flexible Region of Interest (ROI) or specific resolution in order to enhance performance in extreme low light conditions. The sophisticated sensor fault detection features and embedded data on AR0820AT are designed to enable camera ASIL B compliance. The device is programmable through a simple two−wire serial interface and supports MIPI output interface.
The AR0820AT features a high performance 2.1 µm automotive grade Backside Illuminated (BSI) pixel with DR−Pix™ technology. It supports advanced on−sensor HDR reconstruct with flexible exposure ratio control, enabling fast full resolution video capture of 3840 x 2160 at up to 40 fps in 3−exposure HDR and 30 fps in 4−exposure HDR. It also supports line interleaved T1/T2/T3/T4 output and sensor fault detection for ASIL−B compliance. It supports 2 x 2 in−pixel binning mode and color binning mode.
The AR0820AT also features a 1.8 Gbps/lane data interface, a 4−lane MIPI CSI−2 interface, selectable automatic or user controlled black level control, frame to frame switching among up to 4 contexts to enable multi−function systems, and support for multi−camera synchronization. It offers a multiple CFA Options including RGB, and RCCC, RCCB, and is Pb−Free device. The AR0820AT can be applied to front view camera for ADAS and autonomous driving. End products include front view ADAS, surround sensing camera, in cabin, robot taxi, robot delivery, and autonomous trucks.
Conclusion
Advanced Driver Assistance Systems (ADAS) and solutions for enhancing vehicle safety are not only a manifestation of technological progress but also a crucial step towards promoting intelligent transportation and autonomous driving into the future. Through the synergistic operation of sensor fusion, artificial intelligence judgment, and real-time communication, ADAS can effectively reduce driving risks, decrease accident rates, and provide a higher level of safety assurance for drivers and passengers. As vehicle connectivity and autonomous driving technologies mature, the high-quality image sensors specifically introduced by onsemi for ADAS applications will assist in achieving comprehensive, preventive, and intelligent safety management, laying a solid foundation for creating a safer and more efficient traffic environment.
