Sensor fusion enhances the navigation and safety of Autonomous Mobile Robots

Arrow Times/Oct/1024-ArrowTimes-onsemi-Header-Image-820x410

The philosophy behind Industry 5.0 is humans working alongside artificial intelligence (AI)-powered robots, with the vision that these robots are used to support humans, not replace them. Autonomous Mobile Robots (AMRs) can improve productivity, enhance safety, and save manufacturers significant costs. For these reasons, the adoption of AMRs will expand to almost every industry. However, before that happens, AMRs must overcome several challenges, and one of the keys to overcoming these challenges is the integration of various sensors and the emerging field of sensor fusion. This article will introduce the development of sensor fusion technology and related solutions launched by onsemi.

The application prospects of AMR are promising but also face challenges

AMR has application features that can reduce costs, improve safety, and increase efficiency, which is why more and more industries are beginning to adopt it. According to surveys, the global AMR market was valued at $8.65 billion in 2022, and the compound annual growth rate (CAGR) is expected to be 18.3% from 2022 to 2028. Although the development prospects of AMR are promising, adopting AMR also faces many challenges.

The primary challenge in adopting AMR is the diversity of applications and environments in which it operates. Common uses of AMR include warehouses, agricultural technology, commercial landscaping, healthcare, smart retail, security and surveillance, delivery, inventory, and picking and sorting. In all these different environments, AMRs are expected to operate safely around people.

However, the complexity of these application scenarios makes AMR's work extremely challenging. As humans, some situations are taken for granted, but AMRs may find them difficult to handle. For example, imagine a delivery robot delivering a package and seeing a ball in the middle of the path. It’s likely the robot can identify the ball and avoid hitting it, but is it smart enough to foresee that a child might run out to fetch the ball? There are many complex situations like this. For instance, can AMRs use and recognize a 90-degree reflective mirror installed on a roadside pole to observe the situation around a corner and predict traffic in advance? Can an AMR understand that it shouldn’t walk on freshly poured concrete? These are commonplace for humans, but they present challenges for AMRs.

Perhaps these situations, which are easily understood by humans, are more challenging for robots. However, with the right sensors, AMRs may be able to detect objects in bright sunlight more easily than humans. But identifying freshly poured concrete and spilled liquids could be difficult. Moreover, edges, cliffs, ramps, and stairs also pose challenges for AMRs. There are also special situations, such as when someone deliberately sabotages the AMR, which would inspire the need to design an escape maneuver systems. Solving many of the challenges mentioned above would require AI to use state-of-the-art large language models (LLMs) and various types of high-performance sensors.

1024-ArrowTimes-onsemi-Article-AMR

High-performance sensors for AMRs each have their own advantages and disadvantages

AMRs can use different types of sensors to detect the environment, and these sensors need to perform simultaneous localization and mapping (SLAM) while providing distance and depth measurements. Important sensor metrics include object detection, object identification, color recognition, resolution, power consumption, size, cost, range, dynamic range, speed, and the ability to operate under various lighting and weather conditions.

Sensor types that can be used for AMRs include CMOS imaging, direct time-of-flight (dToF) and indirect time-of-flight (iToF) depth sensing, ultrasonic, radar, inductive positioning, Bluetooth® Low Energy (Bluetooth LE) technology, inertial, and others. Each sensor type has its own advantages and disadvantages.

For example, radar offers excellent range and speed performance in low light or adverse weather conditions, but it has poor color detection capabilities, higher initial costs, and relatively large size (an important consideration for AMRs). LiDAR, due to its high-volume CMOS silicon foundry process, has a relatively lower initial cost and performs well in both nighttime and direct sunlight, but it is less effective in object classification. On the other hand, iToF depth sensors offer excellent resolution and low-power processing capabilities.

Clearly, using a single sensor type alone cannot provide all the information that AMRs need to address all of the challenges mentioned above. Depending on the application and environment, AMRs will need several or even multiple types of sensors. These sensors do not operate in isolation but work together in a process known as sensor fusion.

1024-ArrowTimes-onsemi-Article-Sensor fusion process

How do autonomous mobile robots achieve sensor fusion

Sensor fusion is the process of combining two or more data sources (from sensors and/or an algorithm or a model) to gain a better understanding of the system and its surroundings. Sensor fusion in AMRs is crucial as it provides better reliability, redundancy, and ultimately safety, while ensuring that the evaluation results are more consistent, accurate, and reliable.

Sensor fusion combines two functionalities: data collection and data interpretation. The "interpret data" step in sensor fusion requires implementing an algorithm or a model. Sometimes the results of sensor fusion are designed for human habits, such as assisting in cars, and sometimes they are designed for further machine applications, such as facial recognition in security systems.

Sensor fusion has various advantages, such as reducing signal noise. Homogeneous sensor fusion can reduce uncorrelated noise, while heterogeneous sensor fusion can reduce correlated noise. Due to its inherent nature, sensor fusion improves reliability through redundancy. Since there are at least two sensors, if one sensor's data is lost, the detection quality may decrease, but since other sensor data is still available, it will not fail completely. Sensor fusion can also be used to estimate unmeasured states, for example, when an object or part of an object is occluded from view by a camera, and when an object or surface reflects light from one camera to another, sensor fusion maintains a certain level of detection performance.

Due to these advantages and the accelerated market adoption, there are some emerging trends in sensor fusion, including the use of AI-powered algorithms, enhanced object detection and classification, the combination of sensor fusion with multiple sensor modalities for collaborative perception, and environmental perception under adverse conditions. Sensor fusion can achieve 360-degree surround view and enable real-time sensor calibration, among other functions.

1024-ArrowTimes-onsemi-Article-Industrial Automation

Provide a complete solution for sensor fusion

Sensor fusion in AMR aims to have a significant impact on industrial and transportation applications. In the process of advancing toward Industry 5.0, onsemi is committed to providing sensors and subsystems to ensure effective implementation. onsemi's subsystem solutions are also quite diverse, ranging from rugged high-resolution imaging systems to high-power motor control and compact, efficient battery charging solutions, all built on decades of experience serving the automotive industry. onsemi's solutions work together to ensure that development becomes easier and that industrial robots are adaptable and reliable enough to operate in the harshest environments.

Autonomous mobile robots have similar functions to autonomous vehicles and are complex designs composed of multiple subsystems, allowing robots to move, observe, and operate safely with minimal human interaction. onsemi minimizes this complexity through reliable intelligent power and sensing solutions, providing the necessary building blocks for your design.

The core of sensor fusion is the sensors. If the data from the sensors is not good, even the best algorithms will not produce high-quality results. Fortunately, onsemi offers world-class sensors and toolkits to support sensor fusion in AMR.

onsemi is a leader in smart sensing technology, offering a broad portfolio of rolling shutter and global shutter image sensors with industry-leading performance in features such as dynamic range and wake on motion, meeting the requirements of various possible end applications, from wearables and consumer electronics to demanding industrial and automotive applications.

In addition to image sensors, onsemi also provides SiPM for range detection (LiDAR). This product portfolio includes ultrasonic sensors, inductive sensors, and microcontrollers supporting Bluetooth® LE technology, with support for AoA (Angle of Arrival) and AoD (Angle of Departure) for position finding.

A specific example is the NCV75215, an application-specific standard product (ASSP) for ultrasonic parking distance measurement applications. It can operate with piezoelectric ultrasonic sensors to provide time-of-flight measurement of obstacle distance during vehicle/AMR parking. It features high sensitivity and low-noise operation, allowing detection from 0.25 m to 4.5 m on a standard 75 mm pole, with the actual minimum distance determined by the length of reverberation. Under ideal conditions, with perfectly tuned and matched external circuitry, a minimum distance of 0.2 m can be achieved, with the actual detection range depending on the piezoelectric ultrasonic transducer and external analog parts.

This device drives the ultrasonic transducer at a programmable frequency via a transformer. The received echo is amplified, converted to a digital signal, filtered, detected, and compared to time-dependent threshold stored in an internal RAM. The distance to the obstacle is determined by the time from transmission burst to echo recognition. The built-in bidirectional I/O line can be used for communication with the master (ECU). The master sends I/O line commands to the NCV75215, and the data is reported back through the same line.

Conclusion

Autonomous mobile robots (AMRs) have numerous use cases, and their adoption is accelerating, with a series of best practices emerging to support this rapid adoption. First, it is essential to control the environment to reduce potential collisions that AMRs may encounter. An example of this is designating paths for AMRs/automated guided vehicles (AGVs) in manufacturing or warehouse facilities. Second, it is crucial to use digital twins during the development process to simulate exact use cases, including extreme scenarios. Finally, integrating sensor fusion with intelligent sensors, algorithms, and models is vital. onsemi can provide a complete solution for sensor fusion, including high-resolution imaging systems combined with image sensors, high-power motor control, and efficient compact battery charging solutions, all of which meet the various needs of AMR applications. If you have any related needs, please contact Arrow or onsemi for further product and application information.

Related news articles

Latest News

Sorry, your filter selection returned no results.

We've updated our privacy policy. Please take a moment to review these changes. By clicking I Agree to Arrow Electronics Terms Of Use  and have read and understand the Privacy Policy and Cookie Policy.

Our website places cookies on your device to improve your experience and to improve our site. Read more about the cookies we use and how to disable them here. Cookies and tracking technologies may be used for marketing purposes.
By clicking “Accept”, you are consenting to placement of cookies on your device and to our use of tracking technologies. Click “Read More” below for more information and instructions on how to disable cookies and tracking technologies. While acceptance of cookies and tracking technologies is voluntary, disabling them may result in the website not working properly, and certain advertisements may be less relevant to you.
We respect your privacy. Read our privacy policy here