Challenges for Sensor Fusion in Consumer Electronics Design

Published By

The best examples today of synergistic electronic systems — ones whose components produce a combined effect that is greater than the sum of their separate effects ¬— are those that are built using sensor fusion. With sensor fusion, the outputs of two or more sensors are combined using algorithms, typically embodied in software, to provide data about external conditions that is better than any individual sensor could provide.

For example, the output from a 3-axis accelerometer, a 3-axis magnetometer, and a 3-axis gyroscope, and a pressure sensor can be combined using sensor fusion to provide what is referred to as a 10-axis sensor. The performance of the combination, with its redundant measurements in the x, y, and z axes, is significantly better than any single 3-axis device because the error in any axis measurement in one device can be accounted for using data from the other devices.

The concept of sensor fusion is by no means new; military organizations have employed sensor fusion in weaponry since the last millennium. What is new is the fact that, with the widespread use of multiple sensors in such consumer electronics as cell phones, wearables and IoT devices, sensor fusion can allow them to operate smarter, providing users with better, more reliable information. While the application of sensor fusion in this latter realm is in its infancy, it is already getting widespread support from sensor suppliers such as Bosch, Freescale, PNI Sensors and STMicroelectronics.

George Hsu, PNI Sensors’ Founder, Board Chairman, CEO and CTO, is a pioneer in the field of sensor fusion and navigation sensors, having invented several magnetic sensor breakthroughs over the last 20 years, including the Magneto-Inductive technology which is the core of today’s electronic compassing. His experience provides unique insight into the state of sensor fusion for consumer electronics today. In a recent private discussion, we spoke about one of the main things that people would like to use sensor fusion for: providing precise indoor navigation, in places like shopping malls, so as to be able to, say, direct consumers to a flash sale.

There has been a lot of interest in this application and, to aid in realizing it, there are lots of small, inexpensive sensors around: accelerometers, gyros and magnetometers, for instance, as well as pressure sensors to barometrically determine altitude. And several different types of sensors are available in combinations in a single package. Yet as Hsu points out, “Dead reckoning has not been significantly achieved. There are no user-facing outcomes yet, although there are many in the algorithm race.”

The challenge, as Hsu sees it, is not a lack of hardware to make this happen, but getting the algorithm right. And that not only means having it combine sensor outputs for maximum location accuracy, it means doing it very efficiently. Efficiency is needed so that the time spent computing results doesn’t consume a lot of power, which is anathema for portable consumer electronics.

Even some currently popular wearables haven’t gotten it right yet. Experiments with two popular wellness/exercise monitors, the Fitbit and the Jawbone UP step counters, by PNI Sensors reveal significant discrepancies in their results for the number of steps taken, the distance covered and the total calories burned (Table 1).

0715_Challenges for Sensor Fusion in Consumer Electronics Design_inarticle1

Table 1: Comparison of the performance of two accelerometer-only-based step counters. (Source: PNI Sensors)

Part of the reason for the discrepancy was that both devices only used a single accelerometer to determine steps, based on a threshold trigger. This type of system, while inexpensive, is prone to false readings. But it’s essentially the algorithms that determine how wearables capture data, and smart algorithms can achieve higher accuracy. PNI actually developed an accelerometer-only step counting algorithm that optimizes both power and performance by applying both biomechanical and heuristics-based filtering on threshold-crossing features, extracted over a 4-deep step buffer, to accurately identify false or missing steps. The accelerometer-only algorithms proved to be more than 98 percent accurate while consuming less than 60 µA.

The experiment underscores how important it is for designers to get the algorithm right to achieve accurate performance. But Hsu notes that “Power plays a key role as you add more sensors.” The designer has to determine what data you need to see all the time, what sensors can be turned on and off, and how much processing should be done at the sensor and how much by the system processor. The trade-offs are the crux of efficient, accurate performance.

Hence designers will have to become intensely familiar with the application and have a deep understanding of the variables in order to create designs that perform the way consumers want them to. To borrow an example from another field, it’s essential that sensor fusion devices not perform like compact fluorescent bulbs, which so disappointed consumers that they were extremely wary when a truly better lighting solution, the LED bulb, came along.

Another key aspect of design will be the need for code modularity. It’s likely that over the course of a project, a designer may decide to switch components, and that designs will evolve with time to produce different versions of a project. If a designer is not going to start back at square one each time, his or her algorithm-embodying code will need to be reusable, and capable of playing well with other algorithmic code when sensors are fused into new systems. This not only reduces development cost, but ends up speeding products to market.

Another leader in sensor fusion, Freescale Semiconductor, has made its entire sensor fusion library open source, so that engineers don’t have to “reinvent the wheel” when it comes to designing multisensor projects. Freescale worked with MEMS Industry Group (MIG) to forming the Accelerated Innovation Community (AIC) to facilitate sharing and adoption of algorithms for sensor fusion and analytics, and seeded the effort by contributing its sensor fusion library, documentation and source files for Windows- and Android-based visualization tools. They have been joined in the effort by Analog Devices and PNI, among others.

Right now, Hsu observes, the desire for sensor systems able to detect and interpret their operating environment, so-called context detection, is high and new requirements for sensor fusion are coming daily.

“It’s a frontier mentality,” he says, one that’s constantly shifting and broadening the definition of what sensor systems are and can do. To survive on the frontier, you not only have to draw fast, but your aim has to be dead on.

Related news articles

Latest News

Sorry, your filter selection returned no results.

We've updated our privacy policy. Please take a moment to review these changes. By clicking I Agree to Arrow Electronics Terms Of Use  and have read and understand the Privacy Policy and Cookie Policy.

Our website places cookies on your device to improve your experience and to improve our site. Read more about the cookies we use and how to disable them here. Cookies and tracking technologies may be used for marketing purposes.
By clicking “Accept”, you are consenting to placement of cookies on your device and to our use of tracking technologies. Click “Read More” below for more information and instructions on how to disable cookies and tracking technologies. While acceptance of cookies and tracking technologies is voluntary, disabling them may result in the website not working properly, and certain advertisements may be less relevant to you.
We respect your privacy. Read our privacy policy here