When a colleague of mine recently asked me “What is sensor
fusion?” I had to stop and think. Like Justice Potter Stewart once
said, “I know it when I see it.” But as an engineer dealing with
this topic every day, I should be able to do better. Eventually I came up with
the following:
Sensor fusion encompasses a variety of techniques which can:
-
Trade off strengths and weaknesses of the various sensors to compute
something more than can be calculated using the individual components;
-
Improve the quality and noise level of computed results by taking advantage
of:
- Known data redundancies between sensors
-
Knowledge of system transfer functions, dynamics and/or kinematics
Good lord! Sounds like something out of one of my textbooks. It’s more
fun to look at it by example.
Accelerometers return a measured quantity which includes inertial acceleration
as well as gravity. When not moving, they make a great tilt meter. But they
can’t detect rotation about the gravity vector. Magnetometers have a
similar problem detecting rotation about the earth’s magnetic field.
But combine the two, and you have a case where each complements the other to
achieve something that neither can do alone.
MEMS gyros are used to measure angular rotation, and can typically respond to
changes in rotation quickly. They also often have considerable offset and
drift over time. Magnetometers provide a way to place bounds on those offset
and drift terms. And conversely, the gyro data is useful as a second check
against magnetic interference.
You can see techniques like these in use in the great variety of iPhone and
Android sensor applications which can be downloaded to your phone today. And
sometimes, you can see cases where the developer SHOULD have used techniques
like these!
One of the sensor fusion applications I love to show people is the “3D
Compass” application that I’ve downloaded to my Xoom from the
Android Market. This augmented reality application fuses magnetometer,
accelerometer and GPS information to show you not only where you are, but what
your current perspective is. The application screen provides a current camera
view, overlay-ed with a virtual compass and map oriented the same way you are
facing and slowing your current location. Sweet!
I hope we see more creative applications like this brought to market in coming
months.
In the next couple of years, we will see applications built on algorithms that
model the behavior of the system under study, including statistical noise
behavior of the sensors included in the system. By comparing measured
quantities with predicted ones, it is often possible to tease signals out of
what would otherwise look like nothing more than noise.
Low cost MEMS and solid state sensors are enabling consumer products and
applications that were cost prohibitive a few years ago. We are fortunate in
that most of the sensor fusion problems we are dealing with at the micro level
were addressed at the macro level by NASA and the aeronautics business 30 or
more years ago. Since joining the sensor design team,
I’ve had to brush up on my math and control theory and invest in a
number of good textbooks. If you share my passion for the topic, you could do
worse than to obtain some of those listed in the references below. And if you
have a better definition for sensor fusion, please share by responding to this
posting.
References
-
Optimal State Estimation – Kalman, H-infinit and Nonlinear
Approaches, Dan Simon, John Wiley & Sons, 2006
-
Introduction to Random Signals and Applied Kalman Filtering with Matlab
Exercises and Solutions, 3rd edition, Robert Grover Brown and Patrick Y.C.
Hwang, John Wiley & Sons, 1997
-
Inertial Navigation Systems Design, Kenneth R. Britting, Artech House, 2010
(a classic text, this was first published in 1971)
-
Strapdown Inertial Navigation Technology, 2nd Edition, D.H. Titterton and
J.L. Weston, The Institute of Electrical Engineers, 2004
-
Quaternions and Rotation Sequences, by Jack B. Kuipers, Princeton University
Press, 1999