Sensor Fusion on Android Devices: A Revolution in Motion Processing

By: GoogleTechTalks

2403   33   382397

Uploaded on 08/09/2010

Google Tech Talk
August 2, 2010


Presented by David Sachs.

Gyroscopes, accelerometers, and compasses are increasingly prevalent in mainstream consumer electronics. Applications of these sensors include user interface, augmented reality, gaming, image stabilization, and navigation. This talk will demonstrate how all three sensor types work separately and in conjunction on a modified Android handset running a modified sensor API, then explain how algorithms are used to enable a multitude of applications.

Application developers who wish to make sense of rotational motion must master Euler angles, rotation matrices, and quaternions. Under the hood, sensor fusion algorithms must be used in order to create responsive, accurate, and low noise descriptions of motion. Reducing sensing errors involves compensating for temperature changes, magnetic disturbances, and sharp accelerations. Some of these algorithms must run at a very high rate and with very precise timing, which makes them difficult to implement within low-power real-time operating systems. Within Android specifically, this involves modifying the sensor manager, introducing new APIs, and partitioning motion processing tasks.

David Sachs began developing motion processing systems as a graduate student at the MIT Media Lab. His research there led him to InvenSense, where he continues this work with MEMS inertial sensors used in products such as the Nintendo Wii Motion Plus. David's designs incorporate gyroscopes, accelerometers, and compasses in various combinations and contexts including handset user interfaces, image stabilizers, navigation systems, game controllers, novel Braille displays, and musical instruments.

Comments (6):

By anonymous    2017-09-20

Well, +1 to you for even knowing what a Kalman filter is. If you'd like, I'll edit this post and give you the code I wrote a couple years ago to do what you're trying to do.

But first, I'll tell you why you don't need it.

Modern implementations of the Android sensor stack use Sensor Fusion, as Stan mentioned above. This just means that all of the available data -- accel, mag, gyro -- is collected together in one algorithm, and then all the outputs are read back out in the form of Android sensors.

Edit: I just stumbled on this superb Google Tech Talk on the subject: Sensor Fusion on Android Devices: A Revolution in Motion Processing. Well worth the 45 minutes to watch it if you're interested in the topic.

In essence, Sensor Fusion is a black box. I've looked into the source code of the Android implementation, and it's a big Kalman filter written in C++. Some pretty good code in there, and far more sophisticated than any filter I ever wrote, and probably more sophisticated that what you're writing. Remember, these guys are doing this for a living.

I also know that at least one chipset manufacturer has their own sensor fusion implementation. The manufacturer of the device then chooses between the Android and the vendor implementation based on their own criteria.

Finally, as Stan mentioned above, Invensense has their own sensor fusion implementation at the chip level.

Anyway, what it all boils down to is that the built-in sensor fusion in your device is likely to be superior to anything you or I could cobble together. So what you really want to do is to access that.

In Android, there are both physical and virtual sensors. The virtual sensors are the ones that are synthesized from the available physical sensors. The best-known example is TYPE_ORIENTATION which takes accelerometer and magnetometer and creates roll/pitch/heading output. (By the way, you should not use this sensor; it has too many limitations.)

But the important thing is that newer versions of Android contain these two new virtual sensors:

TYPE_GRAVITY is the accelerometer input with the effect of motion filtered out TYPE_LINEAR_ACCELERATION is the accelerometer with the gravity component filtered out.

These two virtual sensors are synthesized through a combination of accelerometer input and gyro input.

Another notable sensor is TYPE_ROTATION_VECTOR which is a Quaternion synthesized from accelerometer, magnetometer, and gyro. It represents the full 3-d orientation of the device with the effects of linear acceleration filtered out.

However, Quaternions are a little bit abstract for most people, and since you're likely working with 3-d transformations anyway, your best approach is to combine TYPE_GRAVITY and TYPE_MAGNETIC_FIELD via SensorManager.getRotationMatrix().

One more point: if you're working with a device running an older version of Android, you need to detect that you're not receiving TYPE_GRAVITY events and use TYPE_ACCELEROMETER instead. Theoretically, this would be a place to use your own kalman filter, but if your device doesn't have sensor fusion built in, it probably doesn't have gyros either.

Anyway, here's some sample code to show how I do it.

  // Requires 1.5 or above

  class Foo extends Activity implements SensorEventListener {

    SensorManager sensorManager;
    float[] gData = new float[3];           // Gravity or accelerometer
    float[] mData = new float[3];           // Magnetometer
    float[] orientation = new float[3];
    float[] Rmat = new float[9];
    float[] R2 = new float[9];
    float[] Imat = new float[9];
    boolean haveGrav = false;
    boolean haveAccel = false;
    boolean haveMag = false;

    onCreate() {
        // Get the sensor manager from system services
        sensorManager =

    onResume() {
        // Register our listeners
        Sensor gsensor = sensorManager.getDefaultSensor(Sensor.TYPE_GRAVITY);
        Sensor asensor = sensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER);
        Sensor msensor = sensorManager.getDefaultSensor(Sensor.TYPE_MAGNETIC_FIELD);
        sensorManager.registerListener(this, gsensor, SensorManager.SENSOR_DELAY_GAME);
        sensorManager.registerListener(this, asensor, SensorManager.SENSOR_DELAY_GAME);
        sensorManager.registerListener(this, msensor, SensorManager.SENSOR_DELAY_GAME);

    public void onSensorChanged(SensorEvent event) {
        float[] data;
        switch( event.sensor.getType() ) {
          case Sensor.TYPE_GRAVITY:
            gData[0] = event.values[0];
            gData[1] = event.values[1];
            gData[2] = event.values[2];
            haveGrav = true;
          case Sensor.TYPE_ACCELEROMETER:
            if (haveGrav) break;    // don't need it, we have better
            gData[0] = event.values[0];
            gData[1] = event.values[1];
            gData[2] = event.values[2];
            haveAccel = true;
          case Sensor.TYPE_MAGNETIC_FIELD:
            mData[0] = event.values[0];
            mData[1] = event.values[1];
            mData[2] = event.values[2];
            haveMag = true;

        if ((haveGrav || haveAccel) && haveMag) {
            SensorManager.getRotationMatrix(Rmat, Imat, gData, mData);
                    SensorManager.AXIS_Y, SensorManager.AXIS_MINUS_X, R2);
            // Orientation isn't as useful as a rotation matrix, but
            // we'll show it here anyway.
            SensorManager.getOrientation(R2, orientation);
            float incl = SensorManager.getInclination(Imat);
            Log.d(TAG, "mh: " + (int)(orientation[0]*DEG));
            Log.d(TAG, "pitch: " + (int)(orientation[1]*DEG));
            Log.d(TAG, "roll: " + (int)(orientation[2]*DEG));
            Log.d(TAG, "yaw: " + (int)(orientation[0]*DEG));
            Log.d(TAG, "inclination: " + (int)(incl*DEG));

Hmmm; if you happen to have a Quaternion library handy, it's probably simpler just to receive TYPE_ROTATION_VECTOR and convert that to an array.

Original Thread

By anonymous    2017-09-20

Compasses will give you what you want, but it's so noisy. It is noisy for 2 reasons, one of reason is, it is picking up, real noise, real signal. So, we live in an environment that's magnetically very noisy. So, this compasses, picking up everything that's magnetic. The other reason is that, it's not integrated, so it doesn't have benefit of dropping the frequency component. So, try to combine your compass with gyroscope data. This video will help you so much for using these sensors.

Some more details, you can combine accelerometers also. So in summary, Gyroscopes provide orientation, Accelerometers provide a correction due to gravity, and compasses provide a correction due to magnetic North.

Original Thread

By anonymous    2017-09-20

This is a common problem with yaw, pitch and roll. You cannot get rid of it as long as you are using yaw, pitch and roll (Euler angles). This video explains why.

I use rotation matrices instead of Euler angles in my motion sensing application. For an introduction to rotation matrices I recommend:

Direction Cosine Matrix IMU: Theory

Rotation matrices work like a charm.

Quaternions are also very popular and said to be the most stable.

[This answer was copied from here.]

Original Thread

Popular Videos 11

Submit Your Video

If you have some great dev videos to share, please fill out this form.