Android Sensor Fusion Tutorial

While working on my master thesis, I’ve made some experiences with sensors in Android devices and I thought  I’d share them with other Android developers stumbling over my blog. In my work I was developing a head tracking component for a prototype system. Since it had to adapt audio output to the orientation of the users head, it required to respond quickly and be accurate at the same time.

I used my Samsung Galaxy S2 and decided to use its gyroscope in conjunction with the accelerometer and the magnetic field sensor in order to measure the user’s head rotations both, quickly and accurately. To acheive this I implemented a complementary filter to get rid of the gyro drift and the signal noise of the accelerometer and magnetometer. The following tutorial describes in detail how it’s done.

There are already several tutorials on how to get sensor data from the Android API, so I’ll skip the details on android sensor basics and focus on the sensor fusion algorithm. The Android API Reference is also a very helpful entry point regarding the acquisition of sensor data. This tutorial is based on the Android API version 10 (platform 2.3.3), by the way.

This article is divided into two parts. The first part covers the theoretical background of a complementary filter for sensor signals as described by Shane Colton here. The second part describes the implementation in the Java programming laguage. Everybody who thinks the theory is boring and wants to start programing right away can skip directly to the second part. The first part is interesting for people who develop on other platforms than Android, iOS for example, and want to get better results out of the sensors of their devices.

Update (March 22, 2012):
I’ve created a small Android project which contains the whole runnable code from this tutorial. You can download it here:
SensorFusion1.zip

Update (April 4, 2012):
Added a small bugfix in the examples GUI code.

Update (July 9, 2012):
Added a bugfix regarding angle transitions between 179° <–> -179°. Special thanks to J.W. Alexandar Qiu who pointed it out and published the soultion!

Update (September 25, 2012):
Published the code under the MIT-License (license note added in code), which allows you to do with it pretty much everything you want. No need to ask me first ;)

Sensor Fusion via Complementary Filter

Before we start programming, I want to explain briefly how our sensor fusion approach works. The common way to get the attitude of an Android device is to use the SensorManager.getOrientation() method to get the three orientation angles. These two angles are based on the accelerometer and magenotmeter output. In simple terms, the acceletometer provides the gravitiy vector (the vector pointing towards the centre of the earth) and the magnetometer works as a compass. The Information from both sensors suffice to calculate the device’s orientation. However both sensor outputs are inacurate, expecially the output from the magnetic field sensor which includes a lot of noise.

The gyroscope in the device is far more accurate and has a very short response time. Its downside is the dreaded gyro drift. The gyro provides the angular rotation speeds for all three axes. To get the actual orientation those speed values need to be integrated over time.  This is done by multiplying the angular speeds with the time interval between the last and the current sensor output. This yields a rotation increment. The sum of all rotation increments yields the absolut orientation of the device. During this process small errors are introduced in each iteration. These small errors add up over time resulting in a constant slow rotation of the calculated orientation, the gyro drift.

To avoid both, gyro drift and noisy orientation, the gyroscope output is applied only for orientation changes in short time intervals, while the magnetometer/acceletometer data is used as support information over long periods of time. This is equivalent to low-pass filtering of the accelerometer and magnetic field sensor signals and high-pass filtering of the gyroscope signals. The overall sensor fusion and filtering looks like this:

So what exactly does high-pass and low-pass filtering of the sensor data mean? The sensors provide their data at (more or less) regular time intervals. Their values can be shown as signals in a graph with the time as the x-axis, similar to an audio signal. The low-pass filtering of the noisy accelerometer/magnetometer signal (accMagOrientation in the above figure) are orientation angles averaged over time within a constant time window.

Later in the implementation, this is accomplished by slowly introducing new values from the accelerometer/magnetometer to the absolute orientation:

// low-pass filtering: every time a new sensor value is available
// it is weighted with a factor and added to the absolute orientation
accMagOrientation = (1 - factor) * accMagOrientation + factor * newAccMagValue;

The high-pass filtering of the integrated gyroscope data is done by replacing the filtered high-frequency component from accMagOrientation with the corresponding gyroscope orientation values:

fusedOrientation =
    (1 - factor) * newGyroValue;    // high-frequency component
    + factor * newAccMagValue;      // low-frequency component

In fact, this is already our finished comlementary filter.

Assuming that the device is turned 90° in one direction and after a short time turned back to its initial position, the intermediate signals in the filtering process would look something like this:

Notice the gyro drift in the integrated gyroscope signal. It results from the small irregularities in the original angular speed. Those little deviations add up during the integration and cause an additional undesireable slow rotation of the gyroscope based orientation.

172 thoughts on “Android Sensor Fusion Tutorial

  1. Sampreeti, 2014-12-27 8.36 am

    Hello!
    I found your tutorial very interesting. I am doing a research which requires me to record limb movements. I am planning to do it using accelerometer and gyroscope. Is there any way where I can save the recorded data along with date and time of the day. (I need the time at which every movement takes place.)

    Thanks,
    Sampreeti

    1. admin, 2015-01-03 12.03 am

      Hi Sampreeti,
      you can record any data produced by the sensors. You simply have to provide the required memory and after each measurement you save the sensor data alongside of the current timestamp. But keep in mind that the sensor fusion described in this tutorial only returns the orientation of the device, not its linear movement. But if you had several sensors (or sensor sets) and the distance between them, you could calculate the movement of a specific limb.

      The practicality of MEMS sensors (such as common accelerometers or gyroscopes in mobile devices) depends heavily on the accuracy of the data you want to record. Maybe you don’t even require any sensor fusion and an accelerometer would suffice. However, if you require high data accuracy like in a motion capturing system, I doubt that MEMS sensors will fit your needs. In such a case I would recommend a computer vision approach.
      Good luck with your research,
      Paul

  2. Saurabh Gupta, 2015-01-22 2.17 pm

    Hi
    Using this approach can I find magnetic heading of device? Will it be more accurate ?
    My understanding is that azimuth which is orientation[0] will be the magnetic heading?

    1. admin, 2015-02-18 12.26 pm

      The magnetometer is used for azimuth stabilisation. This sensor provides the magnetic heading. However, I’m not sure whether azimuth (or orientation[0]) = 0° equals north. Maybe youwill have to do some post processing.

  3. allan, 2015-02-06 10.23 am

    i have liked your tutorial

    1. admin, 2015-02-18 12.30 pm

      Tanks :)

  4. allan, 2015-02-06 10.26 am

    i would like to know how exactly i can get the all the sensor data a once.i am rying to develop a porthole detection system

    1. admin, 2015-02-18 12.36 pm

      Please clarify your requirement. What do you mean by getting all the sensordata at once? You need to collect the data from the sensors at some point.

  5. Marcel Blanck, 2015-02-15 2.32 pm

    Ein ganz ausgezeichnetes Tutorial! Danke dafür!

    1. admin, 2015-02-18 12.31 pm

      Freut mich, dass es gut ankommt. Danke für das Lob :)
      Alles Gute!

  6. Jake Smith, 2015-02-23 7.32 pm

    Thanks again for your tutorial. I loved your explanations, which made a lot of sense. The part I’m struggling with is getting OpenGL to do what I want. I don’t have a lot of experience with it. I’m attempting to follow what I see here: https://bitbucket.org/apacha/sensor-fusion-demo/src/182a3034e5f7b305bf00356c9968a59b9a364d64/src/org/hitlabnz/sensor_fusion_demo/CubeRenderer.java?at=master, and I somewhat understand the necessity of using quarternions rather than Euler rotation/orientation. Would you be able to help me understand how to get a quarternion out of the resulting vector of this work and apply that to my “view in OpenGL land?”

    I have light reflecting off of a sphere rather than a cube for a more subtle proof of perspective change. My renderer is here: https://gist.github.com/jakehockey10/ef1f8a73989be9d04f46

  7. Tuan Dinh, 2015-03-08 7.45 am

    Hello

    I found your tutorial very interesting and useful. Is it possible to track linear motion ? One more thing I dont understand, the fused data from the complementary filter, is it the accelerometer data without noise or gyroscope data without drift ?

    1. admin, 2015-03-10 3.37 pm

      Hi

      The question about linear motion tracking is asked very often here. The answer is that I don’t see a reliable way to track linear motion with the sensor data we use in this tutorial. Most people who try to track linear motion start off with the accelerometer which is not possible with the current state of the art MEMS sensors. Perhaps there will be more accurate sensors one day. But as for now, I don’t think it’s possible.

      The fused data is composed of both: low-pass filtered accelerometer/magnetometer data (not entirely without noise, only the high frequency parts removed) and high-pass filtered gyro data (removing the drift while the system moves slowly).

  8. Sebastian, 2015-03-19 3.52 pm

    Hi!
    Danke für dieses sehr aufschlussreiche Tutorial. Wirklich gut und verständlich geschrieben. Bezüglich der Frage von TUAN DINH zu Linear Motion Tracking hat sich mir die Frage gestellt, ob es über sehr kurze Zeitintervalle (60ms) möglich ist die Bewegung, bzw. die zurück gelegte Distanz, zu tracken.

    Angenommen ich tracke mit der Kamera meines Devices einen definierten Punkt. Nachdem der Punkt initial detektiert wurde, möchte ich die Bewegung des Devices erfassen können, um diesen Punkt in der Folge leichter detektieren zu können. Der Ablauf würde etwa so aussehen:

    1. Initiales Erfassen des Punktes.
    2. Bewegungen des Devices wird ermittelt (via Sensoren).
    3. Integration der Bewegungsdaten in die Berechnung zum Tracken des Punktes. (Area of interest soll so eingeschränkt werden können)
    4. Erfassen des Punktes
    5. Repeat 2.

    Dauer einer Iteration beträgt ca. 45-60 ms. Sind die Integrationsfehler innerhalb solch kleinen Zeitintervallen noch relevant?

    Gruss
    Sebastian

    1. admin, 2015-03-20 1.13 am

      Hi Sebastian,
      freut mich sehr, dass dir das Tutorial gefallen hat :)

      Zu deiner Frage habe ich zunächst einen paar Gegenfragen:
      1: Habe ich es richtig verstanden, dass du mit dem Kamera-Tracking die Positionierung des Devices stabilisieren willst? Die Frage wäre dann, was genauer ist, dein Kamera-Tracking, oder die doppelt integrierte Beschleunigung. Ich würde meine Hand dafür nicht ins Feuer legen, aber ich wäre eher auf der Seite des Kamera-Trackings. Und dann brächtest du kein Accelerometer mehr, wenn das eh ungenauer ist.
      2: Benötigst du bei jeder Iteration wirklich nur die Streckendifferenz, die du aus der Accelerometer-Integration bekommst, oder brauchst du auch die Device-Position, die du in den vorherigen Iterationen berechnet hast?

      An und für sich, sollten in einem Zeitfenster von 60ms nicht viele Fehler anfallen. Ist aber dennoch mit Vorsicht zu genießen. Denn in einer so kurzen Zeit kann das Gerät auch nicht nicht viel Strecke zurücklegen. Deswegen wird das Integrations-Ergebnis auch ein sehr kleiner Wert sein. Und was die Genauigkeit anbetrifft ist das Verhältnis zwischen Fehler und Nutzdaten ausschlaggebend und nicht der absolute Fehleranteil (und der Fehleranteil wird ja ebenfalls doppelt integriert!).

      Das wären meine ersten Überlegungen dazu. Unterm Strich kann ich nur sagen: Ist schwer zu beantworten. Müsste man ausprobieren :-/

Leave a Reply

Your email address will not be published. Required fields are marked *


9 × seven =

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>