Android Sensor Fusion Tutorial

While working on my master thesis, I’ve made some experiences with sensors in Android devices and I thought  I’d share them with other Android developers stumbling over my blog. In my work I was developing a head tracking component for a prototype system. Since it had to adapt audio output to the orientation of the users head, it required to respond quickly and be accurate at the same time.

I used my Samsung Galaxy S2 and decided to use its gyroscope in conjunction with the accelerometer and the magnetic field sensor in order to measure the user’s head rotations both, quickly and accurately. To acheive this I implemented a complementary filter to get rid of the gyro drift and the signal noise of the accelerometer and magnetometer. The following tutorial describes in detail how it’s done.

There are already several tutorials on how to get sensor data from the Android API, so I’ll skip the details on android sensor basics and focus on the sensor fusion algorithm. The Android API Reference is also a very helpful entry point regarding the acquisition of sensor data. This tutorial is based on the Android API version 10 (platform 2.3.3), by the way.

This article is divided into two parts. The first part covers the theoretical background of a complementary filter for sensor signals as described by Shane Colton here. The second part describes the implementation in the Java programming laguage. Everybody who thinks the theory is boring and wants to start programing right away can skip directly to the second part. The first part is interesting for people who develop on other platforms than Android, iOS for example, and want to get better results out of the sensors of their devices.

Update (March 22, 2012):
I’ve created a small Android project which contains the whole runnable code from this tutorial. You can download it here:
SensorFusion1.zip

Update (April 4, 2012):
Added a small bugfix in the examples GUI code.

Update (July 9, 2012):
Added a bugfix regarding angle transitions between 179° <–> -179°. Special thanks to J.W. Alexandar Qiu who pointed it out and published the soultion!

Update (September 25, 2012):
Published the code under the MIT-License (license note added in code), which allows you to do with it pretty much everything you want. No need to ask me first ;)

Sensor Fusion via Complementary Filter

Before we start programming, I want to explain briefly how our sensor fusion approach works. The common way to get the attitude of an Android device is to use the SensorManager.getOrientation() method to get the three orientation angles. These two angles are based on the accelerometer and magenotmeter output. In simple terms, the acceletometer provides the gravitiy vector (the vector pointing towards the centre of the earth) and the magnetometer works as a compass. The Information from both sensors suffice to calculate the device’s orientation. However both sensor outputs are inacurate, expecially the output from the magnetic field sensor which includes a lot of noise.

The gyroscope in the device is far more accurate and has a very short response time. Its downside is the dreaded gyro drift. The gyro provides the angular rotation speeds for all three axes. To get the actual orientation those speed values need to be integrated over time.  This is done by multiplying the angular speeds with the time interval between the last and the current sensor output. This yields a rotation increment. The sum of all rotation increments yields the absolut orientation of the device. During this process small errors are introduced in each iteration. These small errors add up over time resulting in a constant slow rotation of the calculated orientation, the gyro drift.

To avoid both, gyro drift and noisy orientation, the gyroscope output is applied only for orientation changes in short time intervals, while the magnetometer/acceletometer data is used as support information over long periods of time. This is equivalent to low-pass filtering of the accelerometer and magnetic field sensor signals and high-pass filtering of the gyroscope signals. The overall sensor fusion and filtering looks like this:

So what exactly does high-pass and low-pass filtering of the sensor data mean? The sensors provide their data at (more or less) regular time intervals. Their values can be shown as signals in a graph with the time as the x-axis, similar to an audio signal. The low-pass filtering of the noisy accelerometer/magnetometer signal (accMagOrientation in the above figure) are orientation angles averaged over time within a constant time window.

Later in the implementation, this is accomplished by slowly introducing new values from the accelerometer/magnetometer to the absolute orientation:

// low-pass filtering: every time a new sensor value is available
// it is weighted with a factor and added to the absolute orientation
accMagOrientation = (1 - factor) * accMagOrientation + factor * newAccMagValue;

The high-pass filtering of the integrated gyroscope data is done by replacing the filtered high-frequency component from accMagOrientation with the corresponding gyroscope orientation values:

fusedOrientation =
    (1 - factor) * newGyroValue;    // high-frequency component
    + factor * newAccMagValue;      // low-frequency component

In fact, this is already our finished comlementary filter.

Assuming that the device is turned 90° in one direction and after a short time turned back to its initial position, the intermediate signals in the filtering process would look something like this:

Notice the gyro drift in the integrated gyroscope signal. It results from the small irregularities in the original angular speed. Those little deviations add up during the integration and cause an additional undesireable slow rotation of the gyroscope based orientation.

155 thoughts on “Android Sensor Fusion Tutorial

  1. Marzieh, 2013-02-10 3.35 am

    Hello Paul,
    I’m working on a project now and looking for an application to log the direction/heading indoor.
    I found 2 Compass data recorder based on GPS which is not not working indoor.
    Do you know if there is any application available now or I should write it?
    Thanks for your time in advance.

    Cheers
    Marzieh

    1. Paul, 2013-02-10 3.26 pm

      Hi Marzieh,

      I don’t know if there are any applications or libraries/code snippets out there which fit your needs. If I were you, I would write my own simple logging mechanism for the date you receive from your sensors, since you will be abe to adapt the data model easier when required. But that of course depends on the time budget you have on your project.

      Cheers,
      Paul

  2. michal, 2013-03-01 3.39 pm

    Paul – your article is REALLY THE TUTORIAL. Great and life saving. Thank you!

    I would like to ask abou linear acceleration. Some people here were discussing this topic. My question: is getting correct orientation (like the one after fusing here) is the only way leading to calculating linear acceleration? I mean: do I need current orientation? Sorry for spoon-feeding question but I want to implement my own LA and trying to get options I have doing this.

    Any advice on further way would be great!

    1. Paul, 2013-03-01 7.03 pm

      Hi Michal,
      thank you for your commendation, I’m very flattered.
      To be honest, I never implemented linear acceleration for an android device. So what I’m telling you is pure theory.
      Getting linear acceleration is quite tricky because, as you said, you require the device’s orientation in order to get rid of the gravity vector wich is already applied within the acceleration vector (Same problem when trying to measure gravity but the other way around). It also depends on how much you expect the device to change orientation during linear acceleration measurement. If your device is moving AND rotating continuously (and slowly), I beleive that even the sensor fusion approach will fail at some point. The simplest way to go about it (without sensor fusion) is to get the device orientation from the accelerometer and magnetometer at the very beginnig of the measurement and use it as reference for linear acceleration measurement. Downside: you may not rotate your device because this will change your reference orientation thus making your acceleration inconsistent. So I believe this is not an option for you.

      With sensor fusion however you have a gyro-component in your orientation measurement which is independent from acceleration. But your fused orientation will always be biased by the skewed acceleration vector when moving the device. So no matter how you turn it (metaphorically ;) ), there will always be some dirt in your measurement. It depends on how much “dirt” you are willing to accept, how accurate you want to be. You could try to increase the gyro-component in the filter algorithm so the accelerometer component doesn’t distort your orientation too much. However, don’t forget the gyro-drift problem for which we fused the sensor data in the first place.

      As you see, it is not a trivial problem and I’m afraid, since I did not dwell in it very deeply, I cannot provide you with an ideal solution. I’m also not sure whether sensor fusion is the only solution for this problem, neither do I know whether it is a viable approach at all (since I did not test it). All I can do is provide you my thoughts on it.

      I hope you find a solution that fits your needs. And if you do, don’t hesitate to share it with others :)

  3. josean, 2013-03-05 11.56 pm

    The best way to integrate data coming from various sensors is through a Kalman filter.
    I would suggest you to have a look at this compass, in particular if your smartphone includes a gyroscopic sensor.
    Anyway, congratulations for such an excellent article.

  4. Joe Marshall, 2013-03-19 4.40 pm

    Nice tutorial. It is probably worth pointing out briefly somewhere near the top of this article, that on a more recent version of Android (Ice Cream Sandwich or later), the Android framework does all this magic for you, using a quite nice Kalman filter based method. Rather than use the raw accelerometer/gyro etc. data, you can use the ‘virtual sensor’ types:

    TYPE_ROTATION_VECTOR – orientation of device (from a fusion of accelerometer, magnetometer and gyroscope, as your code does, but using a Kalman filter)
    TYPE_GRAVITY – direction of gravity relative to device (from the same fusion)
    TYPE_LINEAR_ACCELERATION (pure linear acceleration of device with gravity removed – done using the above orientation calculations)

    Oh, and if you want orientation from TYPE_ROTATION_VECTOR, you do it like this:
    SensorManager.getRotationMatrixFromVector(m_RotationMatrix, m_RotationVectorVals);
    SensorManager.getOrientation(m_RotationMatrix, m_Orientation);

    You can download a demo app called ‘sensor fusion(demo)’ from the play store, which shows it off, it is surprisingly stable (like <0.1 degree angle jitter when I leave the phone still on a table) and extremely fast to respond.

    The code behind it is all here:
    https://github.com/android/platform_frameworks_base/tree/jb-mr0-release/services/sensorservice

    1. Paul, 2013-03-19 8.53 pm

      @Joe:
      Thanks for pointing that out!
      I wondered how long it would take for google to actually add this feature to their framework. Apparently iOS already had something like that at the time I published this tutorial. As yet I had no time to really understand Kalman filters. But I heard this is the best way to achieve stable and accurate orientation data.

      @Kristian:
      Cool stuff! I’m impressed! I wish you good luck with your campaign.

  5. Kristian Lauszus, 2013-03-19 7.36 pm

    Thanks you very much for this excellent code.
    I use it to control my Balancing robot I just published at Kickstarter: http://www.kickstarter.com/projects/tkjelectronics/balanduino-balancing-robot-kit.
    And here is the source code for the Android application: https://github.com/TKJElectronics/BalanduinoAndroidApp.
    Thank you once again!

  6. Kristian Lauszus, 2013-03-19 11.30 pm

    @Paul
    Thanks :)

  7. Moustafa Alzantot, 2013-04-09 12.12 am

    Hi,

    Thanks for the great article.

    I have adoubt about the LPF equation,
    I think a (1-factor) term is needed to be multiplied by the old accMagOrientation value on the LHS, so that the equation be :

    accMagOrientation = accMagOrientation + factor * newAccMagValue;

    is that right ?

    1. Paul, 2013-04-14 8.30 pm

      Hello Moustafa,
      yes you are right. The total multiplicator should add up to a factor of 1.0. I made a mistake there. I’ll correct it in the article. Thanks for pointing it out!

  8. phildar, 2013-05-03 11.19 am

    Hi !! congratulations for your work ! I made a processing sketch of it if some of you are interested, I can share.
    I would have liked to know how to compensate the roll from sensorfusion datas? I have an installation which requires just the pitch and azimuth. thanks
    p.

    1. Paul, 2013-05-08 12.42 am

      What exactly do you mean by compensating the roll? If you don’t need the roll information, just leave it out or set it to a default value (e.g. 0 degrees).

  9. phildar, 2013-05-08 11.06 am

    Thanks for your reply.
    I have my tablet in landscape orientation. I move around a statue which is at the center of the installation. (The azimuth gives me the spherical position from 0 to 360°).
    When I change to portrait orientation, the azimuth changes ! and the position goes wrong.
    How could I avoid this ?

    1. Paul, 2013-05-09 5.48 pm

      It’s still not clear what exactly you’re asking for. Is the tabled lying flat in front of you or are you holding it upright? What exactly is the installation you are talking about? Is it an object you are capturing with the tablet’s camera or is it a virtual 3D-object in a scene of your application you are showing on the display? Are you moving your statue or are you rotating it?

      Please take into account, that all rotation axes are fixed with the device’s frame. The axis-conventions are defined here. If you are holding the device upright in front of you in portrait mode an turn it about the vertical axis (which is Y, in that case) you are changing the roll angle. If you are halding it upright in landscape mode the vertical axis is x and you are changing the pich angle.

  10. phildar, 2013-05-15 7.32 pm

    thanks_my post was not cleared .
    I have an app which is forced in landscape mode. in portrait mode your code give good results.
    But when I am in landscape, sensors are crazy. How could I set true islandscape() in your code ?
    thanks in advance.

    1. Paul, 2013-05-19 6.19 pm

      you need to transform the coordinate system according to the given display orientation. The app is coded for portrait mode so translational coordinates are mapped as follows – assuming you rotate your device 90° counter-clockwise:
      (portrait) x –> (landscape) y
      (portrait) y –> (landscape) -x
      (portrait) z –> (landscape) z

      Now these are just the axes. My code provides angular information in respect of these axes. We need to take that into account too:
      (portrait) pitch (x-rotation) –> (landscape) roll
      (portrait) roll (y-rotation) –> (landscape) pitch

      However, since I’m using global coordinates in my code, I’m not sure if the above mappings apply correctly.
      Maybe you’re better off by creating a static rotation matrix that rotates the orientation result by -90° around the local z-axis.
      Don’t know if that works either, since I haven’t tested it.

  11. phildar, 2013-05-24 5.50 pm

    Thanks a lot for your response !
    actually remap coordinates doesn’t seem to work :

    public void calculateAccMagOrientation() {
    if (SensorManager.getRotationMatrix(temp, null, accel, magnet)) {
    SensorManager.remapCoordinateSystem(temp, SensorManager.AXIS_X, SensorManager.AXIS_Y, rotationMatrix);
    SensorManager.getOrientation(rotationMatrix, accMagOrientation);
    }
    I’ ll try to make a rotation matrix of -90° from z axis. Do I have to do it in calculateAccMagOrientation ?
    thanks .
    p.

  12. phildar, 2013-05-26 11.43 pm

    It’s weird the remapCoordinateSystem works for my cell nexus 7, but doesn’t with tablet !

    1. Paul, 2013-05-27 7.10 pm

      That’s weird indeed. What tablet do you use?

  13. phildar, 2013-05-28 1.02 am

    my tab is nexus 10.
    Apparently the default mode for tab is LANDSCAPE, although for cell or little tab (with one cam), it’s PORTRAIT. Could it be the cause of the problem ?

  14. tuvbunn2, 2013-06-10 11.29 am

    Hallo,

    Erstmal Danke für das super Tutorial ;)
    Es funktioniert super wenn mein Handy flach auf dem Tisch liegt. Ich habe versucht den Code abzuändern dass es neu Kalibriert wenn ich das Handy senkrecht auf den Tisch stelle, aber leider scheitere ich.
    Wie genau müsste ich den Code abändern dass es funktioniert wenn mein Handy senkrecht auf dem Tisch steht?

    Danke und viele Grüße,
    tuvbunn2

    1. Paul, 2013-06-16 5.40 pm

      Hallo tuvbunn2,
      was genau hast du denn schon versucht? Im Prinzip müsstest du die Rotationsmatrix in ein anderes Koordinatensystem umrechnen. Da ich mich schon länger nicht mehr mit dem Code auseinandergesetzt habe, müsste ich mich selber erst mal hinsetzen und darüber nachdenken (was ich leider zur Zeit nicht kann :().

  15. 0L, 2013-06-11 1.17 pm

    Hi there,

    I found your code very helpful, but maybe i am missing something, i understand that Timer in Android creates a thread, and the callback of the sensors are filling the data arrays that the timer fuse function are accessing.

    Don’t you have a thread safety issues in your code ? it seems to me that the timer runs the fuse function on a new thread, and that the sensor has changed call backs are accessing and modifying the same data.

    Am i wrong ?

    Thanks,
    0L

    1. Paul, 2013-06-16 5.36 pm

      Hi 0L,
      you are right, the code is not thread safe. But remember, the code is quite aged and was originally intended for demonstration purposes. To achieve thread safety you would have to lock gyroMatrix in gyroFunction and in the timer thread. However, even if the programm runs into concurrency issues (i.e. a sensor callback gets interrupted by the timer thread) the deviation should be minimal in most cases. That’s because fusedOrientation and gyroMatrix deviate over a period of time. Truth is, I sucked in concurrent programming back then when I was writing my master thesis… and probably I still do ;)

  16. tuvbunn2, 2013-06-19 4.31 pm

    Hi,

    Ich habe eine Rotationsmatrix um die X achse gebaut und rotationMatrix damit in calculateAccMagOrientation() gedreht.

    Leider funktioniert dann mein roll nicht mehr, bzw der Horizont korrigiert dann auf seltsame weise.

    Das hier sind meine Anpassungen:

    public void calculateAccMagOrientation()
    {
    if (SensorManager.getRotationMatrix(rotationMatrix, null, accel, magnet))
    {
    //rotation x axis 90 degrees
    float[] xRotationMatrix = new float[9];
    xRotationMatrix[0]=1f;
    xRotationMatrix[1]=0f;
    xRotationMatrix[2]=0f;
    xRotationMatrix[2]=0f;
    xRotationMatrix[0]=0f;
    xRotationMatrix[5]=1f;
    xRotationMatrix[6]=0f;
    xRotationMatrix[7]=-1f;
    xRotationMatrix[8]=0f;

    rotationMatrix=matrixMultiplication(rotationMatrix, xRotationMatrix);

    SensorManager.getOrientation(rotationMatrix, accMagOrientation);
    }
    }

    Danke und Grüße,
    tuvbunn2

    1. Paul, 2013-06-20 10.34 pm

      Hi tuvbunn2,
      warum weist du xRotationMatrix[0] einmal 1f zu und überschreibst es weiter unten gleich wieder mit 0f? Ich glaube, dass du da einen copy-paste-Fehler in der Initialisierung deiner Rotationsmatrix hast, denn index 3 und 4 werden garnicht initialisiert, stattdessen 0 und 2 zweimal.

      Außerdem reicht es nicht, wenn du allein den AccMag-Anteil rotierst. Den Gyro-Anteil (siehe gyroMatrix) musst du auch transformieren, sonst hast du später in der Sensor-Fusion inkonsistente Daten.

      Grüße,
      Paul

  17. tuvbunn2, 2013-06-25 11.49 am

    Oh vielen Dank,
    Da ist wohl beim kopieren etwas durcheinander gekommen.
    Durch die Rotationen der rotationMatrix um 90° sowie der Rotation der gyroMatix um 90° passt jetzt alles.

    Vielen Dank.

    Grüße,
    tuvbunn2

  18. […] I’m getting pitch and roll data from my device’s gyroscope using this tutorial: http://www.thousand-thoughts.com/2012/03/android-sensor-fusion-tutorial/ […]

  19. Phillip, 2013-07-26 2.50 pm

    Hi Paul,

    This is so useful article!

    But i have the same problem as tuvbunn2, i rotate the rotationMatrix by 90 before pasing to accMagOrientation which makes pitch work fine but the roll value deviate ( no deviation for 45 degrees pitch though).
    Where exactly in the alogrithm should i rotate gyro matrix?
    I would be so grateful for helping me out with this

    1. Paul, 2013-07-26 10.51 pm

      Hi Phillip,
      you hav to rotate both, the GyroMatrix and the AccMagMatrix (which is called rotationMatrix within the code) by 90° around the x-axis. you have to create a rotation matrix for that, just like tuvbunn2 did:

      1 0 0
      0 0 1
      0 -1 0

      You can apply the rotation using the MatrixMultiplication-Method. The above rotation matrix has to be the second parameter (remember that matrix multiplications are not commutative). Just apply the rotation right after GyroMatrix/AccMagMatrix have been calculated respectively. For the AccMagMatrix that would be somewhere at the end of the method calculateAccMagOrientation and for the gyroMatrix somewhere at the end of the method gyroFunction.
      Hope that helps.

  20. Phillip, 2013-07-27 12.56 pm

    Unfortunetely it doesnt work, after mulitplying gyroMatrix the values go crazy…
    I am using your project and testing on few mobile devices:
    float[] my90DegRotationMatrix = {1, 0, 0,
    0, 0, 1,
    0, -1, 0,
    };

    then in calculateAccMagOrientation:

    if(SensorManager.getRotationMatrix(rotationMatrix, null, accel, magnet)) {
    rotationMatrix = matrixMultiplication(rotationMatrix, my90DegRotationMatrix);
    SensorManager.getOrientation(rotationMatrix, accMagOrientation);
    }

    and in gyroFunction at the very end:

    // apply the new rotation interval on the gyroscope based rotation matrix
    gyroMatrix = matrixMultiplication(gyroMatrix, deltaMatrix);

    // rotate gyro matrix NEW
    gyroMatrix = matrixMultiplication(gyroMatrix, my90DegRotationMatrix);

    // get the gyroscope based orientation from the rotation matrix
    SensorManager.getOrientation(gyroMatrix, gyroOrientation);

    I tried rotating gyromatrx in few other places like at the end of calculateFusedOrientationTask etc. It seem like not matter where i do that it breaks everything.

    Can you please help me, am i doing the gyro matrix multiplication wrong?

    1. Paul, 2013-07-30 7.51 pm

      Hi Phillip,
      I’m sorry, but currently I don’t have the project setup on my local machine so I cannot reproduce the behavior of your code. I was looking through the code and couldn’t find anything, yet. I will report back, if I find something.

  21. Jeff, 2013-08-01 4.40 pm

    Hi Paul,
    This article is amazing. Well done.

    I am experiencing an issue I was hoping you can help with. In my own application and in your sample application, the Pitch value goes from 0 (device laying flat) to -90 (device completely vertical), then back down toward 0 (device flat but upside down). This means the pitch value is -45 both when the phone is titled back (\) and when its tilted forward (/). I am using a Samsung Galaxy Nexus. Are these the values you would expect to see? How can I tell the difference between the two positions?

    Thanks!

    1. Paul, 2013-08-02 5.31 pm

      Hi Jeff,
      pitch, roll and azimuth always keep their relations to each other. That said, you can read the azimuth value in order to interpret which orientation, (/) or (\) is meant by a pitch value of -45°. If the reading direction (–>) is front, an azimuth of 0° and a pitch of -45° would be (\) (with the earpiece pointing down) while with an azimuth of 180° it would be (/). Of course if you flip the roll angle by 180° the whole situation is different. So I’m affraid you have to take into account all three angles.

  22. oh, 2013-08-12 9.39 am

    It would be grand to apply this to android Dead Reckoning (continue to navigate (supplement) with weak gps or weak data (tower masts))

  23. fariba, 2013-08-17 12.17 pm

    Hi
    The program is very good and very useful, thank you.
    I’ve used your program in my project.Could you explain more about the TIME_CONSTANT and the FILTER_COEFFICIENT And how did you get them?

    1. Paul, 2013-08-18 5.52 pm

      Hi fariba,
      thank you for your nice feedback. I think I already explained those two constants pretty much in detail further down in the comments. Try to look on page 2 of the comments. I believe there is a more lengthy explanation on this which I posted some time ago.

  24. Pok, 2013-08-19 1.29 am

    Thank you for the tutorial Paul. Very useful. Is it possible to install it on Android 2.2 devices? Im getting a parsing error so just wondering it might be because of incompatibility (?)

  25. Pok, 2013-08-19 1.45 am

    I think the reason is getrotationmatrixfromvector which has been added in API9.

    1. Paul, 2013-08-19 5.29 pm

      That could be. I didn’t test the example with older API levels.

  26. Rajesh, 2013-09-02 11.31 pm

    Hi
    Nice article ! I have a question !
    How can I convert linear acceleration given by accelerometer sensor with the device frame of reference to the earth frame of reference ? I think it will need sensors (Acc, gyro, Mag) fusion to determine that.

    From (Xa, Ya, Za) to (Xe, Ye, Ze) ?

    please help

  27. fariba, 2013-09-09 6.08 pm

    Hey Paul
    If I want to hold the phone in one of the following three conditions Should I change something in the code?
    1 – when the x-axis parallel and opposite to the vector of gravity .
    2 – when the y-axis parallel and opposite to the vector of gravity .
    3 – when the z-axis parallel and opposite to the vector of gravity .
    please explain what should I do For each case . Just have to change the rotation matrix? How?
    please help me

  28. ASHWINI, 2013-10-06 6.46 pm

    Hello sir,

    Thanks a lot for the tutorial,
    actually am working on indoor tracking in gps denied areas using sensor fusion in android devices,
    the source code you provided helps in gathering sensor values?
    or fusing them as per sensor fusion so that they used for different purposes???

    1. Paul, 2013-11-02 6.58 pm

      @ASHWINI: It’s basically sensor fusion, nothing more.

  29. Android Example, 2013-10-07 12.35 pm

    Very nice dude…

    I have also found one good link here….

    Accelerometer Basic Example – Detect Phone Shake Motion

  30. Aaron, 2013-11-07 2.41 pm

    Hi Paul,
    Good job.
    I’m interested in Android PDR and can you send me the source code? I can’t find the newest version of you code…

    1. Paul, 2013-11-11 5.42 pm

      Hi Aaron,

      there is a link in the post after the fourth paragraph. The text starts wit “Update (March 22, 2012)”. There you can download the code to this tutorial. Otherwise try this link.

  31. Bartek, 2013-11-19 12.53 pm

    Hello Paul,
    Thanks for such a great tutorial!

    I don’t understand why are you using the magnetometer if it’s so inaccurate and generate noise? Why not use accelerometer instead?

    Even the accelerometer output look the same as magnetometer output.

  32. Hao, 2013-11-27 1.35 am

    Dear Paul,

    Can your also calculate the horizontal degree to the North?
    Can it also be benefit from the filter your define?
    Thanks.

    Hao

  33. Paul, 2013-11-28 12.29 am

    @Bartek:
    The magnetometer is the only non-gyro reference for the azimuth angle we have available. You cannot determine the azimuth angle with the accelerometer, only pitch and yaw.
    @Hao:
    Could you be more specific? Do you mean the angle between the devices ‘forward’ vector and the noth direction? In 3D or projected on a horizontal plane?
    The main problem would always be to determine north accurately enough. This could be problematic with the noisy magnetometer. You could point the device towards north using some other method (another accurate compass), save the azimuth angle (and thus ‘calibrate’ it, so to say) and calculate the difference between those two vectors. However, I have the feeling this would not be a satisfying solution to you.

  34. Bahman, 2013-12-14 9.48 am

    Hi, everyone,
    I’m working on my thesis and i want to buy a smartphone such as samsung and etc,
    What are the best phones for suitable sensor accuracy?
    thanks

  35. DriveHard, 2014-02-09 11.05 pm

    First Paul, thanks for this code, much appreciated. I think I need a Masters to truly understand why all this works.

    I saw a bunch of questions about where to remap the coordinates.

    I have re-mapped them at line 378, inside class calculateFusedOrientationTask
    Add – float[] mRemapedRotationMatrix = new float[9];
    After “gyroMatrix = getRotationMatrixFromOrientation(fusedOrientation);”
    {
    SensorManager.remapCoordinateSystem(gyroMatrix,
    SensorManager.AXIS_Y, SensorManager.AXIS_MINUS_X, mRemapedRotationMatrix);
    SensorManager.getOrientation(mRemapedRotationMatrix, gyroMatrix);
    }
    Before “System.arraycopy(fusedOrientation, 0, gyroOrientation, 0, 3);”

    This remap is for landscape, when the device is suspended in the air, the Roll X per app (actual rotation, like wheel) was changing as the Pitch Y per app. This allows Pitch to change without changing Roll. Roll is still impacted by Z per app.

    Hope this helps

    DriveHard

    1. Paul, 2014-02-23 3.14 pm

      Hi DriveHard,
      this is great! Thanks a lot for your modification. I think this could solve quite a lot of use cases.

  36. Carl, 2014-02-11 2.06 pm

    Hi Paul,

    I was studing your this tutorial for months, it works well but I want to improve it.

    I have few question about it.

    I was confused with the complementary filter in your code.

    why it is not like as :The complementary filter =A*Low-pass filter+(1-A)*High-pass filter, A is the filter coefficient.

    your complementary filter = A*gyroOrientation+(1-A)*accMagOrientation.

    And I didnt see the filter code for each Orientation in the previous code.

    Thank you for your attention

    1. Paul, 2014-02-23 3.22 pm

      Hi Carl,
      could you be more specific in your question? What exactly are the variables “Low-pass filter” and “High-pass filter” in your expressions?
      My expression “A*gyroOrientation+(1-A)*accMagOrientation” is the direct implementation of the Low- and High-pass filters using the concrete orientation values. A*gyroOrientation yields the high-pass component of the filter while (1-A)*accMagOrientation yields the low-pass component. The filtering is not an explicit expression in the code, it results implicitly from the periodic execution of the above expression.

  37. Namrata, 2014-02-15 8.41 am

    Where will I get sensor fusion tutorial for windows phones.. esp Lumia.. I want to code in c#.net.
    Thanks a lot for this wonderful project

    Best wishes
    ~Namrata

    1. Paul, 2014-02-23 3.23 pm

      Hi Namarata,
      I’m afraid I cannot help you on this. You will have to do a research on your own.
      Best regards,
      Paul

  38. Carl, 2014-02-24 5.06 pm

    Hi Paul,

    Your expression “A*gyroOrientation+(1-A)*accMagOrientation” is the direct implementation of the Low- and High-pass filters using the concrete orientation values.

    But I didn’t see the relevant code in your example “SensorFusion1.zip” demo.

    This is your code for accMagOrientation and gyroOrientation.

    // calculates orientation angles from accelerometer and magnetometer output
    public void calculateAccMagOrientation() {
    if (SensorManager
    .getRotationMatrix(rotationMatrix, null, accel, magnet)) {
    SensorManager.getOrientation(rotationMatrix, accMagOrientation);
    }
    }
    // get the gyroscope based orientation from the rotation matrix
    SensorManager.getOrientation(gyroMatrix, gyroOrientation);

    Then you use them directly in
    class calculateFusedOrientationTask extends TimerTask { … }
    to get the fused orientation by using “A*gyroOrientation+(1-A)*accMagOrientation” .

    I didn’t see the code for filtering the accMagOrientation by using Low-pass filter or filtering the gyroOrientation by using High-pass filter.

  39. Simon, 2014-05-28 2.15 pm

    I would to know if the matrix that you have at the end is row based or column based ? And what is the referential ? The default referential of Android (Like here)

    Because I use the librairy jPCT and I need to transform the data that you calculate to matrix for my 3D camera (I want that the 3D camera have the same orientation of the real camera). I don’t find how to transform your matrix to a good matrix for jPCT camera.

    1. Paul, 2014-07-08 2.14 pm

      Hi Simon,
      the matirces in my code stored in two dimensional arrays are in Row-major order.
      Matrix [row] [column]

  40. Mok, 2014-07-14 7.00 am

    Thank you, it is a really nice posting. It helps my project a lot. Thank you.

  41. Graham Dawson, 2014-07-25 7.27 am

    I found a subtle bug but with huge consequences i.e. it prevented the gyroscope data from being used at all.

    The problematic statement is this one:

    private float timestamp;

    It should be changed to:

    private long timestamp;

    Without this fix, the calculation of the timestep (dT) is generally alwasys zero, as a float variable is typically unable to hold the precision required in the timestamp values, and hence gives a zero timestep value.

    Before making this change I found that the demo project gave a very slow response to device orientation changes. After making this fix, the devices heading changes are almost instantaneous, and it finally behaves in the way I would expect fused data to behave.

    I find it hard to believe that this issue could have gone unnoticed all this time, given how seriously it affects the output. I looked back through all previous comments but non-one else seems to have spotted the flaw. Am I missing something? Am I the only one with the buggy code? Is there a newer version somewhere in which this was fixed?

    1. Paul, 2014-07-30 7.57 pm

      Graham,
      thank you for pointing this out!
      I will definitely investigate this. I don’t remember making a fix of this type in the sensor fusion code. Though, I didn’t touch the code in years and I didn’t test it on newer Android APIs. Maybe there were some changes that led to this behavior.

      Also, I moved this article and the source code to CodeProject: here

  42. Maryam, 2014-08-04 11.34 am

    Hi Paul,

    Thanks for your awesome implementation and for making it available online. As part of the community I really appreciate that =)
    Paul, I’m trying to utilize your code into my program. For some reason I’m not getting as accurate and stable data as yours. I’m suspecting it may have something to do with the radioSelection parameter, where the accelerometer is the first sensor ever called.
    Is that a right assumption?
    The other issue is that for some unknown reason, I seem to get zero values when multiplying the fusedOrientation values by pi (instead of getting the angular position).
    Would you please provide some guidance on these issues at your convenience?

    Thanks very much

    1. Paul, 2014-08-09 12.41 pm

      Hi Maryam,
      Maybe the issues you’re experiencing have something to do with the bug Graham pointed out in the comments earlier:

      private float timestamp;
      It should be changed to:
      private long timestamp;

      As far as I remember the radioSelection parameter is a GUI thing only to identify which sensor approach is currently in use.
      Try out java.lang.Math.toRadians(double angdeg) or java.lang.Math.toDegrees(double angrad) respectively to convert between radians and degrees. Maybe that will solve your problems.

  43. Christian Blesing, 2014-08-26 12.10 pm

    Hi Paul,

    super Arbeit und hervorragendes Tutorial! Du sagtest das du in deiner MA-Thesis die Kopfposition bestimmen musstest um die Audioausgaben anzupassen. Ich muss ebenfalls die Kopfposition bestimmen. Das Smartphone wird in einer OpenDive Brille befestigt, ähnlich zur Oculus Rift. Ich benötige jedoch nur die “auf” und “ab” Position des Kopfes (roll) sowie die Drehung nach links und rechts (azimuth). Der azimuth wird ja immer relativ zum magnetischen “Norden” berechnet. Ich habe Probleme diesen Wert für die aktuelle Drehung des Kopfes zu verwenden da der Wert nicht absolut ist. Wenn ich das Handy in der OpenDive habe und mich vor dem Starten der App um 90° auf einem Stuhl drehe, die App dann starte erhalte ich einen Wert von Beispielsweise 135°. Kannst du mir da einen Tipp geben?

    beste Grüße

    vielen Dank nochmal fürs großartige Tutorial!

  44. Christian Blesing, 2014-08-26 12.24 pm

    Hi Paul,

    ich glaube ich habe mich unverständlich ausgedrückt. Was ich bräuchte wäre eine Winkelangabe für die Drehung des Kopfes nach links und rechts die mit einem Winkel von 0° startet, egal wo ich mich gerade befinde bezogen auf den magnetischen Norden.

    Gruß

    Christian

  45. Christian Blesing, 2014-08-26 5.52 pm

    Hi,

    ich habe die Lösung! Die Änderung vom azimuth also von 90° auf 110° entspricht meinem gesuchten YAW-Wert. Aber ich habe noch eine andere Frage. Ist es möglich den YAW Winkel ohne den azimuth zu berechnen?

    Gruß

    Christian

    1. Paul, 2014-09-02 10.46 am

      Hallo Christian,
      was du suchst ist die Drehgeschwindigkeit in der yaw/azimuth Orientierung. Yaw und Azimuth sind synonym, somit wirst du um den Azimuth Winkel nicht herum kommen, da das deine einzige Referenz in diesem Freiheitsgrad ist. Du kannst die Winkeldifferenz mit jeder Messung bestimmen und diese mit der Zeit gewichten, die seit der letzten Messung vergangen ist, um deinen Wert zu bekommen. Nur musst du beim Überlauf im Wertebereich aufpassen. Da muss du eine Spezialbehandlung einbauen wenn dein absoluter Winkel 360° überschreitet und bei 0° wieder anfängt zu zählen. In meinem Code ist es glaube ich ein Sprung von 180° auf -180°.
      Viel Erfolg,
      Paul

  46. Stephane Schittly, 2014-09-06 2.39 pm

    Thanks a lot for this :-)

  47. Mike, 2014-09-11 12.26 pm

    Hi

    I think it may be worth changing the Euler angle values to work solely in quaternions. Certainly this simplifies delta rotation updates and also the interpolation between the various orientations.

    Regardless, thanks for the article.

    Mike

    1. Paul, 2014-09-25 8.31 pm

      Yes I wanted to switch to quaternions while I wrote the code. But I had issues with accuracy because floating-point precession errors during all the quaternion multiplications caused errors that grew into oblivion. So I started to normalize the quaternions each pass. This again caused performance issues and I ran out of time (master thesis deadline). So I got stuck with all the joys that gimbal lock and rotation matrices bring. Trust me, I’m actually a big fan of quaternions ;)

  48. icell, 2014-09-21 5.22 am

    Hi Paul,
    Big help of your job.
    I found a problem with onSensorChanged.

    The default TIME_CONSTANT is 30, which is working well.
    If I set the TIME_CONSTANT = 20, so the frequence is 50Hz.

    Then I printed the fusedOrientation and also gyro data , accel data.
    I found gyro always gave the same data two by two ,so did the accel data, always.
    Which means the onSensorChanged function is not fast enough for fusedOrientation.

    I used Nexus S to run the demo, which has 49 Hz of accel and 200 Hz gyro at most.
    So I think the fusedOrientation could go 49 Hz at most.

    1. Paul, 2014-09-25 8.20 pm

      I don’t see a reason why oversampling with fusedOrientation should cause problems in the sensor fusion algorithm. You would get the same value in some consecutive samples but that’s all. The TIME_COSNTANT is rather used to influence the time period of the complementary filter.

  49. erum, 2014-10-17 2.12 pm

    can u help me how can i use this to find distance using accelerometer distance covered by user

    1. Paul, 2014-10-21 2.43 pm

      I would discourage you from using this approach to determine linear movement. The accelerometer ist too noisy and produces too much errors to acurately calculate the total distance covered.

  50. Aleks, 2014-11-07 7.39 pm

    Dear Paul,

    it’s a really great tutorial. May I ask You, whether You have posted a source code somewhere (like, on git hub)?

    Many thanks,

    1. Paul, 2014-11-16 3.45 pm

      Hi,
      please read through the introduction of the article. There is a download link right before the beginning of the first tutorial section. You can download the code as a zip-file. I have no version control for this piece of code, since I wrote it for demonstration purposes only.
      You can also find the source code on CodeProject: http://www.codeproject.com/Articles/729759/Android-Sensor-Fusion-Tutorial

Leave a Reply

Your email address will not be published. Required fields are marked *


9 + = eighteen

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>