Log in or register to post comments

How to combine tracking pose and IMU pose on Android

July 28, 2017 - 1:13am #1


Could you please help to figure out the issue described below?

In case I am losing a known marker, I would like to use the IMU to avoid losing tracking on an Android device – In this scenario I don’t want to use extended tracking. I implemented the basic functionality but something seems to be wrong because the behavior of the rendered objects isn’t as expected.

I did the following:

In every onDrawFrame() cycle I’m  getting two TrackableResults from the state, one from the IMU (DeviceTrackable) and the other  from the ImageTarget. I’m storing both poses as “lastIMURotationOnTrackedMarker” and “lastTrackedRotation”.

In case I’m losing the tracking of the image target I would like to add the difference rotation between the “lastIMURotationOnTrackedMarker” and the IMU rotation provided in this onDrawFrame() cycle to the “lastTrackedRotation” to calculate a new pose. The math I did is as follows:

I have two quaternions q1 and q2. The quaternion q (or rotation) is the quaternion that brings me from q1 to q2 and, as far as I know, that is calculated as q = q1^(-1) * q2

The “difference” rotation (represented by q) between the last “tracked” IMU position and the current IMU position should rotate the last known pose from the image target to a new pose. But for some reason that doesn’t work. Is there an error in the reasoning or in the implementation?

private Rotation getRotation () {

        Vec4F lastTrackedRotationQ = lastTrackedRotation.getQuaternion(); // Last tracked rotation provided from a marker

        Vec4F currentIMURotationQ = currentIMURotation.getQuaternion(); // current IMU rotation

        Vec4F lastIMURotationOnTrackedMarkerQ = lastIMURotationOnTrackedMarker.getQuaternion(); // Last IMU rotation stored when the last marker was tracked

        Vec4F lastIMURotationOnTrackedMarkerQInv = inverseQuaternion(lastIMURotationOnTrackedMarkerQ); // q1^-1

        Vec4F diffQ = multiplyQuaternion(lastIMURotationOnTrackedMarkerQInv, currentIMURotationQ); // q = q1^-1 * q2

        Vec4F newRotQ = multiplyQuaternion(lastTrackedRotationQ, diffQ); // Set the rotation further

        Rotation rotation = new Rotation();


        return rotation;


public static Vec4F inverseQuaternion(Vec4F q) {

        float[] qq = q.getData();

        Vec4F nn = new Vec4F();

        float[] n = new float[4];

        double d = qq[0] * qq[0] + qq[1] * qq[1] + qq[2] * qq[2] + qq[3] * qq[3];

        n[0] = (float) (-qq[0] / d); //x

        n[1] = (float) (-qq[1] / d); //y

        n[2] = (float) (-qq[2] / d); //z

        n[3] = (float) (qq[3] / d);  //w


        return nn;


public static Vec4F multiplyQuaternion(Vec4F q1, Vec4F q2) {

        Vec4F nn = new Vec4F();

        float[] n = new float[4];

        float[] q = q1.getData();

        float[] r = q2.getData();

        n[0] = q[3] * r[0] + q[0] * r[3] + q[1] * r[2] - q[2] * r[1];   //x

        n[1] = q[3] * r[1] + q[1] * r[3] + q[2] * r[0] - q[0] * r[2];   //y

        n[2] = q[3] * r[2] + q[2] * r[3] + q[0] * r[1] - q[1] * r[0];   //z

        n[3] = q[3] * r[3] - q[0] * r[0] - q[1] * r[1] - q[2] * r[2];   //w


        return nn;


Thank you very much for your help in advance!

Log in or register to post comments