- Sort Posts
- 8 replies
- Last post

### incorrect Inverse function in sampleMath class

for(int tIdx = 0; tIdx < state.getNumTrackableResults(); tIdx++) { // Get the trackable object from getTrackableResult() const QCAR::TrackableResult* result = state.getTrackableResult(tIdx); const QCAR::Trackable& trackable = result->getTrackable(); // LOG("Trackable name: %d", state.getNumTrackableResults()); QCAR::Matrix44F modelView = QCAR::Tool::convertPose2GLMatrix(result->getPose()); SampleUtils::printMatrix(&modelView.data[0]); QCAR::Matrix44F inverse = SampleMath::Matrix44FInverse(modelView); SampleUtils::printMatrix(&inverse.data[0] ); .... }

Initially, I tried to compute the relative coordinates between one target to another which means I need to inverse the matrix. After that, I saw the results was not make sense.

then I used printMatrix to see the both of matrix (matrix and it's inverse matrix based on above code), and found out that the rotation vector of inversed matrix was the same value with initial matrix. I think the inverse of rotation vector must be equal to it's transpose..

Thanks

### incorrect Inverse function in sampleMath class

Hi de_duu,

indeed, our sample code for inverse matrix calculation actually returns the "inverse transpose" matrix, instead of just the pure "inverse";

however, to obtain the canonical and "mathematically correct" inverse matrix, you can just transpose again the matrix, i.e. you can use this code:

QCAR::Matrix44F inverseMV = SampleMath::Matrix44FTranspose( SampleMath::Matrix44FInverse( matrix ) );

### incorrect Inverse function in sampleMath class

Hi AlesandroB,

Indeed, I got the inverse result by transpose again the matrix. Here I want to calculate the coordinate of one target relative to another. I understand how to calculate theoretically, but I got wrong results in implementation.

Here is my sample code..

in global variable I declared

int k = -1; typedef struct{ int id; QCAR::Matrix44F pose; }Object_T; Object_T object[2]={{0},{0}};

In renderFrame I wrote

object[0].id = 0; object[1].id = 0; // Did we find any trackables this frame? for(int tIdx = 0; tIdx < state.getNumTrackableResults(); tIdx++) { // Get the trackable object from getTrackableResult() const QCAR::TrackableResult* result = state.getTrackableResult(tIdx); const QCAR::Trackable& trackable = result->getTrackable(); object[tIdx].pose = QCAR::Tool::convertPose2GLMatrix(result->getPose()); object[tIdx].id = trackable.getId(); .....} for(int i = 0; i < state.getNumTrackableResults() ;i++){ QCAR::Matrix44F mainModelView; if(object[0].id > 0 && object[1].id > 0){ if(object[i].id == 3){ // Create mainModelView for target with id = 3 mainModelView = SampleMath::Matrix44FTranspose (SampleMath::Matrix44FInverse(object[i].pose)); k = 1; } // Another Target detected else if(k>0 && object[i].id != 3) { QCAR::Matrix44F wmat2; // Position of Another target with respect to the target with id = 3 // Compare the mutiplied results using class QCAR::tool:: and SampleUtils QCAR::Matrix44F wmat = QCAR::Tool::multiply( mainModelView, object[i].pose); SampleUtils::multiplyMatrix(&mainModelView.data[0], &object[i].pose.data[0] , &wmat2.data[0]); LOG("X: %f Y: %f Z: %f", wmat.data[12], wmat.data[13], wmat.data[14]); LOG("x: %f y: %f z: %f", wmat2.data[12], wmat2.data[13], wmat2.data[14]); } else{ ; }}

I set the target size to the ratio 1:1 against the printed target. But the results I got is not make sense..

I also found out that the order of multiplication matrix between class QCAR::tool and SampleUtils is not the same.

Did I make mistake on calculation? Or Did I misinterpret the sample code of QCAR?

### incorrect Inverse function in sampleMath class

Hi,

if you are trying to compute the position of a target with respect to another, I would recommend to read this tutorial:

https://developer.vuforia.com/resources/dev-guide/unified-target-coordinates

### incorrect Inverse function in sampleMath class

Hi,

I just realise that obtaining the rotation matrix is a trivial task. I have successfully got the relative transformation matrix between two trackables.

However, when I move and rotate my device while tracking two unmoved targets , the relative rotations and translations change a little bit. For instance, the translations value either X and Z change in range 0-5 units at my first observation.

In my opinion as long as the two targets unmoved, the relative transformation won't change. Is that due to the unprecision tracking results for each frame?

Thanks

Can you post a snippet of the code that you're using?