• Sort Posts • 8 replies

Hi, I just found out that Inverse function in SampleMath class was incorrect.

Does anyone have corrected version?

it seems that I did not get identity matrix after multiplying the matrix with it's inverse..

Thanks

Can you post a snippet of the code that you're using?

```

for(int tIdx = 0; tIdx < state.getNumTrackableResults(); tIdx++)
{
// Get the trackable object from getTrackableResult()
const QCAR::TrackableResult* result = state.getTrackableResult(tIdx);
const QCAR::Trackable& trackable = result->getTrackable();
// LOG("Trackable name: %d", state.getNumTrackableResults());
QCAR::Matrix44F modelView = QCAR::Tool::convertPose2GLMatrix(result->getPose());

SampleUtils::printMatrix(&modelView.data);
QCAR::Matrix44F inverse = SampleMath::Matrix44FInverse(modelView);
SampleUtils::printMatrix(&inverse.data );
....

}```

Initially, I tried to compute the relative coordinates between one target to another which means I need to inverse the matrix. After that, I saw the results was not make sense.

then I used printMatrix to see the both of matrix (matrix and it's inverse matrix based on above code), and found out that the rotation vector of inversed matrix was the same value with initial matrix. I think the inverse of rotation vector must be equal  to it's transpose..

Thanks

Hi de_duu,

indeed, our sample code for inverse matrix calculation actually returns the "inverse transpose" matrix, instead of just the pure "inverse";

however, to obtain the canonical and "mathematically correct" inverse matrix, you can just transpose again the matrix, i.e. you can use this code:

QCAR::Matrix44F inverseMV = SampleMath::Matrix44FTranspose( SampleMath::Matrix44FInverse( matrix ) );

Hi AlesandroB,

Indeed, I got the inverse result by transpose again the matrix. Here I want to calculate the coordinate of one target relative to another. I understand how to calculate theoretically, but I got wrong results in implementation.

Here is my sample code..

in global variable I declared

```int k = -1;

typedef struct{
int id;
QCAR::Matrix44F pose;
}Object_T;

Object_T object={{0},{0}};```

In renderFrame I wrote

```

object.id = 0;
object.id = 0;

// Did we find any trackables this frame?
for(int tIdx = 0; tIdx < state.getNumTrackableResults(); tIdx++)
{
// Get the trackable object from getTrackableResult()
const QCAR::TrackableResult* result = state.getTrackableResult(tIdx);
const QCAR::Trackable& trackable = result->getTrackable();

object[tIdx].pose = QCAR::Tool::convertPose2GLMatrix(result->getPose());
object[tIdx].id = trackable.getId();

.....}

for(int i = 0; i < state.getNumTrackableResults() ;i++){
QCAR::Matrix44F mainModelView;

if(object.id > 0 && object.id > 0){
if(object[i].id == 3){
// Create mainModelView for target with id = 3
mainModelView = SampleMath::Matrix44FTranspose
(SampleMath::Matrix44FInverse(object[i].pose));
k = 1;
}

// Another Target detected
else if(k>0 && object[i].id != 3)
{
QCAR::Matrix44F wmat2;

// Position of Another target with respect to the target with id = 3

// Compare the mutiplied results using  class QCAR::tool:: and SampleUtils
QCAR::Matrix44F wmat = QCAR::Tool::multiply(  mainModelView, object[i].pose);

SampleUtils::multiplyMatrix(&mainModelView.data, &object[i].pose.data ,
&wmat2.data);

LOG("X: %f  Y: %f  Z: %f", wmat.data, wmat.data, wmat.data);
LOG("x: %f  y: %f  z: %f", wmat2.data, wmat2.data, wmat2.data);
}
else{
;
}}```

I set the target size to the ratio 1:1 against the printed target. But the results   I got is not  make sense..
I also found out that the order of multiplication matrix between class QCAR::tool and SampleUtils is not the same.

Did I make mistake on calculation? Or  Did I misinterpret the sample code of QCAR?

Hi,

if you are trying to compute the position of a target with respect to another, I would recommend to read this tutorial:

https://developer.vuforia.com/resources/dev-guide/unified-target-coordinates

Hi AlesandroB,

Thanks for your recomendation. However, do you have any clue to get the rotation matrix also?

Thanks

Hi,

I just realise that obtaining the rotation matrix is a trivial task. I have successfully got  the relative transformation matrix between two trackables.
However, when I move and rotate my device while tracking  two unmoved targets , the relative rotations and translations change a little bit. For instance, the translations value either X and Z change in range 0-5 units at  my first observation.

In my opinion as long as the two targets unmoved, the relative transformation won't change. Is that due to the unprecision tracking results for each frame?

Thanks

Hi, the relative transformation of one target with respect to another indeed should not change (if correctly computed); however, some little variations may be expected, due to the tracking precision, as you say.