Hello everyone
I want to make a simple app where I detect an image target and then an object appears in its correct size!
The printed image size is 10x15cm. I made the model in blender so I know that its dimensions are correct.
When I uploaded the image I put the width of the paper (0.1). But the size of the object that appears is not correct! I checked the image target tab -> advanced and there is the width I put and then a height of 0.3. When try to change it to 0.15, the width also changes.
Any idea on how I can solve this would be very much appreciated!
Attachment | Size |
---|---|
![]() | 24.4 KB |
Hi ,
so far, I remember when you create the image in the database you have to specify the width example: (pic 2021-05-06_15-25-28.jpg)
Later when you import the database in unity as you already mentioned you could also specify the width. My understanding here is that the ratio between the width and length is already calculated in the database of the image target and it is fixed. So that when you set the one value now in unity and unity I will automatically adapt the second value. (picture 2021-05-06_15-31-57.jpg)
Therefore , I believe this lead here in your case to the fixed ratio.
Another point is that the scale / dimension of the model are not correctly displayed – so at least unity does not use the correct blender dimensions. So when we import e.g. an simple model form blender mostly I have some difference between the shown scale (in Unity transform component) and the model dimensions. (show in the next picture)
So seems that the transformation scale is different from the blender dimension scale (the reals model outline) . So when I check the model then in blender I see this difference.