"We offer new support options and therefor the forums are now in read-only mode! Please check out our Support Center for more information." - Vuforia Engine Team

Camera and Display Coordinates

Hello,

I am working on a project and have come across a problem that is related to the display screen and the camera. Basically, we have a camera that is mounted on our foreheads that takes input images of the environment, which will then be processed and have the objects from unity added to it to makes the new output environment an AR environment. However, the way the displays are set up and the output of the environment work, the user sees the surrounding from their eye's perspective (due to the output unity environment being displayed on see-through lenses) not from where the camera is mounted. (to explain this further lets assume I am adding a box to the room I am currently in. With the system off, the user would still see the room they are in because the lenses act as glasses and we are not virtually adding the room just adding objects to it, so when the system would be turned on the only thing that the display would add is the box. It does not display the room itself, that remains the same regardless if the user has system on or off, the only AR object that it would be adding is the box not the environment. The only reason we need the input camera is to sense the image we are having our AR objects be in relation to. The room would still be view through the glass and not the input camera). Thus this is creating a problem where even though the mounted camera on your forehead is very close too your eyes, they are still enough of a distance away that the perspectives are different. This is causing the displayed objects to be off by about half an inch up and to the side then we would like, which theoretically makes sense because that is approximately the difference in distance between your eye and forehead. 

A clear solution seems to be to change what the display deems as its origin location at 0,0,0 (let's say) based on the input of the camera and shift that so its new origin where everything would be centered, would be (again let's say) -1, -1, 0. Basically we want to make the system think that the camera is sitting directly over your eye instead of on your forehead, where it has sit so as to not block your literal eye so you can see the environment because that is NOT getting virtually added.

We considered simply changing the coordinates of the object in Unity but that only works for a specific distance away. We want it to work so that if you got close or far or looked from different angles then it would still work which is why changing the way the display is outputting seems like the obvious solution to us. We just are too sure how to go about doing that.

Any solutions on how to accomplish this would be greatly appreciated as we are not sure how to make these coordinate changes.

Thanks,

Stephanie Johnstone

Maybe try creating an additional camera and then parent it to the ARCamera with an offset equal to the eye-camera distance.  Then use the new camera feed vs the ARCamera feed.  Basically the ARCamera gameobject is only used for rotation and position of the new camera (with offset).