We are developing app for client that will be used on BT200.
The app shows some content on marker however there will be more content rendered by other camera in unity scene at the same time and will continute to show it as you look around controlled by orientation sensors even when marker is out of tracking.
To achieve this we are trying to align second camera with ARcamera.
First what we do is that we copy AR camera projection matrix to secondary camera (both left and right eye).
Everythign seems to be rendered fine. Secondary camera is being controlled by orientation sensors and it renders same thing as AR camera at correct proportions and position. However as soon as you roll your head rendering of AR camera stays fine, but renderign of second camera gets offset up or down depending on which way you roll your head.
What we think is that projection matrix is probably offset from center to the left to match physical construction of the glasses where web camera is on the right side and displays are on the left of it. When you roll your head we can see that AR camera is not only rotated but it's position is udjusted up and down to correct this movement to keep scene content rendered correctly on top of real world.
There is probably also calibration profile mixed in as well. We would like to understand how this is exactly done to be able to alignt our camera with it as well.
Have not found much details about it. can someone confirm our assumption and shade a bit more light on this? it will save us tons of time.