first of all, apologizes in case I'm posting this in the wrong section of the forum
We are developing a retail augmented reality application, which relies heavily on extended tracking to allow users to position 3D objects in a room at a life-size scale. (Experience starts from a predefined image target).
The application targets iOS 7 iPad only, and uses the Unity extention.
I would like to know if there are any best practices or ways to use the extended tracking feature in the most reliable / robust way possible, and in particular:
- I noticed a drift in camera tracking and scale when moving the device away from the image target
- If the app includes high-poly objects, will it significantly affect the performance of tracking?
- In your opinion, is the "persistent tracking” feature a good option in a scenario where the user is moving a way from the image target considerably, and even turning the device around the room ~360 degree?
- Are the focus and quality options for the camera affecting the extended tracking?
Thank you very much,