Hi,
I recently received a project tango prototype and I would like to use it's understanding of the device position in my projects.
My idea was to initialize the 3d-scene with vuforia using a marker or something, then place the 3d-objects at the correct position and finally activate the tango prefabs to track the movement of the user.
However, the two services don't work well together. While the app does not crash, the tracking by the tango sdk does not work properly, if tango is activated after vuforia.
Did someone successfully combine the two sdks in one project?
Is it even possible?
Just bumping this thread because of the announcement of the ZenPhone AR. We now have 2 Tango consumer devices--would be great to see Vuforia support for them. Just like support with Hololens shows us, there are many great uses cases to have the environment mapped along with traditional AR target tracking at the same time.