I've been working on a project for a while and ultimately what I have is a minority report style setup with AR screens in front of me. I'm using a unity plugin called lean touch which allows me to tap on the mobile screen and select the 3d objects. I can then move them around and scale them which is great.
I have an undesired side effect though, under a different situation this would be fantastic but in this case it's hurting my head. If you trigger the target using a user defined target the screen appears in front of you with the camera (real world) behind it. I then tap on the item and can drag on my mobile device to interact. The problem is, if someone walks past my mobile in the background the tracking 'pushes' my item off my screen as they walk past.
Under a different setup I could take advantage of this as I wave my hand behind the camera and can interact with the item, unfortunately all I can do is push it from side to side not scale or anything else and it's kind of making the experience problematic. I therefore would like help to find out how to stop the tracking of my objects position if the camera background changes. Thus leaving my objects on screen until I touch the screen to interact.
I'm not sure exactly what needs turning on or off, if it's a vuforia setting or a lean touch unity setting or rigidbody ot something to add. I need directional advice on which way to go as I'm going in circles at the moment.
If I wave my hand behind or someone walks past it moves the AR elements.
Ideas would be appreciated.