Hi, I'm having a problem using unity's Render Textures.
I have a working scene with an AR Camera (camera1), all works fine here, both in unity player and android devices.
but then I add a second Camera (camera2) which outputs to a Render Texture.. it works just fine in unity player, but when I try it on android device, the background (smartphone's video-feed) is all black, augmentation still works fine .... BUT.... here comes the maddness:
I add a cube to which I applied the Render Texture... what I see there in the cube, instead of viewing camera2's output, I see a crop of the android camera's video feed .. now, I add a third camera (camera3) that also outputs to a new Render Texture, and also apply that new render texture to a new cube... and that texture and cube does work fine, showing the output of camera3.... but why is the first Render Texture intercepting the background video feed instead?
can someone give a detailed insight on this issue?
as a workaround, I suppose I could use the Vuforia's Background Texture Access sample to restablish the background video feed, but... what should I do with the first render texture that allways fails? create more than one, and don't use the firts one at all? but why does that happen???
to be clear:
1) the first bug is that video feed on background gets lost..
2) the second bug is that the first unity camera to outputs to a render texture, captures a crop of the video feed instead of unity camera's output.
any hints on why it happens, and how to work around it?