Hello,
For my graduation project, I want to experiment a tool of scenography in augmented reality: in short, the goal is to stream a DJ set with all the scenography in augmented reality.
The area target is from what I see the most complete way for my project, however I have doubts about the performance of such a live capture via an iPhone or an iPad. From what I saw on the Vuforia portal :
"Some Vuforia Engine features are relying on device tracking poses in addition to camera images. For example, Area Targets require a positional device tracker, and device tracking is strongly recommended for stable Model Target tracking."
: is it possible to use the external camera support to keep a desktop application while having the iPad ARKit?
Have a nice day !
Hi, thanks for your fast answer !
Okay I understand, I was just hoping that we could still use ARCore ARKit from the built-in smartphone camera while having a desktop app.
I've got some doubts for two reasons :
- I may be need an additionnal screen/interface to controls VFX Graph effect, light effect and relative things to scenography stuff.
- I wanna try some effects I worked with VFX Graph and Point Cloud objects, and I'm a bit scary of laggy live output
Thanks in advance,
Have a nice day !