"We offer new support options and therefor the forums are now in read-only mode! Please check out our Support Center for more information." - Vuforia Engine Team

AR Web JavaScript API - Open Sockets - Concept

Hi,

I've built Apps with Vuforia + Unity + iOS and it's worked out well, but have a concept to support AR via mobile web browser via a JavaScript API...

I've built an open socket server that receives camera image data generated from JS's getUserMedia (as well as device orientation and compass) and responds with JSON data. Then the web application receives that data to render the experience real-time. In a private demo I've used C++ to receive the coordinates of faces in the camera data and return the size and coordinates to be shown in the HTML camera <video> preview. Works fine, slight delay as expected but works.

The concept is to create an AR processor that returns ground plane and/or anchor data or marker coordinates from the web service instead of face coordinates, and show the AR experience over the camera <video> preview via three.js. The reason I'm here is to see if this would be possible using Vuforia installed on the Linux CentOS web server as the AR processor, and if so any advice in setting this up or making this possible. Would I use Unity OpenCV export? Making Vuforia receive the camera data, etc. Any and all advice apprieciated. Note, I'm aware of the lag a slow network will cause and that's not really what I'm troubleshooting. That lag may be the reason not to pursue the project, but I want to have a functioning demo that tells me that.

Thanks!

-Joe

leszczynskim

Fri, 12/13/2019 - 11:58

Hi,

I would like to develop almost the same web AR app as you have described. Did you manage to create the AR processor running on CentOS web server? How about the lag?

 

Cheers,

Mikolaj