Log in or register to post comments

Unity UI for AR Glasses? (Stereoscopic)

April 26, 2017 - 4:31am #1

Have only been playing around with a pair of R7's for a day or two now, so forgive me if this has been answered or is an obvious question. 



How do you use the Unity UI system with AR Glasses that require stereoscopic cameras? I require tracking in my apps, so I have to use stereoscopic cameras for the tracked 3D object, but I also need a UI. The Unity UI system does not appear to play nice with eh ARCamera, it's always either not visible or spread across the two views so each eye is given half of the UI, which is not a good result. Is there anyway I can overlay the UI to both cameras the same so the user can see and focus on the UI or on the 3D object? Or is there never supposed to be a UI while tracking an object (which seems like it can't be true)? 

Let me know what I'm missing about creating UI's for glasses. Also, I have tried SetDisplayExtended(false) or true and the UI does then look correct, but of course then there are 2 3D objects rendered, so it seems like you only get either 1 or the other.

Thanks

Unity UI for AR Glasses? (Stereoscopic)

July 31, 2017 - 5:49am #3

HI  not  a  programmer  but  trying  to  make  a small  demo for  moverio bt200  . used  the   unity   UI in  stereo by  setting  it  to  world  canvas  and  scaling  it  down   to a

factor  of  .002  approx. works  fine  and  fixed it  to  the  AR camera  .looks  fine .works  fine  in editor

the problem  was  when  I  build it  for  android  I  realized  a  world canvas  would  need  a  world  cursor  so  it  can  be  seen in stereo .so  used  the  unity  plug  in for 

world  cursor .works  fine   but  clashed with  the  default   cursor   in moverio ..unable  to  hide  or  manipulate   the  system /hardware  cursor  in android  ..

 

need  a  work around   for that ????

 

 

 

 

Unity UI for AR Glasses? (Stereoscopic)

April 29, 2017 - 2:42am #2

I hope if anyone has any insights or ideason this problem that they post them!

 

I actually did find a wacky approach that actually seems to be working. I hate how janky it is, but it might be the only thing that works:



--Make a new camera, that renders to a render texture. Make the actual UI only visible to this camera and nothing else (I think I set the original ui to Screen Space Overlay Camera as well, poitning at the ARCamera's camera), and set clear flags to depth only so the render texture is transparent anywhere there isn't UI stuff.



--Now, create a second UI Canvas. It should only contain 2 UI objects: 2 Raw Image objects. Each of them should be set to stretch and be exactly half of the canvas, one on the left and one on the right.



--You guessed it, put the render texture from step 1 as the texture for the Raw Image objects, so now we have the original UI being rendered on each half of the screen. This second canvas can now be set to Screen Space Overlay and the two halves should each have the original UI and it looks correct in the glasses (the original UI is on both eyes).



It worked for me on my first basic test, but I tried a bunch of janky stuff like this and sometimes they don't work in every situation. Also, I knew the exact resolution of the AR glasses' screens (1280x720) beforehand, so of course I set that up as the render texture's resolution.

Log in or register to post comments