Hi,
I'm trying to integrate the Vuforia SDK with our existing Android app. Our app is a large native app with minimal Java code. My approach has been to try to get the camera feed rendering over the top of our GL clear calls by integrating the relevant changes found in either the ImageTargets sample, or as found here...
https://developer.vuforia.com/resources/dev-guide/android-native-activities
At this point I think I've made all the required changes, but the sample app works and my app, while appearing to work, renders nothing more than it usually would.
In brief the things I've done are...
- Modify our build process such that libQCAR.so is packaged with the APK.
- Loaded libQCAR.so inside a static block of Java code.
- Called QCAR setInitParameters and QCAR.init. We used GL_20 and these calls both succeed.
- I've added hooks for onPause and onResume and forwarded to QCAR.
- I've added a Java method named onGLInitialized, called from native code after the window and GL context have been created and from here have called QCAR.onSurfaceCreated and onSurfaceChanged.
- I've added a block of native code to initialize and start the camera, called in the middle of our apps initialisation. This appears to complete without error.
- I've added a call to drawVideoBackground, called once per frame inside begin/end calls on the QCAR renderer instance.
- I've also added manifest permissions for the camera.
I was expecting at this point I'd see the camera feed rendering behind my own 3D graphics, but instead I just see the clear colour.
I'm not sure what I could be missing. I can't see much else relevant in the sample code. I can provide more details on any of the changes I've made if that helps but otherwise if anyone could point out what I'm missing I'd be very appreciative!
Thanks,
Dave.
Android NativeActivity Integration
Hi Dave,
have you started the ImageTracker as well, right after starting the camera (as shown in our samples) ?
Also, could you share the relevant piece of OpenGL code in which you call drawVideoBackground and your custom rendering code ?
Android NativeActivity Integration
Hi,
Thanks for getting back to me so quickly.
The code we use to draw the video background is simply...
Android NativeActivity Integration
Ok, Thanks for the clarifications. so, if you are merely trying to show the video, indeed the tracker is not needed; we can exclude the Tracker issue.
Another important element to render the video feed properly is to call the function configureVideoBackground();
Android NativeActivity Integration
We actually initialize the camera and configure the video texture in one block of code.
Android NativeActivity Integration
Ok, The code below looks correct;
however, to be sure that the video background is configured properly, you also need to configure the video background in the OnSurfaceChanged method (right before calling QCAR.onSurfaceChanged(width, height);...) :
Android NativeActivity Integration
This is perhaps where things differ a little between our code and the samples.
We don't call in GL code anywhere from our Java code. We have only one class dervied from NativeActivity.
Android NativeActivity Integration
Ok. I think what you are doing here is correct.
I tried on the Java samples to remove the configureVideoBackground() code from the onSurfaceChanged(), and the only side effect is that the texture appears stretched, but it is still visible and well centered on the screen;
Android NativeActivity Integration
Hi,
I've done that now.
I do get back frames of data it would seem. The image seems to be 720x480 and interestingly seems to be flagged as GREYSCALE.
Does that sounds right?
Thanks,
Dave.
Android NativeActivity Integration
Hi,
I should add that I have not inspected the contents of the buffer that is returned, only the dimensions and format.
Thanks,
Dave.
Android NativeActivity Integration
Ok, thanks a lot for doing the test.
720 x 480 sounds quite right: but please check that the videoMode.mWidth and videoMode.mHeight (in your video background configuration code that you showed previously) are actually matching those values.
Android NativeActivity Integration
getVideoMode() does return 720x480, so that seems OK.
Android NativeActivity Integration
Ok, let me try the getImage as well and get back to you; you kight need to explicitly assign a frame format ...
Android NativeActivity Integration
Just so you know I have tried calling QCAR::setFrameFormat(QCAR::RGB565, true) ahead of configuring the video background texture but this hasn't seemed to have made a difference.
Android NativeActivity Integration
OK
1. Try setting a different frame format, for instance RGB888 (instead of RGB565), as some devices support one but not the other; so there is chance that your device may support RGB888 but not RGB565.
Android NativeActivity Integration
Looks like I was calling setFrameFormat too early, and it was actually failing (returning false). Replacing the call I was making with a call before and after the call to start the camera has made a difference.
I now get 2 formats returned in the callback.
Android NativeActivity Integration
All right. So, we have made some nice progress.
Now the QCAR camera is getting you the right pixels, which solves half of the problem.
Android NativeActivity Integration
Yes it feels like we are nearly there. In answer to you questions...
Android NativeActivity Integration
Ok, so, yes, please check that you are not clearing with glClear in more than one place, as this could be an issue (you only need to glClear() right before drawing the video background)
Android NativeActivity Integration
I think we will struggle to make this fit the sequence you describe. From out perspective the QCAR call is the custom rendering code and we are looking to find the correct location to insert it into our rendering pipeline, rather than the other way around.
Android NativeActivity Integration
I see. Let me check with our team what are the OpenGL state issues with putting code after the renderer.end() call, and get back to you.
Meanwhile, would it be possible for you, just for the sake of testing, to try do the sequence I describe but without calling your own rendering code ?
Android NativeActivity Integration
I made a few more tests and double-checked with my team, and it appears you can also call your custom rendering code AFTER the renderer.end(), i.e., you can try this: