"We offer new support options and therefor the forums are now in read-only mode! Please check out our Support Center for more information." - Vuforia Engine Team

Screenshot from camera takes too much time

Hi everyone,

I'm using the Vuforia SDK for Android and it works perfectly. I'm able to take a screenshot of the GLSurfaceView (for image analysis with opencv algorithms) and it works fine on device with small screens (I've plugged this in my surface's view GLSurfaceView.Renderer

DisplayMetrics dm = new DisplayMetrics();             mActivity.getWindowManager().getDefaultDisplay().getMetrics(dm);             int w = dm.widthPixels;             int h = dm.heightPixels;             int b[] = new int[w * (y + h)];             int bt[] = new int[w * h];             IntBuffer ib = IntBuffer.wrap(b);             ib.position(0);             //gl.glReadPixels(x, 0, w, y + h, GL10.GL_RGBA, GL10.GL_UNSIGNED_BYTE, ib);             GLES20.glReadPixels(0, 0, w, h, GLES20.GL_RGBA, GLES20.GL_UNSIGNED_BYTE, ib);             for (int i = 0, k = 0; i < h; i++, k++) {//remember, that OpenGL bitmap is incompatible with Android bitmap                 //and so, some correction need.                 for (int j = 0; j < w; j++) {                     int pix = b[i * w + j];                     int pb = (pix >> 16) & 0xff;                     int pr = (pix << 16) & 0x00ff0000;                     int pix1 = (pix & 0xff00ff00) | pr | pb;                     bt[(h - k - 1) * w + j] = pix1;                 }             }

            Bitmap screnshot = Bitmap.createBitmap(bt, w, h, Bitmap.Config.ARGB_8888);

 

However, when dealing with high-resolution screens and camera this causes a 20ms freeze in the camera rendering. I've tried to make this call asynchronously but then I obtain a black screenshot. I also tried to copy the camera in a buffer and then read pixel from this buffer but the freeze remains : 

final int frameIdIndex=0,renderIdIndex=1,textureIdIndex=2;             final int[] bufferId=new int[3];

            // bind renderbuffer and create a 16-bit depth buffer             // width and height of renderbuffer = width and height of             // the texture             GLES20.glBindRenderbuffer(GLES20.GL_RENDERBUFFER, bufferId[renderIdIndex]);             GLES20.glRenderbufferStorage(GLES20.GL_RENDERBUFFER,GLES20.GL_DEPTH_COMPONENT16,width,height);

            //bind the frameBuffer;             GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER,bufferId[frameIdIndex]);

            //specify texture as color attachment             GLES20.glFramebufferTexture2D(GLES20.GL_FRAMEBUFFER,GLES20.GL_COLOR_ATTACHMENT0,GLES20.GL_TEXTURE_2D,bufferId[textureIdIndex],0);

            //specify renderbuffer as depth_attachment             GLES20.glFramebufferRenderbuffer(GLES20.GL_FRAMEBUFFER,GLES20.GL_DEPTH_ATTACHMENT,GLES20.GL_RENDERBUFFER,bufferId[renderIdIndex]);

            //check for framebuffer complete             int status= GLES20.glCheckFramebufferStatus(GLES20.GL_FRAMEBUFFER);             if(status !=GLES20.GL_FRAMEBUFFER_COMPLETE) {                 throw new RuntimeException("status:"+status+", hex:"+Integer.toHexString(status));             }

            final int screenshotSize = width * height;             final ByteBuffer bb = ByteBuffer.allocateDirect(screenshotSize * 4);             bb.order(ByteOrder.nativeOrder());             GLES20.glReadPixels(0, 0, width, height, GLES20.GL_RGBA,             GL10.GL_UNSIGNED_BYTE, bb);             int pixelsBuffer[] = new int[screenshotSize];             bb.asIntBuffer().get(pixelsBuffer);             final Bitmap bitmap = Bitmap.createBitmap(width, height,                     Bitmap.Config.RGB_565);             bitmap.setPixels(pixelsBuffer, screenshotSize - width, -width,                     0, 0, width, height);             pixelsBuffer = null;

            short sBuffer[] = new short[screenshotSize];             ShortBuffer sb = ShortBuffer.wrap(sBuffer);             bitmap.copyPixelsToBuffer(sb);

            // Making created bitmap (from OpenGL points) compatible with             // Android             // bitmap             for (int i = 0; i < screenshotSize; ++i) {                 short v = sBuffer[i];                 sBuffer[i] = (short) (((v & 0x1f) << 11) | (v & 0x7e0) | ((v & 0xf800) >> 11));             }             sb.rewind();             bitmap.copyPixelsFromBuffer(sb);

            // cleanup             GLES20.glDeleteRenderbuffers(1, bufferId,renderIdIndex);             GLES20.glDeleteFramebuffers(1, bufferId ,frameIdIndex);             GLES20.glDeleteTextures(1, bufferId,textureIdIndex);

I've also tried to register a preview callback on my Android camera but then OpenGLSurfaceView does not display the camera anymore

mCamera.setOneShotPreviewCallback(new Camera.PreviewCallback() {             @Override             public void onPreviewFrame(byte[] bytes, Camera camera) {                          DisplayMetrics dm = new DisplayMetrics();                     Bitmap screenshot = BitmapFactory.decodeByteArray(bytes, 0, bytes.length);                     screenshotTaken(screenshot);                 }         });

I've also tried to modify the surfaceholder view : mGLView.getHolder().setFixedSized(width, height) but then it "zooms in" my image instead of scaling it. I then tried to use mGLView.setScaleX() and setScaleY() to but it back on but without success.

And I also tried Renderer.getInstance().setVideoBackgroundConfig(config) to change the resolution but then I got a tiny video area that I do not manage to scale.

 

Does anyone know how to take a screenshot asynchrounously without causing performance loss ? 

Best regards