Android - How can I capture the AR view

April 3, 2013 - 2:36pm #1

This article describes how to capture the augmented reality view, including both the video background and the augmentation (e.g. 3D models) which is rendered on top of it, on the Android platform. The Image Targets sample is used as reference sample.

First, in ImageTargetsRenderer.java, you need to add a couple of methods to:

  • grab pixels from the OpenGL view
  • create a bitmap based on the pixels captured in the previous step
  • save the bitmap onto the device storage

This is the relevant code:


private void saveScreenShot(int x, int y, int w, int h, String filename) {
        Bitmap bmp = grabPixels(x, y, w, h);
        try {
        	String path = Environment.getExternalStorageDirectory() + "/" + filename;
        	DebugLog.LOGD(path);
        	
        	File file = new File(path);
        	file.createNewFile();
            
            FileOutputStream fos = new FileOutputStream(file);
            bmp.compress(CompressFormat.PNG, 100, fos);
 
            fos.flush();
            
            fos.close();
            
        } catch (Exception e) {
            DebugLog.LOGD(e.getStackTrace().toString());
        }
    }
 
    private Bitmap grabPixels(int x, int y, int w, int h) {
        int b[] = new int[w * (y + h)];
        int bt[] = new int[w * h];
        IntBuffer ib = IntBuffer.wrap(b);
        ib.position(0);
        
        GLES20.glReadPixels(x, 0, w, y + h, GLES20.GL_RGBA, GLES20.GL_UNSIGNED_BYTE, ib);
 
        for (int i = 0, k = 0; i < h; i++, k++) {
            for (int j = 0; j < w; j++) {
                int pix = b[i * w + j];
                int pb = (pix >> 16) & 0xff;
                int pr = (pix << 16) & 0x00ff0000;
                int pix1 = (pix & 0xff00ff00) | pr | pb;
                bt[(h - k - 1) * w + j] = pix1;
            }
        }
 
        Bitmap sb = Bitmap.createBitmap(bt, w, h, Bitmap.Config.ARGB_8888);
        return sb;
    }

Then, we define two member variables to store the current view width and height:

private int mViewWidth = 0;
private int mViewHeight = 0;
    

The mViewWidth and mViewHeight variable are updated in the onSurfaceChanged() method, as shown in the code below:

public void onSurfaceChanged(GL10 gl, int width, int height)
    {
        DebugLog.LOGD("GLRenderer::onSurfaceChanged");

        // Call native function to update rendering when render surface
        // parameters have changed:
        updateRendering(width, height);

        mViewWidth = width;
        mViewHeight = height;
        
        // Call QCAR function to handle render surface size changes:
        QCAR.onSurfaceChanged(width, height);
    }

Finally, we call the saveScreenShot method from the onDrawFrame() method, right after calling the native renderFrame() function:

public void onDrawFrame(GL10 gl)
    {
        if (!mIsActive)
            return;

        // Update render view (projection matrix and viewport) if needed:
        mActivity.updateRenderView();

        // Call our native function to render content
        renderFrame();
        
        GLES20.glFinish();
    
        if ( some_condition ) {
        	saveScreenShot(0, 0, mViewWidth, mViewHeight, "test.png");
        }
    }

Note: in order to enable writing to the external storage, you need to edit the AndroidManifest.xml and add the following permission:

 

<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE"/>
    

 

Topic locked