Unity - How can I capture a screen shot?

August 10, 2012 - 3:36pm #1
CaptureScreenshot

Use Unity's Application.CaptureScreenshot method to save a screenshot to the file system as a png. The screenshot should contain both the camera image and augmentation, as long as this method is called after the scene has had a chance to render everything. Here's an example of calling it from the OnGUI method:

void OnGUI()
{
  if (GUI.Button(new Rect(20, 20, 150, 100), "CaptureScreenshot")) {
    Application.CaptureScreenshot("Screenshot.png");
  }
}

For this to work on Android, Write Access needs to be set to "External (SDCard)" in the Player Settings. The file will end up in the sdcard/Android/data/<bundle id>/files folder. Make sure the sdcard is not mounted when running the app, otherwise the file will not be written. On iOS, the file will end up in the application Documents folder. To access this folder from iTunes you need to add the UIFileSharingEnabled key to your Info.plist file (in the Xcode project Unity creates).

ReadPixels

If you don't need to write the screenshot out as a png, it may be preferable to read the screen pixels into a Texture2D object using the Texture2D.ReadPixels method. Here's an example of calling it from the OnGUI method:

private Texture2D m_Texture;

void Start()
{
  m_Texture = new Texture2D(Screen.width, Screen.height, TextureFormat.RGB24, true);
}

void OnGUI()
{
  if (GUI.Button(new Rect(20, 20, 150, 100), "ReadPixels")) {
    m_Texture.ReadPixels(new Rect(0, 0, Screen.width, Screen.height), 0, 0, true);
    m_Texture.Apply();
  }
}

Like Application.CaptureScreenshot, this method should capture both the camera image and augmentation, along with anything else rendered to the screen (e.g. UI elements).

Using RenderTextures

Another option for drawing the scene into a texture is using a Unity RenderTexture. You can set any camera to render to a texture instead of rendering to the screen (see the Target Texture field in the Inspector). Note that if you also want that content rendered to the screen you will need to use a second camera to draw the texture created by the first camera. Unfortunately, this approach will not render the camera image into the RenderTexture. If you set the ARCamera target to a RenderTexture, the augmentation will show up on top of a black background. This is because the camera image is rendered natively using direct OpenGL calls (for performance reasons). To solve this problem, you can start with the BackgroundTextureAccess sample and render the camera image using a Texture2D object. Follow the Readme.txt instructions to render the camera image using the ARCamera: "Deactivate the BackgroundCamera (and its child object) and activate the VideoBackground child of the ARCamera." Now everything is rendered inside Unity using the standard pipeline, and a RenderTexture should work with the ARCamera as expected.

Log in or register to post comments