Adjust the camera to deliver well-focused camera frames to the Vuforia Engine. Use the Image class returned by the CameraDevice to access camera frames from your application.
Set Modes for Camera, Focus, and Exposure
Adjust the camera-related modes to optimize its performance against changing environmental conditions.
As long as Vuforia Engine has access to the camera, other device apps cannot access it. Release the camera by stopping Vuforia Engine to allow other applications to access it:
1CopyVuforiaBehaviour.Instance.enabled = false;
In Unity, this is done automatically when the app is being paused.
Camera mode
The CameraMode
can be adjusted for default, speed, or quality from the Camera Device Mode in Vuforia Configuration in the Inspector Window of the ARCamera GameObject.
Alternatively, you can configure the camera mode with a script inherited from Monobehaviour
. This registers a callback with the VuforiaBehaviour
that will set a camera mode when the Vuforia Engine has started.
123456789Copyvoid Start()
{
VuforiaApplication.Instance.OnVuforiaStarted += OnVuforiaStarted;
}
private void OnVuforiaStarted()
{
VuforiaBehaviour.Instance.CameraDevice.SetCameraMode(Vuforia.CameraMode.MODE_DEFAULT);
}
Enumerator for CameraMode |
|
MODE_DEFAULT |
Best compromise between speed and quality. |
MODE_OPTIMIZE_SPEED |
Minimize Vuforia Engine impact on the system. |
MODE_OPTIMIZE_QUALITY |
Optimize for better image and tracking quality. Applies a higher resource impact on the system. |
The MODE_DEFAULT is used if nothing else is set.
Change the mode to MODE_OPTIMIZE_SPEED if the Vuforia Engine impacts the performance of your device. We recommend using the default mode if you expect the target to be moved continuously
Focus mode
The FocusMode
is set by the CameraDevice
class. We recommend using the continuous auto-focus mode for most scenarios. Not all devices support all focus modes.
12345Copybool focusModeSet = VuforiaBehaviour.Instance.CameraDevice.SetFocusMode(FocusMode.FOCUS_MODE_CONTINUOUSAUTO);
if (!focusModeSet)
{
Debug.Log("Failed to set focus mode" + focusModeSet);
}
Focus mode enum |
Behavior |
FOCUS_MODE_FIXED |
Sets the camera into a fixed focus defined by the camera driver. |
FOCUS_MODE_TRIGGERAUTO |
Triggers a single autofocus operation. After the operation is completed, the focus mode will automatically change to FOCUS_MODE_FIXED. Setting a |
FOCUS_MODE_CONTINUOUSAUTO |
Lets you turn on driver-level continuous autofocus for cameras. This mode is recommended as it guarantees that the camera is always focused on the target. |
FOCUS_MODE_INFINITY |
Sets the camera to infinity, as provided by the camera driver implementation. This mode will set the camera to always focus on the background elements in the scene. (Only supported on UWP and Android without ARCore). |
FOCUS_MODE_MACRO |
Sets the camera to macro mode, as provided by the camera driver implementation. This mode provides a sharp camera image for closeups of approximately 15 cm, rarely used in AR setups. (Not supported on iOS and Magic Leap) |
If nothing else is set, the platform’s default focus mode is used.
Focus region
NOTE: The camera controls for setting/getting focus and exposure regions are not supported on devices running ARCore. Android devices with ARCore disabled can use the controls.
Allow your users to focus on just a region of the camera frame. Set the focus region with a CameraRegionOfInterest
representing a region-of-interest screen position in pixels and an extent that is a percentage of the size of the camera frame’s width and height. This allows the user to tap the screen to focus on a particular screen area.
CameraRegionOfInterest
must be followed by setting focus mode to FOCUS_MODE_CONTINUOUSAUTO to trigger the re-focus. After the re-focus, the mode sets itself to FOCUS_MODE_FIXED.
NOTE: Focusing only on a region of the camera view can interfere with and degrade the detection and tracking of Vuforia targets outside that region.
First, ensure that the device supports setting the focus region with FocusRegionSupported
while the Engine is running. Then, set the focus region with the regionOfInterest
data structure:
123456789101112131415161718192021222324252627282930313233Copyvoid Update()
{
if (Input.touchCount > 0)
{
var touch = Input.GetTouch(0);
if (touch.phase == TouchPhase.Ended)
{
SetFocusRegion(touch.position, 0.25f);
}
}
}
void SetFocusRegion(Vector2 focusPosition, float extent)
{
var regionOfInterest = new CameraRegionOfInterest(focusPosition, extent);
if (VuforiaBehaviour.Instance.CameraDevice.FocusRegionSupported == true)
{
var success = VuforiaBehaviour.Instance.CameraDevice.SetFocusRegion(regionOfInterest);
if (success)
{
VuforiaBehaviour.Instance.CameraDevice.SetFocusMode(FocusMode.FOCUS_MODE_TRIGGERAUTO);
}
else
{
Debug.Log("Failed to set Focus Mode for region " + regionOfInterest.ToString());
}
}
else
{
Debug.Log("Focus region supported: " + VuforiaBehaviour.Instance.CameraDevice.FocusRegionSupported);
VuforiaBehaviour.Instance.CameraDevice.SetFocusRegion(CameraRegionOfInterest.Default());
}
}
Use screen coordinates in pixels as an alternative to the touch position to provide the focus region position in CameraOfInterest()
.
Reset the focus region with CameraRegionOfInterest.Default()
or set CameraRegionOfInterest
to:
1CopyVuforiaBehaviour.Instance.CameraDevice.SetFocusRegion(new CameraRegionOfInterest(new(Screen.width / 2f, Screen.height / 2f), 1.0f));
If Vuforia Engine is paused, the focus region will be reset and reverted to its default values.
NOTE: On UWP, setting an extent (For focus and exposure) of less than 0.05 (5%) throws an error as the platform deems it too small a value.
Use the get method to return the currently active region for auto-focus.
1Copy
GetFocusRegion();
NOTE: On iOS, only the screen position is returned. The extent (For focus and exposure) is returned as a 0 as iOS changes this value internally. The extent should still be set in the CameraregionOfInterest
.
Exposure mode
Adjust the exposure on the camera frames by setting the exposure mode to correct for lighting settings. Note that not all devices support all available exposure modes.
First, verify that the exposure mode you want to change to is supported on the device:
12Copyvar exposureMode = ExposureMode.EXPOSURE_MODE_FIXED;
IsExposureModeSupported(exposureMode);
Set the exposure mode while the Vuforia Engine is running with:
1Copy
VuforiaBehaviour.Instance.CameraDevice.SetExposureMode(ExposureMode.EXPOSURE_MODE_TRIGGERAUTO);
The platform’s default exposure mode is used if nothing else is set.
Exposure mode |
|
EXPOSURE_MODE_TRIGGERAUTO |
Triggers a single auto-exposure operation. After the operation is completed, the exposure mode will automatically change to EXPOSURE_MODE_FIXED. Setting a |
EXPOSURE_MODE_CONTINUOUSAUTO |
Allows the device to control auto-exposure continuously. This mode is recommended as it guarantees that the camera is always applying exposure corrections on the camera view. |
EXPOSURE_MODE_FIXED |
Sets the exposure at a fixed value defined by the camera driver. |
In most scenarios, the EXPOSURE_MODE_CONTINUOUSAUTO is the recommended mode. Apply other modes when the environment has strong or insufficient light to show the target.
Setting or changing the exposure mode while the Vuforia Engine is running might take a little longer on specific devices until the exposure changes.
Exposure region
NOTE: The camera controls for setting/getting focus and exposure regions are not supported on devices running ARCore. Android devices with ARCore disabled can use the controls.
Allow your users to set the exposure on just a region of the camera frame with the CameraRegionOfInterest
data structure. Set the exposure region with a region-of-interest screen position in pixels and an extent that is a percentage of the size of the camera frame’s width and height. CameraRegionOfInterest
must be followed by setting exposure mode to EXPOSURE_MODE_TRIGGERAUTO to trigger the re-exposure. After the re-exposure, the mode sets itself to EXPOSURE_MODE_FIXED.
NOTE: Adjusting the exposure on only a region of the camera view can interfere with and degrade the detection and tracking of Vuforia targets outside that region.
First, ensure that the device supports setting the exposure region with ExposureRegionSupported
while Vuforia Engine runs. Then, set the exposure region with the regionOfInterest
variable:
123456789101112131415161718192021222324252627282930313233Copyvoid Update()
{
if (Input.touchCount > 0)
{
var touch = Input.GetTouch(0);
if (touch.phase == TouchPhase.Ended)
{
SetFocusRegion(touch.position, 0.25f);
}
}
}
void SetExposureRegion(Vector2 focusPosition, float extent)
{
var regionOfInterest = new CameraRegionOfInterest(focusPosition, extent);
if (VuforiaBehaviour.Instance.CameraDevice.ExposureRegionSupported == true)
{
var success = VuforiaBehaviour.Instance.CameraDevice.SetExposureRegion(regionOfInterest);
if (success)
{
VuforiaBehaviour.Instance.CameraDevice.SetExposureMode(ExposureMode.EXPOSURE_MODE_TRIGGERAUTO);
}
else
{
Debug.Log("Failed to set Exposure Mode for region " + regionOfInterest.ToString());
}
}
else
{
Debug.Log("Exposure region supported: " + VuforiaBehaviour.Instance.CameraDevice.ExposureRegionSupported);
VuforiaBehaviour.Instance.CameraDevice.SetExposureRegion(CameraRegionOfInterest.Default());
}
}
Use screen coordinates in pixels as an alternative to the touch position to provide the exposure region position in CameraOfInterest()
.
Reset the focus region with CameraRegionOfInterest.Default()
or set CameraRegionOfInterest
to:
1CopyVuforiaBehaviour.Instance.CameraDevice.SetExposureRegion(new CameraRegionOfInterest(new(Screen.width / 2f, Screen.height / 2f), 1.0f));
NOTE: On iOS, only the screen position is returned. The extent is returned as a 0 as iOS changes this value internally. The extent should still be set in the CameraregionOfInterest
.
If Vuforia Engine is paused, the exposure region will be reset and reverted to its default values.
Use the get method to return the currently active region for auto-exposure.
1Copy
GetExposureRegion();
Flash Torch
Poor lighting conditions can significantly affect target detection and tracking. For best results, ensure there is enough light in your environment so that the scene details and target features are well visible in the camera view.
- Tracking works best in indoor environments, where the lighting conditions are usually more stable and easier to control.
- If your application use case and scenarios require operating in dark environments, consider enabling the device Flash torch (if your device has one).
NOTE:, ARCore version 1.45 or later must be included in the project and installed on the ARCore-enabled Android devices to use the Flash. See Using ARCore with Vuforia Engine in Unity on changing the version.
Call IsFlashSupported()
while Vuforia Engine is running to check if the platform device supports setting the Flash mode.
1Copy
VuforiaBehaviour.Instance.CameraDevice.IsFlashSupported();
Set Flash to true to activate or false to deactivate.
1CopyVuforiaBehaviour.Instance.CameraDevice.SetFlash(true);
Clipping Plane
If your augmentations disappear at a certain distance from the Image Target, it may be that your far clipping plane (in OpenGL or the Unity camera settings) needs to be adjusted. This applies especially if you work with large Image Targets viewed from a distance.
In Unity, the near and far clipping planes can be set directly in the ARCamera GameObject's Inspector window or with Unity’s scripting API.
Apply a Shader to the Camera Images
The Image class provides the camera pixels as a byte array. This approach is useful for some image processing tasks, but sometimes, it is preferable to obtain the image as a Unity Texture2D, as shown in the example below. You can also apply the image to a Material’s texture in a shader.
Access the Camera Image
Use the Vuforia Image
class to access and set the desired camera image format. Use the images as an OpenGL texture or apply the texture to a Unity material. Register for the desired image format using the Vuforia.PixelFormat
declaration.
1CopyVuforiaBehaviour.Instance.CameraDevice.SetFrameFormat(PixelFormat.RGB888, true);
NOTE: The Vuforia Namespace Reference page lists the available pixel formats.
Call this method after Vuforia Engine is initialized and started; to this aim, it is recommended to register an OnVuforiaStarted
callback in the Start()
method of your MonoBehaviour
script:
123456789101112131415161718Copyprivate void OnVuforiaStarted()
{
PixelFormat pixelFormat = PixelFormat.RGB888;
bool success = VuforiaBehaviour.Instance.CameraDevice.SetFrameFormat(pixelFormat, true);
// Vuforia has started, now register camera image format
if (success)
{
Debug.Log("Successfully registered pixel format " + pixelFormat.ToString());
}
else
{
Debug.LogError(
"Failed to register pixel format " + pixelFormat.ToString() +
"\n the format may be unsupported by your device;" +
"\n consider using a different pixel format.");
}
}
We can extend this script to retrieve the camera images after setting the format and during a callback. In this way, you ensure you obtain the latest camera image matching the current frame.
Also, make sure that the camera image is not null since it can take a few frames for the image to become available after registering for an image format.
Unregister the camera image format whenever the Engine is stopped and register it again when the Engine is started again.
Apply camera images to a RawImage GameObject in your Unity scene.
- Create an Empty GameObject and attach the below script CameraImageAccess.
- Create a RawImage GameObject from UI -> RawImage.
- Drag the RawImage GameObject to the public field in the CameraImageAccess script.
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120Copyusing UnityEngine;
using UnityEngine.UI;
using Vuforia;
using Image = Vuforia.Image;
public class CameraImageAccess : MonoBehaviour
{
const PixelFormat PIXEL_FORMAT = PixelFormat.RGB888;
const TextureFormat TEXTURE_FORMAT = TextureFormat.RGB24;
public RawImage RawImage;
Texture2D mTexture;
bool mFormatRegistered;
void Start()
{
// Register Vuforia Engine life-cycle callbacks:
VuforiaApplication.Instance.OnVuforiaStarted += OnVuforiaStarted;
VuforiaApplication.Instance.OnVuforiaStopped += OnVuforiaStopped;
if (VuforiaBehaviour.Instance != null)
VuforiaBehaviour.Instance.World.OnStateUpdated += OnVuforiaUpdated;
}
void OnDestroy()
{
// Unregister Vuforia Engine life-cycle callbacks:
if (VuforiaBehaviour.Instance != null)
VuforiaBehaviour.Instance.World.OnStateUpdated -= OnVuforiaUpdated;
VuforiaApplication.Instance.OnVuforiaStarted -= OnVuforiaStarted;
VuforiaApplication.Instance.OnVuforiaStopped -= OnVuforiaStopped;
if (VuforiaApplication.Instance.IsRunning)
{
// If Vuforia Engine is still running, unregister the camera pixel format to avoid unnecessary overhead
// Formats can only be registered and unregistered while Vuforia Engine is running
UnregisterFormat();
}
if (mTexture != null)
Destroy(mTexture);
}
///
/// Called each time the Vuforia Engine is started
///
void OnVuforiaStarted()
{
mTexture = new Texture2D(0, 0, TEXTURE_FORMAT, false);
// A format cannot be registered if Vuforia Engine is not running
RegisterFormat();
}
///
/// Called each time the Vuforia Engine is stopped
///
void OnVuforiaStopped()
{
// A format cannot be unregistered after OnVuforiaStopped
UnregisterFormat();
if (mTexture != null)
Destroy(mTexture);
}
///
/// Called each time the Vuforia Engine state is updated
///
void OnVuforiaUpdated()
{
var image = VuforiaBehaviour.Instance.CameraDevice.GetCameraImage(PIXEL_FORMAT);
// There can be a delay of several frames until the camera image becomes available
if (Image.IsNullOrEmpty(image))
return;
Debug.Log("\nImage Format: " + image.PixelFormat +
"\nImage Size: " + image.Width + " x " + image.Height +
"\nBuffer Size: " + image.BufferWidth + " x " + image.BufferHeight +
"\nImage Stride: " + image.Stride + "\n");
// Override the current texture by copying into it the camera image flipped on the Y axis
// The texture is resized to match the camera image size
image.CopyToTexture(mTexture, true);
RawImage.texture = mTexture;
RawImage.material.mainTexture = mTexture;
}
///
/// Register the camera pixel format
///
void RegisterFormat()
{
// Vuforia Engine has started, now register camera image format
var success = VuforiaBehaviour.Instance.CameraDevice.SetFrameFormat(PIXEL_FORMAT, true);
if (success)
{
Debug.Log("Successfully registered pixel format " + PIXEL_FORMAT);
mFormatRegistered = true;
}
else
{
Debug.LogError("Failed to register pixel format " + PIXEL_FORMAT +
"\n the format may be unsupported by your device;" +
"\n consider using a different pixel format.");
mFormatRegistered = false;
}
}
///
/// Unregister the camera pixel format
///
void UnregisterFormat()
{
Debug.Log("Unregistering camera pixel format " + PIXEL_FORMAT);
VuforiaBehaviour.Instance.CameraDevice.SetFrameFormat(PIXEL_FORMAT, false);
mFormatRegistered = false;
}
}
Use an OpenGL texture
Other image processing tasks might require obtaining the image as an OpenGL texture. You can get the image as an OpenGL texture using the approach demonstrated in the OcclusionManagement sample.
- Register a Texture2D object to be filled with the camera pixels at each frame instead of letting Vuforia Engine render the camera image natively at each frame, using the
VuforiaRenderer.VideoBackgroundTexture
API - See the OcclusionManagement sample scripts for an example of this technique.
Accessing the Raw Pixels
For custom processes involving CV (Computer Vision), you can access the raw pixels from the camera with the Image class. See the following example method for retrieving the pixel size:
12345678Copyprivate byte[] GetPixels(Image image)
{
var pixelBufferPtr = image.PixelBufferPtr;
int imageSize = image.Stride * image.Height;
byte[] pixels = new byte[imageSize];
System.Runtime.InteropServices.Marshal.Copy(pixelBufferPtr, pixels, 0, imageSize);
return pixels;
}