Description

Answers to frequently asked questions

New Developer Portal changes with Engine version 10

With the changes to our next-gen SDK (10.0), we will continue to support version 9.8 during a limited window for users to transition to SDK version 10.0.  During this transition period, we will provide the following:

  1. Download support for both 9.8 and 10.0 on the Engine Developer Portal.
    • Select between SDK version 10.0 or 9.8 from the dropdown menu at the top, hit Apply, and the corresponding Samples, Tools, and Downloads will be shown for the selected version.
    • Note you must click “Apply” for changes to take effect.
  2. Documentation support for both 9.8 and 10.0 on the Developer Portal Library site.
    • Use the SDK 10.0 table of contents for articles pertaining to SDK version 10.0
    • Use the SDK 9.8 table of contents (under 10.0) for articles pertaining to SDK version 9.8 
  3. Option to mark forum posts for either 9.8 or 10.0 on the Developer Forums
    • Authors of new threads/posts can select which version of the SDK it pertains to with a corresponding badge/label. Older versions (9.8) will appear in grey, while current version (10.0) will appear in teal.

In addition to the above changes, we also are no longer distributing Unity packages via GIT – there is now a direct download link to a .unitypackage available via the Engine Developer Portal in the “Downloads” page!

We hope you are as excited to start building AR experiences with our next generation of Vuforia Engine as we are! Thank you for your continued support!

- The Vuforia Engine team

Load dataset from Android split binary (obb)

A previous thread has described a few different approaches to handling an Android app that exceeds 100mb in size: https://developer.vuforia.com/forum/faq/unity-how-can-i-handle-large-android-apps This thread will provide a small example of how to load a dataset from the obb file when using the split binary option.

 

Create a new Unity project and import the Vuforia SDK and Vuforia Sample Project. Download here: https://developer.vuforia.com/downloads/sdk

Vuforia Configuration

  • Enter your license key
  • Make sure your dataset is loaded and activated (in this example I will be using the StonesAndChips dataset present in the sample project)

Player Settings -> Android Settings

  • Other Settings: Set "Write Permissions" to "External (SDCard)"
  • Publishing Settings: Check the "Split Application Binary" box

Once the above is all set up, copy the code at the bottom of this post (ObbExtractor.cs) and create a new scene. In this scene, create a new script with the code below and attach it to an object. This scene will need to be run before Vuforia is initialized to function properly.

For this example, add the Vuforia sample project scenes to the build settings' "Scenes in Build" and then add this extra scene with the ObbExtractor at the top of the list of scenes. This will start the app in this scene and allow the datasets to be extracted before moving on to the Vuforia initialization in the following scenes. The sample script provided in this case will automatically load the scene "Vuforia-0-Splash" when the extraction has been completed.

Once Vuforia has been loaded, the image target for the StonesAndChips dataset will be ready to go.

Disclaimer: This code is meant as an instructional example for Vuforia 6.2 sample app.

using System.IO;
using System.Collections;
using UnityEngine;
using UnityEngine.SceneManagement;


public class ObbExtractor : MonoBehaviour {

    void Start () {
        StartCoroutine(ExtractObbDatasets());
    }

    private IEnumerator ExtractObbDatasets () {
        string[] filesInOBB = {"StonesAndChips.dat", "StonesAndChips.xml"};
        foreach (var filename in filesInOBB) {
            string uri = Application.streamingAssetsPath + "/QCAR/" + filename;

            string outputFilePath = Application.persistentDataPath + "/QCAR/" + filename;
            if(!Directory.Exists(Path.GetDirectoryName(outputFilePath)))
                Directory.CreateDirectory(Path.GetDirectoryName(outputFilePath));

            var www = new WWW(uri);
            yield return www;

            Save(www, outputFilePath);
            yield return new WaitForEndOfFrame();
        }

        // When done extracting the datasets, Start Vuforia AR scene
        SceneManager.LoadScene( "Vuforia-0-Splash" );
    }

    private void Save(WWW www, string outputPath) {
        File.WriteAllBytes( outputPath, www.bytes );

        // Verify that the File has been actually stored
        if( File.Exists( outputPath ) )
        {
            Debug.Log( "File successfully saved at: " + outputPath );
        }
        else
        {
            Debug.Log( "Failure!! - File does not exist at: " + outputPath );   
        }
    }
}

This code will pull the StonesAndChips .dat and .xml files from the obb and output them in the applications persistentDataPath, allowing them to be loaded once Vuforia is initialized. This code will need to run before Vuforia is initialized.

Unity - How can I disable the video background

In Vuforia 4.0 and above, the camera video background is rendered on a textured plane, called "BackgroundPlane".

The BackgroundPlane is located under the Camera object attached under the ARCamera prefab of the Vuforia Unity Extension; you can visualize this in the scene hierarchy view in Unity, by expanding the ARCamera object hierarchy:

  • ARCamera
    • Camera
      • BackgroundPlane

The BackgroundPlane object contains a component called "Background Plane Behaviour"; you can switch off the camera video background rendering by simply disabling such component; for example, you may create a script like the following, and attach it to the BackgroundPlane game object:

using UnityEngine;
using System.Collections;
using Vuforia;

public class BackgroundOff : MonoBehaviour {

	private bool mBackgroundWasSwitchedOff = false;

	// Update is called once per frame
	void Update () {
		if (!mBackgroundWasSwitchedOff) {
			BackgroundPlaneBehaviour bgPlane = GetComponent<BackgroundPlaneBehaviour> ();
			if (bgPlane.enabled) {
				// switch it off
				bgPlane.enabled = false;
			}
			mBackgroundWasSwitchedOff = true;
		}
	}
}

 

Or, you can also disable the Background Plane Behaviour component through the inspector panel of the BackgroundPlane object.

 

 

General Programming Resources

If you are looking for general programming resources that can help get you started with Vuforia, such as OpenGL, Unity, C++ or Java Java programming resources (and more), here are a few web pages that you may want to take a look at:

Unity 3D:

  • https://unity3d.com/learn
  • https://unity3d.com/community
  • http://forum.unity3d.com/forum.php
 Android:
  • http://developer.android.com/index.html
  • http://developer.android.com/training/index.html
  • http://developer.android.com/training/articles/perf-jni.html
 iOS:
  • https://developer.apple.com/devcenter/ios/index.action
 OpenGL:
  • http://www.opengl.org/
  • http://nehe.gamedev.net/
  • http://www.shadertoy.com
 C#:
  • http://docs.go-mono.com/
  • http://msdn.microsoft.com/en-us/library/kx37x362.aspx
 JAVA:
  • http://docs.oracle.com/javase/tutorial/
  • http://docs.oracle.com/javase/7/docs/technotes/guides/jni/spec/jniTOC.html
 C++:
  • http://www.cplusplus.com/doc/tutorial/
  • http://www.cplusplus.com/forum/
 Objective-C:
  • https://developer.apple.com/library/mac/documentation/cocoa/conceptual/ProgrammingWithObjectiveC/
  Game Development:
  • http://www.gamedev.net/page/index.html
  

 

Unity - How can I handle large Android apps

If you are building an application with the Vuforia Unity Extension, and you application size exceeds 50MB, you may need to take some special measures in order to be ble to publish this on the Google Play Store, due to the Google Play 50MB app requirement as explained in the official Google Android web site here:

http://developer.android.com/google/play/expansion-files.html

General tips and suggestions

Hosting assets on a server and downloading at runtime 

If the size of your App is due to the use of large files, such as videos and/or large 3D models and textures or other large Unity assets, you may want to consider placing those assets onto an external location, such as a server, and download and use the contents through the network at runtime. This way, you can dramatically reduce your App size, as the heavy-weight / large assets are not packaged with the App.

Asset Bundles

Note that Unity also provides a convenient feature and API to achieve this, called Asset Bundles. Please refer to the official Unity Asset Bundle documentation for more details about this feature:

https://docs.unity3d.com/Documentation/Manual/AssetBundlesIntro.html

If you choose to go with this approach, you may also be interested in reading this article about how to dynamically create 3D augmentations for Vuforia Image Targets using Asset Bundles:

https://developer.vuforia.com/forum/faq/unity-how-can-i-augment-my-image-target-model 

 

Using OBB expansion files

Another possibility is to use the OBB explansion files, which practically allow your App to go beying the 50MB limit, by using the Split Application Binary option in the Player Settings, spliting it into an APK + an expansion file (.obb), as explained in the Unity online documentation:

https://docs.unity3d.com/Documentation/Manual/android-OBBsupport.html

OBB and Vuforia

While you are allowed to use the OBB split option with Vuforia, you should be aware that certain Vuforia functionalities may not be fully available and may need to be handled in a special way.

In particular, the Dataset loading feature may be affected, as the dataset files (.DAT and .XML) that you have placed in the 'Assets/StreamingAssets/QCAR' folder in your Unity project, will likely get stored on the OBB expansion file. As a consequence, the Dataset load methods of the Vuforia API may fail to locate and load those files correctly.

In order to properly handle this scenario, you will need to follow these steps:

  • make sure that the desired Datasets to load are "enabled" in the Data Set Load Behaviour of the ARCamera
    • for each Dataset, enable (tick) the 'Load Data Set Your_Dataset_Name' and 'Activate Dataset Your_Dataset_Name' check-boxes via the ARCamera inspector
  • create a C# script to programmatically read your Dataset XML and DAT files from these locations:
    • Application.streamingAssetsPath + "/QCAR/" + dataset_xml_filename / dataset_dat_filename
    • Application.applicationDataPath + ''/Android/obb/" + obb_filename (check the Android developer guide for more information on the actual OBB path and filename: http://developer.android.com/google/play/expansion-files.html)
  • read the .DAT and .XML files, for instance using the www class with a URI pointing to one of the locations above (see also: https://docs.unity3d.com/Documentation/ScriptReference/WWW.html)
  • get the files content (as a byte array) from the www class, using the www.bytes method
  • save the files content into a new pair of .DAT and .XML files on your device storage (e.g. your SD card, or otherwise to a device internal storage); for instance, store them onto a subdirectory of Application.persistentDataPath (e.g. Application.persistentDataPath + "/MyDatasets/MyDataset.xml" and Application.persistentDataPath + "/MyDatasets/MyDataset.dat")
  • use the Vuforia API to load the datasets from the new location, using the STORAGE_ABSOLUTE option, as shown in this code example:

 

 

 

 

Unity - How to select Camera and mirroring

In Unity you can select the Camera direction (BACK vs. FRONT camera) in the Unity Editor, by selecting the ARCamera object in your scene and looking at the Inspector panel; there you will find a component called QCARBehaviour which offers a Camera Direction setting which can be set to one of these:

  • CAMERA_BACK
  • CAMERA_FRONT
  • CAMERA_DEFAULT

The CAMERA_BACK option will select the rear camera on your device, while the CAMERA_FRONT option will select the front camera on your device, if the device has one. The CAMERA_DEFAULT option will typically select the rear camera, unless the device only has a front camera.

The ARCamera inspector also shows a property called "Mirror Video Background" which can be set to one of these:

  • DEFAULT
  • ON
  • OFF

When the DEFAULT option is selected (recommended for most common use cases), the Mirroring will be automatically disabled (OFF) when the BACK camera is used, and will be automatically enabled (ON) when the FRONT camera is used.

 

How to change the Camera direction and mirroring at runtime?

You can change the Camera direction at runtime by using the CameraDevice and QCARRenderer API, as shown in this example:

public class DirSwap : MonoBehaviour 
{
    private QCARAbstractBehaviour mQCAR;

    void Start() {
        mQCAR = (QCARAbstractBehaviour)FindObjectOfType(typeof(QCARAbstractBehaviour));
        
    }

    void OnGUI()
    {
        if (GUI.Button(new Rect(50,50,200,50), "Swap Camera"))
        {
            CameraDevice.CameraDirection currentDir = CameraDevice.Instance.GetCameraDirection();
            if (currentDir == CameraDevice.CameraDirection.CAMERA_BACK || currentDir == CameraDevice.CameraDirection.CAMERA_DEFAULT)
                RestartCamera(CameraDevice.CameraDirection.CAMERA_FRONT, true);
            else
                RestartCamera(CameraDevice.CameraDirection.CAMERA_BACK, false);
        }

        if (GUI.Button(new Rect(50,100,200,50), "Mirror OFF"))
        {
            RestartCamera(CameraDevice.Instance.GetCameraDirection(), false);
        }

        if (GUI.Button(new Rect(50, 150, 200, 50), "Mirror ON"))
        {
            var config = QCARRenderer.Instance.GetVideoBackgroundConfig();
            config.reflection = QCARRenderer.VideoBackgroundReflection.ON;
            QCARRenderer.Instance.SetVideoBackgroundConfig(config);

            RestartCamera(CameraDevice.Instance.GetCameraDirection(), true);
        }
    }

    private void RestartCamera(CameraDevice.CameraDirection newDir, bool mirror)
    {
        CameraDevice.Instance.Stop();
        CameraDevice.Instance.Deinit();

        CameraDevice.Instance.Init(newDir);

        // Set mirroring 
        var config = QCARRenderer.Instance.GetVideoBackgroundConfig();
        config.reflection = mirror ? QCARRenderer.VideoBackgroundReflection.ON : QCARRenderer.VideoBackgroundReflection.OFF;
        QCARRenderer.Instance.SetVideoBackgroundConfig(config);

        CameraDevice.Instance.Start();
    }
}

 

Unity - how can I change the target size programmatically

if you want to change the Size of your Image targets at runtime (programmatically via script code),

and you want Vuforia to take into account the new size and change the tracked "target distance" accordingly,

you need to:

  • use the ImageTarget.SetSize( Vector2 new_size ) API
  • Deactivate the Dataset before changing the size
  • Reactivate the Dataset after changing the size

(see also API reference: 

https://developer.vuforia.com/resources/api/unity/interface_image_target )

To verify this, you could use this simple code (e.g. by attaching it to any Empty GameObject in your Unity scene):

12345678910111213141516171819202122232425262728293031323334353637383940public class TargetInfo : MonoBehaviour {  void OnGUI()  {    StateManager sm = TrackerManager.Instance.GetStateManager();    if (GUI.Button (new Rect(50,50,200,40), "Size Up"))    {      ImageTracker tracker = TrackerManager.Instance.GetTracker<ImageTracker>();      foreach (DataSet ds in tracker.GetActiveDataSets())      {        // Deactivate Dataset before chaging the target size        tracker.DeactivateDataSet(ds);        foreach (Trackable trackable in ds.GetTrackables())        {          if (trackable is ImageTarget) {            ImageTarget it = trackable as ImageTarget;            Vector2 old_size = it.GetSize();             Vector2 new_size = new Vector2(1.5f*old_size.x, 1.5f*old_size.y);            it.SetSize(new_size);        }      }      // Reactivate dataset      tracker.ActivateDataSet(ds);    }   }    foreach (TrackableBehaviour tb in sm.GetActiveTrackableBehaviours())   {     if (tb is ImageTargetBehaviour)     {       ImageTargetBehaviour itb = tb as ImageTargetBehaviour;       float dist2cam = (itb.transform.position - Camera.main.transform.position).magnitude;       ImageTarget it = itb.Trackable as ImageTarget;       Vector2 size = it.GetSize();       GUI.Box (new Rect(50,100,300,40), it.Name + " - " + size.ToString() +              "\nDistance to camera: " + dist2cam);      }    }  }}

 

Android - How can I build a basic Vuforia app

This article provides some guidelines on how to build a basic Vuforia-enabled Android application, starting from an empty Activity. This only covers basic setup steps, such as Vuforia initialization, tracker initialization and starting / stopping the camera. For more details about specific features, you can consult the developer guide and look at the sample code in the Vuforia samples.

This article refers to common concepts and API of the Android Activity life-cycle, such as creating, pausing, resuming and destroying an Activity.

For more information about the Android Activity life-cycle, and general Android programming topics, please refer to the official Android developer guide:

https://developer.android.com/guide/index.html

http://developer.android.com/training/basics/activity-lifecycle/index.html

Vuforia initialization

The onCreate() method is called when your Activity is created; this is the place where you typically put most of your application initialization code; as such, this is also where you will usually put the code to initialize Vuforia, using the Vuforia.init() method, e.g.:

Vuforia.setInitParameters( myActivity, Vuforia.GL_20 );
do
{
    mProgressValue = Vuforia.init();
}
while (mProgressValue >= 0 && mProgressValue < 100);

Since the initialization of Vuforia may take some time, it is recommended to wrap the code above inside an AsyncTask and execute it asynchronously. The Vuforia samples show a correct implementation of this.

Initializing the Trackers

Upon completion of the Vuforia initialization, the next step consists in initializing the Trackers; note that you may want to use an ImageTracker, or a MarkerTracker, or a TextTracker, depending on the specific features you plan to use in your app. You can also initialize multiple Trackers at the same time (for example the ImageTracker and the MarkerTracker, if you want to use both Image Targets and Frame Markers in your app). The following code snippet shows how you can initialize the ImageTracker:

TrackerManager tManager = TrackerManager.getInstance();
Tracker tracker = tManager.initTracker(ImageTracker.getClassType());

Tracker Data Loading

Once the trackers are successfully initialized (but not yet started), the next step is to load some data; for example, you may want to load one or more Datasets if your application relies on Image Targets. Once again, since the data loading process can take some time (depending on the amount of data to be loaded), you should consider using an AsyncTask to perform this task; the Vuforia samples provide a good implementation example.

Setting up the OpenGL view and starting the camera

Upon completing the data loading, the next phase will typically consist in initializing the camera and starting the camera; however, before even starting the camera, you need to create an OpenGL view and add it to your Activity, as shown in this code example:

int depthSize = 16;
int stencilSize = 0;
boolean translucent = Vuforia.requiresAlpha();

mGlView = new SampleGLView(this);
mGlView.init(translucent, depthSize, stencilSize);

mRenderer = new ImageTargetRenderer();
mGlView.setRenderer(mRenderer);

addContentView(mGlView, new LayoutParams(LayoutParams.MATCH_PARENT,
           LayoutParams.MATCH_PARENT));

You can refer to the Vuforia samples for the detailed implementation of the OpenGL view and the related renderer.

Once the OpenGL view is set up, the camera can be started and the video-background can be configured:

CameraDevice.getInstance().init(camera);

configureVideoBackground();

CameraDevice.getInstance().selectVideoMode(
                CameraDevice.MODE.MODE_DEFAULT))

CameraDevice.getInstance().start());

Vuforia.setFrameFormat(PIXEL_FORMAT.RGB565, true);

Pausing/Resuming the Activity

The onPause() method is called when the Activity is paused; this is where you should add some code to stop the camera, pause Vuforia and hide the OpenGL view, as shown in this example:

@Override
protected void onPause()
{
    super.onPause();

    stopCamera();

    if (mGlView != null)
    {
        mGlView.setVisibility(View.INVISIBLE);
        mGlView.onPause();
    }

    Vuforia.onPause();
}

In the code above, note in particular the call to Vuforia.onPause().

 

Similarly, the onResume() method is called when the Activity is resumed; this is where you should put some code to resume Vuforia, restart the camera and show the OpenGL view, for instance:

@Override
protected void onResume()
{
    super.onResume();

    Vuforia.onResume();

    if (mGlView != null)
    {
       mGlView.setVisibility(View.VISIBLE);
       mGlView.onResume();
    }  

    startCamera();       
}

In the code above, note in particular the call to Vuforia.onResume().

 

Activity destruction

When your activity is terminated, the onDestroy() method is called; this is where you should put your code to deinitialize Vuforia, stopping camera and trackers, deinitializing the trackers and unloading any data, e.g.:

@Override
protected void onDestroy()
{
    super.onDestroy();
       
    stopTracker();
    stopCamera();
       
    deinitTracker();

     Vuforia.deinit();
}

In the code above, note in particular the call to Vuforia.deinit ().

Please refer to the Vuforia samples for all the implementation details.

 

 

iOS - How do I get a touch event on a 3D model

The Dominoes sample shows how to transform a touch on a 2D screen to a 3D plane which relates to an Image Target.

In order to re-use this technique, the key methods to investigate here are HandleTouches() which handles the touches and this uses the projectScreenPointToPlane() method in order to create an intersection line for the touch.  The code also shows how to check this line against all the dominoes to see if it touches it.

In order to get touch events from the screen, your view/class must handle events / implement methods for the following:

touchesBegan, touchesMoved, touchesCancelled, touchesEnded

 

As you can see from EAGLView.mm in the dominoes sample:

// Pass touch events through to the Dominoes module
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
    UITouch* touch = [touches anyObject];
    CGPoint location = [touch locationInView:self];
    dominoesTouchEvent(ACTION_DOWN, 0, location.x, location.y);
}
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event
{
    UITouch* touch = [touches anyObject];
    CGPoint location = [touch locationInView:self];
    dominoesTouchEvent(ACTION_CANCEL, 0, location.x, location.y);
}
 
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
    UITouch* touch = [touches anyObject];
    CGPoint location = [touch locationInView:self];
    dominoesTouchEvent(ACTION_UP, 0, location.x, location.y);
}
 
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
    UITouch* touch = [touches anyObject];
    CGPoint location = [touch locationInView:self];
    dominoesTouchEvent(ACTION_MOVE, 0, location.x, location.y);
}

One can see that these touches are passed through to the dominoesTouchEvent(), which records the touches for later validation by HandleTouches.

 

 

Unity - How do I create a simple VideoPlayback app

Note: This article was written at the time of Vuforia SDK 2.x. Check Unity API, Release Notes, and Change/Migration Guides in our Library for information about API changes and refactored code/classes.

The objective here is to show how to replicate the essence of the Vuforia-VideoPlayback sample scene using the Vuforia prefabs and the drag and drop approach of Unity:

  • Create a new Unity project
  • Import the Vuforia video playback unity package
  • Create a new scene
  • Drag the ARCamera prefab into the Unity scene
  • Under the DataSetLoadBehaviour in the Inspector tick “Load Data Set StonesAndChips”, and the 'Activate' checkbox below this
  • From '/Qualcomm AugmentedReality/Prefabs' drag the ImageTarget prefab into the scene
    • For the Image Target select “StonesAndChips” as the dataset and the Image Target should change to the Stones texture
  • From Vuforia Video Playback/Prefabs drag the Video prefab to be the child of the Image Target
    • In the Inspector under “Video Playback Behaviour (Script)” set the path to 'VuforiaSizzleReel_1.m4v'
  • Drag the TrackableEventHandler from Scripts to the Image Target (this plays the video)
  • Remove the DefaultTrackableEventHandler script from the Image Target as it is not needed.
  • Autoplay works already, however tapping the video does not yet work. In order to fix this, simply create a VideoPlaybackController script, fill it with the code below, and then attach this to the ARCamera.

See also:

Unity - How do I play video from URL ? https://developer.vuforia.com/forum/faq/unity-how-do-i-play-video-url

VideoPlaybackController.cs script – to enable tap to play the video 
/*==============================================================================
Copyright (c) 2012-2014 QUALCOMM Austria Research Center GmbH.
All Rights Reserved.

This  Vuforia(TM) sample application in source code form ("Sample Code") for the
Vuforia Software Development Kit and/or Vuforia Extension for Unity
(collectively, the "Vuforia SDK") may in all cases only be used in conjunction
with use of the Vuforia SDK, and is subject in all respects to all of the terms
and conditions of the Vuforia SDK License Agreement, which may be found at
https://developer.vuforia.com/legal/license.

By retaining or using the Sample Code in any manner, you confirm your agreement
to all the terms and conditions of the Vuforia SDK License Agreement.  If you do
not agree to all the terms and conditions of the Vuforia SDK License Agreement,
then you may not retain or use any of the Sample Code in any manner.
==============================================================================*/

using UnityEngine;
using System.Collections;

/// <summary>
/// This class contains the logic to handle taps on VideoPlaybackBehaviour game objects
/// and starts playing the according video. It also pauses other videos when a new one is
/// started.
/// </summary>
public class VideoPlaybackController : MonoBehaviour
{
    #region PRIVATE_MEMBER_VARIABLES

    private Vector2 mTouchStartPos;
    private bool mTouchMoved = false;
    private float mTimeElapsed = 0.0f;

    private bool mTapped = false;
    private float mTimeElapsedSinceTap = 0.0f;

    private bool mWentToFullScreen = false;

    #endregion // PRIVATE_MEMBER_VARIABLES



    #region UNITY_MONOBEHAVIOUR_METHODS

    void Update()
    {
        // Determine the number of taps
        // Note: Input.tapCount doesn't work on Android

        if (Input.touchCount > 0)
        {
            Touch touch = Input.touches[0];
            if (touch.phase == TouchPhase.Began)
            {
                mTouchStartPos = touch.position;
                mTouchMoved = false;
                mTimeElapsed = 0.0f;
            }
            else
            {
                mTimeElapsed += Time.deltaTime;
            }

            if (touch.phase == TouchPhase.Moved)
            {
                if (Vector2.Distance(mTouchStartPos, touch.position) > 40)
                {
                    // Touch moved too far
                    mTouchMoved = true;
                }
            }
            else if (touch.phase == TouchPhase.Ended)
            {
                if (!mTouchMoved && mTimeElapsed < 1.0)
                {
                    if (mTapped)
                    {
                        // Second tap
                        HandleDoubleTap();
                        mTapped = false;
                    }
                    else
                    {
                        // Wait to see if this is a double tap
                        mTapped = true;
                        mTimeElapsedSinceTap = 0.0f;
                    }
                }
            }
        }

        if (mTapped)
        {
            if (mTimeElapsedSinceTap >= 0.5f)
            {
                // Not a double tap
                HandleTap();
                mTapped = false;
            }
            else
            {
                mTimeElapsedSinceTap += Time.deltaTime;
            }
        }

        // special handling in play mode:
        if (VuforiaRuntimeUtilities.IsPlayMode())
        {
            if (Input.GetMouseButtonUp(0))
            {
                if (PickVideo(Input.mousePosition) != null)
                    Debug.LogWarning("Playing videos is currently not supported in Play Mode.");
            }
        }
    }

    #endregion // UNITY_MONOBEHAVIOUR_METHODS



    #region PRIVATE_METHODS

    /// <summary>
    /// Handle single tap event
    /// </summary>
    private void HandleTap()
    {
        // Find out which video was tapped, if any
        VideoPlaybackBehaviour video = PickVideo(mTouchStartPos);

        if (video != null)
        {
            if (video.VideoPlayer.IsPlayableOnTexture())
            {
                // This video is playable on a texture, toggle playing/paused

                VideoPlayerHelper.MediaState state = video.VideoPlayer.GetStatus();
                if (state == VideoPlayerHelper.MediaState.PAUSED ||
                    state == VideoPlayerHelper.MediaState.READY ||
                    state == VideoPlayerHelper.MediaState.STOPPED)
                {
                    // Pause other videos before playing this one
                    PauseOtherVideos(video);

                    // Play this video on texture where it left off
                    video.VideoPlayer.Play(false, video.VideoPlayer.GetCurrentPosition());
                }
                else if (state == VideoPlayerHelper.MediaState.REACHED_END)
                {
                    // Pause other videos before playing this one
                    PauseOtherVideos(video);

                    // Play this video from the beginning
                    video.VideoPlayer.Play(false, 0);
                }
                else if (state == VideoPlayerHelper.MediaState.PLAYING)
                {
                    // Video is already playing, pause it
                    video.VideoPlayer.Pause();
                }
            }
            else
            {
                // Display the busy icon
                video.ShowBusyIcon();
                
                // This video cannot be played on a texture, play it full screen
                video.VideoPlayer.Play(true, 0);
                mWentToFullScreen = true;
            }
        }
    }


    /// <summary>
    /// Handle double tap event
    /// </summary>
    private void HandleDoubleTap()
    {
        // Find out which video was tapped, if any
        VideoPlaybackBehaviour video = PickVideo(mTouchStartPos);

        if (video != null)
        {
            if (video.VideoPlayer.IsPlayableFullscreen())
            {
                // Pause the video if it is currently playing
                video.VideoPlayer.Pause();

                // Seek the video to the beginning();
                video.VideoPlayer.SeekTo(0.0f);

                // Display the busy icon
                video.ShowBusyIcon();

                // Play the video full screen
                video.VideoPlayer.Play(true, 0);
                mWentToFullScreen = true;
            }
        }
    }


    /// <summary>
    /// Find the video object under the screen point
    /// </summary>
    private VideoPlaybackBehaviour PickVideo(Vector3 screenPoint)
    {
        VideoPlaybackBehaviour[] videos = (VideoPlaybackBehaviour[])
                FindObjectsOfType(typeof(VideoPlaybackBehaviour));

        Ray ray = Camera.main.ScreenPointToRay(screenPoint);
        RaycastHit hit = new RaycastHit();

        foreach (VideoPlaybackBehaviour video in videos)
        {
            if (video.collider.Raycast(ray, out hit, 10000))
            {
                return video;
            }
        }

        return null;
    }


    /// <summary>
    /// Pause all videos except this one
    /// </summary>
    private void PauseOtherVideos(VideoPlaybackBehaviour currentVideo)
    {
        VideoPlaybackBehaviour[] videos = (VideoPlaybackBehaviour[])
                FindObjectsOfType(typeof(VideoPlaybackBehaviour));

        foreach (VideoPlaybackBehaviour video in videos)
        {
            if (video != currentVideo)
            {
                if (video.CurrentState == VideoPlayerHelper.MediaState.PLAYING)
                {
                    video.VideoPlayer.Pause();
                }
            }
        }
    }

    #endregion // PRIVATE_METHODS



    #region PUBLIC_METHODS

    /// <summary>
    /// One-time check for the Instructional Screen
    /// </summary>
    public bool CheckWentToFullScreen()
    {
        bool result = mWentToFullScreen;
        mWentToFullScreen = false;
        return result;
    }

    #endregion // PUBLIC_METHODS
}