iOS - Sample code tips

May 8, 2014 - 8:59am #1

Basic Q&A on iOS samples

How is the sample code structured?

You can find an overview of the sample apps code structure and organization in the Vuforia developer guide (for the Android and iOS native samples):

https://developer.vuforia.com/resources/dev-guide/sample-apps

 

Where can I find the code that renders the augmentations?

The rendering of the 3D augmentation content in the samples (e.g. the 3D teapot models in the Image Targets and Cloud Recognition sample, the video textures in the Video Playback sample, the 3D letters models in the Frame Markers sample, etc.) is performed using OpenGL ES. The relevant OpenGL code is located in the a xxxEAGLView class, which is declared in a xxxEAGLView.h header file and implemented in a xxxEAGLView.mm file (e.g. ImageTargetsEAGLView.mm for Image targets, under the vuforia-sdk/samples/VuforiaSamples/Classes/apps/ folder).

In particular, the renderFrameQCAR method is the executed on the OpenGL rendering thread at each frame and includes the OpenGL code to perform these main operations:

  • Clearing the OpenGL viewport
  • Drawing the video background, with QCAR::Renderer::getInstance().drawVideoBackground();
  • Binding the necessary shaders for rendering the 3d geometry
  • Building the model-view and projection matrix from the Trackable pose and feeding them to the shaders
  • Binding the textures that are applicable to the 3d geometry
  • Drawing the meshes (vertex arrays) representing the 3d geometry

while the initWithFrame initializer contains code to:

  • create and setup the augmentation textures that will be later applied to the various 3D models during the rendering
  • create and initialize the GLSL shaders (vertex and fragment shaders) that will be later used for rendering the 3D meshes

In particular, the shader initialization is performed within the initShaders method, which in turn invokes the createProgramWithVertexShaderFileName method of the SampleApplicationShaderUtils class.

The SampleApplicationShaderUtils is defined in the SampleApplicationShaderUtils.h / .m files located in the /'VuforiaSamples/Classes/SampleApplication/' folder.

For tips and hints on how to customize the OpenGL code to render your own 3D model, see also:

https://developer.vuforia.com/resources/dev-guide/replacing-teapot

 

How is the OpenGL view attached to a sample UIViewController?

In the VuforiaSamples project, each sample module has a main UIViewController, such as ImageTargetsViewController for Image Targets, FrameMarkersViewController for Frame Markers, and so on. The source files (.h and .mm) of those UIViewControllers are located under the ‘vuforia-sdk-ios-x-y-z/samples/VuforiaSamples-x-y-z/VuforiaSamples/Classes/apps/’ folder.

The loadView method contains code to initialize the eaglView instance variable (which represents the OpenGL view where the camera background and the augmentation will be rendered) and to “set” it as the view of the view-controller, as shown in this example:

eaglView = [[ImageTargetsEAGLView alloc] initWithFrame:viewFrame appSession:vapp];
[self setView:eaglView];

 

Where can I find the code to start/stop the trackers?

The ImageTargetsViewController class implements the SampleApplicationControl protocol, which defines the two methods doStartTrackers and doStopTrackers for starting and stopping the Vuforia trackers; for instance, this is the code used in the Image Targets sample:

- (bool) doStartTrackers {
    QCAR::TrackerManager& trackerManager = QCAR::TrackerManager::getInstance();
    QCAR::Tracker* tracker = trackerManager.getTracker(QCAR::ImageTracker::getClassType());

    if(tracker == 0) {
        return NO;
    }
    tracker->start();
    return YES;
}

and:

- (bool) doStopTrackers {
    // Stop the tracker
    QCAR::TrackerManager& trackerManager = QCAR::TrackerManager::getInstance();
    QCAR::Tracker* tracker = trackerManager.getTracker(QCAR::ImageTracker::getClassType());

    if (NULL != tracker) {
        tracker->stop();
        NSLog(@"INFO: successfully stopped tracker");
        return YES;
    }
    else {
        NSLog(@"ERROR: failed to get the tracker from the tracker manager");
        return NO;
    }
}

Note that the same (or pretty similar) code can be found in the other sample view-controllers as well, such as MultiTargetsViewController, FrameMarkersViewController, etc., although the specific tracker class used will be QCAR::MarkerTracker for the Frame Markers sample (instead of QCAR::ImageTracker), and QCAR::TextTracker for the Text Recognition sample.

 

Where can I find the code to start/stop the camera?

The code to start the AR camera can be found in the startAR method of the SampleApplicationSession class (see SampleApplicationSession.mm under ‘vuforia-sdk-ios/samples/VuforiaSamples/Classes/SampleApplication’). The startAR method invokes code to initialize the QCAR camera and to start it:

- (bool) startAR:(QCAR::CameraDevice::CAMERA)camera error:(NSError **)error {
    //
    if (! [self startCamera: camera viewWidth:self.mARViewBoundsSize.width andHeight:self.mARViewBoundsSize.height error:error]) {
        return NO;
    }
    self.cameraIsActive = YES;
    self.cameraIsStarted = YES;
    return YES;
}

Similarly, the same class contains a stopCamera method to stop and deinitialize the QCAR camera:

// stop the camera
- (bool) stopCamera:(NSError **)error {
    if (self.cameraIsActive) {
        // Stop and deinit the camera
        QCAR::CameraDevice::getInstance().stop();
        QCAR::CameraDevice::getInstance().deinit();
        self.cameraIsActive = NO;
    } else {
        [self NSErrorWithCode:E_CAMERA_NOT_STARTED error:error];
        return NO;
    }
    self.cameraIsStarted = NO;

    // Stop the trackers
    if(! [self.delegate doStopTrackers]) {
        [self NSErrorWithCode:E_STOPPING_TRACKERS error:error];
        return NO;
    }

    return YES;
}

 

Where can I find the code to load datasets?

The ImageTargetsViewController class implements the SampleApplicationControl protocol, which defines the two methods doLoadTrackersData and doUnloadTrackersData; these methods contain code to load / unload Datasets into the Trackers; for instance, this is the code used in the Image Targets sample:

- (bool) doLoadTrackersData {
    dataSetStonesAndChips = [self loadImageTrackerDataSet:@"StonesAndChips.xml"];
    dataSetTarmac = [self loadImageTrackerDataSet:@"Tarmac.xml"];

    if ((dataSetStonesAndChips == NULL) || (dataSetTarmac == NULL)) {
        NSLog(@"Failed to load datasets");
        return NO;
    }

    if (! [self activateDataSet:dataSetStonesAndChips]) {
        NSLog(@"Failed to activate dataset");
        return NO;
    }

    return YES;
}

and this is the code to unload the data:

- (bool) doUnloadTrackersData {
    [self deactivateDataSet: dataSetCurrent];
    dataSetCurrent = nil;

    // Get the image tracker:
    QCAR::TrackerManager& trackerManager = QCAR::TrackerManager::getInstance();
    QCAR::ImageTracker* imageTracker = static_cast<QCAR::ImageTracker*>(trackerManager.getTracker(QCAR::ImageTracker::getClassType()));

    // Destroy the data sets:
    if (!imageTracker->destroyDataSet(dataSetTarmac))  {
        NSLog(@"Failed to destroy data set Tarmac.");
    }

    if (!imageTracker->destroyDataSet(dataSetStonesAndChips)) {
        NSLog(@"Failed to destroy data set Stones and Chips.");
    }

    NSLog(@"datasets destroyed");
    return YES;
}

 

Topic locked