"We offer new support options and therefor the forums are now in read-only mode! Please check out our Support Center for more information." - Vuforia Engine Team

Please explain iOS Cloud Recognition sample application

Hello, 

This is my first post to the Vuforia developer forum. For the past couple of months I have been using the Vuforia iOS SDK and created some iOS apps like the Image Targets sample app using my own 3d models and textures. Now I need to learn how to maintain the target images in the cloud rather than in the device database. So, I registered for the cloud service, read the developer articles and downloaded the Cloud Recognition sample app. Up to now, I am having a fair understanding on how to create cloud databases, add image targets to it, downloading the access keys and incorporating them to the kAccessKey and kSecretKey constants declared in the CRQCARutils.mm file.

But after that I have no idea on how to modify the sample app catering to my image targets and to display my own content when the user point the camera to an image target maintained in the cloud. I find it difficult to understand the execution flow of the sample app. I have read the "https://developer.vuforia.com/forum/unity-3-extension-technical-discussion/please-explain-step-step-how-use-vuforia-cloud" thread but it is written for Android. Can someone please help me explaining the sample app code? It would be great if you could explain what classes and methods to be modified to display my own content on top of the book cover of the sample app. Forgive me if what I asked above is too much and by the way you guys are doing an amazing job here.

Thank you,

Amila.