Log in or register to post comments

Integrating into an existing app

June 29, 2012 - 3:38pm #1

How would one go about integrating the QCAR library into an existing application on the iOS?

Integrating into an existing app

August 2, 2012 - 8:41pm #18

Many thanks for sharing this.

N

Integrating into an existing app

August 1, 2012 - 10:47am #17

If I force the device orientation into portrait and then back into landscape when AR is activated, the model fixes itself.  As does the camera orientation

Integrating into an existing app

July 31, 2012 - 8:50pm #16

I think this might be because Portrait is considered the default orientation because this is the way the camera works on iOS.

So the way the samples work is to use this as the default and rotate accordingly.  You may also see that the order of various messages might be subject to how the various views and threads are initialised which might make it seem quite complex.

Going back to your issue, the only thing I can suggest is to look at how the samples (including video) do this as this should give you a template as to how to do this.

N

Integrating into an existing app

July 30, 2012 - 10:30am #15

So I saw this method on the image targets example app and I had that replicated in my own application. However, this orientation issue still exists. Interestingly enough, I compared the debugging print statements in the console and I realized something. With my iPad physically oriented landscape, my AR app will print out three statements: "ARParent: Rotating to Landscape Right".  However, when I run the image targets app with the iPad again at Landscape Right, I read first a print out of "ARParent: Rotating to Portrait" and then two print statements of "ARParent: Rotating to Landscape Right" I do not understand why the app would claim to be in portrait rotation.

Integrating into an existing app

July 27, 2012 - 2:51pm #14

You could try looking at

- (void) handleARViewRotation:(UIInterfaceOrientation)interfaceOrientation

in ARViewController.mm as this handles the autorotation, so your application will need to replicate this functionality somehow. 

N

 

Integrating into an existing app

July 27, 2012 - 12:02pm #13

Hey, so I got the video feed up.

But now in landscape mode the video takes up half the screen and the orientation is set to the side.

Any ideas?

Integrating into an existing app

July 17, 2012 - 3:38am #12

One option might be to look at OverlayViewController.mm  

...as this kind of controls things like changing datasets at the moment, and it provides a UI with buttons.

 

HTH

Integrating into an existing app

July 16, 2012 - 9:49am #11

Kind of. Really what I am trying to figure out is which view controller my app needs to handover control to.

Integrating into an existing app

July 16, 2012 - 2:43am #10

Hi sbhuiyan

In the sample code, the EAGLView contains the OpenGL ES Context into which the camera feed an 3D objects are drawn.

The RenderFrameQCAR is the loop called each frame and it shows that the camer feed is drawn first, with the 3D objects after.

Is this what you wanted to know?

N

Integrating into an existing app

July 13, 2012 - 9:24am #9

Hello N!

 

Thank you for your help. I thought that was something I had covered but upon double checking it, I found that I was wrong.

 

So does the augmented 3D object appear on the same layer as the camera feed or does it appear on a layer on top of the camera feed?

Integrating into an existing app

July 12, 2012 - 2:50am #8

Hi sbhuiyan

You need to make sure that your equivalent to the EAGLView class adheres to the following protocol:

UIView <UIGLViewProtocol>

and that it has a renderFrameQCAR() method.

Once you look through the code it should be easier to understand.

HTH

 

N

 

Integrating into an existing app

July 11, 2012 - 10:01am #7

Thanks! This thread was quite helpful. 

So when I integrate I seem to be getting this read out on my terminal: 

2012-07-11 10:48:09.407 Atlas[14557:707] APPSTATUS_INITED

2012-07-11 10:48:09.409 Atlas[14557:707] APPSTATUS_CAMERA_RUNNING

2012-07-11 10:48:09.434 Atlas[14557:707] DEBUG/AR(14557) UIView has CAEAGLLayer class

2012-07-11 10:48:09.435 Atlas[14557:707] DEBUG/AR(14557) UIView does not respond to selector renderFrameQCAR

2012-07-11 10:48:09.437 Atlas[14557:707] DEBUG/AR(14557) UIView has CAEAGLLayer class

2012-07-11 10:48:09.439 Atlas[14557:707] DEBUG/AR(14557) UIView does not respond to selector renderFrameQCAR

2012-07-11 10:48:09.441 Atlas[14557:707] DEBUG/AR(14557) Could not find a UIView with CAEAGLLayer layer class that responds to selector renderFrameQCAR

2012-07-11 10:48:09.444 Atlas[14557:707] DEBUG/AR(14557) UIView has CAEAGLLayer class

2012-07-11 10:48:09.445 Atlas[14557:707] DEBUG/AR(14557) UIView does not respond to selector renderFrameQCAR

2012-07-11 10:48:09.447 Atlas[14557:707] DEBUG/AR(14557) UIView has CAEAGLLayer class

2012-07-11 10:48:09.449 Atlas[14557:707] DEBUG/AR(14557) UIView does not respond to selector renderFrameQCAR

2012-07-11 10:48:09.451 Atlas[14557:707] DEBUG/AR(14557) Could not find a UIView with CAEAGLLayer layer class that responds to selector renderFrameQCAR

 

How would I go about fixing this?

Integrating into an existing app

July 10, 2012 - 9:57am #6

Integrating into an existing app

July 10, 2012 - 9:19am #5

What do you think the best way is to get the camera feed? 

Integrating into an existing app

July 3, 2012 - 8:34am #4

hi sbhuiyan

If you look in the EAGLView.mm of the Image Targets app the code that tracks and renders is in renderFrameQCAR()

The pose or model view matrix give the position of the trackable.  You can google "model view matrix" for more background info:

QCAR::Matrix44F modelViewMatrix = QCAR::Tool::convertPose2GLMatrix(trackable->getPose());

 

Once you have this it gives you the 3d position of the trackable, and you can examine the teapot code to see how this enables you to render at the right location.


HTH


N

Integrating into an existing app

July 3, 2012 - 8:20am #3

Hello N,

Thank you for your quick response. Would you mind explaining what the 3D pose/matrix is and where I could find it in the example code? 

Thanks!

Integrating into an existing app

July 2, 2012 - 2:28am #2

Hi sbhuiyan

This is such a wide ranging and application specific question that could have many different answers.

Essentially the samples show the key things that Vuforia does:

  • Get the camera feed
  • Identify and track trackables e.g. Image Targets, Frame Markers and Multi-Targets
  • Provide 3d pose/matrix on the trackables
The samples also show how to render augmentations e.g. teapots, but there is no reason that Vuforia must handle this.  Provided you can get your existing iOS application to talk to Vuforia then it can do its own rendering.  Various threads on the forums will give you tips on how to achieve the above, along with the sample code 
 
Regarding the specifics you basically need to grab the QCAR include files you need and put them in your application together with the library and link.
 
The easiest way is to use Unity because you do not need to worry about the lower levels of detail.
 
HTH
 
N
Log in or register to post comments