Log in or register to post comments

Simple image tracking application development guide for beginner(Android)

August 25, 2011 - 7:19am #1

Hey guys

I am new to qualcomm and augmented reality stuff.
After a few hours of digging into it all i have got is a trackable from the site with 2 files which i have placed in my assets folder. I dont know what to do next and how to use the sdk. Can somebody guide me to the step by step application development tutorial. I have checked the dev guide but its more for how to make a trackable rather than how to make an augmented reality application using qualcomm.

All i want to do is start an intent when a specific image is viewed by camera whose .dat and xml file is present in assets folder.

Thanks in advance

Re: Simple image tracking application development guide for begi

September 19, 2011 - 12:40pm #4

Take a look at this thread, it sounds like a similar issue: http://ar.qualcomm.at/node/2000751

- Kim

Re: Simple image tracking application development guide for begi

September 17, 2011 - 1:51am #3

hey !

I am following almost the same strategy instead i am calling my own webview intent. Well, even when i tried using your code, the problem i am facing is that if i press back button on android phone, it keeps re-playing the video instead of taking to camera view.

Are you facing the same problem ? If yes, how did you solve it ?

Regards,

wahib

Re: Simple image tracking application development guide for begi

August 26, 2011 - 2:41am #2

Ok i got it working. The following is the solution to it which i got from this link

http://ar.qualcomm.at/node/2000032

The imagetargets.cpp which i had was already having the method renderFrame
so i had to modify it a little

NIEXPORT void JNICALL
Java_com_qualcomm_QCARSamples_ImageTargets_ImageTargetsRenderer_renderFrame(JNIEnv* env, jobject obj)
{
//LOG("Java_com_qualcomm_QCARSamples_ImageTargets_GLRenderer_renderFrame");

// Clear color and depth buffer
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

// Render video background:
QCAR::State state = QCAR::Renderer::getInstance().begin();

#ifdef USE_OPENGL_ES_1_1
// Set GL11 flags:
glEnableClientState(GL_VERTEX_ARRAY);
glEnableClientState(GL_NORMAL_ARRAY);
glEnableClientState(GL_TEXTURE_COORD_ARRAY);

glEnable(GL_TEXTURE_2D);
glDisable(GL_LIGHTING);

#endif

glEnable(GL_DEPTH_TEST);
glEnable(GL_CULL_FACE);

// Did we find any trackables this frame?
for(int tIdx = 0; tIdx {
// Get the trackable:
const QCAR::Trackable* trackable = state.getActiveTrackable(tIdx);
QCAR::Matrix44F modelViewMatrix =
QCAR::Tool::convertPose2GLMatrix(trackable->getPose());

// Choose the texture based on the target name:
int textureIndex = (!strcmp(trackable->getName(), "stones")) ? 0 : 1;
const Texture* const thisTexture = textures[textureIndex];

jstring js = env->NewStringUTF(trackable->getName());
jclass javaClass = env->GetObjectClass(obj);
jmethodID method = env->GetMethodID(javaClass, "displayMessage", "(Ljava/lang/String;)V");
env->CallObjectMethod(obj, method, js);

}

glDisable(GL_DEPTH_TEST);

#ifdef USE_OPENGL_ES_1_1
glDisable(GL_TEXTURE_2D);
glDisableClientState(GL_VERTEX_ARRAY);
glDisableClientState(GL_NORMAL_ARRAY);
glDisableClientState(GL_TEXTURE_COORD_ARRAY);
#else
glDisableVertexAttribArray(vertexHandle);
glDisableVertexAttribArray(normalHandle);
glDisableVertexAttribArray(textureCoordHandle);
#endif

QCAR::Renderer::getInstance().end();
}

and after doing changes in imageRenderer class i added following code in on resume :-

ImageTargetsRenderer.mainActivityHandler = new Handler() {
@Override
public void handleMessage(Message msg) {
Intent intent = new Intent(Intent.ACTION_VIEW);
intent.setData(Uri.parse("http://www.youtube.com/watch?v=DyDA2Abnssg"));
startActivity(intent);
ImageTargets.this.finish();
}
};

Log in or register to post comments