Log in or register to post comments

Interaction between markers(FrameMarkers)

July 23, 2012 - 11:41am #1

Greeting all,

     Currently doing my Final Year Project. In my project, I need to show up some interaction between marker.

For example, MarkerA show apple, MarkerB show Orange and when MarkerA and MarkerB close to each other, it will show banana.

 

Anyone can enlight me how to do it?

Thanks a lot. 

Interaction between markers(FrameMarkers)

December 11, 2013 - 10:41am #23

How would i go about doing this for a unity project? there seems to be no discussion on interacting markers for unity. I have no idea how to achieve this. I simply want an animation triggered on one object when another one is brought close to it. I wrote n animation script for just such an event but I think the targets don't "recognize" each other. I'm fairly new to coiding so please be explicit in your reply.

 

Thanks

 

Dom

Interaction between markers(FrameMarkers)

October 2, 2013 - 12:08am #22

Hi, i had tried the code u give.. But it shown black screen when i install to my phone after i press start button.

besides that, i do not see any code is refering to the marker that will render the object.

Sorry for my bad English and hope u understand my question.

Thanks

Interaction between markers(FrameMarkers)

June 11, 2013 - 11:20pm #21

You can easily verify that the markerIds are unique to each marker, by doing this test:

pointing your camera at each of the marker (only one at a time) and checking the logs to see that the reported MarkerId is actually unique for each printed marker.

The rest is application logic...

 

Interaction between markers(FrameMarkers)

June 11, 2013 - 11:14pm #20

Still, it doesn't seem to work as expected..here's the bit of code I use for logic:

int numberOfMarkers = 16; // the number of available markers

int markersNum=state.getNumTrackableResults();  // number of tracked markers
QCAR::Vec3F targetCenters[numberOfMarkers+1];   // sort of buffer
QCAR::Vec3F updatedTargetCenters[markersNum+1]; // the actual vector used for drawing

Queue<int> queue(numberOfMarkers+1); //supposed to store markers' id in the order of their appearance

for(int tIdx = 0; tIdx < markersNum; tIdx++)
{
        // Get the trackable:
        const QCAR::TrackableResult* result = state.getTrackableResult(tIdx);
        const QCAR::Trackable& trackable = result->getTrackable();

        QCAR::Matrix44F modelViewMatrix = QCAR::Tool::convertPose2GLMatrix(result->getPose());
         //Check the type of the trackable:
         assert(trackableResult->getType() == QCAR::TrackableResult::MARKER_RESULT);
         const QCAR::MarkerResult* markerResult = static_cast<const QCAR::MarkerResult*>(result);
         const QCAR::Marker& marker = markerResult->getTrackable();

        // ...

       // put on the marker's id position in the buffer its center
       targetCenters[marker.getMarkerId()].data[0] = position.data[0];
       // and so on for each data[#]

       LOG("tIdx: [%d] mId: [%d]", tIdx, marker.getMarkerId());

        // enqueue current marker's id
        queue.Enqueue(marker.getMarkerId());

}

// ...

//update target centers
        for(int i=0;i<markersNum+1;i++){
            int mId=queue.Dequeue();           
            updatedTargetCenters[i].data[0]=targetCenters[mId].data[0];
            // and so on for each data[#]
        }

// ...

glVertexAttribPointer(vertexHandle, 3, GL_FLOAT, GL_FALSE, 0,  (const GLvoid*) &updatedTargetCenters[0].data[0]);

 

Apparently, tIdx is the same as getMarkerId(). {The Queue class is a round FIFO queue. The loops go to (markersNum+1) and (numberOfMarkers+1) because of one last vertex that represents the infinity which is calculated on a last step}.

Need help.

 

Mike

Interaction between markers(FrameMarkers)

June 11, 2013 - 11:01am #19

glad to help!

Interaction between markers(FrameMarkers)

June 11, 2013 - 9:42am #18

Alessandro,

     Thanks a lot! It was so obvious. I forgot to call the updated array.

 

Mike

Interaction between markers(FrameMarkers)

June 11, 2013 - 7:33am #17

Hi, if you are using FrameMarkers, each Frame Marker has a unique ID which is called "MarkerID";

this can be retrieved using the getMarkerID() method of the QCAR::Marker class; this should allow you to uniquely identify each marker and implement your logic.

 

Interaction between markers(FrameMarkers)

June 11, 2013 - 4:15am #16

Hi Alessandro,

     What should be that logic(for tracking the markers' appearance) based on? In the sample apps the state object is used to retrieve the trackables. I can't use this state object as it uses an index to identify each trackable. Should I use a frame object and its timestamp or there's another way to get it solved? I apologize for the n00b-kind questions. I'm completely new to all of this stuff but it is very interesting and I want to get into it. Thanks!

Interaction between markers(FrameMarkers)

May 8, 2013 - 4:30am #15

Hi, as a general note, some basic understanding of 3D maths (and in particular how transformations are handled with matrices in 3D computer graphics) is generally needed to tackle these kind of issues (as I cannot teach the foundation of 3D maths via the Forum, as you can understand)

Some, my first recommendation would be for you to study a bit this subject (there are plenty of resources online)

Then, just very briefly, the meaning of the 3x3 rotation matrix is quite simple; every column in the matrix represents the "components" (x,y,z) of one of the axis of a 3D reference frame as represented in another reference frame; so in practice it tells how to rotate from one reference frame to another.

More practically, if you have a vector V whose coordinates in one reference frame are (x,y,z), you can find the coordinates of the same vector represented in the 2nd reference frame by multiplying the matrix by the vector V.

And this is just for the rotation.

Then the translation (position) part is more obvious.

but if I want a personally designed line, maybe even an animated one, how can that be done?

This can be done in many ways; the way i would do it is to draw a line between the two markers, and use a special shader to achieve the animation effect; alternatively, you can draw a line (using GL_LINES) between two points and animate the coordinates of those two points at every frame, based on some timer.

 Is it possible the line to connect markers independently of their id, I mean in the order of their appearance?

Yes, you will need to implement some logic to keep track of which marker was detected first (i.e. to keep track of the order of appearance, as Vuforia does not provide an API to do that out-of-the box)

 

 

Interaction between markers(FrameMarkers)

May 8, 2013 - 2:57am #14

Hi Alessandro,

Yes, I use it to draw a line_loop among markers, but if I want to know, say the yaw, how can I get it? As I understood, the getPose() method provides a 3x4 matrix with 6DOF, but from https://developer.vuforia.com/resources/dev-guide/pose-matrix-explained I don't clearly understand that 3x3 rotation sub-matrix. I suppose I have to learn some ray transfer matrix analysis stuff.

Speaking of the laser beam, supposingly, it can be done using a GL_LINE_STRIP primitive providing the first vertex, the last vertex as infinite and the intermediary vertexes according to markers' position, but if I want a personally designed line, maybe even an animated one, how can that be done?

And another one(from the bunch of questions), in that sample app with line loop, it connects two consecutive id markers. Is it possible the line to connect markers independently of their id, I mean in the order of their appearance?

Thanks

Interaction between markers(FrameMarkers)

May 7, 2013 - 2:50am #13

Hi, have you checked this article:

https://developer.vuforia.com/resources/dev-guide/unified-target-coordinates

This explains how to handle multiple target reference frames in a common "unified" world reference.

Interaction between markers(FrameMarkers)

April 13, 2013 - 3:42am #12

Hi,

     I'm developing an application which, basically, deflects a light beam from the occuring mirrors(using vuforia for android sdk). Everything is fine, I replaced, as DavidBeard suggested in an email correspondence, with targetCenters[tIdx].data[#] when iterating through state.getNumActiveTrackables(). Now I need to know the beam direction from mirror to mirror. Any ideas with that? And a second question, where do I get from marker's position, is it from 'getPose()' method?

Interaction between markers(FrameMarkers)

July 26, 2012 - 12:40pm #11

You can remove the #include of fastcv.h - you won't need it.

Interaction between markers(FrameMarkers)

July 26, 2012 - 10:42am #10

I got an error while trying to ndk-build the file u send to me

jni/ImageTargets.cpp:46:20: error: fastcv.h: No such file or directory

 

Interaction between markers(FrameMarkers)

July 25, 2012 - 6:43pm #9

yap, I had built it. 

Interaction between markers(FrameMarkers)

July 25, 2012 - 10:56am #8

Did you run ndk-build on that? If you PM your email address, I'll send you a tested example of ImageTargets.cpp to get you going.

Interaction between markers(FrameMarkers)

July 24, 2012 - 8:23pm #7

Hi david,

 

Any solution for this?

Interaction between markers(FrameMarkers)

July 23, 2012 - 12:39pm #6

This is the edited ImageTargets.cpp file

/*==============================================================================

            Copyright (c) 2012 QUALCOMM Austria Research Center GmbH.

            All Rights Reserved.

            Qualcomm Confidential and Proprietary

            

@file 

    ImageTargets.cpp

 

@brief

    Sample for ImageTargets

 

==============================================================================*/

 

 

#include <jni.h>

#include <android/log.h>

#include <stdio.h>

#include <string.h>

#include <assert.h>

#include "SampleMath.h"

 

#ifdef USE_OPENGL_ES_1_1

#include <GLES/gl.h>

#include <GLES/glext.h>

#else

#include <GLES2/gl2.h>

#include <GLES2/gl2ext.h>

#endif

 

#include <QCAR/QCAR.h>

#include <QCAR/CameraDevice.h>

#include <QCAR/Renderer.h>

#include <QCAR/VideoBackgroundConfig.h>

#include <QCAR/Trackable.h>

#include <QCAR/Tool.h>

#include <QCAR/Tracker.h>

#include <QCAR/TrackerManager.h>

#include <QCAR/ImageTracker.h>

#include <QCAR/CameraCalibration.h>

#include <QCAR/UpdateCallback.h>

#include <QCAR/DataSet.h>

 

#include "SampleUtils.h"

#include "Texture.h"

#include "CubeShaders.h"

#include "Teapot.h"

 

#ifdef __cplusplus

extern "C"

{

#endif

 

// Textures:

int textureCount                = 0;

Texture** textures              = 0;

 

// OpenGL ES 2.0 specific:

#ifdef USE_OPENGL_ES_2_0

unsigned int shaderProgramID    = 0;

GLint vertexHandle              = 0;

GLint normalHandle              = 0;

GLint textureCoordHandle        = 0;

GLint mvpMatrixHandle           = 0;

#endif

 

// Screen dimensions:

unsigned int screenWidth        = 0;

unsigned int screenHeight       = 0;

 

// Indicates whether screen is in portrait (true) or landscape (false) mode

bool isActivityInPortraitMode   = false;

 

// The projection matrix used for rendering virtual objects:

QCAR::Matrix44F projectionMatrix;

 

// Constants:

static const float kObjectScale = 3.f;

 

QCAR::DataSet* dataSetStonesAndChips    = 0;

QCAR::DataSet* dataSetTarmac            = 0;

 

bool switchDataSetAsap          = false;

 

// Object to receive update callbacks from QCAR SDK

class ImageTargets_UpdateCallback : public QCAR::UpdateCallback

{   

    virtual void QCAR_onUpdate(QCAR::State& /*state*/)

    {

        if (switchDataSetAsap)

        {

            switchDataSetAsap = false;

 

            // Get the image tracker:

            QCAR::TrackerManager& trackerManager = QCAR::TrackerManager::getInstance();

            QCAR::ImageTracker* imageTracker = static_cast<QCAR::ImageTracker*>(

                trackerManager.getTracker(QCAR::Tracker::IMAGE_TRACKER));

            if (imageTracker == 0 || dataSetStonesAndChips == 0 || dataSetTarmac == 0 ||

                imageTracker->getActiveDataSet() == 0)

            {

                LOG("Failed to switch data set.");

                return;

            }

            

            if (imageTracker->getActiveDataSet() == dataSetStonesAndChips)

            {

                imageTracker->deactivateDataSet(dataSetStonesAndChips);

                imageTracker->activateDataSet(dataSetTarmac);

            }

            else

            {

                imageTracker->deactivateDataSet(dataSetTarmac);

                imageTracker->activateDataSet(dataSetStonesAndChips);

            }

        }

    }

};

 

ImageTargets_UpdateCallback updateCallback;

 

JNIEXPORT int JNICALL

Java_com_qualcomm_QCARSamples_ImageTargets_ImageTargets_getOpenGlEsVersionNative(JNIEnv *, jobject)

{

#ifdef USE_OPENGL_ES_1_1        

    return 1;

#else

    return 2;

#endif

}

 

 

JNIEXPORT void JNICALL

Java_com_qualcomm_QCARSamples_ImageTargets_ImageTargets_setActivityPortraitMode(JNIEnv *, jobject, jboolean isPortrait)

{

    isActivityInPortraitMode = isPortrait;

}

 

 

 

JNIEXPORT void JNICALL

Java_com_qualcomm_QCARSamples_ImageTargets_ImageTargets_switchDatasetAsap(JNIEnv *, jobject)

{

    switchDataSetAsap = true;

}

 

 

JNIEXPORT int JNICALL

Java_com_qualcomm_QCARSamples_ImageTargets_ImageTargets_initTracker(JNIEnv *, jobject)

{

    LOG("Java_com_qualcomm_QCARSamples_ImageTargets_ImageTargets_initTracker");

    

    // Initialize the image tracker:

    QCAR::TrackerManager& trackerManager = QCAR::TrackerManager::getInstance();

    QCAR::Tracker* tracker = trackerManager.initTracker(QCAR::Tracker::IMAGE_TRACKER);

    if (tracker == NULL)

    {

        LOG("Failed to initialize ImageTracker.");

        return 0;

    }

 

    LOG("Successfully initialized ImageTracker.");

    return 1;

}

 

 

JNIEXPORT void JNICALL

Java_com_qualcomm_QCARSamples_ImageTargets_ImageTargets_deinitTracker(JNIEnv *, jobject)

{

    LOG("Java_com_qualcomm_QCARSamples_ImageTargets_ImageTargets_deinitTracker");

 

    // Deinit the image tracker:

    QCAR::TrackerManager& trackerManager = QCAR::TrackerManager::getInstance();

    trackerManager.deinitTracker(QCAR::Tracker::IMAGE_TRACKER);

}

 

 

JNIEXPORT int JNICALL

Java_com_qualcomm_QCARSamples_ImageTargets_ImageTargets_loadTrackerData(JNIEnv *, jobject)

{

    LOG("Java_com_qualcomm_QCARSamples_ImageTargets_ImageTargets_loadTrackerData");

    

    // Get the image tracker:

    QCAR::TrackerManager& trackerManager = QCAR::TrackerManager::getInstance();

    QCAR::ImageTracker* imageTracker = static_cast<QCAR::ImageTracker*>(

                    trackerManager.getTracker(QCAR::Tracker::IMAGE_TRACKER));

    if (imageTracker == NULL)

    {

        LOG("Failed to load tracking data set because the ImageTracker has not"

            " been initialized.");

        return 0;

    }

 

    // Create the data sets:

    dataSetStonesAndChips = imageTracker->createDataSet();

    if (dataSetStonesAndChips == 0)

    {

        LOG("Failed to create a new tracking data.");

        return 0;

    }

 

    dataSetTarmac = imageTracker->createDataSet();

    if (dataSetTarmac == 0)

    {

        LOG("Failed to create a new tracking data.");

        return 0;

    }

 

    // Load the data sets:

    if (!dataSetStonesAndChips->load("StonesAndChips.xml", QCAR::DataSet::STORAGE_APPRESOURCE))

    {

        LOG("Failed to load data set.");

        return 0;

    }

 

    if (!dataSetTarmac->load("Tarmac.xml", QCAR::DataSet::STORAGE_APPRESOURCE))

    {

        LOG("Failed to load data set.");

        return 0;

    }

 

    // Activate the data set:

    if (!imageTracker->activateDataSet(dataSetStonesAndChips))

    {

        LOG("Failed to activate data set.");

        return 0;

    }

 

    LOG("Successfully loaded and activated data set.");

    return 1;

}

 

 

JNIEXPORT int JNICALL

Java_com_qualcomm_QCARSamples_ImageTargets_ImageTargets_destroyTrackerData(JNIEnv *, jobject)

{

    LOG("Java_com_qualcomm_QCARSamples_ImageTargets_ImageTargets_destroyTrackerData");

 

    // Get the image tracker:

    QCAR::TrackerManager& trackerManager = QCAR::TrackerManager::getInstance();

    QCAR::ImageTracker* imageTracker = static_cast<QCAR::ImageTracker*>(

        trackerManager.getTracker(QCAR::Tracker::IMAGE_TRACKER));

    if (imageTracker == NULL)

    {

        LOG("Failed to destroy the tracking data set because the ImageTracker has not"

            " been initialized.");

        return 0;

    }

    

    if (dataSetStonesAndChips != 0)

    {

        if (imageTracker->getActiveDataSet() == dataSetStonesAndChips &&

            !imageTracker->deactivateDataSet(dataSetStonesAndChips))

        {

            LOG("Failed to destroy the tracking data set StonesAndChips because the data set "

                "could not be deactivated.");

            return 0;

        }

 

        if (!imageTracker->destroyDataSet(dataSetStonesAndChips))

        {

            LOG("Failed to destroy the tracking data set StonesAndChips.");

            return 0;

        }

 

        LOG("Successfully destroyed the data set StonesAndChips.");

        dataSetStonesAndChips = 0;

    }

 

    if (dataSetTarmac != 0)

    {

        if (imageTracker->getActiveDataSet() == dataSetTarmac &&

            !imageTracker->deactivateDataSet(dataSetTarmac))

        {

            LOG("Failed to destroy the tracking data set Tarmac because the data set "

                "could not be deactivated.");

            return 0;

        }

 

        if (!imageTracker->destroyDataSet(dataSetTarmac))

        {

            LOG("Failed to destroy the tracking data set Tarmac.");

            return 0;

        }

 

        LOG("Successfully destroyed the data set Tarmac.");

        dataSetTarmac = 0;

    }

 

    return 1;

}

 

 

JNIEXPORT void JNICALL

Java_com_qualcomm_QCARSamples_ImageTargets_ImageTargets_onQCARInitializedNative(JNIEnv *, jobject)

{

    // Register the update callback where we handle the data set swap:

    QCAR::registerCallback(&updateCallback);

 

    // Comment in to enable tracking of up to 2 targets simultaneously and

    // split the work over multiple frames:

     QCAR::setHint(QCAR::HINT_MAX_SIMULTANEOUS_IMAGE_TARGETS, 2);

QCAR::setHint(QCAR::HINT_IMAGE_TARGET_MULTI_FRAME_ENABLED, 1);

}

 

 

JNIEXPORT void JNICALL

Java_com_qualcomm_QCARSamples_ImageTargets_ImageTargetsRenderer_renderFrame(JNIEnv *, jobject)

{

// Clear color and depth buffer 

glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

 

// Render video background:

QCAR::State state = QCAR::Renderer::getInstance().begin();

 

#ifdef USE_OPENGL_ES_1_1

// Set GL11 flags:

glEnableClientState(GL_VERTEX_ARRAY);

glDisableClientState(GL_NORMAL_ARRAY);

glDisableClientState(GL_TEXTURE_COORD_ARRAY);

glDisable(GL_LIGHTING);

#endif

 

glDisable(GL_TEXTURE_2D);

glEnable(GL_DEPTH_TEST);

glEnable(GL_CULL_FACE);

 

QCAR::Matrix44F mainModelViewMatrix;

QCAR::Vec3F targetCenters[2]; // make this big enough to hold all your targets

 

// Did we find any trackables this frame?

for(int tIdx = 0; tIdx < state.getNumActiveTrackables(); tIdx++)

{

// Get the trackable:

const QCAR::Trackable* trackable = state.getActiveTrackable(tIdx);

QCAR::Matrix44F modelViewMatrix =

QCAR::Tool::convertPose2GLMatrix(trackable->getPose());

 

if (tIdx == 0)

{

// Make the first visible target our world center (0, 0, 0)

// Store its modelViewMatrix and continue looking for other targets

mainModelViewMatrix = modelViewMatrix;

targetCenters[0].data[0] = 0.0f;

targetCenters[0].data[1] = 0.0f;

targetCenters[0].data[2] = 0.0f;

}

else

{

// This is another visible target

// Find its center point in relation to the first target

// To do this we use the matrix inverse function (SampleMath.h from the Dominoes project)

QCAR::Matrix44F mainModelViewInverse = SampleMath::Matrix44FInverse(mainModelViewMatrix);

QCAR::Matrix44F modelViewTranspose = SampleMath::Matrix44FTranspose(modelViewMatrix); // let's work with row-major matrices

QCAR::Matrix44F offsetMatrix = QCAR::Tool::multiply(mainModelViewInverse, modelViewTranspose);

 

// Transform a point on the second target by this offset matrix

// (0, 0, 0) is the local center of the target

QCAR::Vec4F position(0.0f, 0.0f, 0.0f, 1.0f);

position = SampleMath::Vec4FTransform(position, offsetMatrix);

 

// Add this position to our array

targetCenters[1].data[0] = position.data[0];

targetCenters[1].data[1] = position.data[1];

targetCenters[1].data[2] = position.data[2];

}

}

 

if (state.getNumActiveTrackables() > 1)

{

#ifdef USE_OPENGL_ES_1_1

// Load projection matrix:

glMatrixMode(GL_PROJECTION);

glLoadMatrixf(projectionMatrix.data);

 

// Load model view matrix:

glMatrixMode(GL_MODELVIEW);

glLoadMatrixf(mainModelViewMatrix.data);

 

// Set the color to red:

glColor4f(1.0f, 0.0f, 0.0f, 1.0f);

 

// Draw object:

glVertexPointer(3, GL_FLOAT, 0, (const GLvoid*) &targetCenters[0].data[0]);

glDrawArrays(GL_LINES, 0, 2);

#else

 

QCAR::Matrix44F modelViewProjection;

 

SampleUtils::multiplyMatrix(&projectionMatrix.data[0],

&mainModelViewMatrix.data[0],

&modelViewProjection.data[0]);

 

glUseProgram(shaderProgramID);

 

glVertexAttribPointer(vertexHandle, 3, GL_FLOAT, GL_FALSE, 0,

(const GLvoid*) &targetCenters[0].data[0]);

 

glEnableVertexAttribArray(vertexHandle);

 

glUniformMatrix4fv(mvpMatrixHandle, 1, GL_FALSE,

(GLfloat*) &modelViewProjection.data[0]);

glDrawArrays(GL_LINES, 0, 2);

#endif

}

 

glDisable(GL_DEPTH_TEST);

 

#ifdef USE_OPENGL_ES_1_1 

glDisable(GL_TEXTURE_2D);

glDisableClientState(GL_VERTEX_ARRAY);

glDisableClientState(GL_NORMAL_ARRAY);

glDisableClientState(GL_TEXTURE_COORD_ARRAY);

#else

glEnable(GL_TEXTURE_2D);

glDisableVertexAttribArray(vertexHandle);

glDisableVertexAttribArray(normalHandle);

glDisableVertexAttribArray(textureCoordHandle);

#endif

 

QCAR::Renderer::getInstance().end();

}

 

 

void

configureVideoBackground()

{

    // Get the default video mode:

    QCAR::CameraDevice& cameraDevice = QCAR::CameraDevice::getInstance();

    QCAR::VideoMode videoMode = cameraDevice.

                                getVideoMode(QCAR::CameraDevice::MODE_DEFAULT);

 

 

    // Configure the video background

    QCAR::VideoBackgroundConfig config;

    config.mEnabled = true;

    config.mSynchronous = true;

    config.mPosition.data[0] = 0.0f;

    config.mPosition.data[1] = 0.0f;

    

    if (isActivityInPortraitMode)

    {

        //LOG("configureVideoBackground PORTRAIT");

        config.mSize.data[0] = videoMode.mHeight

                                * (screenHeight / (float)videoMode.mWidth);

        config.mSize.data[1] = screenHeight;

 

        if(config.mSize.data[0] < screenWidth)

        {

            LOG("Correcting rendering background size to handle missmatch between screen and video aspect ratios.");

            config.mSize.data[0] = screenWidth;

            config.mSize.data[1] = screenWidth * 

                              (videoMode.mWidth / (float)videoMode.mHeight);

        }

    }

    else

    {

        //LOG("configureVideoBackground LANDSCAPE");

        config.mSize.data[0] = screenWidth;

        config.mSize.data[1] = videoMode.mHeight

                            * (screenWidth / (float)videoMode.mWidth);

 

        if(config.mSize.data[1] < screenHeight)

        {

            LOG("Correcting rendering background size to handle missmatch between screen and video aspect ratios.");

            config.mSize.data[0] = screenHeight

                                * (videoMode.mWidth / (float)videoMode.mHeight);

            config.mSize.data[1] = screenHeight;

        }

    }

 

    LOG("Configure Video Background : Video (%d,%d), Screen (%d,%d), mSize (%d,%d)", videoMode.mWidth, videoMode.mHeight, screenWidth, screenHeight, config.mSize.data[0], config.mSize.data[1]);

 

    // Set the config:

    QCAR::Renderer::getInstance().setVideoBackgroundConfig(config);

}

 

 

JNIEXPORT void JNICALL

Java_com_qualcomm_QCARSamples_ImageTargets_ImageTargets_initApplicationNative(

                            JNIEnv* env, jobject obj, jint width, jint height)

                            

{

    LOG("Java_com_qualcomm_QCARSamples_ImageTargets_ImageTargets_initApplicationNative");

    

    // Store screen dimensions

    screenWidth = width;

    screenHeight = height;

        

    // Handle to the activity class:

    jclass activityClass = env->GetObjectClass(obj);

 

    jmethodID getTextureCountMethodID = env->GetMethodID(activityClass,

                                                    "getTextureCount", "()I");

    if (getTextureCountMethodID == 0)

    {

        LOG("Function getTextureCount() not found.");

        return;

    }

 

    textureCount = env->CallIntMethod(obj, getTextureCountMethodID);    

    if (!textureCount)

    {

        LOG("getTextureCount() returned zero.");

        return;

    }

 

    textures = new Texture*[textureCount];

 

    jmethodID getTextureMethodID = env->GetMethodID(activityClass,

        "getTexture", "(I)Lcom/qualcomm/QCARSamples/ImageTargets/Texture;");

 

    if (getTextureMethodID == 0)

    {

        LOG("Function getTexture() not found.");

        return;

    }

 

    // Register the textures

    for (int i = 0; i < textureCount; ++i)

    {

 

        jobject textureObject = env->CallObjectMethod(obj, getTextureMethodID, i); 

        if (textureObject == NULL)

        {

            LOG("GetTexture() returned zero pointer");

            return;

        }

 

        textures[i] = Texture::create(env, textureObject);

    }

    LOG("Java_com_qualcomm_QCARSamples_ImageTargets_ImageTargets_initApplicationNative finished");

}

 

 

JNIEXPORT void JNICALL

Java_com_qualcomm_QCARSamples_ImageTargets_ImageTargets_deinitApplicationNative(

                                                        JNIEnv* env, jobject obj)

{

    LOG("Java_com_qualcomm_QCARSamples_ImageTargets_ImageTargets_deinitApplicationNative");

 

    // Release texture resources

    if (textures != 0)

    {    

        for (int i = 0; i < textureCount; ++i)

        {

            delete textures[i];

            textures[i] = NULL;

        }

    

        delete[]textures;

        textures = NULL;

        

        textureCount = 0;

    }

}

 

 

JNIEXPORT void JNICALL

Java_com_qualcomm_QCARSamples_ImageTargets_ImageTargets_startCamera(JNIEnv *,

                                                                         jobject)

{

    LOG("Java_com_qualcomm_QCARSamples_ImageTargets_ImageTargets_startCamera");

 

    // Initialize the camera:

    if (!QCAR::CameraDevice::getInstance().init())

        return;

 

    // Configure the video background

    configureVideoBackground();

 

    // Select the default mode:

    if (!QCAR::CameraDevice::getInstance().selectVideoMode(

                                QCAR::CameraDevice::MODE_DEFAULT))

        return;

 

    // Start the camera:

    if (!QCAR::CameraDevice::getInstance().start())

        return;

 

    // Uncomment to enable flash

    //if(QCAR::CameraDevice::getInstance().setFlashTorchMode(true))

    // LOG("IMAGE TARGETS : enabled torch");

 

    // Uncomment to enable infinity focus mode, or any other supported focus mode

    // See CameraDevice.h for supported focus modes

    //if(QCAR::CameraDevice::getInstance().setFocusMode(QCAR::CameraDevice::FOCUS_MODE_INFINITY))

    // LOG("IMAGE TARGETS : enabled infinity focus");

 

    // Start the tracker:

    QCAR::TrackerManager& trackerManager = QCAR::TrackerManager::getInstance();

    QCAR::Tracker* imageTracker = trackerManager.getTracker(QCAR::Tracker::IMAGE_TRACKER);

    if(imageTracker != 0)

        imageTracker->start();

}

 

 

JNIEXPORT void JNICALL

Java_com_qualcomm_QCARSamples_ImageTargets_ImageTargets_stopCamera(JNIEnv *, jobject)

{

    LOG("Java_com_qualcomm_QCARSamples_ImageTargets_ImageTargets_stopCamera");

 

    // Stop the tracker:

    QCAR::TrackerManager& trackerManager = QCAR::TrackerManager::getInstance();

    QCAR::Tracker* imageTracker = trackerManager.getTracker(QCAR::Tracker::IMAGE_TRACKER);

    if(imageTracker != 0)

        imageTracker->stop();

    

    QCAR::CameraDevice::getInstance().stop();

    QCAR::CameraDevice::getInstance().deinit();

}

 

 

JNIEXPORT void JNICALL

Java_com_qualcomm_QCARSamples_ImageTargets_ImageTargets_setProjectionMatrix(JNIEnv *, jobject)

{

    LOG("Java_com_qualcomm_QCARSamples_ImageTargets_ImageTargets_setProjectionMatrix");

 

    // Cache the projection matrix:

    const QCAR::CameraCalibration& cameraCalibration =

                                QCAR::CameraDevice::getInstance().getCameraCalibration();

    projectionMatrix = QCAR::Tool::getProjectionGL(cameraCalibration, 2.0f,

                                            2000.0f);

}

 

 

JNIEXPORT jboolean JNICALL

Java_com_qualcomm_QCARSamples_ImageTargets_ImageTargets_activateFlash(JNIEnv*, jobject, jboolean flash)

{

    return QCAR::CameraDevice::getInstance().setFlashTorchMode((flash==JNI_TRUE)) ? JNI_TRUE : JNI_FALSE;

}

 

 

JNIEXPORT jboolean JNICALL

Java_com_qualcomm_QCARSamples_ImageTargets_ImageTargets_autofocus(JNIEnv*, jobject)

{

    return QCAR::CameraDevice::getInstance().setFocusMode(QCAR::CameraDevice::FOCUS_MODE_TRIGGERAUTO) ? JNI_TRUE : JNI_FALSE;

}

 

 

JNIEXPORT jboolean JNICALL

Java_com_qualcomm_QCARSamples_ImageTargets_ImageTargets_setFocusMode(JNIEnv*, jobject, jint mode)

{

    int qcarFocusMode;

 

    switch ((int)mode)

    {

        case 0:

            qcarFocusMode = QCAR::CameraDevice::FOCUS_MODE_NORMAL;

            break;

        

        case 1:

            qcarFocusMode = QCAR::CameraDevice::FOCUS_MODE_CONTINUOUSAUTO;

            break;

            

        case 2:

            qcarFocusMode = QCAR::CameraDevice::FOCUS_MODE_INFINITY;

            break;

            

        case 3:

            qcarFocusMode = QCAR::CameraDevice::FOCUS_MODE_MACRO;

            break;

    

        default:

            return JNI_FALSE;

    }

    

    return QCAR::CameraDevice::getInstance().setFocusMode(qcarFocusMode) ? JNI_TRUE : JNI_FALSE;

}

 

 

JNIEXPORT void JNICALL

Java_com_qualcomm_QCARSamples_ImageTargets_ImageTargetsRenderer_initRendering(

                                                    JNIEnv* env, jobject obj)

{

    LOG("Java_com_qualcomm_QCARSamples_ImageTargets_ImageTargetsRenderer_initRendering");

 

    // Define clear color

    glClearColor(0.0f, 0.0f, 0.0f, QCAR::requiresAlpha() ? 0.0f : 1.0f);

    

    // Now generate the OpenGL texture objects and add settings

    for (int i = 0; i < textureCount; ++i)

    {

        glGenTextures(1, &(textures[i]->mTextureID));

        glBindTexture(GL_TEXTURE_2D, textures[i]->mTextureID);

        glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);

        glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);

            glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, textures[i]->mWidth,

                textures[i]->mHeight, 0, GL_RGBA, GL_UNSIGNED_BYTE,

                (GLvoid*)  textures[i]->mData);

    }

#ifndef USE_OPENGL_ES_1_1

  

    shaderProgramID     = SampleUtils::createProgramFromBuffer(cubeMeshVertexShader,

                                                            cubeFragmentShader);

 

    vertexHandle        = glGetAttribLocation(shaderProgramID,

                                                "vertexPosition");

    normalHandle        = glGetAttribLocation(shaderProgramID,

                                                "vertexNormal");

    textureCoordHandle  = glGetAttribLocation(shaderProgramID,

                                                "vertexTexCoord");

    mvpMatrixHandle     = glGetUniformLocation(shaderProgramID,

                                                "modelViewProjectionMatrix");

 

#endif

 

}

 

 

JNIEXPORT void JNICALL

Java_com_qualcomm_QCARSamples_ImageTargets_ImageTargetsRenderer_updateRendering(

                        JNIEnv* env, jobject obj, jint width, jint height)

{

    LOG("Java_com_qualcomm_QCARSamples_ImageTargets_ImageTargetsRenderer_updateRendering");

 

    // Update screen dimensions

    screenWidth = width;

    screenHeight = height;

 

    // Reconfigure the video background

    configureVideoBackground();

}

 

 

#ifdef __cplusplus

}

#endif

 

Interaction between markers(FrameMarkers)

July 23, 2012 - 12:35pm #5

Yup, I have add them to my Imagetarget sample project and ndk-build.

 

07-24 03:34:28.662: D/dalvikvm(688): GC_CONCURRENT freed 774K, 12% free 8991K/10183K, paused 2ms+2ms

07-24 03:34:28.724: D/dalvikvm(688): GC_CONCURRENT freed 670K, 12% free 8991K/10183K, paused 1ms+5ms

07-24 03:34:28.826: D/dalvikvm(688): GC_CONCURRENT freed 770K, 12% free 8990K/10183K, paused 1ms+5ms

07-24 03:34:28.881: I/ActivityManager(215): No longer want com.google.android.partnersetup (pid 327): hidden #16

07-24 03:34:29.162: D/dalvikvm(688): GC_CONCURRENT freed 786K, 12% free 8986K/10183K, paused 1ms+2ms

07-24 03:34:29.584: D/dalvikvm(688): GC_CONCURRENT freed 784K, 12% free 8984K/10183K, paused 1ms+2ms

07-24 03:34:29.623: D/dalvikvm(688): GC_CONCURRENT freed 679K, 12% free 9028K/10183K, paused 2ms+3ms

07-24 03:34:29.951: D/dalvikvm(688): GC_CONCURRENT freed 772K, 12% free 9056K/10183K, paused 1ms+2ms

07-24 03:34:29.959: I/IMAGE_TARGETS(688): Successfully loaded and activated data set.

07-24 03:34:29.959: D/QCAR(688): LoadTrackerTask::onPostExecute: execution successful

07-24 03:34:29.990: D/dalvikvm(688): GC_EXPLICIT freed 138K, 12% free 8971K/10183K, paused 2ms+3ms

07-24 03:34:29.990: I/IMAGE_TARGETS(688): Java_com_qualcomm_QCARSamples_ImageTargets_ImageTargets_startCamera

07-24 03:34:30.013: D/SecCameraCoreManager(103): SecCameraCoreManager created: pid=103, cameraId=0

07-24 03:34:30.013: I/SecCameraCoreManager(103): Opening camera 0

07-24 03:34:30.013: I/CameraHAL(103): camera_device open

07-24 03:34:30.013: I/CameraHAL(103): Sensor index 0

07-24 03:34:30.310: I/CameraHAL(103): PreviewFormat yuv420sp

07-24 03:34:30.310: I/CameraHAL(103): VSTAB false

07-24 03:34:30.310: I/CameraHAL(103): IPP mode set ldc-nsf

07-24 03:34:30.310: I/CameraHAL(103): PreviewResolution by App 640 x 480

07-24 03:34:30.310: I/CameraHAL(103): Recording Hint is set to NULL

07-24 03:34:30.310: I/CameraHAL(103): Focus mode set infinity

07-24 03:34:30.310: I/CameraHAL(103): Picture Size by App 640 x 480

07-24 03:34:30.310: I/CameraHAL(103): FRAMERATE 30

07-24 03:34:30.310: I/CameraHAL(103): FPS Range = 6000,30000

07-24 03:34:30.310: I/CameraHAL(103): DEFAULT FPS Range = 6000,30000

07-24 03:34:30.310: I/CameraHAL(103): SET FRAMERATE 30

07-24 03:34:30.310: I/CameraHAL(103): FPS Range [6, 30]

07-24 03:34:30.310: I/CameraHAL(103): GBCE Value = disable

07-24 03:34:30.310: I/CameraHAL(103): Exposure set = auto

07-24 03:34:30.310: I/CameraHAL(103): White balance set auto

07-24 03:34:30.310: I/CameraHAL(103): Contrast set 2

07-24 03:34:30.310: I/CameraHAL(103): Sharpness set 100

07-24 03:34:30.310: I/CameraHAL(103): Saturation set 100

07-24 03:34:30.310: I/CameraHAL(103): Brightness set 50

07-24 03:34:30.310: I/CameraHAL(103): Antibanding set 50hz

07-24 03:34:30.310: I/CameraHAL(103): ISO set auto

07-24 03:34:30.310: I/CameraHAL(103): Exposure compensation set 0

07-24 03:34:30.310: I/CameraHAL(103): Scene mode set auto

07-24 03:34:30.310: I/CameraHAL(103): Effect set none

07-24 03:34:30.310: I/CameraHAL(103): Jpeg quality set 95

07-24 03:34:30.310: I/CameraHAL(103): Thumbnail width set 160

07-24 03:34:30.310: I/CameraHAL(103): Thumbnail width set 120

07-24 03:34:30.310: I/CameraHAL(103): Thumbnail quality set 60

07-24 03:34:30.310: I/CameraHAL(103): EXIF Model set GT-P5100

07-24 03:34:30.310: I/CameraHAL(103): EXIF Make set Samsung

07-24 03:34:30.310: I/CameraHAL(103): Zoom set 0

07-24 03:34:30.310: I/CameraHAL(103): Auto Exposure Lock set false

07-24 03:34:30.310: I/CameraHAL(103): Auto WhiteBalance Lock set false

07-24 03:34:30.310: I/ShotCommon(103): ShotCommon created: pid=103

07-24 03:34:30.318: I/ShotCommon(103): Preview width(640), height(480)

07-24 03:34:30.318: I/ShotCommon(103): Preview color format [yuv420sp]

07-24 03:34:30.318: I/ShotCommon(103): Picture width(640), height(480)

07-24 03:34:30.318: I/ShotCommon(103): Picture color format [jpeg]

07-24 03:34:30.318: I/ShotSingle(103): ShotSingle created: pid=103

07-24 03:34:30.318: E/CameraHAL(103): (329b70)   hardware/ti/omap4xxx/camera/CameraHal.cpp:2790 sendCommand - Preview is not running

07-24 03:34:30.318: V/AwesomePlayer(103): setDefault

07-24 03:34:30.318: V/AwesomePlayer(103): constructor

07-24 03:34:30.318: V/AwesomePlayer(103): setDefault

07-24 03:34:30.318: I/AwesomePlayer(103): reset

07-24 03:34:30.318: I/AwesomePlayer(103): cancel player events

07-24 03:34:30.318: V/AwesomePlayer(103): mAudioTrackVector clear

07-24 03:34:30.318: V/AwesomePlayer(103): setListener

07-24 03:34:30.318: V/AwesomePlayer(103): AwesomePlayer running on behalf of uid 10119

07-24 03:34:30.318: V/AwesomePlayer(103): setAudioSink

07-24 03:34:30.318: I/AwesomePlayer(103): cancel player events

07-24 03:34:30.318: V/AwesomePlayer(103): mAudioTrackVector clear

07-24 03:34:30.318: V/AwesomePlayer(103): prepareAsync

07-24 03:34:30.318: V/AwesomePlayer(103): onPrepareAsyncEvent

07-24 03:34:30.326: V/AwesomePlayer(103): track of type 'audio/vorbis' does not publish bitrate

07-24 03:34:30.326: V/AwesomePlayer(103): mBitrate = -1 bits/sec

07-24 03:34:30.326: V/AwesomePlayer(103): current audio track (0)

07-24 03:34:30.334: V/AwesomePlayer(103): current audio track (0) is added to vector[0]

07-24 03:34:30.334: V/AwesomePlayer(103): setDataSource_l: Audio(1), Video(0)

07-24 03:34:30.334: V/AwesomePlayer(103): notifyListner_l() msg (5-MEDIA_SET_VIDEO_SIZE), ext1 (0), ext2 (0)

07-24 03:34:30.334: V/AwesomePlayer(103): notifyListner_l() msg (1-MEDIA_PREPARED), ext1 (0), ext2 (0)

07-24 03:34:30.334: V/AwesomePlayer(103): setDefault

07-24 03:34:30.334: V/AwesomePlayer(103): constructor

07-24 03:34:30.334: V/AwesomePlayer(103): setDefault

07-24 03:34:30.334: I/AwesomePlayer(103): reset

07-24 03:34:30.334: I/AwesomePlayer(103): cancel player events

07-24 03:34:30.334: V/AwesomePlayer(103): mAudioTrackVector clear

07-24 03:34:30.334: V/AwesomePlayer(103): setListener

07-24 03:34:30.334: V/AwesomePlayer(103): AwesomePlayer running on behalf of uid 10119

07-24 03:34:30.334: V/AwesomePlayer(103): setAudioSink

07-24 03:34:30.334: I/AwesomePlayer(103): cancel player events

07-24 03:34:30.334: V/AwesomePlayer(103): mAudioTrackVector clear

07-24 03:34:30.334: V/AwesomePlayer(103): prepareAsync

07-24 03:34:30.334: V/AwesomePlayer(103): onPrepareAsyncEvent

07-24 03:34:30.341: V/AwesomePlayer(103): track of type 'audio/vorbis' does not publish bitrate

07-24 03:34:30.341: V/AwesomePlayer(103): mBitrate = -1 bits/sec

07-24 03:34:30.341: V/AwesomePlayer(103): current audio track (0)

07-24 03:34:30.341: V/AwesomePlayer(103): current audio track (0) is added to vector[0]

07-24 03:34:30.341: V/AwesomePlayer(103): setDataSource_l: Audio(1), Video(0)

07-24 03:34:30.341: V/AwesomePlayer(103): notifyListner_l() msg (5-MEDIA_SET_VIDEO_SIZE), ext1 (0), ext2 (0)

07-24 03:34:30.341: V/AwesomePlayer(103): notifyListner_l() msg (1-MEDIA_PREPARED), ext1 (0), ext2 (0)

07-24 03:34:30.341: V/AwesomePlayer(103): setDefault

07-24 03:34:30.341: V/AwesomePlayer(103): constructor

07-24 03:34:30.341: V/AwesomePlayer(103): setDefault

07-24 03:34:30.341: I/AwesomePlayer(103): reset

07-24 03:34:30.341: I/AwesomePlayer(103): cancel player events

07-24 03:34:30.349: V/AwesomePlayer(103): mAudioTrackVector clear

07-24 03:34:30.349: V/AwesomePlayer(103): setListener

07-24 03:34:30.349: V/AwesomePlayer(103): AwesomePlayer running on behalf of uid 10119

07-24 03:34:30.349: V/AwesomePlayer(103): setAudioSink

07-24 03:34:30.349: I/AwesomePlayer(103): cancel player events

07-24 03:34:30.349: V/AwesomePlayer(103): mAudioTrackVector clear

07-24 03:34:30.349: V/AwesomePlayer(103): prepareAsync

07-24 03:34:30.349: V/AwesomePlayer(103): onPrepareAsyncEvent

07-24 03:34:30.349: V/AwesomePlayer(103): track of type 'audio/vorbis' does not publish bitrate

07-24 03:34:30.349: V/AwesomePlayer(103): mBitrate = -1 bits/sec

07-24 03:34:30.349: V/AwesomePlayer(103): current audio track (0)

07-24 03:34:30.349: V/AwesomePlayer(103): current audio track (0) is added to vector[0]

07-24 03:34:30.349: V/AwesomePlayer(103): setDataSource_l: Audio(1), Video(0)

07-24 03:34:30.349: V/AwesomePlayer(103): notifyListner_l() msg (5-MEDIA_SET_VIDEO_SIZE), ext1 (0), ext2 (0)

07-24 03:34:30.349: V/AwesomePlayer(103): notifyListner_l() msg (1-MEDIA_PREPARED), ext1 (0), ext2 (0)

07-24 03:34:30.373: I/CameraHAL(103): PreviewFormat yuv420sp

07-24 03:34:30.373: I/CameraHAL(103): VSTAB false

07-24 03:34:30.373: I/CameraHAL(103): IPP mode set ldc-nsf

07-24 03:34:30.373: I/CameraHAL(103): PreviewResolution by App 640 x 480

07-24 03:34:30.373: I/CameraHAL(103): Recording Hint is set to NULL

07-24 03:34:30.373: I/CameraHAL(103): Focus mode set infinity

07-24 03:34:30.373: I/CameraHAL(103): Picture Size by App 640 x 480

07-24 03:34:30.373: I/CameraHAL(103): FRAMERATE 30

07-24 03:34:30.373: I/CameraHAL(103): FPS Range = 6000,30000

07-24 03:34:30.373: I/CameraHAL(103): DEFAULT FPS Range = 6000,30000

07-24 03:34:30.373: I/CameraHAL(103): SET FRAMERATE 30

07-24 03:34:30.373: I/CameraHAL(103): FPS Range [6, 30]

07-24 03:34:30.373: I/CameraHAL(103): GBCE Value = disable

07-24 03:34:30.373: I/CameraHAL(103): Exposure set = auto

07-24 03:34:30.373: I/CameraHAL(103): White balance set auto

07-24 03:34:30.373: I/CameraHAL(103): Contrast set 2

07-24 03:34:30.373: I/CameraHAL(103): Sharpness set 100

07-24 03:34:30.373: I/CameraHAL(103): Saturation set 100

07-24 03:34:30.373: I/CameraHAL(103): Brightness set 50

07-24 03:34:30.373: I/CameraHAL(103): Antibanding set 50hz

07-24 03:34:30.373: I/CameraHAL(103): ISO set auto

07-24 03:34:30.373: I/CameraHAL(103): Exposure compensation set 0

07-24 03:34:30.373: I/CameraHAL(103): Scene mode set auto

07-24 03:34:30.373: I/CameraHAL(103): Effect set none

07-24 03:34:30.373: I/CameraHAL(103): Jpeg quality set 95

07-24 03:34:30.373: I/CameraHAL(103): Thumbnail width set 160

07-24 03:34:30.373: I/CameraHAL(103): Thumbnail width set 120

07-24 03:34:30.373: I/CameraHAL(103): Thumbnail quality set 60

07-24 03:34:30.373: I/CameraHAL(103): EXIF Model set GT-P5100

07-24 03:34:30.373: I/CameraHAL(103): EXIF Make set Samsung

07-24 03:34:30.373: I/CameraHAL(103): Zoom set 0

07-24 03:34:30.373: I/CameraHAL(103): Auto Exposure Lock set false

07-24 03:34:30.373: I/CameraHAL(103): Auto WhiteBalance Lock set false

07-24 03:34:30.396: I/IMAGE_TARGETS(688): Configure Video Background : Video (640,480), Screen (1280,752), mSize (1280,960)

07-24 03:34:30.412: I/CameraHAL(103): PreviewFormat yuv420sp

07-24 03:34:30.412: I/CameraHAL(103): VSTAB false

07-24 03:34:30.412: I/CameraHAL(103): IPP mode set ldc-nsf

07-24 03:34:30.412: I/CameraHAL(103): PreviewResolution by App 640 x 480

07-24 03:34:30.412: I/CameraHAL(103): Recording Hint is set to NULL

07-24 03:34:30.412: I/CameraHAL(103): Focus mode set infinity

07-24 03:34:30.412: I/CameraHAL(103): Picture Size by App 640 x 480

07-24 03:34:30.412: I/CameraHAL(103): FRAMERATE 30

07-24 03:34:30.412: I/CameraHAL(103): FPS Range = 6000,30000

07-24 03:34:30.412: I/CameraHAL(103): DEFAULT FPS Range = 6000,30000

07-24 03:34:30.412: I/CameraHAL(103): SET FRAMERATE 30

07-24 03:34:30.412: I/CameraHAL(103): FPS Range [6, 30]

07-24 03:34:30.412: I/CameraHAL(103): GBCE Value = disable

07-24 03:34:30.412: I/CameraHAL(103): Exposure set = auto

07-24 03:34:30.412: I/CameraHAL(103): White balance set auto

07-24 03:34:30.412: I/CameraHAL(103): Contrast set 2

07-24 03:34:30.412: I/CameraHAL(103): Sharpness set 100

07-24 03:34:30.412: I/CameraHAL(103): Saturation set 100

07-24 03:34:30.412: I/CameraHAL(103): Brightness set 50

07-24 03:34:30.412: I/CameraHAL(103): Antibanding set 50hz

07-24 03:34:30.412: I/CameraHAL(103): ISO set auto

07-24 03:34:30.412: I/CameraHAL(103): Exposure compensation set 0

07-24 03:34:30.412: I/CameraHAL(103): Scene mode set auto

07-24 03:34:30.412: I/CameraHAL(103): Effect set none

07-24 03:34:30.412: I/CameraHAL(103): Jpeg quality set 95

07-24 03:34:30.412: I/CameraHAL(103): Thumbnail width set 160

07-24 03:34:30.412: I/CameraHAL(103): Thumbnail width set 120

07-24 03:34:30.412: I/CameraHAL(103): Thumbnail quality set 60

07-24 03:34:30.412: I/CameraHAL(103): EXIF Model set GT-P5100

07-24 03:34:30.412: I/CameraHAL(103): EXIF Make set Samsung

07-24 03:34:30.412: I/CameraHAL(103): Zoom set 0

07-24 03:34:30.412: I/CameraHAL(103): Auto Exposure Lock set false

07-24 03:34:30.412: I/CameraHAL(103): Auto WhiteBalance Lock set false

07-24 03:34:30.474: D/dalvikvm(688): GC_CONCURRENT freed 279K, 8% free 9446K/10183K, paused 2ms+3ms

07-24 03:34:30.490: D/dalvikvm(688): GC_FOR_ALLOC freed <1K, 8% free 9446K/10183K, paused 13ms

07-24 03:34:30.490: I/dalvikvm-heap(688): Grow heap (frag case) to 9.701MB for 464912-byte allocation

07-24 03:34:30.506: D/dalvikvm(688): GC_FOR_ALLOC freed <1K, 8% free 9900K/10695K, paused 17ms

07-24 03:34:30.560: D/dalvikvm(688): GC_EXPLICIT freed <1K, 8% free 9900K/10695K, paused 5ms+2ms

07-24 03:34:30.560: D/SecCameraCoreManager(103): virtual android::status_t android::SecCameraCoreManager::startPreview():start IT Policy checking thread

07-24 03:34:31.326: I/AR(688): Triggering autofocus not supported

07-24 03:34:31.326: I/AR(688): Setting the torch mode not supported

07-24 03:34:31.326: I/IMAGE_TARGETS(688): Java_com_qualcomm_QCARSamples_ImageTargets_ImageTargets_setProjectionMatrix

07-24 03:34:31.341: D/KeyguardViewMediator(215): setHidden false

07-24 03:34:31.365: W/SurfaceTexture(100): [com.qualcomm.QCARSamples.ImageTargets/com.qualcomm.QCARSamples.ImageTargets.ImageTargets] cancelBuffer: SurfaceTexture has been abandoned!

07-24 03:34:31.451: D/KeyguardViewMediator(215): setHidden false

07-24 03:34:31.459: D/KeyguardViewMediator(215): setHidden false

07-24 03:34:31.459: I/QCAR(688): Creating OpenGL ES 2.0 context

07-24 03:34:31.552: D/QCAR(688): GLRenderer::onSurfaceCreated

07-24 03:34:31.552: I/IMAGE_TARGETS(688): Java_com_qualcomm_QCARSamples_ImageTargets_ImageTargetsRenderer_initRendering

07-24 03:34:31.552: D/KeyguardViewMediator(215): setHidden false

07-24 03:34:31.560: D/KeyguardViewMediator(215): setHidden false

07-24 03:34:31.560: D/QCAR(688): GLRenderer::onSurfaceChanged

07-24 03:34:31.560: I/IMAGE_TARGETS(688): Java_com_qualcomm_QCARSamples_ImageTargets_ImageTargetsRenderer_updateRendering

07-24 03:34:31.560: I/IMAGE_TARGETS(688): Configure Video Background : Video (640,480), Screen (1280,752), mSize (1280,960)

07-24 03:34:31.685: I/CameraHAL(103): Camera 0.000000 FPS

07-24 03:34:31.685: D/(103): PPM: Standby to first shot: Sensor Change completed -  :1125.702 ms :  1343100871696 ms

07-24 03:34:34.716: I/CameraHAL(103): Camera 4.949500 FPS

07-24 03:34:37.748: I/CameraHAL(103): Camera 6.599300 FPS

 

Interaction between markers(FrameMarkers)

July 23, 2012 - 12:27pm #4

Did you add that to ImageTargets? What are you seeing in your log?

Interaction between markers(FrameMarkers)

July 23, 2012 - 12:22pm #3

Hi David,

Thanks again for the help.

I had followed the steps. I also have compiled the .cpp file successfully.

But when I run the app, it turn black after the splash screen.

Am I miss up anything? 

Interaction between markers(FrameMarkers)

July 23, 2012 - 11:54am #2

Here are instructions for drawing a line between two targets that show how to obtain an offset matrix, which can be used for accomplishing what you're trying to do..

The pose matrix defines a local coordinate system for each trackable, with the center of the trackable as the origin (0, 0, 0). This makes it easy to draw objects on top of each trackable independently, but not so easy to draw content that spans multiple trackables (taking their relative orientation into account). The solution is to pick one trackable to act as the world center, and bring all other trackables into its coordinate system. We do that by multiplying the inverse pose of our world center target (A) by the pose of our other targets (B). This creates an offset matrix that can be used to bring points on B into A's coordinate system. Now, we can bind a single modelview matrix tied to our world center and render all the points from the world center's point of view. That should let you draw lines between targets. This is a replacement renderFrame method for the ImageTargets sample. It will render a line connecting the center of the chips target to the center of the stones target. You will need to do a few setup steps first:

 

  1. Copy SampleMath.cpp and SampleMath.h from the Dominoes/jni folder to the ImageTargets/jni folder.
  2. Add SampleMath.cpp to the LOCAL_SRC_FILES flag in the ImageTargets Android.mk file.
  3. Uncomment the following lines in ImageTargets.cpp: 


QCAR::setHint(QCAR::HINT_MAX_SIMULTANEOUS_IMAGE_TARGETS, 2);

QCAR::setHint(QCAR::HINT_IMAGE_TARGET_MULTI_FRAME_ENABLED, 1);

Replace the renderFrame method with the following:

 

#include "SampleMath.h"

JNIEXPORT void JNICALL
Java_com_qualcomm_QCARSamples_ImageTargets_ImageTargetsRenderer_renderFrame(JNIEnv *, jobject)
{
// Clear color and depth buffer
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

// Render video background:
QCAR::State state = QCAR::Renderer::getInstance().begin();

#ifdef USE_OPENGL_ES_1_1
// Set GL11 flags:
glEnableClientState(GL_VERTEX_ARRAY);
glDisableClientState(GL_NORMAL_ARRAY);
glDisableClientState(GL_TEXTURE_COORD_ARRAY);
glDisable(GL_LIGHTING);
#endif

glDisable(GL_TEXTURE_2D);
glEnable(GL_DEPTH_TEST);
glEnable(GL_CULL_FACE);

QCAR::Matrix44F mainModelViewMatrix;
QCAR::Vec3F targetCenters[2]; // make this big enough to hold all your targets

// Did we find any trackables this frame?
for(int tIdx = 0; tIdx < state.getNumActiveTrackables(); tIdx++)
{
// Get the trackable:
const QCAR::Trackable* trackable = state.getActiveTrackable(tIdx);
QCAR::Matrix44F modelViewMatrix =
QCAR::Tool::convertPose2GLMatrix(trackable->getPose());

if (tIdx == 0)
{
// Make the first visible target our world center (0, 0, 0)
// Store its modelViewMatrix and continue looking for other targets
mainModelViewMatrix = modelViewMatrix;
targetCenters[0].data[0] = 0.0f;
targetCenters[0].data[1] = 0.0f;
targetCenters[0].data[2] = 0.0f;
}
else
{
// This is another visible target
// Find its center point in relation to the first target
// To do this we use the matrix inverse function (SampleMath.h from the Dominoes project)
QCAR::Matrix44F mainModelViewInverse = SampleMath::Matrix44FInverse(mainModelViewMatrix);
QCAR::Matrix44F modelViewTranspose = SampleMath::Matrix44FTranspose(modelViewMatrix); // let's work with row-major matrices
QCAR::Matrix44F offsetMatrix = QCAR::Tool::multiply(mainModelViewInverse, modelViewTranspose);

// Transform a point on the second target by this offset matrix
// (0, 0, 0) is the local center of the target
QCAR::Vec4F position(0.0f, 0.0f, 0.0f, 1.0f);
position = SampleMath::Vec4FTransform(position, offsetMatrix);

// Add this position to our array
targetCenters[1].data[0] = position.data[0];
targetCenters[1].data[1] = position.data[1];
targetCenters[1].data[2] = position.data[2];
}
}

if (state.getNumActiveTrackables() > 1)
{
#ifdef USE_OPENGL_ES_1_1
// Load projection matrix:
glMatrixMode(GL_PROJECTION);
glLoadMatrixf(projectionMatrix.data);

// Load model view matrix:
glMatrixMode(GL_MODELVIEW);
glLoadMatrixf(mainModelViewMatrix.data);

// Set the color to red:
glColor4f(1.0f, 0.0f, 0.0f, 1.0f);

// Draw object:
glVertexPointer(3, GL_FLOAT, 0, (const GLvoid*) &targetCenters[0].data[0]);
glDrawArrays(GL_LINES, 0, 2);
#else

QCAR::Matrix44F modelViewProjection;

SampleUtils::multiplyMatrix(&projectionMatrix.data[0],
&mainModelViewMatrix.data[0],
&modelViewProjection.data[0]);

glUseProgram(shaderProgramID);

glVertexAttribPointer(vertexHandle, 3, GL_FLOAT, GL_FALSE, 0,
(const GLvoid*) &targetCenters[0].data[0]);

glEnableVertexAttribArray(vertexHandle);

glUniformMatrix4fv(mvpMatrixHandle, 1, GL_FALSE,
(GLfloat*) &modelViewProjection.data[0]);
glDrawArrays(GL_LINES, 0, 2);
#endif
}

glDisable(GL_DEPTH_TEST);

#ifdef USE_OPENGL_ES_1_1
glDisable(GL_TEXTURE_2D);
glDisableClientState(GL_VERTEX_ARRAY);
glDisableClientState(GL_NORMAL_ARRAY);
glDisableClientState(GL_TEXTURE_COORD_ARRAY);
#else
glEnable(GL_TEXTURE_2D);
glDisableVertexAttribArray(vertexHandle);
glDisableVertexAttribArray(normalHandle);
glDisableVertexAttribArray(textureCoordHandle);
#endif

QCAR::Renderer::getInstance().end();
}

Log in or register to post comments