Log in or register to post comments

Relative camera position

April 10, 2012 - 4:41am #1

Is it possible to get the camera position (x, y, z) in the target system coordinate ?

Relative camera position

August 14, 2012 - 12:11am #6

Thanks for your answer David. And yes, this is exactly what I'm trying to achieve (sorry if I did not make myself clear, I'm french). I'm using the latest SDK on android (1.5.9).

Relative camera position

August 13, 2012 - 8:12pm #5

To confirm, the effect that you're trying to achieve is that the light source seems to follow the position of the ARCamera?

Also which of the SDKs are you using?

Relative camera position

August 12, 2012 - 10:23am #4

Hello.

I know this might be out of the vuforia forum purpose, but maybe I'm missing something very simple and maybe somebody will answer. Is it possible, once we have the inverted matrix (by using the Matrix44FInverse static method ?) to pass it to a shader uniform ? My goal is to build a simple per fragment lighting (done, thanks to the Internet) and mapping a vec3 light position to the virtual camera position (or using a mat4 uniform holding the inverted matrix and do the lighting calculations with it). For now, I'm using this :

#define STRINGIFY(x) #x

static const char volumeVertShader[] = STRINGIFY(
        uniform mat4 u_MVPMatrix;
        uniform mat4 u_MVMatrix;

        attribute vec4 a_Vertex;
        attribute vec3 a_Normal;

        varying vec3 v_Position;
        varying vec4 v_Color;
        varying vec3 v_Normal;

        // The entry point for our vertex shader.
        void main()
        {
            v_Position = vec3(u_MVMatrix * a_Vertex);
            v_Color = vec4(1.0, 1.0, 1.0, 1.0);
            v_Normal = vec3(u_MVMatrix * vec4(a_Normal, 0.0));

            gl_Position = u_MVPMatrix * a_Vertex;
        }
);

static const char volumeFragShader[] = STRINGIFY(
        varying vec3 v_Position;
        varying vec4 v_Color;
        varying vec3 v_Normal;

        // The entry point for our fragment shader.
        void main()
        {
            vec3 lightPos = vec3(0.0, 0.0, 220.0);
            // attenuation
            float distance = length(lightPos - v_Position);
            // lighting direction vector
            vec3 lightVector = normalize(lightPos - v_Position);
            // dot product (illumination)
            float diffuse = max(dot(v_Normal, lightVector), 0.1);
            // add attenuation
            diffuse = diffuse * (1.5 / (1.0 + (0.25 * distance * distance)));
            // final output color
            gl_FragColor = v_Color * diffuse;
        }
);

Both modelView and modelViewProjection matrices are passed to the shader from the FrameMarkers sample QCAR::Matrix44F modelViewMatrix and QCAR::Matrix44F modelViewProjection. For now, you could see that the fragment shader is using constant values for the light position. The result is somehow working when you point the camera at the front of the frame marker at a certain distance. I'm missing some matrix calculations hence the light position is reflecting the model center, meaning that when I move the camera forward, the light is pushed backward... Anyway, as I said, I'm not trying to use a static light position, that was for testing only. I'm a super newbie to openGL and I'm seriously struggling with shading language and matrices.

Thanks for your answer.

Moderators, feel free to erase that post but please point me to a ressource on the web, I can't even formulate a google query that could answer my issue.

Re: Relative camera position

April 11, 2012 - 7:14am #3

Thank you. It's what i was looking for.

Re: Relative camera position

April 10, 2012 - 3:29pm #2

If you take the inverse of the pose matrix you should get the position/orientation of the camera from the target's point of view.

- Kim

Log in or register to post comments