I hope someone here may help me with this problem because nobody else seems to have an idea.
I'm working on a new Galaxy S3 and found a problem that didn't occur on other phones.
My problem is strange. When drawing my OpenGL models on top of my camera image like that:
mGLView = new GLSurfaceView(this);
GameRenderer renderer = new GameRenderer(kamera, this);
InputManager.getInstance().currentActivity = renderer;
mGLView.setEGLConfigChooser(8, 8, 8, 8, 16, 0);
setContentView(new CameraView(this), new LayoutParams(LayoutParams.MATCH_PARENT, LayoutParams.MATCH_PARENT));
addContentView(mGLView, new LayoutParams(LayoutParams.MATCH_PARENT, LayoutParams.MATCH_PARENT));
So the GLRenderer should be transparent. It actually is. It works fine for all the objects that are opaque as well as the background (where I see the image of the camera)
My problem is, when I make the pixels semi-transparent in the fragment Shader I get a weird result. The brightest pixels seem overexpose when "blending together". And it looks like they got clamped after so the value that brighter than white - loosing probably the first bit - got dark again.
So only the brightest pixels are affected.
I use a simple call like:
gl_FragColor = vec4(clamp(textureColor.rgb * lightWeighting.xyz, 0.0, 1.0), 0.5);
So the actual GL-Fragment is clamped and I'm pretty sure it comes from the camera preview itself.
Here is an image that describes my problem:
I hope someone here has seen that problem before. Is there any way of "clamping" both values together since in the Shader I can only clamp the actual rendered fragment somehow. :-(
By the way: it works perfectly fine on my old Galaxy S1 and on the S2 as well.
Thank you for your help,