I already searched the forums and found some things that sounded interesting, but I'm not sure if it applies for this particular problem.
So to get a clear answer I started this topic.
For example, Ksiva says in another topic:
No, this is a hardware issue, not something we can control.
Ah, in Unity it's a little less straightforward. You would have to edit the TrackerBehaviour (1.0.6) or QCARBehaviour (1.5) script, located in the Qualcomm Augmented Reality/Scripts folder. See the ConfigureVideoBackground method in that script.
Here's the problem: we're experimenting with the new iPad, and we noticed the poor quality of the camera when running a Unity3d app.
The camera quality is very crisp if you use the "built-in" camera app. But when you start a Unity-app the quality is very poor.
In the attachment you can see the difference between a screenshot and a photo I've taken with the built-in camera app.
You can see a big difference when you, for example; look at the phone, the buttons on the phone, the cables etc.
I already tried the MODE_OPTIMIZE_QUALITY setting for the ARCamera. Didn't make a difference. In fact, the screenshot was taken with this setting applied.
The hard- and software we use:
- Mac OS 10.7.3
- Vuforia 1.5.9 sdk
- Unity 3.5.0f5
- Xcode 4.3.2