I doubt this is specific to Unity but I'm using the Unity extension. I'm building an app that requires the physical distance between the camera and a given target. I've been finding that firstly the distance returned is different to the actual distance (after scaling given the targets physical size), and secondly the distance returned is different across different devices. I'm being careful to make sure that the physical position of the camera lens is in the same physical location when measuring different devices - and directly in "front" of the target, the most trivial of examples. This person also seemed to have the same issues (with no obvious resolution) using Android devices.
It's the discrepancy across devices that most disturbs me as it looks like I'll have to introduce a calibration step into the app, is this the case? Surely this isn't the first time this has been raised?
My square target is 15cm and the camera is ~1m away.
- iphone 6 measures ~1m 20
- iPad Air 2 measures ~1m 16
- iPad 2 measures ~99.5
The significant error (on the retina devices) seems to be "roughly constant" over distance i.e. when measuring 50cm away, the computed distance is ~60cm for iPhone 6, approximately the same error. So I figured that could introduce a scaling factor and this would be an ugly fix, but then I measure with other devices and get a different error. Interestingly, when I measure the distance between 2 markers that are roughly the same distance from the camera, the distance between them is physically fairly accurate (although this has not been exhaustively tested).
The computed distance is as following:
Vector3 vToCamera = mImageTarget.transform.position - mCamera.transform.position; float distance = vToCamera.magnitude;
Hopefully I'm just missing something simple? I didn't think I would have to be concerned with large inconsistencies across devices.
Any help would be greatly appreciated.