Log in or register to post comments

Motion sensor fusion

December 27, 2011 - 9:41pm #1

Hi, this is probably highly non-trivial to implement but there is some work going on for improving target tracking by taking device motion sensors into account. See example videos at http://www.cs.cmu.edu/~myung/IMU_KLT and http://youtube.com/watch?v=afA3dQNEa68 (they aren't doing AR just feature tracking supported with IMU). Since the iPhone has these... any thoughts?

Thanks for the good work and helpful support

Re: Motion sensor fusion

December 29, 2011 - 2:18pm #3

Could you post this request to the http://ar.qualcomm.at/node/2001256? Thanks!

- Kim

Re: Motion sensor fusion

December 29, 2011 - 12:34am #2

Yes!!
This would be a huge enhancement to the overall stability, by a great deal. Just need to be sure the developer is in charge of supplying gyroscopic and accelerometer data directly. Otherwise the library would muck with the ability of a developer to access that data.

There are two cases:
1) Target Object is being moved, so we need to supply remote tracking data, possibly sourced from target object being an iPod Touch, iPhone, or iPad.
2) Self is being moved, so developer needs to supply motion data from local accelerometer and gyros.

Combining these two could provide some incredible uses of dual-device AR interactivity.

The latency should be taken into account, so it can be forward projected in time and integrated with the camera vision stream. Lazy evaluation of sample data could be a good setup for handling drop-outs and variability of the motion data. Forward projecting the motion data is helpful for feature tracking localization (more efficient, smaller bounding areas). If you wanted to get really fancy, you could auto-correlate the latency difference between motion data vs vision feature tracking.

Cheers!
- Aaron
Conquer Mobile

Log in or register to post comments