"We offer new support options and therefor the forums are now in read-only mode! Please check out our Support Center for more information." - Vuforia Engine Team

Predictive tracking and AR

In the articles "Changes in Vuforia 5.5" and "Using the Rotational Device Tracker" it seems to be implied that AR applications do not support prediction.

Is this statement correct?Does this mean that the trackable results are always providing poses corresponding to the camera frame used for the tracking?

Is this also true for optical see-through devices like ODG glasses?

Secondly, how do you compute the prediction interval and is it possible to force a different interval on the fly if we have a deeper or variable pipeline?

As an example in the Oculus SDK prediction is a two-step process, first you ask the expected display time of the current frame, then you ask the predicted pose for that expected time. But you can also feed your own time to the predictor.

Thanks