Log in or register to post comments

Particles - Local vs World Space

February 24, 2016 - 7:34pm #1

I'm creating an AR experience where a cylinder target emits particles. 

I'm using the the Cylinder Target sample project inside Unity. My AR Camera's "Wolrd Center Mode" is set to "Camera". I've parented a particle system to the cylinder target game object. The particle system is set to world simulation space, allowing the emitted particles to be independent of it's parents transformations. With a static camera, this behavior allows me to move the cylinder target without affecting the translation/rotation/scale of the active particles.

The issue arises when I move my physical camera (an iPhone) in 3D space. The particles seem to be attached to the camera. Because they are set to world simulation space, when I move my iPhone around in 3D space, so do the particles. It appears as if the emitted particles are glued to the screen. 

The goal is to accomplish two things:

- Have the particles move independently of the cylinder target upon being emitted. 

- Have the emitted particles have some sort of relation to the scene so they don't appear to be floating when the physical camera is moved. 

Any help would be much appreciated! 

 

Particles - Local vs World Space

February 25, 2016 - 6:46am #2

Hi,

the problem is the following:

- if you set the ARCamera World Center Mode to FIRST_TARGET or SPECIFIC_TARGET (settng it to your Cylinder Target), the Unity World will be anchored to the Target itself, so, basically the Target remains static in the Unity World; this means that, even if you rotate your cylinder, the gravity vector will not change; similarly, if you change the Cylinder position (e.g. you translate the cylinder), the World will move with it, so the particles (which are parented to the Cylinder Target) will also instantly follow the cylinder motion, and the while particlwe system will appear as if it was glued to the cylinder refernce frame

- on the other hand, if you set the ARCamera World Center Mode to CAMERA, and you define your own ARCamera position and orientation in the Unity Scene, you solve the problem above, as the particles are no longer fixed in the World with the Cylinder, i.e., now if you hold your handheld device at a fixed location, looking at the Cylinder Target, and you move / rotate just the Cylinder Target, you should get your particle system to correctly animate in the Unity World, without being affected by the Target position (except that they will be emitted from the Target center, if you have parented the particle system on the Target);

however, the latter solution also has a problem, i.e. the effect breaks as soon as you move your device (camera) around; and indeed, this is basically what you describe with " It appears as if the emitted particles are glued to the screen. ".

The problem is that, as the Cylinder Target defines a pose with respect to the ARCamera (so to have Augmented Reality), you need to assume that, either the Target is static (and so you can move your Camera around the target), or the Camera is static, and so you can move your Target w.r.t. the Camera;

in a scenario where you plan to move both the Target and the Camera (like the one you describe), you would need a third input data to define what is the "absolute" world reference frame, so that you can track both the Target and the Camera position with respect to such absolute world reference.

To achieve this, however, you would need to accurately track the position and orientation of your device in 3D space; 

one way of achieving this is to use a second Target (for example an Image Target which you will print and stick at a fixed location on your desk / floor / table ) and set the ARCamera World Center Mode to SPECIFIC_TARGET and set the reference target to such Image Target.

By doing this, you will achieve the following:

- the Image Target defines your World reference frame 

- the Cylinder Target moves w.r,t, such world frame

- the ARCamera also moves w.r.t. such world frame

(make sure to set Max Simltaneous Image Targets to 2 in the ARCamera inspector)

 

The caveat with the technique above is that you need to always keep both the Image Target and the Cylinder Target in view simultaneously; however, you could mitigate this by enabling Extended Tracking on the Image Target, so that you can have some tolerance in case the Image Target gets out of view for short time intervals.

Another possibility is to programmatically update the ARCamera position and orientation via a script, by using the Android (or iOS) Orientation Sensors of your device; however, this would require impleemnting a native Android or iOS plugin and then integratiing it into Unity; also, this would probably give you a good orientation tracking but it will be challenging to get a decent and reliable position (translation) tracking, as the accelerometers typically suffer some significant drift.

 

A third option might be to use the Cardboard SDK for Unity, which exposes the device orientation in Unity, and update the ARCamera orientation based on the orientation reported in the CardoardHead prefab; but again, this will be OK for orientation tracking, but not for position tracking.

 

 

Log in or register to post comments