Hi,
I have really frustrating problem. I have very simple app with just one frame marker (0). I load a jpeg dynically from a URL. I then render the jpeg as a texture on a plane game object. This works as expected on MacBook Pro in Unity Player (see attachment MacBook Pro), but is displaying as magenta square on iPad (see attachment iPad). It's running exactly the same code (see below).
Can someone tell me what I'm doing wrong? The frustrating part is that it works in Unity Player, but not on the device, which means I'm now hopelessly stuck. Can someone suggest an alternate way of doing this?
I'm using;
Vuforia 5.0.6
Unity 5.2.1
OS X El Capitan
Xcode 7.1
iPad 4th Gen iOS 8.3
Source code:
using UnityEngine;
using System.IO;
using System.Collections;
namespace Vuforia
{
/// <summary>
/// A custom handler that implements the ITrackableEventHandler interface.
/// </summary>
public class DynamicTextureTrackableEventHandler : MonoBehaviour,
ITrackableEventHandler
{
#region PRIVATE_MEMBER_VARIABLES
private TrackableBehaviour mTrackableBehaviour;
private Texture2D texture;
private string filePath;
#endregion // PRIVATE_MEMBER_VARIABLES
#region UNTIY_MONOBEHAVIOUR_METHODS
IEnumerator Start()
{
mTrackableBehaviour = GetComponent<TrackableBehaviour>();
if (mTrackableBehaviour)
{
mTrackableBehaviour.RegisterTrackableEventHandler(this);
}
texture = new Texture2D(400, 400, TextureFormat.DXT1, false);
filePath = "http://10.20.11.38/Server/Services/Media1.jpg";
WWW www = new WWW (filePath);
yield return www;
this.texture = www.texture;
}
IEnumerator LoadImageUrl(WWW www) {
yield return www;
}
#endregion // UNTIY_MONOBEHAVIOUR_METHODS
void OnGUI()
{
GUI.Box (new Rect (10, 20, 780, 35), filePath);
GUI.Box (new Rect (10, 70, 600, 600), this.texture);
}
#region PUBLIC_METHODS
/// <summary>
/// Implementation of the ITrackableEventHandler function called when the
/// tracking state changes.
/// </summary>
public void OnTrackableStateChanged(
TrackableBehaviour.Status previousStatus,
TrackableBehaviour.Status newStatus)
{
if (newStatus == TrackableBehaviour.Status.DETECTED ||
newStatus == TrackableBehaviour.Status.TRACKED ||
newStatus == TrackableBehaviour.Status.EXTENDED_TRACKED)
{
OnTrackingFound();
}
else
{
OnTrackingLost();
}
}
#endregion // PUBLIC_METHODS
#region PRIVATE_METHODS
private void OnTrackingFound()
{
Renderer[] rendererComponents = GetComponentsInChildren<Renderer>(true);
Collider[] colliderComponents = GetComponentsInChildren<Collider>(true);
// Enable rendering:
foreach (Renderer component in rendererComponents)
{
component.enabled = true;
}
// Enable colliders:
foreach (Collider component in colliderComponents)
{
component.enabled = true;
}
Debug.Log("Trackable " + mTrackableBehaviour.TrackableName + " found");
GameObject go = GameObject.CreatePrimitive (PrimitiveType.Plane);
go.transform.parent = mTrackableBehaviour.transform;
go.transform.localPosition = new Vector3 (0f, 0f, 0f);
go.transform.localRotation = new Quaternion (0f, 90f, 0f, 0f);
go.transform.localScale = new Vector3 (0.15f, 0.15f, 0.15f);
go.GetComponent<Renderer>().material.mainTexture = this.texture;
go.SetActive (true);
}
private void OnTrackingLost()
{
Renderer[] rendererComponents = GetComponentsInChildren<Renderer>(true);
Collider[] colliderComponents = GetComponentsInChildren<Collider>(true);
// Disable rendering:
foreach (Renderer component in rendererComponents)
{
component.enabled = false;
}
// Disable colliders:
foreach (Collider component in colliderComponents)
{
component.enabled = false;
}
Debug.Log("Trackable " + mTrackableBehaviour.TrackableName + " lost");
}
#endregion // PRIVATE_METHODS
}
}
Attachment | Size |
---|---|
![]() | 1.32 MB |
![]() | 1.33 MB |
Hi,
I've come back to this problem and been trying to solve it, and it appears to be iOS specific. Basically as described below I have a very simple scene with AR Camera (using Vuforia extension for Unity) and one frame marker. When frame marker is recognised (by Vuforia), the OnTrackingFound() method is called. In this method I load a JPG image, convert it to texture (doesn't matter whether I use PVRTC compression or not), create a 3D plane programatically and render the texture on the plane (see code below).
Now this works fine in Unity Editor. However, when I build for iOS and then deploy to iPad (via Xcode) the texture is not rendered (magenta coloured shape with the correct dimensions is rendered). I have definitively proven that this is NOT a texture loading/compression problem. Because in Unity I can add the 3D plane at design time (rather than programatically as in my code below) while adding the texture at run-time and such setup works on iPad (i.e. correct image is rendered).
So my question is why is adding 3D plane in Unity Editor produces the correct texture being rendered on iPad while adding 3D plane in code (like below) doesn't. Can some one tell what is missing in my code?
Now I'm using:
Unity 5.3.4
Vuforia 5.5.9
iPad Air 2 (16GB)
iOS 9.2.1