Hello, sorry to post in such an old thread, feel free to tell me to start a new one, I just thought I should use this one as it provides clear context.
Basically I require some Unity water reflection in my latest AR project, however as everybodies nodoubt aware by now, getting round the reflection bug isn't easy. Water4Example isn't the solution it appears to be at first glance, because its default settings do very little reflection of other scene objects, turn the reflection up on it and you run into the same bug as with pro only daylight water.
I've been impimenting Kim's genius solution quoted below and have had some success.
This part is rather challenging, especially since you don't know the FOV of the camera (QCAR sets this under the hood). One approach is to do some raycasting to find the bounds of the far clip plane and fit the mesh to that. Here's a snippet that might help:
// Scale game object for full screen video image: gameObject.transform.localScale = new Vector3(1, 1, 1 * (float)mTextureInfo.imageSize.y / (float)mTextureInfo.imageSize.x); // Position the mesh at the far end of the perspective frustum // Choose a point almost at the far clip plane float dist = m_Camera.farClipPlane * 0.99f; // Define a plane at the chosen distance Plane farPlane = new Plane(m_Camera.transform.forward, m_Camera.transform.position + m_Camera.transform.forward * dist); // Cast a ray along the lower left and upper right edges of the frustum Ray ray1 = m_Camera.ScreenPointToRay(new Vector3(0, 0, 0)); Ray ray2 = m_Camera.ScreenPointToRay(new Vector3(m_Camera.pixelWidth, m_Camera.pixelHeight, 0)); // Find the points at which the rays intersect the plane float rayDist = 0.0f; farPlane.Raycast(ray1, out rayDist); Vector3 p1 = m_Camera.transform.InverseTransformPoint(ray1.GetPoint(rayDist)); farPlane.Raycast(ray2, out rayDist); Vector3 p2 = m_Camera.transform.InverseTransformPoint(ray2.GetPoint(rayDist)); // Position the video mesh at the chosen distance gameObject.transform.localPosition = new Vector3(0, 0, dist); // Scale the video mesh to stretch between the intersection points gameObject.transform.localScale *= (p2.x - p1.x) / 2.0f;
I currently have a plane rendering the camera feed appearing just before the farclipplane of the only functioning camera in the scene, which is a perspective ARCamera (with solid colour clear flags.) However this only semi-works and is prevented from being a perfect solution by two issues:
1. The display ratio as dictated by mTextureInfo (QCARRenderer.VideoTextureInfo) seems to result in the plane being the appropriate width for the screen, but leaving gaps at the top and bottom of the screen, presumably because the camera is perspective not orthographic. I can resize the plane to get around this but by doing this i'm stretching or obscuring parts of the camera texture which isn't a smart move.
2. Regardless of if I resize the plane the AR, which renders perfectly at the centre of the screen, now drifts off target when the target is moved to the sides of the screen, which I assume is due to a disconnect between where the background is being rendered and where the AR positioner thinks its being rendered.
I'd be really grateful if anyone has a solution or any advice to getting any kind of reflective water or mirrorreflection working with vuforia AR, seems a little crazy if in two years this hasn't been solved... I mean, think of all the cool AR things people could be doing with glasses of water which react to being poured etc.