Top
refractleg6

Physically Based Planar Refraction Shader

A while ago I spent some time figuring out how to get glass “aquarium tank” style refraction working in a vertex shader targeting mobile VR and thought I’d share the results.

Background

I’m lucky enough to be working a great deal with desktop VR currently, both at Popup Asylum and my day job, but there’s been a lot of mobile VR developments recently and I wanted to explore the VR capabilities in that area, thinking about what effects could be achieved on restricted hardware that might not have been tried before. With Popup Asylum having a fairly large library of mobile ready underwater assets at my disposal as well as PA Particle Field being fairly adept at handling large schools of fish, I began to consider a single room non-euclidean aquarium that would show off the assets but also fit well with the design and performance restrictions of mobile VR.

With that idea in mind I started googling aquariums for reference and a common visual factor of aquarium videos was the way the glass refracted the contents of the tank, refraction is usually considered a heavy effect and I like a challenge so I decided this would be a good place to start.

What struck me was that this kind of planar refraction could be achieved in the vertex shader as each point in the real image mapped to a single point in the refracted image, unlike rippled water and distorted glass where multiple points in the refracted image can map to a single point of the real image. By defining a geometric plane with a refractive index in the shader we can achieve a ray traced style refraction, the Refract function below is commented but I’ll go through it step by step;

Shader Function

The function takes 3 arguments, the vertex position (or any position) in world space, a float4 describing the normal and position of the plane and 1/refractive index, and returns a refracted point for that position in world space.

Step by Step

The function starts by setting up variables that will define the initial ray, namely a position and a direction


//ray origin is the camera position
float3 viewerPosition = _WorldSpaceCameraPos.xyz;
//ray end is the vertex's undistorted position
float3 vertexPosition = position;
//get the vector from the camera to the vertex
float3 worldRay = vertexPosition - viewerPosition;
//normalize it for direction
float3 worldRayDir = normalize(worldRay);

This takes the built in _WorldSpaceCameraPos variable as the ray start and the direction from the camera to the vertex as the direction.
Next the plane is defined, which consists of a normal and a position in world space along that normal.


//surface is a vector4 that defines a plane
float3 worldPlaneNormal = surface.xyz;
//define a known position on the plane
float3 worldPlaneOrigin = worldPlaneNormal * surface.w;

refract_1

Now the initial ray and plane is defined, the ray direction is refracted on the plane normal with the refractive index, cg/hlsl/glsl has a built in function for this

//get the vector result of the worldRay entering the water
float3 refraction = refract(worldRayDir, normalize(worldPlaneNormal), refractionIndex);

refract_2

This gives us the direction that a ray crossing the plane would take. Normally in a ray tracing engine this direction would be queried to find where it intersects with some geometry then return the resulting pixel color, in this case we want to do the opposite, we already have the pixel color (it will be looked up in the fragment shader), what we need to know is where to draw it on screen. This can be approximately achieved by performing ray-plane intersection from the vertex position to the plane in the reversed refracted ray direction.


//raycast from the vertex, backwards along the refraction vector
float denom = dot(-worldPlaneNormal, -refraction);
float3 p010 = worldPlaneOrigin - vertexPosition;				
float t = dot(p010, -worldPlaneNormal) / denom;				
float3 intersection = vertexPosition + refraction * -t;

refract_3

Finally, getting the position from the camera through that intersection point with the initial ray’s length gives the refracted position.


//get the vector from the camera to the intersection, this is the perceived position
float3 originToIntersection = intersection - viewerPosition;
//starting from the camera, move along the perceived position vector by the original ray length
return viewerPosition + normalize(originToIntersection) * length(worldRay);

refract_4

This can then be fed into the rest of the vertex shader.

The result is a refraction that behaves realistically and displays the geometry from a slightly different view point as a real refraction does. This isn’t totally accurate since the refracted ray is calculated based on the initial ray direction, for full accuracy the refracted ray would need to be calculated using the vector from the camera to the intersection position, but this would require integration and I felt the result was close enough with out it.

This results in a very clean refraction like the glass of a fish tank, but for a distorted refraction like rippled water (still planar overall) I would still use this approach. Usually a distorted refraction is a render texture of the scene from the current camera, looked up with UV offsets sourced from a texture, not physically based at all but creates a nice mock refraction distortion effect. The style of refraction outlined above could be used to generate the render texture with some degree of realism, then the mock distortion effect could be applied to that texture.

The only other thing to add here is that now it’s quite easy for something to be unintentionally culled as it’s outside the cameras regular field of view but still in view with the refraction, to prevent the object being culled I used a behaviour that modified the objects position to its refracted position before the camera’s culling.

2 Comments

Leave a comment

Your email address will not be published.

//