Cass-E Design

Design, Experiments, Games, Resources

p5.js Off Axis Projection

I started playing with p5.js & off axis projection recently, and I wanted to put together my demo and thoughts. Using Off Axis Projection, we can render content on a screen which appears to be truly 3-dimensional to the viewer. In order to give the impression of true 3D, we need to match the content to the user's perspective. My goal here has been to do this as accessibly as possible, using portable code anyone can run in a browser.

My demo is down below! I created a scene in Blender, and baked the lighting & textures using the Cycles renderer. Then I exported the scene to render in p5.js. This means that the content of the scene is more or less constrained to be static, but a regular browser should be able to explore relatively high fidelity scenes. It's a work in progress.

You'll probably have to click on the sketch to get any response, and if you don't have a camera, you'll have to uncheck the 'tracking' box underneath to use your mouse instead.

Demo

Space Image out the window credit to Felix Mittermeier on Pexels

Details: The project contains a second canvas that displays an overview of the contents, along with a representation of the viewer position. It also contains 3 sliders and a checkbox.

  • Checkbox is used to enable/disable head tracking, allowing you to simulate the head position with the cursor.
  • Sliders are X, Y, and Z respectively, and allow the user to offset the view from the head tracking position

I used the JavaScript library headtrackr (see dependencies), which is an older head tracking library. It cleverly aimed to make head tracking more performant on the browser by only performing actual head tracking occasionally, and using cheaper tracking every tick for greater fidelity.

What Is Projection

Projection in rendering is the process of creating a 2d representation for coordinates in a 3d world. Typically on computers we render 3d scenes with perspective or orthographic projection. As Wikipedia points out, these are really categories, but I've seen these words used most often to describe the same basic ideas.

Perspective rendering is the one you're probably most familiar with. It represents objects more or less like a camera. There's usually one vanishing point, and distant things recede in size according to the Field of View. Most video games take advantage of perspective rendering. More on Wikipedia

Orthographic is the second most common. It's like perspective, but with no adjustment over depth. It's frequently used to generate to-scale plans, charts, diagrams, or even 3D decals, that want for an image without perspective warping. More on Wikipedia

These aren't the only ways that people represent 3D space. There are other methods and subcategories of projection, and also 'faked' 3d styles included in games or rendering environments that don't truly support 3D.

Off Axis Projection

The Problem: How do you render something so that it looks 3d to the viewer. Typical projection is designed to make things appear 3d relative to eachother, and not the medium itself. These also almost always considers the screen the origin, or assumes that the viewer will be in a same basic position in front of the screen. As Robert Kooima points out in his paper Generalized Perspective Projection (see citations) this is usually good enough, although not in our case. He goes on to describe in general terms the concept of Off Axis Projection, with his use case of multiple screens for Virtual Reality.

The Solution: Anamorphosis, and viewer tracking. Anamorphosis is creating an image that looks correct from one position, or requires other special viewing conditions (ie. special glasses). One well known and entertaining example of Anamorphosis is The Ambassadors by Hans Holbein the Younger, painted in 1533. You've probably heard of it:

The Ambassadors, Hans Holbein the Younger, 1533
Image credit Wikipedia
When the painting is viewed from the correct position
Image credit Wikipedia

There's a fun (unsubstantiated) story about The Ambassadors being hung in a narrow stairway, so that the skull would seem to jump out when the viewer position became perfect, and scare the daylights out of them. The painting's Wikipedia page mentions it.

So if we can create an image that looks 3D from the correct position, and match that correct position to the viewer's position, we can do it! The technique to do this has been explored by Robert Kooima & others - the creators of headtrackr used it to make a game as Øygard notes in his tumblr post on the subject (see citations).

Off Axis Projection is essentially unlinking the vanishing point from the center of perspective rendering, so that you can move the camera and perspective around to match the user while still rendering the same content.

Michal Szkudlarek & Maria Pietruszka's paper Head-Coupled Perspective in Computer Games (see citations) was integral to my understanding of how to set up the projection matrix I needed in the time I had available, and my related algorithms in the code are modified versions of theirs.

Why p5.js? Live Rendering

My goal with this project has been to make this runnable in as many places (and for as many people) as possible. This technology requires live tracking the viewer and rendering to match their perspective, and therefore live rendering. I believe p5.js fits that goal best for now. It's quick to load, accessible on many mobile devices, and relatively cheap in terms of performance. With this setup you can (in theory) tape a tablet to the wall, and you have a gallery installation.

To get this running on as many devices as possible, we process everything we can beforehand. Lighting & textures are baked and compressed, so all the script has to do is track the viewer, project the scene, and display it. This allows us a lot of flexibility in how we do these steps, and it gives us extra performance budget for head tracking, which isn't cheap. In addition, the actual content & quality of the scene is relatively independent from the way we display it, so it could be ported to other engines like three.js or even Unity.

Because of this work, I hope my demo should run in your browser right now! Although the head tracking might be a bit laggy on some phones. If it doesn't, please feel free to contact me with any information you can provide so I can try to address the issue.

Limitations

As I've noted elsewhere, this technology has several severe limitations.

  • With the use of head tracking, this only really works from the perspective of one viewer (and perhaps other viewers standing exceptionally close).
  • Head tracking is expensive, and limited to what can be recognized by the system's camera, which is limited by lighting in the environment, quality of the camera, etc. It works best at short-medium ranges.
  • The fidelity of the content and the quality of the head tracking are limited by the resources of the system.

Dependencies

This project would not have been possible without a pre-existing javascript Head Tracker I could pair it with. I was lucky to find @auduno's headtrackr, and lucky that it worked so well for my purposes.

Citations

Kooima, R. (2009). Generalized perspective projection. J. Sch. Electron. Eng. Comput. Sci, 6. src

Øygard, A. M. (2012, June 15). Head Tracking with WebRTC. auduno.tumblr.com. src

Szkudlarek, M., & Pietruszka, M. (2013). Head-coupled perspective in computer games. src

More Reading

Previous Post

Next Post