FYI.

This story is over 5 years old.

Entertainment

How It Works: Chris Milk's The Treachery Of Sanctuary

Chris Milk’s new collaboration with The Creators Project was a tremendous feat of technical prowess. Here’s how it works.

Director Chris Milk is a pioneer of interactive experiences, whether they live on the web like The Wilderness Downtown and Three Dreams of Black, or in real life like Arcade Fire’s Coachella 2011 performance, Summer Into Dust. Milks’ latest project, The Treachery of Sanctuary debuted at The Creators Project: San Francisco 2012 and will be traveling the world to all our 2012 events. This new piece is a giant triptych that takes viewers through three stages of flight through the use of Kinect controllers and infrared sensors. We spoke with Milk, Creative Director Ben Tricklebank, and the artwork’s programming team to find out how they managed to bring everything together to ultimately allow people to lose themselves for a second.

Advertisement

We asked the project’s Technical Director, James George, and creative software developers, Aaron Meyers and Brian Chasalow, to reveal a bit of the technical wizardry that went into making the installation possible.

How It Works: Chris Milk’s The Treachery Of Sanctuary

We worked with The Creators Project and FakeLove to create the software behind Chris Milk's latest installation for their San Francisco event at Fort Mason.

The Treachery of Sanctuary, as conceptualized by Chris Milk, is three monolithic white frames towering above a still reflecting pool. Entering the space in front of the pool, you notice your shadow appears within the first frame, as if you've stepped in front of a bright light. A flock of birds swarms at the top of the panel. Reaching up to them, you are surprised as your shadow begins to dissolve, transforming into hundreds of small birds that flutter upwards to join the flock. Within just a few moments, your silhouette has completely disintegrated, leaving no trace of you.

Photo by Bryan Derballa.

Moving to the second panel, the flock above you is larger and more menacing than the last. When you enter, the birds begin to swoop down, attacking your shadow and taking away chunks of it in their claws. The onslaught continues until you've been almost completely devoured and all that remains is a pair of stubby legs.

Entering the third panel, your silhouette has returned. Upon swinging your arms up, you are bestowed with a massive pair of wings. Your wings follow your gestures, swaying with the movement of your arms.

Advertisement

Technical Approach

The implementation of Treachery stitched together several different technologies. We needed a way to visualize the viewers as silhouettes in front of the display so that we could augment their shadows, selectively removing parts or attaching wings. This meant we needed both the outline of the viewers, as well as data points describing their actual posture in terms of torsos, arms, and legs.

Screenshots from the custom software built for the installation.

In order to create a flock of birds, we needed a way of efficiently animating hundreds of 3D models flying together and interacting directly with the silhouettes.

We chose to use both a 3D game development environment called Unity and a creative coding platform called openFrameworks. openFrameworks was used to access a Kinect camera sensing the presence of people, which was passed to the game engine to show silhouettes interacting with the bird flocks.

Both the environments have unique strengths that made them work well together. openFrameworks has powerful libraries for processing data from cameras, which made it easy to interface with the Kinect cameras (we used the Kinect for Windows SDK, since it provides the most advanced pose tracking out there). For displaying the visuals, Unity 3D worked well as it's made for working with lots of animated models.

Video Tracking
The system begins when a viewer steps in front of the screen. The Kinect camera discreetly built into the wall behind them recognizes the outline of the person and infers their posture. The raw image from the Kinect camera is quite noisy and requires several image filtering techniques to smooth its edges before it’s displayed as a shadow.

Advertisement

Photo by Jason Henry.

We attempted several techniques to produce cleaner shadows, including Delaunay triangulation, contour finding, and edge detection. While many of those techniques generated interesting effects, they didn't result in the pure shadow look we were after. The solution we settled on used an erosion pass to remove small pixel glitches, followed by a dilation pass to fill any holes, and finally a median blur to smooth things out. We averaged consecutive frames in time together to further stabilize the silhouette edges and remove noise.

Unity and openFrameworks integration
Since the data from the Kinect was received and processed in openFrameworks, we needed a way to visualize the interaction in the game engine Unity. In the past, we had used OSC network communication to connect the two applications, but that was way too slow for this installation. We decided to experiment with the more direct approach of a native C++ plugin for Unity.

A plugin opened up the option of low level shared memory access between the openFrameworks and Unity processes. That way both applications could share the actual RAM location where the skeletons and shadows were stored. This was super fast.

Silhouette Masking Magic Trick
Once the silhouettes and skeletons were visible in Unity, we needed a way to make them dynamic and interactive. The design for the first two panels required us to be able to progressively hide each viewer's silhouette as they interacted with the birds. To accomplish this we built a "particle suit" of small images and attached it to the skeleton data per viewer. The particles were not visualized directly but served as a reveal mask to let the silhouette image show. As the birds would emerge or rip apart the silhouette, the particles were progressively removed and those parts of the body would disappear. Because the particles were attached to the Kinect's skeleton data, it didn't matter where you waved your arm, if it had been ripped off it was gone!

Advertisement

Photo by Bryan Derballa.

Bird Animation and Flocking
Birds moved through the scenes using a custom flocking algorithm simulating bird flight. Flocking works by applying the same simple set of movement rules to each bird in the flock. The rules sound like "move towards the average center of all the birds" and "head in the same direction as the birds near you," or "move away from the birds if they are too close."

By applying these simple rules to all birds, organic flocking animations emerge in the entire group. In order to have fine control of where the flocks swarmed we positioned a series of invisible points around the 3D scene. The birds were assigned points to fly towards, which let us create paths for the flocks to follow.

The birds were created in Maya, a program for 3D animation traditionally used to create films and video games. Each bird had a series of animations done by hand as short loops or gestures, such as a peck, a flutter, or a dive bomb. By mixing code-based animations like the flocking with hand-done animations, we achieved a dynamic and scriptable yet still organic-looking behavior.

But like any good artistic experience, the magic isn’t in the technology, but in the way people interact with it. Once all the elements are in place, the technology can sink into the background and let a seamless, more emotional experience emerge.

Watch the behind-the-scenes documentary on Treachery of Sanctuary with interviews from Chris Milk, Ben Tricklebank, and the tech team to find out more about the project.