As it turns out, the characters in 'Mad God' were created using CGI, claymation, and fur from Phil Tippett's cat.
You might call Phil Tippett the granddaddy of special effects for cinema. A longtime advocate for claymation, stop-motion, and other analog techniques, the Oscar-winning visual effects producer is the brilliant mind behind illusions like the holo-chess scene in Star Wars, AT-AT walkers in The Empire Strikes Back, and the animated sequences in Robocop. He’s also internet-famous for his role as “Dinosaur Supervisor” in Jurassic Park, and though his involvement in Spielberg’s 1994 dino flick spawned an enduring meme, it also marked Tippett's first exposure to realistic CGI.
“Silent film to sound, black-and-white to color, TV coming in and changing everything—from the beginning, technology has changed and will always change. But what was really cool about working with George Lucas, Steven Spielberg, and Paul Verhoeven was that it was a much more open kind of creative situation," Tippett tells The Creators Project. "I prefer working on stuff where no one knows what the hell's going on and you're really free. That’s what drew me to working with VR: it was the Wild West.”
Mad God, a one-minute VR film produced in partnership with WeVR, is both the next technological jump in Tippett’s career, as well as a continuation of the dark, dystopian sci-fi short film series he's been working on since the early 90s. Mad God was initially nothing more than a few scenes shot on 35mm, depicting a handmade world filled with monsters, mad scientists, and “shit men”—sinewy humanoids made of metal skeletons and hair from Tippett’s cat.
“About six years ago, I saw some old footage from Mad God and asked Phil, ‘Why is this not happening? Can we try to shoot it using digital cameras?’” VFX Supervisor Chris Morley says. “Phil gave us the go-ahead, so we built a staircase set and shot going down it. You could tell that it sparked something in Phil, because he started helping. We've been working on the first episodes of Mad God, and now this VR reboot, ever since.”
Mike Breymann of Kaleidoscope VR was the first to suggest converting Mad God to VR. With Tippett on board, they partnered with WeVR, a Venice, CA-based production company founded by Anthony Batt, Scott Yara, and Neville Spiteri, to bring it to life. WeVR creates story-driven VR experiences, pairing original creative voices, like Reggie Watts, Janicza Bravo, Jon Favreau, and the Gregory Brothers, with advances in VR technology. Their process focuses on “getting the first draft of an idea down, then shaping it into a VR story worth telling,” according to Batt.
“Once we start to ask, Why would I want to be in that specific scene?, we try to capture the original idea and feeling,” Batt explains. “If you look at Janicza Bravo's piece, [Hard World for Small Things], she's an independent filmmaker and creative person, but she didn't fully know how to think about being immersed in a VR story. So the process became having her write the story how she felt comfortable, and then we contemplated the conversations and nuances that she wanted wrapped around you, the viewer.”
To translate Mad God to VR, Tippett and his team similarly dove into ways to condense their story into a one-minute tale. They landed on a claustrophobic, 360-degree view of a hellish landscape, allowing the visitor to stand among the shit men as they answer to a creature called the She-It, before all comes to a crashing end. For Tippett, sound was a key aspect in guiding the viewer’s attention, replacing traditional transitions and cuts.
“I conceived of the action simultaneously as sound and picture and drew up a series of storyboards that marked out cardinal directions—North, South, Southwest, and so forth,” Tippett says. “For each quadrant, I then did a set of boards that timed out exactly to what happens in every second of the choreography. That's pretty much how we laid out the whole thing.”
As Tippett, Morley, and a team of animators—including Tom Gibbons and Chuck Duke—worked on realizing this dark, fantastical world, they honed in on a key challenge. “The initial idea was to have a 360-degree set with trap doors all around and have each frame be animated. That would've been 1,800 frames of animation. You know, Phil did that when he was like 30 years old on the AT-AT walkers in Star Wars, but us now—we’re not gonna be able to handle that anymore,” Batt says.
“To get around that problem, we shot 1,500 frames of the shit men on green screen sound stages, based on action cues that Phil described,” he continues. “Then, we used compositing software to lay them into a separately-shot world. It was a hybrid approach, but we're not ‘pure’ in any sense of the word. We do whatever we have to, to make the best image, but it was so cool to use old-fashioned techniques with new technology.”
The team also shrunk the distance between the left and right eye of the VR camera in post, sizing down from a human’s distance to that of the shit men, attaining the scale needed to sell the film’s reality. It’s just one subtle example of the newly-possible immersion afforded by VR, and based on the success of Tippett’s collaboration, Batt envisions more and more traditional filmmakers dabbling in the new technology, changing their cinematic vocabulary as the medium becomes increasingly interactive.
I think that, in the case of filmmakers who’ve developed certain techniques that are ‘their thing,’ they’ll learn that in VR, there are all sorts of subtleties that can be used instead to create a moment,” Batt says. “So if you’ve got your Kubricks and Hitchcocks starting today, they won’t be using any ‘long takes.’ They’ll want to take you into a room and let you get real uncomfortable, until you realize that the environment is responding to you when you move, just like real life.”