Neil Huxley is back on Art of VFX. He talked about his work at Prime Focus on AVATAR. He then joined the team of Mothership as director. Accompanied by Aladino Debert, he explains his work on this DEAD SPACE 3 commercial.
Can you tell us about your work at Mothership?
I’ve been signed as a director at Mothership since January 2012 and have been getting more and more video game spots and cinematic jobs. The video game space is expanding rapidly. I love video games so it’s a great place to be right now.
How did you get involved on this show?
Draftfcb San Francisco wanted to work with us. They had their concept so I gave them a visual pitch on how I thought their story could be told. I am a huge Dead Space fan. We all kept referencing the same movies like ALIEN and THE THING so we were on the same page from the beginning.
What was your approach on this commercial?
We try and implement our virtual production pipeline on CG projects whenever it’s appropriate, treating them like live-action film productions. This more sophisticated approach means that instead of just producing a bunch of shots that don’t really tie together, you end up with a cohesive visual piece that shares assets across the show, which lets us create a better end product. Creatively all of the artists have an investment in the project. The story here has three acts, and the characters have story arcs, the same as in a real film, so it engages everyone involved in the creation on a deeper level.
A broader sense of our usual process consists of creating a series of storyboards and a board-o-matic to guide our motion-capture session, which is done here in DD’s virtual production studio by Gary Roberts and his team. Once that session is completed, an edit with selects is created. With that edit in hand, we use the virtual camera system here in our virtual production studio to create realistic cameras for each shot.
From there, the production moves into Maya (where the initial model and rig were created). In Maya, our animators layer all the nuanced animation that motion capture is not capable of producing (weight, hands, fur, helmet animation in this case, etc.). Once the animation is approved, all lighting is done with our custom HDR-centric pipeline in V-Ray. Concurrently all effects were created using traditional particle systems in Maya. At the end everything is put together in Nuke with the final touches added in Flame.
Did you received specific indications and references from Electronic Arts?
Obviously we had to keep things in canon with the game so EA shared a lot of stuff with us like game assets, concept art etc. For example, EA wanted the skies toward the end of the piece to be angrier, more stormy, more like the game reference we had, but we still wanted to keep it in our realistic, photographic world. The team did a great job in mixing both the game reference and our own photographic, real-world reference together to create a very believable climax to the environment.
Can you tell us more about the motion capture sessions?
Motion capture for this show was pretty straightforward. One day, one actor, a couple of witness cameras and the crew and you are off. The action was simple – it is Isaac walking through snow pretty much. Dan Southworth was our mocap actor and I’ve worked with Dan before on THE DARKNESS II. He’s great. He gets it and gives you exactly what you need. The ending of the piece was still not thoroughly fleshed out at this point so we captured a couple of different endings. Aladino (VFX Supervisor) and I always wanted Isaac to jump into the pit at the end because we thought it was badass. I’m glad we got to keep that in there.
What were the advantages of the vCam for the shooting?
Well it takes the responsibility of camera away from the animators and gives it to the director, which is where it needs to be, allowing us to get the desired shot faster, in real time, with minimal clean up required, rather than having me sit behind an animator noodling with chopsticks which can be very tedious.
What is your work methodology with the actors?
I always try and get them in the headspace of what world they are in and what character they are playing. The actors have to imagine everything in front of them, which I am sure can be tough for some people. It reminds me a lot of being a kid, and you imagine these worlds in which you are a space soldier or something. That same imagination is required here. It’s my excuse for being a big kid!
Can you tell us in details about the creation of the frozen environment?
Creating interesting snow scenes is tough, and the last thing we wanted was white scene after white scene without visual interest. We wanted a monochromatic look — that was a conscious design choice. Raking light across the scenes would create harder shadows and contrast. The grey was a good lighting reference for us. We didn’t want bright blue skies. Tau Volantis is a hostile environment, and we wanted to represent that fully.??
How did you manage the challenge to blend the live action head and the CG armor?
Integrating the live action head was also a challenge, but one we relish. When we have a live-action component, it means we get a chance to push the CG to that photo-feel level so everything sits nicely together in our world. These are the shows I love working on the most.
What was the biggest challenge on this project and how did you achieve it?
Schedule is always a challenge. The team turns these shots around in about 6 weeks after we shoot mocap. We work with very aggressive schedules compared to some shops that get 6 months for a cinematic.
Was there a shot or a sequence that prevented you from sleep?
Thanks to Aladino, no. I slept like a baby! (laughs)
What do you keep from this experience??
It was a great show and I worked with some great people. The team at Digital Domain is awesome. I loved the combination of live action and CG and I look forward to my next chance to do more of this work. And of course, it was a huge thrill to work with Gunner Wright, the actor who plays Isaac Clark in the game. He was awesome and it was a great experience for me.??
What is your next project?
I’m involved in two big video game cinematics for E3 and VGAs right now for Mothership, which is fantastic. PORTRAIT OF AN ASSASSIN is my passion project that has been bubbling away for a few years and I could talk for hours about it. It’s a documentary about an ex-boxer by the name of Jimmy Flint a.k.a The Wapping Assassin, who should have been world champion and didn’t quite make it. It’s not really a sports biopic though. It deals with some interesting and tragic themes of loss and redemption that I think we can all relate to. Boxing is a metaphor for life; talk to any fighters and they will tell you the same thing.
How did you animators works with the mocap data and enhanced them?
Mocap is always used as a strong starting point with the idea of enhancing it in production. It gives you a priceless amount of natural performance, but there are things that cannot be captured or particular actions that might’ve not been recorded. For instance, any soft deformations of the body usually have to be added, as well as secondary animations for things like hands and fingers. Besides that, there’s always a certain amount of art direction that goes into every shot, and for that our artists layered key-framed animation on most shots. In this particular spot we did not have facial animation, but that would be something we also work on a lot after the initial mocap session.
How was created the shaders and the render for armor?
Shaders were created with a combination of Z-Brush and Photoshop textures, all compiled into V-Ray shader networks. All rendering was done in V-Ray.
Can you tell us in details about the creation of the frozen environment?
We are quite proud of the developments we did on shaders for the environment. For instance, CG Artist Casey Benn created a shader for the rocks that took into account the normals of the geometry as well as world rotation to calculate how much exposed rock vs. ice/snow to show. It gave us an immense amount of flexibility when art directing each environment. The rest of the environments were created with basic geometry with snow shaders, while for the wide shots and some backgrounds had some incredible matte paintings by Shannan Burkley.
How was created the storm and other FX such as the lasers and breath?
All the effects on the spot were generated with Maya particles by Erik Ebling and Karl Rogovin. In this spot, the snow blizzard was a character on its own, but it was important for us to make sure the snow and breath were as natural looking as possible so as to not take away from the rest. We experimented a lot with dynamic effects, wind, turbulence and the like to arrive at a library of elements that worked really well. The laser beams were actually rendered geometry that the compositors mixed with practical smoke elements.
How did you manage the challenge to blend the live action head and the CG armor?
That’s Digital Domain’s specialty, so we were well prepared for that challenge. The trick is to have a plan when you are on set shooting live action, and then a good team of compositors with experience combining plates with CGI. Our compositors were Barry Berman, Scott Hale and Arthur Argote.
Can you explain to us the creation of the helmet animation?
We had a rough idea of what the helmet needed to do from the game developer. We even got a basic animation from them, but it proved to be too simple for the intricate models we had or how close we were going to see it. So our rigger/animator Rick Fronek created individual Maya controls for pretty much every piece of the helmet and the backpack that receives the rear half of it. He took the same approach with the chest box. Then he animated it old style with keyframes.
What was the biggest challenge on this project and how did you achieve it?
Realism. It’s always a challenge to raise the bar on game cinematics, and in this case that was tricky given that we had a relatively short schedule (about 8 weeks total). Also, since we were shooting live action elements for the face, we had to make sure everything else looked as real as possible. So how were we to achieve realism, or try to? It’s about camera work, animation performance and lighting. The animation has to be as natural as possible. The camera work has to feel realistic, we have to do our best to avoid « impossible » shots, and the lighting of the CGI has to reflect natural conditions. The first part is achieved by shooting motion capture and layering great animation over it. The camera work is greatly helped by using our proprietary Virtual Camera setup (vCam), and lastly we have an extremely talented team of artists, in this case lead by CG Supervisor Ron Herbst.
Was there a shot or a sequence that prevented you from sleep?
Sleep? What’s sleep?! Probably I’d say it was the precipice shot, when Isaac Clarke finds the giant Marker. The challenge was not necessarily a technical one but really had to do with creative for that shot evolving. Originally we had planned on having an overcast look to the entire piece, but towards the end the look migrated toward a more dramatic sky, with a blood-red moon. That threw our lighting and compositing team for a loop initially but we are quite happy with the results.
What was the team size at Digital Domain?
We had a team of seventeen people including our producer and coordinator.
A big thanks for your time.
// WANT TO KNOW MORE?
– Digital Domain: Dedicated page about DEAD SPACE 3 – TAKE DOWN THE TERROR on Digital Domain website.
– Mothership: Dedicated page about DEAD SPACE 3 – TAKE DOWN THE TERROR on Mothership website.
// DEAD SPACE 3 – TAKE DOWN THE TERROR – MAKING OF
© Vincent Frei – The Art of VFX – 2013