Paul Story began his career in visual effects in 1996 at Weta Digital. He works on many films such as THE LORD OF THE RINGS trilogy, AVATAR, DAWN OF THE PLANET OF THE APES and THE JUNGLE BOOK.

What was your feeling to be in the MCU?
This was my first time working in the MCU. I was fortunate we have a lot of fans working at Weta Digital, so I got a crash course on all the character’s histories, which helped me to prep for the sequences we were involved with. It’s great to see how passionate people are about the Marvel Universe, and seeing that translate to their contributions to the film!

How was the collaboration with directors Russo Brothers and VFX Supervisor Dan DeLeeuw?
The majority of our collaboration was with Dan DeLeeuw. He was great to work with for animation and always cheerful during our weekly cineSync calls where we talked through detailed notes to gain a better understanding of what the Russo Brothers were looking for. Once we ‘found’ the characters of Thanos, Iron Man & Spider-Man, we were able to have some fun exploring options for their more dynamic shots.

What are the sequences made by Weta Digital?
Our sequences were set on Thanos’ home planet, Titan. The ruined state of Titan helps tell the story of why Thanos is set on his path of destruction. In these sequences, the Guardians, Iron Man, Spider-Man and Dr Strange take on Thanos to try and stop him from wiping out half the universe.

Can you explain in detail about the design and creation of Thanos?
Thanos’ design incorporated a lot more of Josh Brolin’s facial features in this film. We were mindful that Digital Domain were also working on Thanos, so both companies would often check in to make sure we had our characters in sync, even though our facial pipelines are quite unique.

For Thanos’ bodybuilder design, we had to make sure someone so sculpted was capable of making the quick and powerful performances needed for our sequences. This involved a full range of motion poses to stress test the detailed muscle and tissue rigging. This ultimately informed animation and the rigging team of the ranges desired and what was achievable. With the muscle detail so visible in Thanos, we needed ways to be able to show his muscles firing from an already inflated state. So specific shapes were created for animation to control, along with the finer details of veins enlarging to help give us the perceivable change in shape needed.

How did you create and rig his face?
Once the base model for Thanos had final buy off from the Russo brothers and Marvel, we built Weta Digital facial puppets of both Josh Brolin and Thanos. These facial rigs are created with hundreds of muscle-based blend shape controls for the animators. Each controller works as though we are firing a specific muscle (or group of muscles) in the face. The shapes are derived from a FACs session of Josh, in which we capture the many face shapes needed to make a comprehensive facial puppet. Many combination shapes are reworked alongside animation tests until we have a solid puppet that keeps all the correct muscle movements intact, while giving the animators as much control as they need to sell a great performance.

Can you tell us more about the face animation?
The process for facial animation starts with our actor puppet of Josh Brolin. Tracked data from the face cameras is used to drive Weta Digital’s Facial Solver FACETS with the guidance and input of an animator, to make sure the correct levels of muscles are firing over another. FACETS outputs animation curves to drive the muscle based blend shapes in the facial puppet. These curves are then edited carefully by the animator, while referencing all the footage we have of Josh to make sure we have a one-to-one match of Josh’s performance.

Both Josh and the Thanos puppets are worked on at the same time to make sure the muscle-based blend shapes work as closely together as they can, so that if the same muscles are fired on both you would get the same read on both puppets, even though the facial shapes and facial structures are quite different between the two. This means that when we transfer our Josh animation to the Thanos puppet, we know he is giving us the same performance read as how Josh performed it. This animation is then fine-tuned once more to make sure we have the correct lip detailing, and eye animation passes to really sell the realism! Having these two fully developed puppets driven from the inside out means they are both contained systems so the final motion will always sit within what is possible for that character. Believability is a human construct, not a scientific one.

How does his purple skin tone affect your lighting work?
// CG Supervisor Sean Walker: Thanos’ purple skin was a challenge in our Titan environment. Purple skin in an orange/yellow environment desaturates to grey very quickly, so we had to find the right balance in light temperatures to retain his characteristic purple. The lighting on set was also very orange which made any shot where he interacted with our actors especially challenging. We were acutely aware that too much deviation from the set lighting broke Thanos’s connection to the plate. Even though this was an alien planet, we still found it important to always start our lighting based in reality. We were able to do this using our Physlight system. This system takes in to account all the settings and attributes of the camera used to shoot the plates and contains all the physical attributes of the lights used on set. This allows us to recreate the set lighting with extreme confidence. Knowing how Thanos would look on set allowed us to then shift colour temperatures around so that he would sit in the plate, feel like he was on another planet, AND also retain his characteristic purple colour.

Can you tell us more about your work on his eyes?
Animating the eyes on any character is key to selling a ‘living’ performance and is where Weta Digital is careful to pay a lot of attention. So much of the emotion is captured in the eyes and the surrounding area, and it’s something the audience can pick up on quickly if not done correctly. The animators strive for every eye nuance, every blink or half blink and every twitch in the surrounding muscles to give us a close of a match to the original performance as possible.

Iron Man is using a new technology. Can you explain in detail about this nano-particle armor and weapons?
// FX Supervisors Ashraf Ghoniem and Gerardo Aguilera: Each weapon manifestation started from a mechanical animated transition that the modelling and animation departments worked up to get general timing. We then used that to generate many layers of particles and geometry to build up the weapon. We took a step back and tried to think through if this was a real technology, how it would actually look, and what would we expect it to do. We eventually came to the idea of creating a per-nano particle animation of how it would build itself and replicating that across the whole weapon. This allowed us to retain the physicality of the effect and to ground it in realism. By taking all these per-particle animations that change states from liquid to solid and eventually grow to final shapes, we were able to create the full armour.

Another aspect of this is that the client wanted a liquid metal feel, which we achieved by surfacing our particles like water as they transitioned into the next state. So it was not only a state change but a layered effect, which added a significant amount of visual complexity. The key to our work was to feel the transition come in across a band of the surface to help the eye understand that this is a complex mechanical thing that is being built up by a seemingly organic feeling material. Another detail that helped us retain realism was building an interior structure for the suit as if these weapons and armour were manufactured and included in the suit build. Adding this into our sims created further visual richness and made the concept of the suit feel very real.

How did you manage his face animation work?
We had a small team of animators who work specifically on the facial animation. This has to tie in to body motion, so we had to be mindful of both body and facial animation working together to sell a complete performance.

With that in mind we also had to add specific controls to Thanos’ neck. Animators could fire muscles for certain head movements and help with overall expressiveness as well. The neck is usually an area that we transition from a creature rig to a facial rig, but for Thanos is was decided the extra level of control was needed for animation, so we ended up having these controls as part of the facial rig.

How did you handle the lighting work for those two challenging reflective armors?
// CG Supervisor Sean Walker: The reflective armours weren’t as much of a problem as we initially thought. We made the decision early on to create a fully render-able environment, including the set that was shot. Using our in-house path tracer, Manuka, we were able to naturally reflect the environment and Titan lighting. While this simplified our all CG shots, any shot that included our actors, involved a little bit of work in “hiding” the onset lights. While these would be naturally reflected on set, we replaced them in our HDRI IBLs with naturally bright parts of the environment like clouds and moons. Titan in this story has two suns, so we also used that to our advantage, allowing us to keep continuity between the plate lighting and reflections, and our fully CG environment.

The fight involved a lot of full CG characters. Can you tell us more about their choreographies?
We were lucky with Marvel having provided us with a great base to work with, in the form of some well-thought-out previs. As we explored the motion in more detail, coming up with ideas on how to push it further, Marvel also supplied us with some great performances from Thanos’ stunt double, Greg Rementer. Not only is his name perfect for this type of movie, but his actions were quick and powerful which we used as a one-to-one reference for some shots, and also as a guide to other performances for our animators in other shots. Animators were also encouraged to use our reference room where they can record their own inspired actions to help guide them through both subtle and dynamic actions.

During the fight, Thanos destroys a moon to launch the debris on the heroes. How did you created these large scale FX elements?
// FX Supervisors Ashraf Ghoniem and Gerardo Aguilera: The close up of the moon as it is being destroyed by Thanos was achieved by a combination of techniques: rigid bodies, instanced geometry, particles and volumetrics. Our goals were to add as much visual complexity as possible and to help preserve the scale of this massive event. We approached the fracturing by individual material and organized it in layers.

Another key component was to choreograph the magical effects with the destruction. This was achieved by simulating in layers and passing data back and forth for each artist to use to enable interaction. The underlying rigid body sim was driven by hand animated curves to allow us to quickly iterate on timing and performance of the event and to also provide the backbone of all the effects from rigid body all the way to final energy effects. At the end, we had a large amount of layers and passes as to allows us to be as modular as possible as to be able to quickly change aspects of the effect.

The falling meteors in the surrounding shots were done in collaboration with Animation. Animation handled the position and speed of meteors to allow them to control motion and shot composition independent of FX. In FX we created procedural workflows in Houdini to ingest the meteor animation and create particle trails off of them. These particles were then our sources for our simulations done in our in-house distributed sparse volumetric solver within our simulation framework called Synapse.

Because of the massive nature of the sims in regards to distance travelled, speed, and detail needed, Synapse was ideal because of how we could distribute our sims over multiple machines. This also gave us the freedom to simulate meteors together with very long smoke trails that can interact and influence each other when traveling close to each other and the camera at a very high voxel resolution.

How did you created the vaporize effect?
// FX Supervisors Ashraf Ghoniem and Gerardo Aguilera: The blip effect was unique and highly designed effect developed by our art department here at Weta, and further developed and refined with our FX Lookdev team in conjunction with Marvel. After a lot of back and forth we honed in on a specific concept. This concept was a mixture of the supernatural and very naturalistic look of wood burning and turning to ash.

The challenge of this was to try to not have it feel like a simple vaporization effect but to have a unique character and complexity as communicate an other-worldly feel. The fundamental work was done within Houdini with a variety of different types of particle and volumetric simulations. To create the organic nature of the effect we relied on growth algorithms to allow the transition to propagate across the body in a more naturalistic way. It was important to make sure the “blip” effects did not appear hollow so we built in layers of different particle distribution densities.

We used PBD solves to give all the flakes interconnectivity and flex in order to incorporate motion complexity required to execute the effect in close-ups. We had a number of key characters blip out during long character performances in-camera. With the high-fidelity of our digital doubles we were able to create very complex and high detailed simulations that helped us get to a point where the transition could be as seamless as possible.

Is there something specific that gives you some really short nights?
Earlier on in the show and working out what it was the directors and Marvel wanted. Finding the character they were after, and incorporating that into the direction we gave to the animators. Making sure the facial rig was up to scratch and and that we were capturing every aspect of Josh’s performances in the performances by Thanos.

What is your favorite shot or sequence?
I loved working on the Thanos’ arrival to Titan sequence. It involved a lot more facial animation challenges which I really enjoyed.

It’s not often we get to work on a villain, let alone one with such an emotional range across our sequences – it made this uniquely challenging, but also made Thanos a really enjoyable character to work on for the team. From rage to sympathy, and even involving a trance like state when Mantis has control over him, meant animators could use a larger set of controls needed to help portray these great performances of Josh.

What is your best memory on this show?
Working with a great team of enthusiastic animators and supervisors. I learnt a lot from Sidney Kombo-Kintombo, who was my co-animation supervisor. He inspired a lot of animators to push their work to the next level. Many animators were really excited to be working on some of their long-time favourite movie heroes. Seeing them relish finally getting the chance was pretty cool!

How long have you worked on this show?
10 months.

What’s the VFX shots count?
398.

What was the size of your team?
The animation team was 30 strong, and over 600 Weta Digital crew worked on the show altogether.

What is your next project?
I’m now working on an unannounced project. It will be an exciting challenge doing something that Weta does best, but that’s as much as I can say for now.

A big thanks for your time.

// WANT TO KNOW MORE?

Weta Digital: Dedicated page about AVENGERS: INFINITY WAR on Weta Digital website.





© Vincent Frei – The Art of VFX – 2018

LAISSER UN COMMENTAIRE

S'il vous plaît entrez votre commentaire!
S'il vous plaît entrez votre nom ici