What is your background?
When I was a kid, I loved both drawing and painting and was always interested in nature. As I grew up I watched movies such as STAR WARS and that kept me pretty busy for a while drawing spaceships and Jedis. Later on when deciding what to study, my older brother introduced me into a software called Maya and I realized that one could make a living creating imaginary images. My initial idea of a job was to not work with computers… Which I think didn’t work out… I started at RISE in 2011 as a 3D Artist/Matte Painter and later became CG Supervisor. Marvel’s THE AVENGERS was the first show I worked on as a VFX Supervisor.
How did you get involved on this show?
RISE already has a long working relationship with Marvel so it is always a pleasure to be able to be part of the next project. Their production team approached us back in January asking our help for the realization of the post credit sequence – which at that point, still had to be filmed.
What was your feeling to be back on the MCU?
I was still wrapping on their other production BLACK PANTHER when I began on this show. Having worked on several Marvel shows prior, it gave a sense coming onto this one that I was seeing old friends again – except instead of people they were the Wakanda plates. It was good to feel back at home!
How was this new collaboration with directors Russo Brothers and VFX Supervisor Dan DeLeeuw?
I first met Dan in person back on the CAPTAIN AMERICA: CIVIL WAR sets in Atlanta and have always enjoyed the opportunities to collaborate with him in the years since. Throughout this production most of our coordination and dialogue were with his incredible Co-Supervisor Mårten Larsson, who like Dan really knows his craft and is a great guy to work with. In particular his comments and feedback were always spot on and guided the production in the right direction. Dan and Mårten were our link to the directors, keeping them up to date as we rushed to the finish line.
How did you organize the work with your VFX Producer?
The VFX Producers on this show were Katrin Arndt and Florian Gellinger. Katrin is relatively new at RISE but with Florian I had worked many times prior to INFINITY WAR, him being the VFX Supervisor on previous shows. We are on the same wave length when it comes to our sense of humor so everything went pretty smoothly.
What are the sequences made by RISE?
RISE did the TAG sequence, a Central Park sequence at the movie’s opening, the Wakanda sequence where Bucky gets his new Arm, some inside shots of the Quinjet and an establisher of Planet Vormir.
How did you created the Central Park environment?
The Park creation itself was relatively simple from a CG point of view. The plates were shot in a park in Atlanta which meant we mostly needed to add a background extension. We recreated one of the reservoirs in Central Park with a CG shoreline and a far distance DMP for one part of the shots. The other shots then needed be looking in the direction of 5th Avenue. Here we exchanged the Atlanta building skyline for those in New York that are within the vicinity of the park as a projected DMP.
For our compositing team the Central Park sequence was one of the most challenging. This is mostly because the opening shot is nearly 800 frames long, beginning high up in the trees, looking through various branches and leaves towards the Central Park Lake extension. The camera then lowers down showing Pepper and Tony Stark in the middle of a conversation– all the while you are seeing the set extension shifting between being defocused and the moving trees. Compositing veteran and Supervisor Oliver Hohn was always up to the task and came up with some pretty great solutions to deal with the extensive roto and patch work. In the end we did replace some parts of the plate with CG vegetation just because it was faster than to key or roto every leaf.
Can you tell us more about the CG trees creation?
Our vegetation/tree pipeline at RISE is centered around using Speedtree for builds and Houdini for the population of trees. On this particular project the 3D Team had to create a lot of vegetation for different purposes. The biggest bunch was probably the forest hills we had to build for the Wakanda environment. There were also the trees and bushes for the Central Park sequence, some of which had to matching Lidar scans and on-set-data to allow for their in-plate replacement. CG Supervisor Matthias Winkler led the team responsible for doing that and meticulously matched these plants to the reference until they fit.
The Wakandan trees for the second environment were matched to reference shots we got from ILM‘s showdown sequence and were built as digital assets for our environment, the NYC Central Park trees obviously had to match the vegetation of the location.
How did you enhance the portal effects you have created for Doctor Strange?
With a portal effect from DOCTOR STRANGE already in place we were able hit the ground running for this task. The enhancements on this effect were partly a rebuild of our FX spark-system in the way they handle subframe motion blur. The addition of some more subtle turbulent forces and longer lifespans of parts in the particle simulations made the effect work better in these shots.
How did you created the Wakanda environments?
The Wakanda sequence was filmed on a grassy hill around the Atlanta area. We got a lot of data from other vendors on the BLACK PANTHER production, including parts of the Golden City done by ILM. This data included aerial photogrammetry geometry that we later used as a base for the environment layout. Building on the Lidar geometry of the actual hill we could then readily map out the environment to cover for 360 degree of camera movement inside the set. We did some still frame renders together with some Photoshop overpaint to settle on a look that both matched our Atlanta overcast plate but also fits the story of being somewhere in a warmer sub-African climate.
After getting a buyoff on the overall layout and lighting of this environment we populated the whole terrain with thousands of trees in our in-house scattering pipeline. By defining multiple areas that could be worked on simultaneously we could split up the work easily between several artists. One of our CG Supervisors, Andreas Giesen, took care of that and within a very short amount of time we had a full 3D environment ready to be added into every shot.
Bucky is getting a brand new cool arm. Can you tell us more about it?
I would love to tell more about that in detail, but the fact is that we got this as an already lookdev-ed and developed asset from ILM that we just needed to transfer and ingest into our Mantra based rendering pipeline.
How did you manage the challenge of his reflective aspect?
As most of the todays PBR rendering pipelines are somewhat comparable, this aspect didn’t really cause too much trouble on our side. We closely matched our shading to the lookdev done by ILM. It is then just about matching all parameters per matching lighting conditions. The only change that we really did to this arm was to make it look a little less dirty and scratched to emphasize that this arm is brand new.
The post-credit have really long shots. How does that affect your work?
This indeed is true… those longer shots do require a bit more planning beforehand. The last of the two post credit shots together total around 1750 frames long and the first hurdle we had to tackle was to get the matchmove done as accurately and fast as possible. We had an initial previz of that shot and we knew which effects happened at which point in time. However from past experience I knew that the foundation of every successful VFX shot is a solid matchmove because you never know what may happen. So we split up the matchmove between 3 artists in the 2D tracking phase which was later recombined to get the final version.
With the previz we did break apart this long shot into 5 subshots, each containing one major effect to be able to split it up in compositing. We made sure to have enough overlap between those parts to compensate for timing changes.
Can you explain in detail about the design and the creation of the vaporizing effects?
We did a somewhat similar effect in a previous commercial project so we dug up what we already had and started development in-house. This look development was spearheaded by FX Supervisor Korbinian Hopfner who did a phenomenal job, and not only on this particular effect. We knew that to get this effect working the base would need to have a pretty good rotomation which not only matched the silhouette pixels perfectly but would also match things like the cloth folding and facial expressions. We ramped up our animation pipeline to be able to handle shot-sculpting after rotomatching was done. For the initial look development we received digital assets of Cobie Smulders and Samuel L. Jackson from WINTER SOLDIER to which we applied a standard rig and some generic motion capture data to start the effect development.
Pretty early on in the project we received lookdev from Weta Digital which was showing a ‘blip’ as it was called in its early stages. After a cineSync with Dan we were asked to take this as a base look and start to designing around it. At that point the effect consisted of these brown/sand colored flakes and some particle and volume passes. Since the impression from this blip effect needed to seem painless to those dissipating, all of our forces used to drive the effect were gentle like in nature as if interacting with a subtle blowing wind. We then incorporated some string-like inner structures we called the “soul” on which we instanced tiny point lights giving us nice shadowing effects when later combined with an inner volume/sand layer the characters’ interior was made of. We experimented with a spectrum of colors, some resembling those of the Infinity Stones which turned out quite nice. The outside shell of the crumbling characters was a mixture of RBD combined with FEM simulations. We ran some tests where the flakes instead began drying out and deforming prior to shedding off the body.
As production progressed and the the first rough rotomations from the animation department came in we could start doing the lookdev in the shot context – which proved to be very helpful as a lot of the dynamics tend to rely a lot on where the specific character is placed within the frame. The next part we incorporated was to have the flakes inherit the surface color of the character for some frames until it turned to the intended sand colored hue. There was a constant mix and match between what we had done and additional blip FX which was simultaneously developed at Weta. In the end a lot of the additional elements like the inner soul element or secondary dust and sand were discarded in favor of the story’s intended direction of the decomposition looking painless and less cruel. What remained instead was mostly the outer shell layer with additional detail elements scattered on top. These layers consisted of several stages of flakes. Each of them behaving differently to have full control of timing and appearance of the Blip effect.
How did you handle the animation of this effect?
The animation of this effect was done preliminary with influence objects set to activate the different Stages of the Blip. This was done in the characters T-pose and then transferred back to our animated character. Everything was broken up with noises to appear more natural. In the end, especially the blip of Nick Fury, proved to be a little more complicated when staged and timed to the camera’s movement. Especially on body parts that should remain connected in avoiding magically floating appendages in the air.
Which sequence or shot was the most complicated to create and why?
Each sequence had its own set of challenges but if I had to choose one it would probably be the TAG sequence. The sheer length and the complexity of the different effects alone were quite a task. One thing was that at a later point in the process the decision was made to not move forward with the filmed plate of Nick Fury — instead we would do a camera takeover and switch to a full CG shot. This included a full CG arm crumbling away in close-up together with a full CG environment. Additionally were also the CG close-up pavement and the all CG pager – which reveals the illuminated Captain Marvel Logo at the end. Sequence Comp Lead Erik Schneider did a great job staying on top of all this different elements and delivering a successful shot even when time started to run out.
Is there something specific that gives you some really short nights?
I think nothing too specific here, just the amount of different elements to keep track off and the time that is always against you gave me some short nights here and there.
What is your favorite shot or sequence?
The TAG Sequence I would say, but also the Wakanda Sequence had some nice looking shots.
What is your best memory on this show?
The best memory I have from this show was how well everything and one performed within our team when considering the enormous pressure faced in delivering these shots on a very tight deadline. Also the support of all my supervisors on this show as it was also my first gig as VFX Supervisor is something I will never forget. I can only say thank you very much!
How long have you worked on this show?
A little more than 2 months.
What’s the VFX shots count?
In the end we delivered 26 shots.
What was the size of your team?
We peaked at around 30 artists working on the show.
What is your next project?
While I can’t yet reveal the project’s title – more news about it should pop up later this year.
A big thanks for your time.
// WANT TO KNOW MORE?
RISE: Official website of RISE.
© Vincent Frei – The Art of VFX – 2018