Last year, Matt Kasmir explained to us in detail his work on the George Clooney’s series, Catch-22. He is back today to tell us about his new collaboration with the director, The Midnight Sky.

Chris Lawrence began his career in visual effects some 20 years ago at Framestore. He then worked at Pixar, DNEG before returning to Framestore in 2009. As a VFX Supervisor, he has worked on films such as Edge of Tomorrow, The Martian, Kingsman: The Golden Circle and Christopher Robin. He won an Oscar for Best Achievement in Visual Effects for Gravity.

How was this new collaboration with Director George Clooney?

Matt Kasmir (MK): This has been my third film now with George and he trusts me to translate his direction in VFX.

How did you split the work between the two of you?

MK: I was the Overall VFX Supervisor, rolling on the show with a core team. It was an easy choice to bring on Chris Lawrence because of his experience on Gravity and The Martian. Chris would deal mainly with the zero G and digi doubles, I dealt directly with George as well as the other vendors, ILM, One of Us and Nviz. I would then be presented with all the shots for the last round of comments, before I reviewed them with George.

Chris Lawrence (CL): I think we were quite a complementary team because of our different backgrounds. I was involved with a lot of the decisions surrounding shooting methodologies and testing / R&D around some of the complex problems that had to be solved at the beginning. Later on I was working closely with Framestore Animation Supervisor Max Solomon and VFX Supervisor Graham Page on the zero G characters. I also worked with Shawn Hillier and the Framestore team in Montreal who were doing the arctic environment work to solve some key creative challenges with them. People always say this, but in this case it was true: it was a big collaboration. In the end I was looking after about 450 of the 615 VFX shots in the movie.

How did you work with the art department to design the spaceship and the look of the planet?

MK: Framestore loaned us an Art Director, from its Art Department, Jonathan Opgenhaffen. He worked closely with Production Designer Jim Bissell in developing the ‘Aether’ design. The main service tunnel was designed to look like a reuse of existing space technology, combined with 3D printed crew habitations and control pods.

CL: Yes it’s a cliche but the Eames quote ‘details make the design’ really applies here. Jonathan made this fabulously rich model in Blender, which he rendered with full global illumination and did material and lighting studies for. He also collaborated with the Nviz previs team to support the staging as the sequence blocking evolved. It was a really good example of VFX being involved from the outset and working together to achieve a designer’s vision.

Can you elaborate about the creation of the spaceship Aether?

MK: We used Nviz and their virtual camera to help us design the ‘Aether’. We would export it from Blender into Unreal to walk round and plan shots and also to finesse the design, and scale, as well as layout for staging the action

CL: It went through several stages. First was Jonathan’s Blender model and the Unreal engine version used for previs. Then we had to build a finished CG model, ready for monitor playback on set. These were 6k wide renders, loops up to a minute long with locked cameras but moving lighting and the spinning ‘baton’ going round. This was given to the graphics folks, Felicity Hickson at Territory for in-camera playback on the huge monitors on the bridge, where a lot of the action takes place. This process was really helpful, because it locked in a lot of the final design decisions during production when people’s minds were focussed. The model itself was finished in Maya under the expert hand of Modelling Lead Michael Balthazar, with a fantastic amount of texturing detail from Gianpietro Fabre’s team and look development led by Jason Baker. This was quite a rush to the finish because we needed the Christmas holiday to get the massive amount of rendering through in time for the shoot. Next the areas that weren’t seen in those CCTV views were built out to sustain the same high level of detail. This was done in parallel with extensive optimisation to get the ship to render quickly in Framestore’s proprietary renderer Freak. Our CG Supervisor JP Li had his work cut out to get it rendering in a reasonable amount of memory. Finally, we got to smash it up… always the fun part, with the damage this time led by Brent Droog and his FX team!

Did you receive specific references and influences for the spaceship?

MK: Jim Bissell and George both had quite specific notes, and Jonathan Opgenhaffen really helped fill in a lot of the detail.

CL: As Matt said there were two design languages, the central axis of the ship was referencing contemporary spaceship design and reuse of existing orbital pieces such as the ISS truss work. The idea was to communicate that it had been put together in a hurry. The living quarters were designed to be spinning to create centripetal force and provide the crew with ‘gravity’ on their long journey. The idea behind this was that it had been designed from scratch using the current latest technology: 3D printing and ‘topological optimisation’ to create organic, mathematical looking structures.

How does the light in deep space affect your lighting work?

MK: DP Martin Ruhe and I wanted to treat space as a real place, rather than the more traditional sci fi, high contrast look of many films, Martin did this by using more lights than we needed on stage, and exposing down. We researched the effects of radiation on film and digital cameras, then working closely with Graham Page Framestore London Supervisor, we replicated some of these effects digitally. Rather than wait til the DI conform at Company 3, Martin would pay numerous visits to Framestore London to guide us with the lighting. This resulted in the final conform and the Framestore final renders being more or less the identical.

CL: The reality is that light in space is incredibly harsh. Martin overcame that by being very careful with exposure, to create cinematic shots that looked beautiful and felt authentic. He collaborated at every stage of the process, initially with Art Department and previs on the design and staging. The story dictated where the sun would be relative to its journey so we had to plan carefully as certain cheats would have been impossible to pull off. He also came into Framestore during post for reviews and briefs, where we were able to look at the latest work on the big screen at 4k. This was a huge treat because the shots looked gorgeous and as so much careful planning had been done his notes were always guiding us in concert with his footage.

How did you destroy some parts of the ships?

MK: This was all digital, FX sims and hero animated rocks. The destruction was modeled.

CL: Yes, FX simulations were done in Houdini using a combination of rigid bodies and cloth. We tried to respect the construction of the ship and materials to create contrasts and complexity in the movement, one of the things with zero gravity in a vacuum is that rigid body simulations can naturally look simplistic so we added greebles & used variable materials to mix it up. The tearing and smashing of geometry started off procedurally but extra detail was modelled in some areas to make it more compelling.

How did you manage the Zero G aspect for the FX simulations?

MK: We went through many iterations and R&D for the look of the blood in zero G. We carried out many FX sims, then we hero animated the larger blobs to interact with the set or each other.

CL: The look of the blood was closely matched to some fantastic reference from the ISS. There was one video in particular of a blob of water with ink injected into it, shot 4k on a Red camera that we studied. It was a challenge to get the surface tension wobbles to feel right, and it had to combine with the shading to look opaque enough, and bloody enough without looking too red and graphic. Graham Page and Brent Droog spent a lot of time finessing the look. The whole sequence needed very specific choreography so we used library simulations that we could place as well as hero, per shot sims for things that interacted with sets or other objects.

Can you tell us more about the use of Stagecraft on this show?

MK: Jim Bissell was building this beautiful, but reflective set. It had huge windows onto a snowy tundra. Rather than hang a bluescreen and having to then suck a third of the colour information out, we decided to use ILM’s StageCraft, I went to Iceland and shot a five Alexa mini camera array of the set location. As luck would have it, the day I shot the footage was some of the only snow we saw on location. We also surveyed and mapped the area so the array footage could be projected in 3D. We did this for four different times of day, so on set we could choose time of day, weather, and general grade. The camera was tracked, using motion tracking cameras throughout the set, and this would create parallax on the frostrum area of the screen the camera was facing.This meant we had approximately 100 VFX shots in camera going into post.

CL: The huge advantage of this approach was that Martin was framing shots for reflections and everyone understood the scope and exposure of the window views. In the event that the views had to be fixed or changed, we had a great starting point with what was captured in camera.

How did you create the new planet and the devastated version of the Earth?

MK: We did this as a 3D Earth with cloud and destruction sims, then a beauty DMP pass over the top.

CL: Yes, this was a hybrid approach. Jonathan Opgenhaffen initially created some beautiful concept images to establish what the look of Earth should be. This took a few goes to get right because George didn’t want what caused the destruction to be too obvious. From there we built out a CG earth with a 2D fluid sim as a base for wind direction and volumetric cloud simulations to create the correct atmospheric lighting falloff. Framestore Montreal FX Lead Kaki Hudgens and environment artist Alejandro Lavrador collaborated to simulate and place all the clouds. Finally it went back for final matte painting work, which was largely done static. Framestore’s comp lead in Montreal, Laurianne Proud’hon assembled the whole thing and added additional 2.5 movement in Nuke to create the swirling masses.

Can you elaborate about the recreation of the North Pole and its beautiful skies?

MK: George didn’t want the plot to be driven by escaping the ‘poisoned’ skies he wanted to illustrate something was wrong and at the same time look beautiful, so we used the plates where ever possible and added hits of atmosphere and colour in where needed.

CL: The Iceland location was having a very warm Autumn, so the locations needed rebuilding as CG to enhance the snowscape. This was in addition to the exterior architecture, that was all realised in CG, based on Jim Bissell’s designs. Lots of subtle simulated atmospheric effects and blowing snow were added to the shots to give life and scale to the harsh environment.

How did you create the blizzard and the ice cracking sequences?

MK: We shot the ice breaking scene on stage at Shepperton, on an SFX gimbal, we surrounded the stage with a Rosco Gel, back lit with sky panels instead of a bluescreen. Atmosphere and wind was added in post, to match levels we had shot in Iceland. All the under water shots of George were shot clean, at Pinewood on U stage, with ILM’s Supervisor Malcolm Humphereys, who did a great job adding the Ski Doo, icebergs and bubbles in CG.

CL: We also built a beautiful photoreal CG wolf for the blizzard scene, but in the end you only really see a silhouette of her stalking Augustine.

The movie is full of graphics elements and holograms. Can you elaborate about them?

MK: We tried to get as many screens in camera as possible, we had a great Art Director Felicity Hickson and Compuhire on set, then in post Chris Lunney at Nviz did a great job filling in the rest. One of Us created all the looks for the holograms, and map rooms, creating a magical look for some of the more intimate moments.

Can you explain in detail about their creation and animations?

MK: We built sets that would represent the Holograms and projected them back on them selves, for the interactive map rooms we used controllable lights as place holds and eye lines, which meant a fair bit of clean up/paint out.

How did you choose your various vendors?

MK: Framestore was an obvious choice, because of the beautiful work they had done on Gravity and The Martian. We used ILM for the sinking pod sequence as they were on board for the StageCraft LED Wall and Anyma. One of Us looked after all the ‘creative/subjective’ aspects such as the various hologram scenes, and the planet K23. We used Nviz for virtual cam, simul cam and graphics.

How did you split the work amongst these vendors?

MK: Framestore London were the obvious choice for the zero G scenes and Framestore Montreal, was a good match for the creatures and environments. ILM did all the StageCraft, Anyma and the pod sinking sequence. One of Us picked up all the subjective scenes, such as holograms, monkey computer game, and the look and design of K23.

Can you tell us more about your collaboration with their VFX Supervisors?

MK: I dealt mainly with Chris Lawrence, Graham Page and Max Solomon at Framestore and Shawn Hillier at Framestore Montreal. Oli Cubbage at One of Us and Mark Bakowski at ILM, I had a great relationship with them all, having a virtual open door policy to the VFX cutting room during lockdown, all of us collaborating together.

CL: We should also give a shout out to Kyle McCulloch who covered some of the shoot while I was finishing another show.

Which sequence or shot was the most challenging?

MK: The digi double work was the most complex as we needed believable CG faces.

CL: The spacewalk taken as a whole was probably the most challenging work. Right from planning the shoot to the final comps we had to get everything right or the illusion would be blown. Within that there were other things that were especially difficult, the digital faces in particular took an enormous amount of effort from some of the finest craftspeople I have worked with. Two artists in particular went the extra mile: Gabor Foner and Andrea di Martis. We were literally making tweaks on the eyelids of less than half a millimeter to make sure that expressions were matching the plates from the facial capture shoot. Our eye rig was simulating the asymmetrical squash of the eyeball and creating a meniscus of tear fluid over it. Just to get through dailies I had to learn ophthalmological terminology, it was quite the endeavour! Our lookdev lead Stephen MacKershan was very patient with me while I learned all the anatomy.

Did you want to reveal any other invisible effects?

MK: The digi doubles, the LED screens meant we also had invisible VFX shots in camera while we shot.

CL: I always found it quite funny that the pea fight has CG peas.

Is there something specific that gives you some really short nights?

MK: There was so much work going into certain sequences, such as the Anyma, Maya’s death and the look of the planet K23, that George and the studio just had to trust us when we said they were going to look great. The first full look at these sequences was so close to delivery, we had nowhere to hide. But this is where George’s trust was paramount, he accepted this and was incredibly happy with the results.

What is your favorite shot or sequence?

MK: Maya’s death scene is my favourite as so many shots in it are fully CG and I even have trouble remembering which ones, I also love the blood, it was a ‘ballet’ — it had to look beautiful, real and new. I think we achieved in all of those criteria.

CL: My favourite shot is when they bring Maya into the airlock. This is a 100% CG shot with captured faces but it feels completely real. It’s a great testament to the team who worked on it including Animation Supervisor Max Solomon, because it just worked from the outset, we just saw it and went ‘yeah!’.

What is your best memory on this show?

MK: I have so many, living in a remote hotel in the middle of nowhere in Iceland. I loved working with the huge LED walls.

CL: My daughter being born during prep 🙂 My second best moment was probably the first time since the outbreak of the pandemic that we were allowed back into the screening room at Framestore. We’d been reviewing everything at home and I really wanted to check that it was going to look good when projected big. I went in on my own and sat in this huge theatre with a 4k projector and watched this sequence that we’d been working on for almost a year and thought “phew! We did it…”.

How long have you worked on this show?

MK: I have been on the show from March 2019 till now.

CL: I first met with Matt and the Netflix folks in June 2019. I was part time to begin with while I finished off another show.

What’s the VFX shots count?

MK: There were 615 VFX shots and about 100 in-camera VFX shots.

What was the size of your team?

MK: 5 foot 8 inches 🙂

CL: Framestore had a crew of 382 people who worked on the show and I was working closely with a core team of 20 or so spread across the London, Montreal and India offices.

What is your next project?

MK: I am now on Eli Roth’s Borderlands, and still consulting with George on his current movie The Tender Bar.

CL: Helping out briefly on a Marvel film. I probably can’t talk about it.

A big thanks for your time.

// Inside The Midnight Sky’s Groundbreaking VFX

WANT TO KNOW MORE?
Netflix: You can watch now The Midnight Sky on Netflix.
Framestore: Dedicated page about The Midnight Sky on Framestore website.
ILM: Dedicated page about The Midnight Sky on Industrial Light & Magic website.

© Vincent Frei – The Art of VFX – 2021

LAISSER UN COMMENTAIRE

S'il vous plaît entrez votre commentaire!
S'il vous plaît entrez votre nom ici