In 2015, Sheldon Stopsack explained to us about the work of MPC on TERMINATOR GENISYS and then he worked on PIRATES OF THE CARIBBEAN: DEAD MEN TELL NO TALES. Today, he talks about his first collaboration with director Michael Bay for TRANSFORMERS: THE LAST KNIGHT.
How did you and MPC get involved on this show?
I first met Michael Bay in early 2016 while he was in the U.K. to scout for the shoot. He visited MPC to discuss the idea of getting us involved in the show. During that time I was heavily engaged in the development of our ocean and water toolkit. Since Michael had some sequences planned that required water work, I was lucky enough to pitch him the idea of bringing us onboard. This is the first time MPC worked on a TRANSFORMERS movie and with Michael.
What was your feeling to be in the Transformers universe?
Very excited of course. The TRANSFORMERS universe is known for its visual richness and incredible scale. From a VFX point of view it has massive appeal as it’s guaranteed that you will be required to deliver your best and push the boundaries. But it’s also one of those opportunities that don’t come around very often. And I am very grateful that I was able to contribute to the franchise.
How was the collaboration with director Michael Bay?
We were fortunate to work directly with Michael Bay on the show. That obviously had the benefit of keeping feedback loops short and direct. At the same time it’s been a great experience to see Michael work first hand. Unlike some directors Michael is really hands on and is involved in every step of the process. That can go as far as him stopping at an artists desk and creating the camera move he wants right there on the spot. It’s quite refreshing and certainly a unique experience for the artists that worked on our team.
The flip side is that you get to a point in the production where Michael’s time is really precious and you don’t get to review the work as frequent as you might like. At that point you are left with a lot of trust to get on with the job in hand. It ended up being a fairly unconventional way of working where we didn’t necessarily take the formal approval steps for every single shot. While it’s been a great new experience for me, it was also challenge for production to track and schedule things accordingly.
What was his approach and expectations about the visual effects?
Needless to say that expectations were high. Michael is a very visual director and has an incredible eye. I remember showing Michael a pitch reel during our initial meeting with him. We played it out once to him and he instantly commented on specific details that many wouldn’t notice. From there it was pretty clear that there would be no hiding. However, what is nice about working with Michael is that almost everything is being judged in context. If something works in the edit you have a good chance that it works for the movie. We showed almost all of our work in cut context. A massive help was the use of the Cisco system video conferencing. It basically allowed me to share my desktop with Michael and vice versa. The benefit was that I had the whole of MPC at hand and was able show anything on the fly. For example, when Michael had a special animation note for a shot, we often already tried what he noted. With the Cisco system and MPCs in house “reviewTool” I was able to quickly pull things up in context and keep the ball rolling. It was a very dynamic and fast way of reviewing the work. That all being said, while Michael knows exactly what he wants, he was also very open for any suggestions and ideas we brought to the table. All in all working on TRANSFORMERS was a really enjoyable experience.
What are the sequences made by MPC?
MPC worked on the underwater sequences, which included anything from the submarine chase, to the DSV submersibles, and the ancient alien ship resting and rising to the surface. The other bigger chunk of work was the drone sequence where our heroes get chased through an abandoned city. We also did a good handful of shot for the London car chase sequence.
How did you organize the work at MPC?
MPC has a very established pipeline and workflow. We pretty much approached it in the same way we would approach any other show. Although I should add that the project was originally anticipated to be a lot smaller – what started at a little under a hundred shots, grew significantly in size, so we had to adapt there a little bit and spread the work across the various MPC sites around the globe. The team did incredible well and reacted very flexibly to demands and tasks we were facing along the way.
The TRF are using drones to catch the heroes. Can you explain in detail about their design and the creation?
The Drone design was pretty established from time we got on board. Production design took place prior to principal photography and life sized drones were built and used on set. We inherited a basic CG build from ILM. That gave us a good starting point. But we had to rework a lot of detail from there. Michael was keen to get a fairly organic appearance where the drones didn’t looks too pristine. So we had to beat them up a bit and made look more dirty.
How was simulated their presence on-set and their interactions?
With life sized drones being built and used on set Michael was initially very keen to use as much as possible of what was in camera. But the better and more refined out CG drones got, the more we ended up replacing most of them. The benefit was obviously the full control over their animation. As convincing and great the rigged drones looked, they lacked a bit of a lightness that made you believe they were hovering. The on-set interaction was partly covered by practical FX. Fans were used to swirl paper and dust around where drones were meant to be. One one hand that was great to have in camera but it certainly didn’t always make it easy with the rig and crew cleanup. We ended up adding a whole lot more to most shots, as the practical FX didn’t quite give us a enough dynamic.
Did you received specific indications and references for the drones animation?
We looked at a lot of existing drone footage. Nothing beats the real deal. So it’s seemed best to focus on the behaviour of real drones. The most important bit was to get the scale right. Drones are a lot smaller than, let’s say a helicopter, and that results in very specific physics and distinct subtle secondary movements. From hovering when more idle, to distinct skidding when going full pace. It took us a little while to get the right balance that everyone liked. Michael wanted the drones to act as a team. They needed to feel well organized, as if there was an orchestration going on between them, either as larger group flying in formation or as team of two searching room and keeping each other’s back.
How did you created the city and the huge environment?
The shots that required large set extensions and city builds came relatively late to us. By that point there had been already a few rounds of conceptual work on what the environment and city should look like. To some extend we knew already what to do or what not to do. However one of the first steps for us was to go through the sequence and create a mapping that would ensure a good flow and continuity. Once we had a our city plan laid out it was all over to our environments team. We approached it very much as a 3D build combined with various projection setups using live action photography. Production provided us with tons of aerial footage from the Fisher Building in Chicago which was used during principal photography. We combined this with material we had in house as part of our library. With that approach way we were able to keep things flexible and could quickly act on changes done by editorial.
The sequence ends with Cade jumping on the main drone. Can you tell us more about these shots?
The shots where Cade (Mark Wahlberg) leaps onto the Drones and finally lands on the Mother Drone were part of the batch that came to us pretty late. Despite the large background extension the challenge was to bring these shots to life. We wanted to break the constraints of the original plate photography and allow for a greater dynamic. For that we matchmoved the plates and roto animated Cade’s performance. With that data at hand we could create new layouts and cameras. Cade and the Drones ended up being a combination of 3D Cards and CG objects seen through our altered and more dynamic cameras.
Can you explain in detail about the creation of the various submarines?
The underwater sequence features two types of conventional submarines. One being an old World War II Submarine. And the other being a modern US Nuclear Submarine. The builds started as shared asset with ILM. It is always an interesting challenge to share assets between two vendors, but in recent years we have seen a trend to more standardised approaches and it is arguably easier nowadays than just a few years ago. Same as with the Drones we picked up where ILM had gotten to and tailored the assets to our own needs.
Can you tell us more about the fight between the two submarines?
The Submarine chase was an interesting piece of the underwater sequence. The original previs was done by MPC LA’s team and we had a good idea where Michael wanted to take these shots when we started doing the VFX work. However the concept of two subs chasing each other seemed a bit odd at first. Michael had the idea of this crazy upside down stunt and it required a fair amount of camera work to come up with shots that were dynamic enough while maintaining the scale and lag that was appropriate for such large objects traveling underwater.
How did you handle the FX and destruction work on this sequence?
For the destruction and FX work required in the underwater sequence we utilized a range of solutions. While some parts used Kali (MPC’s proprietary destruction toolkit), we also ended up using Houdini a lot. Special volumetric effects and large volumes of debris or floating sediment were done using that approach. Each shot ended up with a mix of FX solutions that all got channeled through our lighting pipeline using Katana.
The team discovers a huge ship at the bottom of the ocean. How did you created this ship?
The ancient Alien Spaceship was a build we shared with ILM. The sheer size of that ship was an interesting aspect. With the width of roughly a mile (1.7km) it’s an open invitation to add an endless amount of detail. Since ILM and MPCs requirements were slightly different we set ourself a cut off point where we ingested any sort of model or texture updates that came from ILM. At that point we took the build to our own needs. With the “CrossShip” resting at the bottom of the sea we focused strongly on adding details that helped to tell the story that the ship had been resting at the bottom for a very long time. We added tons of seaweed, corals, sediments and other overgrowth. Michael really liked these details so we ended up sharing it back with ILM. Everything we did to the CrossShip was intended to help the scale and give the ship the appropriate size. On the shot level we ran cloth dynamics to help the underwater look. It was important for us to achieve a graceful floating of extra elements that helped to sell the underwater look.
How did you handle of the lighting in deep sea?
The deep sea lighting was a whole challenge in itself. It very much ended up a joint effort between our lighting, environment and comp teams. To keep as much flexibility as possible we decided to breakdown the lighting in a multipass approach with two main components. The clear and clean beauty lighting of the vehicles and environment itself, plus as set of volumetric passes that would help with the underwater murkiness and the play of the various light sources. Furthermore we had to deal with two types of shots – full CG and plate based shots. For the plate based shots we often stayed very close to the live action plate with our lighting. Knowing that some parts of the plate would need to be treated to fit into the underwater world. It seemed easier to treat CG and Plate components the same way to achieve a consistent look. While the initial idea was to maintain as much as possible of what was shot in camera, we ended up replacing a lot more with CG. Every detail of the asset was so refined in the end, that it seems no compromise to replace larger portions of the practical gimbal build.
What was the main challenge on this show and how did you achieve it?
The underwater look was probably the most challenging. It was a matter of finding the right balance of what is natural and realistic, while still showing enough to tell the actual story. We had to find a good balance that works for both. While we certainly had to show more of what would naturally visible at deep sea, we added a lot of extra details to make up for the “clarity”. Every shot has a large amount of floating sediments and plankton, as well as schools of fish and other bits that would help to maintain scale.
What is your favorite shot or sequence and why?
The two DSV submersibles approaching the entry points on the Alien Spaceship. Its two shots that cut back to back in the movie. We started both of these shots very early in production and although there is a risk that “early shots” became a bit overworked at the end, I still love the result for both of them – especially in stereo.
What was your best moment on the show?
There have been plenty of memorable moments on this show – hard to just pick one out. Overall it’s been a very enjoyable project. We had a great team at MPC and I can only say a big thank you to everyone involved.
How long have you worked on this show?
From initial meetings with Michael Bay to final completion it was a little over a year.
What is your VFX shots count?
We had around 350 VFX shots in house. Roughly 220 of those made it into the cut.
What was the size of your team?
Worldwide we had 350 artist involved in the show. Plus 50 members of the production team supporting the teams.
What is your next project?
Afraid that will remain a secret for time being.
A big thanks for your time.
// WANT TO KNOW MORE?
MPC: Dedicated page about TRANSFORMERS – THE LAST KNIGHT on MPC website.
© Vincent Frei – The Art of VFX – 2017