How did you get involved in this series?
I got involved with 1899 in late August / early September 2020. I happened to be in Germany, where there was an opportunity to meet with Bo (Baran Bo Odar), Jantje [Friese] and Philipp Klausing in Berlin. It was an exciting opportunity, as I really enjoyed their work on Dark. I was invited to showcase Framestore’s latest tool-set for virtual production and explore possible creative and technological collaborations with Dark Ways. It turned out that we shared similar visions on how virtual production could be leveraged on their production.
How was the collaboration with Creators Baran bo Odar, Jantje Friese?
The collaboration with Bo and Jantje was a great experience for me. When we met for the first time, we only had four pages for each episode. It was the perfect time for us to explore how we could leverage virtual production technologies and visual effects for some of the scenes while Jantje was writing the scripts. It was a great time to join the project, as it was far along enough to get a clear idea of what was required visually, but at the same time the schedule allowed for enough flexibility to explore different solutions for the VFX work.
Creatively, both Bo and Jantje had a clear idea where they wanted to land, and it gave us the opportunity to discuss different approaches on how we could achieve the visuals they envisioned. Being involved so early on in the process meant we were able to work closely with Udo Kramer (production designer) and Nik Summerer (DoP) to ensure every aspect of Bo’s and Jantje’s storytelling went hand in hand.
We also had the pandemic to contend with, so a lot of decisions changed on a daily basis. A location shoot in Spain, for example, would be switched in favour of an LED volume shoot. It was a very intricate and exciting process to be part of.
What were their expectations and approach about the visual effects?
For the showrunners, storytelling is at the heart of creation. Like every other department, we were there to support the story and translate their vision by complementing the production design and cinematography. It was very clear from the beginning that everything we did should be done for the sake of narration and keeping the audience engaged in the world of 1899. It’s actually the work I enjoy most as a VFX supervisor, when we are able to visually support the show and hopefully the viewer will never notice we did any visual effects work – at least not in the moment.
How did you organize the work with your VFX Producer?
During the prep and virtual production process of the show, I worked closely with Nick King (VFX producer) to organize the shoot along with Martnia Chakarova (Virtual Art Department producer, Framestore) and Manon Hartzuiker (Virtual Production producer, Framestore). Whilst a lot of the processes were familiar or similar to post-production, we obviously needed to adapt to accommodate the ever changing fluidity of pre-production.
The post-production of 1899 was really not that different from any other visual effects show. Whilst Sophie Caroll (VFX Producer, Framestore) took over from Martina and Manon on the Framestore side, Marlene Nehls (Production VFX Producer) took over from Nick. We would be involved in the planning and bidding of the post-production. As the edits progressed and evolved we specified VFX methodologies and reevaluated the schedule. We used ShotGrid to keep track of our shot work.
Can you elaborate about the filming of the sequence on the boat?
Given the fact that a lot of scenes would take place on the decks of the ship, they naturally became a large part of the virtual production, with one huge benefit: we would be able to come back to the same weather conditions time and time again. Of course, we still had to be mindful of how we would best utilise the LED volume technology, but it was great to be able to return to the same sky when we needed to, even on different decks. This allowed us to schedule the scenes in a way that we could work by location, rather than the weather condition.
We had several locations on the ship. The front and rear deck, the promenade deck, the bridge, the dining room and the engine room, some of which were retooled for scenes on the Prometheus. We always knew that the wide establishing shots of the ships on the ocean would become visual effects shots, at the same time, our goal was to get as many shots in camera as we could for the dramatic shots. We prepared and assumed that most of the backgrounds should hold up to be used in camera and should not require post-production VFX work by default. This meant that we didn’t use markers if we didn’t absolutely have to, or it was an obvious VFX shot. Since every episode was a different day and weather condition we had to make sure that our ship asset, as seen from the decks, would hold up in various lighting conditions.
For the skies we used HDR images which gave us a photorealistic look for a large part of the image, which was great, as we would naturally be able to retain fine hair details and get a very natural looking integration of the actors into the set and environment.
In terms of the individual sets, for the front and rear decks we were able to establish which areas of the ship asset we would be able to see from the various locations through VR scouting and we could focus our virtual production efforts on those areas. At the same time Udo Kramer, the production designer, worked closely with us to ensure we would have a coherent design and visual language throughout the entire set. Since there were so many scenes on the decks, the idea of a turntable arose. We could rotate the set 360 degrees within minutes, and this gave Bo freedom of movement and the flexibility to shoot into any direction.
The bridge and promenade deck became inevitable sets for the LED Volume, as we had perfect sky continuity for the decks and there we needed to be able to offer the same for those two ‘outdoor’ locations. Tracking the cameras here was a little trickier as we had confined spaces and we didn’t want the Vicon cameras to be visible in the shots, but in the end we managed to hide them well and with careful framing we were able to get a lot of shots in camera for the bridge in particular.
The engine rooms were great interior sets within the depths of both ships: dark, gloomy, coal-filled engine rooms with vast furnaces. As with all the other sets, it was important to not have fore and background but to end up with one continuous set. One thing that helped here was to have motion captured crowd performances playing in the background, which brought life to virtual extensions. We used a lot of atmospheric special effects within the practical set, but added the flames in post-production for safety reasons. Besides the addition of sparks and flames, and the occasional top up, the engine room was a hugely successful virtual production build.
Lastly, and this is probably the most impressive interior set, is the dining room – particularly the Kerberos dining room in episode one, when Maura enters it. It works so well that we were actually able to capture a lot of the scenes in camera. The challenge here was that we were not only much closer to the walls and ceiling, compared to the sky and ocean on the decks, but also much brighter than the engine room. Therefore creating a photorealistic interior became a real challenge. Whilst the Unreal Engine can render very convincing interiors, running them at 24fps is an entirely different story. In the end we found the right balance between modeling detail, backing lighting and textures to turn the 270 degree LED wall into a grand dining hall. It was a pretty spectacular experience to shoot the dining room and to see the entire set become one entity once you looked through the camera. This is one of the things I’m most proud of.
What were the pros and cons of using a massive LED volume?
Shooting a TV series in an LED volume has some great advantages, at the same time it does bring restrictions to the table. The volume gives you the opportunity to return to the same weather conditions for an environment time and time again. In the case of 1899 where a large portion of the eight episodes would play on the different locations of the ship this was obviously ideal for narrative continuity. At the same time, it does limit you to lighting conditions that are favourable for the LED panels. Direct sunlight or harsh lighting in general are still tricky, even in large volumes. The amount of light spill and the quadratic light falloff still make it difficult to get convincing results in the LED stages. Luckily, this wasn’t a huge problem on 1899, as the visual language was always going to be overcast, dark and gloomy.
Another factor to bear in mind is that the LED panels as a light source behave differently to regular cinematography lights, so there’s a learning curve for all departments to work with this type of light source.
Another impact is of course the schedule. If you are seriously considering capturing in camera visual effects, you have to do a lot of work up front, that would usually happen in the post schedule. It does flip the VFX schedule on its head and all the asset builds have to happen and be signed off before you start shooting. This can be challenging, as maybe not all scenes have had their final draft script and production design may or may not be able to commit to certain designs as they are dependent on material or rental availability, so you end up doing as much as you can and readjust your assets as script and sets evolve. Once you get over those hurdles though, LED walls can become an incredibly powerful production tool and when you have the right scenes, or subject matter and have all other departments on board with the approach and there is a common desire to make it all work, you can achieve extremely convincing results.
Can you explain in detail about the design and creation of the two main ships?
The Kerberos and Prometheus play a key part in 1899’s narrative. Colour palette and symbolism played as much a part as the period accuracy when it came to designing the ships. We did a lot of research with Udo and his team to find accurate references for design and architecture of the time towards the end of the 19th century. Some aspects of the design were driven by the shooting requirements of the LED volume, for example, we needed to find “easy” ways for the cast to enter the deck as they could not just magically appear out of the LED wall. During the early stages of scouting and laying out the scenes in VR, we would identify the requirements and place visual blockers strategically or position the stairs in a way that the cast could easily enter the decks during a scene. We tried to stay true to the period, as much as possible, and only modified for technical or story purposes.
What kind of references and influences did you receive for the boats?
One of the ships we looked closely at was the RMS Lusitania. It matched closely in size to what Bo and Jantje needed for their story and gave us plenty of options for the design of the decks and exterior of the ship. Since we also needed wide open ocean vistas for the establishing shots, where we needed accurate scale and having a ship that existed as a base or reference point, was a huge help in bringing it all together.
What was the real size of the boats?
The final design of the ship was 221m long and 22m wide. The draught is 9m and the total height to the top of the mast is 60m. The decks are approximately 11m above the waterline.
How did you enhance the various parts of the boat?
We needed two stages of the ships: one that was suitable for the real time display on the LED wall and one version for the VFX. For the virtual production model we built the asset with VFX in mind, but had to go in and optimize and reduce poly count and delete everything we didn’t need so we could display what we needed, but not calculate anything we didn’t. But with everything modeled and textured with post production in mind, we were able to have a good starting point to the correct scale when we approached the visual effects. Once we started the visual effects asset build, we were able to add a lot more model, texture and surface detail without having to really redesign anything, except of course the areas that were not fully built for virtual production. This approach made the whole process very efficient.
How did you handle the night sequences and how does that affect your work?
It became clear very early in the creative process that we would stay in a dark and gritty mood. Inevitably a lot of scenes would take place in very dark or night time lighting scenarios, which brings its challenges, but also makes it exciting to really work on the grade and lighting carefully. Whilst LED wall content and VFX had their own challenges to deal with when it came to night time sequences, in the end the solution for both is actually quite similar. You display, shoot and/or render your sequences brighter to avoid noise and then work with exposure, LUTs and finally DI to achieve the look that Bo was after. Obviously, it’s a bit more complicated to deal with a black painted ship in the middle of the Atlantic at night, but it is a common approach to avoid noise. We had to light the scenes very carefully to see enough of the ships, ocean and sky, but not break the illusion of a night time scene.
The ocean is playing a big part in the series. Can you elaborate about its creation and animation?
Just like everything else, the ocean was its own character and had to support the storytelling. Early in the story, the ocean is all of a sudden much calmer when Eyk and Maura row over to the Prometheus and this isn’t a coincidence, but an early hint of the narrative. Therefore we needed an ocean that could be easily used in many shots, allowing for just enough customization, whilst not making every ocean shot a complicated one of water simulation.
Ocean surfaces are still very tricky to handle, as surface detail has a huge impact on the perception of scale and therefore how far away you are from the water. It could very easily make the ship look like a miniature if the surface detail was incorrect. Some of the wide establishing shots required thousands of meters of ocean surface to be visible, so we needed something that would work visually but not break the RAM requirements. We ended up with a system that had the most amount of detail close to the camera and we reduced geometric detail the further we were from the camera. The calmer ocean surfaces were actually trickier to get across convincingly as there was less movement and the surface detail was even more important compared to the rougher ocean surface, where the dynamics would carry a larger part of the perceived scale.
How did you create and animated the “tunnel of water”?
The swirl was another big challenge, as you cannot resort to physically plausible water simulation physics for art directed water surfaces. We spent quite some time in camera layout with low resolution geometry and made sure what we saw through the camera had the correct composition, before we carefully mapped ocean surface detail onto the deforming surface. Only then we added secondary FX simulations for the spray, foam and subsurface aeration.
Can you elaborate about the storm creation and its lighting challenges?
The big storm in Episode 07 was a massive challenge to achieve in such a short amount of time. Again, scale was of the essence here and since in the story the swells of the waves were beyond what you could imagine, the FX simulation of the water and the ship interaction became an additional challenge, as the physics had to be manually tweaked to work visually.
The most difficult shots for the storm were the establishing shots when we see the Kerberos conquering the storm. It became evident very quickly that if the physics and scale of the water didn’t play well together, the shot would not work. Therefore we needed to do careful camera layout with a simplified version of the massive waves and animated the ship accordingly, only once we were happy with the composition and feel of the shot, we would go into the various secondary FX simulations, which also needed careful physics manipulation to work for the desired feel of the shot.
The vistas of stormy ocean shots were also very large and we quickly ran into technical limitations of traditional methods using height fields as displacement. Using the camera’s field of view and ship position, we were able to limit the area that required subdivision and could therefore increase the amount of geometry and, subsequently, surface detail. This way we were able to reduce our memory footprint and deform over 40 million polygons in the camera frustum. Once we have camera layout and a stormy ocean surface, we used Houdini FLIP sims with proprietary modifications to achieve the water interactions, splashes, spray and aeration. These FLIP simulations were then merged with massive storm waves and then rendered together, so comp would receive seamlessly merged renders.
In addition to the full CG ocean establishers, we also had a large sequence of Olek and Ling-Yi on the front deck of the Kerberos. This sequence was not shot in the Volume, but in the backlot of Babelsberg Studios in Berlin. The special Effects team, supervised by Gerd Nefzer, set up a massive two-tonne water tipping tank, water mortars and tractor-powered wind machines, to create all the necessary elements for the actors to be filmed on the deck. This footage was then integrated into our full CG stormy ocean and supplemented with additional rain, water and spray elements in compositing.
How did you create and animate the “virus” that contaminates the ship?
During the look development of the crystal virus, we explored several areas for the movement to find the right balance of organic and mechanical movement. We wanted something that would grow organically, but at the same time, could glitch or move mechanically. We had some really great results using Machine Learning generated animated textures as a base for surface displacements, which gave us these really unexpected and unusual happy accidents. This then became the base for our FX animation for the crystals as we needed to have a certain amount of control over the speed and direction of growth. We used traditional animation techniques for the blocking and distribution of the crystals before passing these simple geometries to the FX team who developed their own system to distribute and grow the crystals in the shots.
Can you explain in detail about the creation of the various simulation environments?
The simulation environments are a mixture of actual locations and CG generated environments. Spain, Scotland, Poland and the Futuristic Landscape were virtual production environments. Each of them had a slightly different approach in the creation process. Scotland for example was based on an actual location in the Scottish Highlands, that was discovered coincidentally during the ocean element shoot. During a fly over, Bo spotted a valley that he really liked and he shot an establishing shot with the helicopter. Since our virtual build had to match with the establisher we sent a team up to the valley to do a detailed photoscan and drone photogrammetry. This material was then converted into a 360-degree simplified geometry with highly detailed texture. We made some adjustments to the sky and vegetation to align with the mood that Bo liked and the practical sets. The Mental Institute was also added into the valley, which was enhanced for a selection of shots in post production. The Futuristic Landscape was based on Scotland with various modifications to the amount of snow, adding the Pyramid and crystal structures. Spain and Poland on the other hand were built completely from scratch in CG based on various reference materials.
How did you handle the transitions when the characters are moving between these sims?
The transitions or portals between the worlds were something that Bo had a very clear vision of from the beginning. He wanted these portals to open up randomly in the middle of thin air and have a three dimensionality to them. Even though they were really gateways between simulations or memories, he wanted them to look like you could touch them. The movement needed to be quick, look believable, but didn’t have to adhere to real physics, it was ok to fold away geometry and let it disappear, they were meant to be portals, not physical objects, even if they looked like one on a still frame.
We used a combination of projection set up and geometry deformations within Hoidini to achieve this effect. Which gave us control of the timing and how far a portal would close within any given shot.
Which environment was the most complicated to create?
I am going to name two, one for the VFX side and one for the VP side. For VFX, the storm was really the hardest environment to get right. The vast expanse of the view made it incredibly difficult to manage detail, scale within what we could simulate and render with today’s hardware.
For the VP side, none of the content we put up on the LED wall was easy by any means, but the dining room was probably our most complicated to get to run in real time with the look we were after. It required a lot of optimization, baking lights, and reducing geometry over and over again. In the end, I do think it turned out to be one of the most impressive virtual environments.
Is there any invisible work you want to reveal to us?
My mantra is, the less aware the viewer is of our work, the better we’ve done our job. I mean there are always obvious visual effects, because they go beyond the physically plausible, but the rest, if you can’t see what we did, I prefer to leave it that way and not spoil the illusion.
Which shot or sequence was the most challenging?
The storm sequence in Episode 07, which stretches across the whole episode, with various huge ocean establishing shots, Olek getting washed off the front deck and Ling-Yi, including the Kerberos, being dragged into a giant swirl and emerge into the ship graveyard. Getting all of this CG water rendered and comped with the level of detail that we needed was without a doubt a big challenge. Nonetheless I am incredibly grateful for the fantastic work all the artists were able to achieve.
Is there something specific that gives you some really short nights?
Despite some incredibly complex visual effects work, the thing that gave me the shortest nights was how we would set up, prep and shoot a virtual production show, during the pandemic. That was a huge challenge.
What is your favourite shot or sequence?
Honestly, there are too many to choose from on this show. It’s incredible how everything has come together creatively. All production departments complement each other visually and that is very satisfying as a supervisor. If I had to choose a sequence, I would probably choose the opening scene of Maura entering the dining room.
As a single shot, one of my favourites is actually one shot in Episode 03, when the camera is pushing into the Prometheus in the fog, with the little row boat and its light attached to it. It’s just a very beautiful and cinematic shot.
What is your best memory on this show?
The amount of effort and care that has gone into the show. And this goes beyond the virtual production and visual effects work. You can see that every department put their best foot forward and made sure that one thing compliments the other. I think it truly shows that everyone who worked on the show, wanted to make it work. People cared – and it shows.
How long have you worked on this show?
A little over two years. I started September 2020 and we wrapped our final VFX work in October 2022. This was a huge project for all of us – it involved our art department, the Framestore Pre-production Services (FPS) team, our design team for the title sequences and VFX teams in London, Montreal, Vancouver and Mumbai.
What’s the VFX shots count?
What is your next project?
Unfortunately, it’s not something I can disclose while the ink is not dry.
A big thanks for your time.
© Vincent Frei – The Art of VFX – 2022