Stephane Paris began his career in Paris in many studios like Duran Duboi on films like ASTERIX & OBELIX: MISSION CLEOPATRE or IMMORTAL (AD VITAM). He then worked at MPC and Weta Digital. In 2006, he joined the teams of Cinesite. He supervised the CG of movies like CLASH OF THE TITANS, PRINCE OF PERSIA: THE SANDS OF TIME or THE CHRONICLES OF NARNIA: THE VOYAGE OF THE DAWN TREADER.
What is your background?
I began my career in Paris where I worked as a digital modeller on numerous feature film, television and commercial projects. I joined Cinesite in 2006 and have supervised an array of blockbuster work for the company, including CLASH OF THE TITANS (2010), PRINCE OF PERSIA: THE SANDS OF TIME (2010) and THE CHRONICLES OF NARNIA: THE VOYAGE OF THE DAWN TREADER (2010). I was also on the GENERATION KILL (2008) team, for which our work was awarded an Emmy and VES Award nomination.
How was your collaboration with director Rob Marshall and production VFX supervisor Charlie Gibson?
I never actually met with Rob Marshall; Charlie Gibson was our main contact throughout the production. I hadn’t worked with him before, but he has a long and prestigious track record in effects. I found him very easy to work with and we found creative solutions for shots by working together.
Can you explain the different steps for creating the huge environment during the sequence of the chase in London?
It was quite a task because the final shots had lots of composited layers. We relied heavily on our proprietary tool, csPhotoMesh, which enabled us to build CG sets for environmental CG extensions. We used camera projection techniques set up in Maya and transferred for final application in Nuke. We also created fully shaded, photo-realistic models of the buildings to extend the Greenwich location and make it look like a convincing 19th century city.
What references have you received from the production for the buildings and streets?
We mainly managed the references for the sequence on our own – Cinesite has a team dedicated to that task. Our VFX Photographer Aviv Yaron went out on location on the streets of Greenwich and on set at Pinewood Studios, taking thousands of photographs which were key to the creation of the environments. These photos were used with our proprietary tool csPhotoMesh to create 3D scan-like geometries. We also used the photos to create high resolution textures to reapply to our models.
Can you explain in detail the creation of digital doubles?
We didn’t create any digidoubles, but we composited elements of the crowd using Nuke to populate the shots in the areas of the city we extended.
How did you create the FX elements such as coal burning?
In the sequence where Jack Sparrow is escaping through the streets on a horse and cart, burning coal spills onto the street. We used rigid body simulations to create the breakage as the coal hits the ground and Houdini to create the fire emanating from the pieces as well as the smoke it generated.
Can you explain the creation of the frog?
The clients provided us with photographs of poisonous tree frogs. We completed look development in Renderman, then created a model using the photographic reference, which we rigged and animated using Maya – this included subtle and realistic movements like blinking and breathing.
The main frog was red, but we created several coloured versions, most of which populate a jar which Barbossa holds up to inspect. The most tricky part was to get the shading right, but in the end we found a good balance between subsurface scattering and specular passes, which I think make it pretty believable.
Can you tell us in detail the creation of the wooden leg of Barbossa?
We initially took photos of the wooden leg prop on location and the production provided us with a 3D scan. We used the photos and scan to create a full CG leg, from trousers and straps down to the wooden peg. Creating the entirety of the leg was easier, in many cases, than just adding the lower section. The top of the leg was rigged and animated to match the actor’s movement.
In some shots, where the actor’s performance had him touching the leg, we needed to recreate and animate his hand and sleeve digitally, to achieve a good interaction.
Do you have developed specific tools for this project?
Yes, our Head of Visual Effects Technology, Michele Sciolette, led the efforts to build the stereo production pipeline and develop a number of new tools to meet these challenges. These included csStereoColourMatcher, a fully-automated tool designed to compensate for colour differences between stereoscopic image pairs, and csStereoReSpeed to determine the best respeed methodology for any given shot. We also used csPhotoMesh, which already existed, but we certainly developed it further throughout the course of this project.
What was the biggest challenge on this project?
The whole project went pretty smoothly from a CG perspective and there was very little to be concerned about. We had good direction from Charlie Gibson and our in-house VFX Supervisor Simon Stanley-Clamp. I’m pretty pleased with how it all went.
How long have you worked on this film?
I became involved in the testing around July 2010 and we delivered at the end of March 2011, so around 8 months in total.
What was the size of your team?
The size of the team fluctuated, but I’d say we had around 50 people, including everyone from tracking through to lighting.
What do you keep from this experience?
This was my first native stereo project. Aspects of the CG are greatly affected by the stereo aspect so it was definitely a learning experience for me.
What is your next project?
I’m working on a really exciting project, but I’m not allowed to say what it is at the moment!
What are the 4 movies that gave you the passion of cinema?
THE THING (John Carpenter)
RAIDERS OF THE LOST ARK.
A big thanks for your time.
// EN SAVOIR PLUS ?
© Vincent Frei – The Art of VFX – 2011