Neil Huxley has worked more than 5 years at Digital Pictures Iloura as Flame operator before coming to the United States where he worked as art director on movies such as GAMER or WATCHMEN at yU+co. It’s him that create the beautiful opening title sequence of this show. In 2009, he joined Prime Focus.

Hi, can you explain your career path in VFX?
My first job after graduating was UI design in interactive media production. In 2002 I started in VFX as a Flame Op at Digital Pictures Iloura in Melbourne, and then moved more into vfx design after art directing and designing the SALEM’s LOT titles sequence for TNT. I moved to LA in 08 where I worked as an art director for yU+co. There I directed some cool broadcast projects, idents, titles sequences etc… and art directed the title sequence to Zach Snyder’s WATCHMEN. The Mark Neveldine and Brian Taylor-directed movie GAMER in 2008/09 was the first project where I really tackled interface design in a film context. That project then led me to AVATAR,

How did Prime Focus get involved on AVATAR?
I think it was Chris Bond and Mike Fink’s relationship with the VFX producer Joyce Cox and our showreel that landed us the gig. We also had experience with stereoscopic movies like JOURNEY TO THE CENTER OF THE EARTH. Originally I think we only had the Ops center at the start of production but as the project progressed we got more shots from other vendors who had too much on their plates plus James Cameron really liked what he was seeing from us.

What are the sequences made at Prime Focus?
We worked on over 200 shots which included the Ops Centre, Biolab, and Hells Gate exteriors.

What elements did you receive from the production and Weta?
Well we would receive a number of assets depending on the shot and the sequence. Everyone had a look to match to so we would share assets as much as production would allow. We were sent everything from on set photography, concept art, in progress renders from other vendors, 3D models, textures etc etc. It was very exciting for us to see what other vendors were working on.

How did you design the hologram? And the one with Home Tree in particular?
We worked with Jim on the basis that this table would display multiple satellite scans orbiting Pandora so we looked at Lidar imagery. We wanted the table projections to be particle-based to mimic LIDAR mapping so we used our in-house particle renderer Krakatoa. A lot had to be modeled in house or at least reworked since it was previz quality, motion builder files, and not high res enough. The Home Tree had to be rebuilt so we could generate Krakatoa PRT Geo Volumes, a particular grid (LevelSet) representing geometry to mimic the LIDAR-Scan look Jim wanted. Home Tree in particular was re-modeled based on productions concept art. We then added projection beams, icons, glows and dust mites for added detail.

Were you able to propose ideas or did the artistic team of James Cameron already determine everything?
Jim and the production were always open to ideas – some of the screens and animation design was nailed first time, other elements took a few variations and revisions – it was great to work with a director with such a strong creative vision, you know exactly the direction in which the captain is steering the ship so to speak.

Can you explain to us the creation of an Operations Room shot?
The Ops Center and Bio Lab scenes in AVATAR included interactive holographic displays for dozens of screens and a ‘holotable,’ each comprising up to eight layers, rendered in different passes and composited. The Ops Center itself had over 30 practical plexes alone. To enable easy replacement of revised graphics across the massive screen replacement task, we developed a custom screen art graphic script, SAGI. This enabled us to limit the need for additional personnel to manage data, deliver the most current edit consistently, reduce error by limiting manual data entry and minimize the need for artists to assemble shots.

Our pipeline department built a back-end database to associate screen art layers with shot, screen and edit information, and a front-end interface to enable users to interact with it. The UI artists could update textures in layers, adjust the timing of a layer, select shots that required rendering, manage depth layers by adding and deleting as necessary and view shot continuity — while checking the timing of screen art animation across multiple shots.

The immersive screens were treated as a special case because of the sheer size of the practical plex glass element. In the case of creating graphics for the immersive screens, there were several unique factors and challenges to consider:

–The large size and prominent placement of these screens
–Their curved, semi-circular shape that we see from both sides
–The background layer is displayed as a « window to the world », behaving like a world-space environment instead of a localized overlay

To ensure that the after effects animation graphics would appear correctly once mapped onto 3D geometry modeled to match the practical immersive screens, special UV pre-distortion algorithms were applied to the source imagery. For the background layers virtual environments, flight trajectories and icons were modeled and animated in 3d animation software then rendered with a stereo camera rig. Due to the extreme curvature of the immersive screens, special UV pre-distortion algorithms were applied to the source imagery to ensure the graphics would appear correctly once mapped onto the 3D match geometry. The resulting sequence was then processed via the SAGI/ASAR pipeline, with special attributes associated with the immersive screen type invoking a scripted UV mapping system to emulate a « virtual periscope » effect as the immersives rotated to match the action of the practical screens in the plates.
Additional passes were created by the lighting and rendering team to help better integrate the screens into the photography, such as reflections, lighting and clean plate elements.

Did the stereo cause you trouble?
The stereo pipeline was already set up from the teams work on Journey to the Center of the Earth so we were ready. We had stereo dailies at least 3 times a day which really helped in pushing these shots through. Stereo problems I’m sure were the same for anyone else doing stereo projects, LE RE discrepancies, convergence issues etc which all get ironed out as you go.

Have you worked with other studios like Hybride for the screens or Framestore for the Hell Gate exteriors?
We matched a look that was established in one of Dylan Cole’s amazing concept matte paintings for the Hells Gate exterior for a particular shot. I think Framestore did the bulk of that work so they provided us with some great reference too.

Prime Focus has many branches worldwide. Do you allocate sequences between them or all was centralized in Vancouver or Los Angeles?
Most of the work was done in LA under the guidance of Chris Bond, with some additional support from the Winnipeg studio.

How was the collaboration with James Cameron and Jon Landau?
Working with James Cameron and Jon Landau was an amazing experience for all of us. One I would repeat again without hesitation.

What did you keep from this experience?
We were a part of one of the biggest, most spectacular films of all time, and I got to live out a schoolboy dream of working with James Cameron.

What are the 4 movies that gave you the passion for cinema?
Tough question! There are so many films that have inspired me over the years. My brother and I would sit all day in front of the TV in our underpants on summer break and watch movies religiously. I think one summer we watched BIG TROUBLE IN LITTLE CHINA like 30 times! I think the movies that really affected me as a kid were BLADE RUNNER directed by Ridley Scott, THE TERMINATOR and ALIENS from James Cameron and John Carpenter’s THE THING.

Thanks for your time.

Prime Focus: Dedicated page to AVATAR on Prime Focus website.

© Vincent Frei – The Art of VFX – 2010


S'il vous plaît entrez votre commentaire!
S'il vous plaît entrez votre nom ici