Tadao Matsuno has been working in visual effects for almost 20 years. He has worked on films such as CHARLIE’S ANGELS, PINGPONG, VEXILLE and TOMORROW’S JOE. He has been working at Oxybot since 2004.

What is your background?
I studied robotics engineering in university and got into the entertainment industry 20 years ago. I started in film VFX in 2000 when I joined the VFX team in LA for CHARLIE’S ANGELS, and then worked on PINGPONG in 2002. Since 2004, I’ve worked at Oxybot as a VFX supervisor on various film projects, including ASHITA NO JOE (2011), ANTARCTICA (2012, TV series), VEXILLE (2007), which was the opening film for the 72nd Locarno film festival. Aside from working on a few film projects a year, I also supervise the VFX production on TV series. My specialty is in shader development and composite. As for the achievement, I have won some awards in Japan and overseas.

How did you and Oxybot get involved on this show?
FULLMETAL ALCHEMIST was the project that Oxybot planned and produced. Oxybot is a production company that handles all aspects of film production from conception to post production, which is rather unique in Japan. I work with Oxybot as a VFX supervisor, and I was involved with this project from the beginning.

How was your collaboration with director Fumihiko Sori?
Sori is a film director/producer, and a VFX supervisor at Oxybot. Since we’ve been working together for years on many projects such as PINGPONG (2002) and VEXILLE (2007), we understand each other very well. So the collaboration on this project was great as usual.

What was his approach and expectations about the visual effects?
Unlike Hollywood, the Japanese films hadn’t had any real photorealistic CG characters. However, this film demanded one of the main characters, Alphonse, to be a full CG character, and it had to look Hollywood quality photoreal. Furthermore, the number of VFX shots in this film was right up there on the top for the Japanese films. So, naturally, it was very challenging to control everything.

How did you organize the work with your VFX Producer?
The main producer of the film doubled as the VFX producer, so there was no stress. The VFX team was mainly made up of the artists from Oxybot, so the teamwork was great also. The problem was the budget and the schedule. Unlike Hollywood production, we had to produce a huge number of shots in a short period of time with a small group of people and a small budget. I’d like to note the pay scale for the Japanese CG artist is not that different from Hollywood.

Can you tell us more about the previz and postviz work?
The director himself designed the VFX shots and prepared the storyboards, so the layout of the VFX shots went smoothly. As for the shooting, the director knew all the technical aspects of VFX, so we didn’t need to explain anything to him. The editing process was the same. He decided the timing of the VFX shots and edited them himself. Since he was the hands-on director at every step of the way, the production process went very efficiently. We only prepared previz for the complex shots, such as Ed running through the rooftop at the beginning of the film. This shot is entirely CG except for Ed. The camera work and everything had to be coordinated precisely.

How did you approach and create the opening sequence featuring the young Ed and Al?
For the Human Transmutation sequence, we built the room set, and reproduced it in CG to make it collapse. We thought about creating the digi-doubles of the children, but the director wanted to shoot with the real actors. It was all safely carried out. We used previz to simulate the camera moves and to figure out the scale of the set, so the actual shoot went without a hitch.

Can you explain in details about the creation of Al?
Alphonse was designed to look faithful to the original as well as to have a well-proportioned form in 3D. It has the physically correct articulated parts. The most challenging was the angle of the camera for the shot where he interacted with Ed. We had to come up with the right design for the protruding part of Al’s chest so that it won’t obscure Ed’s face. If we made it too small, it looked different from the original. We struggled with it almost one year and finally came up with a 3D model that worked well and yet not to draw apart from the image of the original.

How did you handle the challenge of conveying the Al’s emotions?
Alphonse is supposed to be just an armor, so he doesn’t have any facial expression. We paid special attention to the angle of his neck and the posture of his body to convey his emotional states. When he’s just standing, he looks very strong and regal. To make him look gentle or sad, we adjusted the position of his shoulders to a little forward, so that they would look rounder and smaller to help expressing his emotions. If we were successful in conveying his emotions, that was because the artists did a great job tracing the performance of the actor in CGI.

How was simulated his presence on-set and for the interactions?
We captured the interaction with Ed and Al’s performer together, using the wireless motion capture system. This technique worked very well, especially for the fighting scene of the brothers. It was a complex scene that a live action human gets in a scuffle with a CG character. We were able to synchronize their motions perfectly and to achieve the gravity to the action. It was all accomplished by the wireless MoCap. It also helped to make the timing of the interaction and the angle of the joints look natural. And of course, it wasn’t possible without the outstanding performance of the actors. We were grateful for the effort of Ed and Winry who had kept their eye lines on the 2.2m Al throughout the shoot.

Can you tell us more about the textures and shaders challenges?
We augmented an existing renderer and developed a proprietary shader to render Alphonse with photoreal quality. Our development capability is what makes Oxybot different from other Asian studios.

How did you handle the various lighting conditions where we see Al?
We utilized various techniques in reflecting the on-set lighting environment. We recreated the on-set lighting accurately in the digital environment first, and then optimized it aesthetically without contradicting the laws of physics.

The heroes confronts creatures in a village. Can you explain in detail about their creation?
The director had an idea of a tiger or lion like creature that was made of stones and rocks, and when it moved it scattered around the pebbles and grits. We dubbed it Stone Creature. We spent some time creating the prototypes, but at the end we went back to the very first one we did. It is one of the good things working with Sori, that we can be flexible about going back the process, without thinking it as a setback.

How did you handle their rigging and animation?
The rig was just an ordinary animal rig. We animated it, mimicking the motions of tigers and cats, and added the ferociousness. We also set up some effect parts on the body to go with the animation.

We discover later the mysterious The Truth. How did you design and created this character?
We designed The Truth to look great as a 3D character, while keeping in mind not to go far from the original or the anime. The director conveyed his image of the character that has an enigmatic aura to a Houdini artist and they designed it together.

How did you control and animate his FX simulations?
We used wireless MoCap to capture the performer on set and used the data for animation. And the particles were driven by the animation and controlled in Houdini. We tested many different versions first, so once the image was fixed, the production went smoothly.

The colonel can control fire. How did you created this beautiful FX element?
We created most of the fire effects in FumeFX, not in Houdini, and enhanced them with the practical fire elements. The creation of fire took some time, but the 3D tracking of the performer was even more challenging. We brushed them up in comp and I think they look like real fire.

How did you manage the art direction for the fire?
The process of creating effects is always the same. You create it, test it in the shot. If it doesn’t work, you try it again. The fire didn’t look quite real at first and we couldn’t get it approved by the director for a long period of time. It took us a lot of trial and error to get it right.

Can you explain in details about your work on the Homunculus?
We created the lower arms and hands in CG for Lust to control her claws. We did digital scan of the actress to build the 3D models of her arms and hands. It gave us the flexibility we needed for the alteration of the claws. As for Gluttony, we altered the silhouette of his shoulders and body in 2D. The actor wasn’t as big as Gluttony.

Did you received specific indications for the Homunculus Envy?
For the transformation of Envy, we executed tornado effects to go with the look of the Transmutation we tried to keep consistent throughout the film. You can see the Transmutations of a dummy horse, a pike, and a tower all have the unified look.

The final sequence involved a terrible army. How did you created them?
For this film, all the non-human characters were modeled in 3D. The one-eyed doll soldier was one of them. The very first one we modeled looked too scary, so we dialed back a little and made it not as frightening. We also did a muscle simulation, but the motion was too slow to recognize it.

How did you handle the crowd animation?
We used Miarmy for some of the crowd scenes. We also used Mocap, but mostly it was done by the animators.

What is your favorite shot or sequence?
My favorite is the fight scene of the brothers. Alphonse looks as if he is the real person, not a CG character. His acting is very expressive even though there’s no facial expression. It is a beautiful scene, where the director’s view, the actors’ performance, and the CG technique become integrated.

What is your best memory on this show?
The plate shooting we did in Volterra, Italy. The whole town was very supportive of us and we were deeply grateful to them. We thoroughly scanned the main streets and the plaza, using a drone. Then we created a virtual town from the data we had obtained there and used extensively for the VFX shots. It wasn’t possible without the cooperation of the people of Volterra.

How long have you worked on this show?
Six months on preproduction, 3 months for shooting, and about 1 year for the postproduction.

What’s the VFX shots count?
About 1000.

What was the size of your team?
The main team was made up of 30 artists from Oxybot. We worked with seven other VFX vendors. But about 90 percent of the VFX shots were done in house.

What is your next project?
I’m currently working on a small Japanese film project and on some television series. I’m also working on a film project that is on a similar scale to FULLMETAL ALCHEMIST, but I’m not able to discuss it yet. If you are interested in the Japanese VFX works, please check our web site.

What are the four movies that gave you the passion for cinema?

A big thanks for your time.


The movie is streaming on Netflix here.


Oxybot: Official website of Oxybot.

© Vincent Frei – The Art of VFX – 2018


S'il vous plaît entrez votre commentaire!
S'il vous plaît entrez votre nom ici