VFX and animation have more and more in common now, not only the tools and skill-sets but also the methods. For Jordi Bares Dominguez, the rendering phase as it stands today needs to be totally revamped. Buf produced no fewer than 400 animation frames of snow in extreme conditions. Framestore, conversely, concentrated its efforts on simulation of an arterial lava flow for Iron Man 3, and MPC detailed the effects of Prometheus with outstanding complementarity between keyframe animation and motion capture.
Moving Picture Company
Involved in the visual effects industry for the past 12 years, Ferran Domenech has worked for MPC on films such as Tomb Raider, Harry Potter and the Deathly Hallows, Ridley Scott's Prometheus and The Lone Ranger by Gore Verbinsky.
Deputy Creative Director
A Director of 3D Creation and Visual Effects Supervisor, Jordi Bares Dominguez is responsible for innovation and creative excellence for Realise Studio, which specialises in post-production. His work has received many prizes, such as BTAA Craft Awards, multiple Clio Awards, D&AD yellow pencils, and various accolades at the Australian Effects and Animation Festival, Australia?s Young Guns and London International Awards, for works such as the commercial launching Sony Playstation 2, Mountain, and The Quest, promoting Tooheys' beer.
Isabelle Perin-Leduc is Head of Special Effects at Buf Compagnie. She has worked on many adverts as well as on such films as Alexander, Arthur and the Invisibles, Be Kind Rewind, Mr. Magorium's Wonder Emporium and The Darkest Hour.
A graduate of Toulouse Fine Arts school, Yann de Cadoudal began his career with Buf Compagnie as a CG Artist in 2002, and later worked as a Sequence Supervisor on films such as Alexander, Babylon A.D., Enter the Void and Thor. He was also Lighting Supervisor as well as On-Set Supervisor for films such as Babylon A.D., The Scapegoat and The Grandmaster.
CG FX Supervisor
Alexis Wajsbrot joined Framestore in 2009 as Lead FX Technical Director for Mike Newell's Prince of Persia: The Sands of Time. He has recently supervised the special effects on Gravity by Alfonso Cuarón and the CG imagery on Shane Black's Iron Man 3.
He also works on his own projects, such as the award-winning Red Balloon.
Thierry Barbier is an executive producer and expert in special formats (including large formats), 3D and augmented reality.
He is a founding partner of AmaK, a design and production studio devoted to the promotion of artistic and cultural content on digital supports. AmaK advises, designs and produces films as well as interactive, immersive environments.
Key words: Realise Studio, Bares, rendering, library, Renderman, Mental Ray, BRDF, Buf, Wong Kar-wai, Grandmaster, snow, Framestore, Iron Man, Extremis, Wajsbrot, lava, Prometheus, MPC, previsualization.
The now-traditional VFX conference concentrates on some of the finest work of the past year and suggests examining the state of the art – both technical and artistic – serving as a source of inspiration.
"The work I'm presenting here today is based on four years of frustration in the field of rendering," says Jordi Bares Dominguez of Realise Studio, in introduction. After having noted that today nearly every camera features such classical options as sensitivity, aperture and shutter speed, this assistant creative director wondered why it was not possible to access these features when computer-generated images were concerned: "The answer is obviously cost-linked."
At present, the rendering market is split between two major tools: Renderman, on the one hand, currently used by the big studios, applying the "bake and use" approach which notably requires texture work upstream; and Mental Ray, which is more suited to smaller structures, and is easier to use. Jordi Bares Dominguez notes the chaotic situation of the rendering phase in the production process.
He then enumerates many problem points which may explain this situation. "For example, preproduction is separated from production, so it's not possible to integrate – upstream – certain constraints which occur later on down the line. The visual development side is often disconnected from the production context. In addition, production processes suffer from numerous repeats, with many variables making the process less efficient, including, among others, divergence on the pertinence of this or that solution. Everyone holds forth with his/her own idea. The outcome? Creativity is caught in a stranglehold, and more production problems ensue."
For him, too many rendering approaches kill off the rendering: "real, photo-realistic, hyper-realistic, pseudo-realistic, etc. Why not simplify?"
Going back to the variable or zoom camera in the rendering phase, he explains that "it's possible to do without it since the only thing that needs to be done is to pick up the data that the DOP has shot on the backgrounds or set. Concerning the lighting, it's always the same natural principle: it's therefore very easy to compute. We can capture data using an HDR ball, the data can be tweaked… in short, this variable could also be done away with." Often it's said that the material itself represents substantial difficulties for texture rendering. Jordi Bares Dominguez retorts that "you just need to work from the periodic table of elements which lists the elements and all the types of physical and chemical behaviors of these elements. It's essential for the behavior (of materials) to be based on real and standardized measurements, and not on opinions."
The BRDF, for Bidirectional Reflectance Distribution Function, has been used for many years (but not known under this term) by painters or photographers interested in re-transcribing what they observed under various angles of light and atmospheric conditions. Today BRDF is everywhere, modifying perceptions of the "real", depending on where the person (or camera) is located, the position of the sun (or light sources), especially with scattering variables.
Given this fact of nature, Jordi Bares Dominguez then wonders about the choice between Renderman and Mental Ray: "All said and done, doesn't this just complexify something which ought to be more objective?" His suggestion for a more efficient rendering phase is to establish a number of libraries: market-available cameras (equipped with all possible abovementioned parameters), including optical simulations; types of light possible; or materials existing in nature, with their physical and chemical properties.
The creative director of Realise Studio is fully aware that this type of approach will not necessarily be adopted, neither unanimously nor immediately. However, he remains hopeful that, if adapted to today's existing workflows, "this one decision can have repercussions along the entire line."
The Grandmaster (Jet Tone Production) by Wong Kar-wai is a 2013 feature film on the beginnings of the golden age of Chinese Kung-fu in the 1930s. In 2004 Buf had already worked on the director's film entitled 2046.
For The Grandmaster, Isabelle Perin-Leduc was in charge of the post-shoot VFX work, whereas Yann de Cadoudal was present during the entire film shoot: "This film required nearly three years of postproduction, beginning with a test on one snow sequence. At the time the idea was to turn out the film in stereoscopic 3D, meaning that all the snow of the sequence would have to be erased, replaced by a digital equivalent and, finally, a second eye would have to be created, since the camera shoot had been done in mono," Isabelle Perin-Leduc reminds us. Other tests to stylize the fight scenes were also done, but in both instances these were unsuccessful, and it was decided to drop the motion blur and stereoscopic effects.
Yann de Cadoudal recalls: "The shoot lasted for a two-year period, requiring constant presence since the film kept changing. In addition, Wong Kar-wai is someone who is very precise and can modify any detail that does not suit him and, therefore, the scene must be redone." Much of the action takes place during a snowfall but although the shoot began under perfect weather conditions, the duration of the shoot meant that synthetic snow had to be added to the set. "Here again, the result did not turn out to be conclusive and we had to redo some backgrounds, erasing the synthetic snow to recompose an entire environment with computer-generated images. For simple shots, however, only color calibration was needed."
Out of the 512 special effects scenes that Buf handled, 400 of them included snowfall. "We implemented falling snow flake simulations but with no interaction with the actors," Isabelle Perin-Leduc mentions. "Next, we shot at different speeds – 24, 48, 72 and 96 images/second – and for each speed, a set of references was established. This allowed us to reuse these elements both as a basis and as a production element. The phase of interaction with the actors and their movements also required designing a system with baseline adjustments for each snowfall speed, while integrating parameters of density, momentum and interaction modes: the actors' movements, particularly in the fight scenes, impacted the erratic side of the dropping flakes."
The second major portion of VFX work concerned the fight sequence which takes place in a railroad station, with a train in the background. The latter was filmed from every possible angle so that it could later be 3D modeled. "We had both the set with the train," Yann de Cadoudal recalls, "and a set with a green screen. On the latter, we picked up all the DOP's camera parameters again but with a tiny bit wider focal length and slightly less out of focus, to get the light right and be as realistic as possible." The train was then mapped on the 3D model that Buf had designed.
Fastidious to an extreme, Wong Kar-wai noticed that the lit windows of the train passing in front of the two enemies gave off an intermittent light that the green screens would not have provided: "We therefore integrated light flashes at the same pace as the train, to have consistency between the shots. Unfortunately, we noticed that the light in the green-screen studio shots was stronger than that of the film shoot. Consequently, we were asked to recreate the light of the set coming from the train windows and falling on the actors so that it would all fit together…"
Besides working on these two important areas, Buf also handled the fur effects on the actors' hats, and any stuffing coming out of the slashed clothing.
Even though all the shots of the film were effects-processed between February and May 2011, "six months later, the director wasn't happy with the train sequence and we had to take it up again. Even several months later on, he found it to be the train itself that he didn't care for. So we had to rework this again as well. But since the editing had changed in the interim, the long and short of it was that several other shots also had to be redone," Isabelle Perin-Leduc relates.
In all, 63 graphic specialists worked on this movie for the full three years of postproduction.
Don't all of these back-and-forth efforts and shot modifications necessarily have an impact on the cost of your work? How did you manage this?
It's true that we redid many elements, some of them very late on, and that comes at a definite cost. Yes, the service-provision contract was renegotiated. But what counted first and foremost was the artistic worth of creating these VFX.
The British studio Framestore handled several types of visual effects for the Iron Man 3 movie, notably the effects of the Extremis drug which gives the starring bad guys their super powers. "Mainly we needed to model the anatomy of the characters taking the drug, to create a skeleton but also muscles, veins and capillaries. Next, we created the rigging of the entire anatomy, including the venous system, then we tracked the actors in filmed shots before adding the lighting and effects," explains Alexis Wajsbrot, special effects and CGI supervisor.
For this production, Framestore hired 97 graphic artists for 5 months, split between ten areas: tracking, rigging, modeling, animation, FX, shaders, lighting, editorial, compositing and production. Roughly 178 shots were VFX-altered, of which 113 were maintained in the final cut, with one shot requiring three months of work and 231 dailies. Twenty different types of software were used, including Maya, Arnold, Houdini, Nuke, 3D Equalizer, ZBrush, Photoshop. The studio also banked on proprietary tools, with not only a volumetric mesh deformer but others such as fRibgen, Flush, Baselight, etc.
What are the effects of the Extremis drug? It gives super-human strength to those who take it, and has a self-regenerating power. To depict this energy, the graphic choice was to show a flow of energy speeding through the arteries but which can be seen under the skin, thereby revealing the character's full anatomy. Since the drug burns, the skin heats, cracks and steams. "We had three main challenges: implementing the visual aspect of Extremis, involving a visible, thereby realistic, anatomy, while still portraying this lava-like energy flow. This flow also had to appear inner-body and not just skin-deep. Last of all, we had to be able to reproduce Extremis' effects on four different characters, in seven sequences."
For the modeling portion, a scan was done of the actors concerned and this was then conformed, to integrate the studio pipeline. For each actor, the anatomy, arteries, lymph system, muscles, respiratory tract, skeleton, veins and skin were completely modeled, with light "blockers" at each step. "The energy flow courses through the veins. For this, we modeled specific arteries which would then enable certain areas of the body to heal, through the drug's self-regenerating power. We agreed that the energy flow would be guided by the UVs of the artery meshes."
The rig of each scanned actor not only contained the full anatomy, but also the skin. Since some parts of the anatomy had to be deformed, "we relied on our in-house volumetric mesh deformer tool," Alex Wajsbot continues. The memory-heavy models easily exceeded 2 gigabyte, beyond the limits set by this type of Maya file.
Tracking was a crucial point of the production. The environment of each shot was modeled, then the Framestore team did camera tracking, with enough reference cameras for the actors' positions to be triangulated. As for the 3D Equalizer software, it was used for the body tracking and facial deformations. "We used the same technique as for Red Skull's face in the Captain America movie; in other words we tracked the markers in 2D then re-projected onto a base mask to be able to deform the skin at will."
For the FX and lighting steps, "We needed to create this arterial energy flow which simultaneously lit up the bodies from the inside. Overall, we chose two types of flow: one more in depth, the other closer to the skin surface, with both done through a set-up in Houdini. In addition, we added a little steam and regeneration effects. For this last point, the idea was that the flow to a given zone would bring about the regeneration. The TD team (technical directors), all Maya FX specialists, used fluid simulations in 2D to show the different phases of damage to the skin."
The lighting team had to juggle with these very unusual shots to avoid overdoing it, without neglecting the physical reality of the burns. Globally, "we carried out four types of render to depict the four major phases of damage."
Ridley Scott contacted MPC (The Moving Picture Company) to do 420 visual effects shots for his film Prometheus, including: the hammer-shaped snake alien (Hammerpede) which twists around one of the crew member's arms before nestling inside his body; the spacecraft with its earthling crew – the USCSS Prometheus – landing on the planet of the Aliens; and, last of all, the collision scene between the Juggernaut and the Prometheus. The previsualization was done at MPC following many often-precise drawings by the director.
For the sequence with the Hammerpede, "the team on the shoot worked with an animatronic for enhanced interaction between the set and the actors," Ferran Domenech, animation supervisor, recalls. As always, the development involved extensive research work on forms (worms, snakes, etc.) but also on textures (chickens, jellyfish, etc.). "For the rigging, you had to think through it, anticipate its movements, see how it could open and shape itself into that sort of hammer that makes it look so threatening."
When the Hammerpede winds around the actor's arm, it presses so fast and hard that the arm breaks: "We developed a muscle system, with deformers, so we could simulate this pressure. A subsurface scattering pass provided realistic lighting to highlight the organic feel of the creature. Next, we modeled and keyframe-animated the point in time where it spirals inside of the helmet and slides into the mouth. At the start, this movement looked unnatural and we had to adjust it to fit the curvature of the glass. Likewise, to add an even more organic twist, we added layers of fluids which stick to the visor, with certain density parameters for the final result."
While Prometheus is mostly digital, the backgrounds team created one of the four landing struts to judge scale. All the rest sprung from the imaginations of the artists working under the director. Conversely, the Juggernaut had a solid basis since its design dated back to 1978 (date of the first Alien); at the time it had been perfected by Hans Ruedi Giger. "To the contrary of the Prometheus, the Juggernaut's rigging was simple, as there were no moving parts," since this spaceship was actually just an enormous open ring.
For the sequence showing the crash between the two vessels, "we used a previsualization step, especially to determine the running speed of the two actresses dashing across the planet surface as the Juggernaut rolls up behind them," like a runaway tire. Working with seven animators for two weeks, MPC also produced a postvis after doing the FX, in order to see how to combine the two – actresses and FX – as best possible. "All of the effects were created with Maya Particle."
Last of all, digital stand-ins were created in two steps: keyframe animation and motion capture at MPC's London studio.
Drafted by Stéphane Malagnac, Prop'Ose, France
Translated by Sheila Adrian
The Annecy 2013 Conferences Summaries are produced with the support of:
under the editorial direction of René Broca and Christian Jacquemart