Avengers: Infinity War has just hit Blu-ray and DVD, which means it’s time to sit down and spend multiple hours watching your favorite Marvel movie heroes get beaten into bloody pulps by Thanos all over again. They’ve done so in a rather beautiful manner thanks to the work of Weta VFX—and we sat down with Visual Effects Supervisor Matt Aitken to learn more. We’ve also got an exclusive look at the team’s fantastic before-and-after work.
Weta Digital was one of multiple visual effects studios that were part of bringing the epic action and grand scope of Infinity War to life, primarily focusing on the scenes set on Titan—the scrap between Thanos, the Guardians of the Galaxy, Spider-Man, Doctor Strange, and Iron Man—which included not just the creation (and destruction of various bits) of the vistas on the planet itself, but the FX work on bringing Thanos to life, elements like the additional metallic legs of the Iron Spider costume, and effects like Doctor Strange’s magical mystic arts. And, of course, the most important piece of CGI magic of all in the film: the brutal effect to show our beloved heroes being turned to dust by the power of the Infinity Gauntlet.
To get just a sample of some of the work Weta did on the film, check out this exclusive reel of before-and-after shots highlighting the studio’s work, as well as our full interview conducted via email with Aitken below.
io9: Tell us a little about the elements of the film the Weta team worked on.
Matt Aitken: Weta Digital was responsible for the 3rd act battle sequences on Planet Titan: the Q-Ship crash; Tony Stark, Dr. Strange, Spidey, Quill and company planning the attack on Thanos; the extended fight with Thanos on Titan; and the deaths of most of the heroes on Titan once Thanos gets the last Infinity Stone.
io9: Multiple FX studios worked on Infinity War. What was that collaborative experience like, breaking all the required shots down not just among Weta, but across multiple teams?
Aitken: In one sense our work was reasonably self-contained, in that we didn’t share shots with other visual effects facilities and we were the only facility working on the Titan environment. But there were opportunities to collaborate with other facilities on the characters of Thanos, Iron Man and Spider-Man which were shared assets. Infinity War shots with Thanos were handled either by ourselves or Digital Domain. We each developed our own versions of Thanos in parallel, working to designs created by the Marvel Look Dev department. Dan Deleeuw, the Overall Visual Effects Supervisor for Marvel Studios on the show, made sure that our Thanos and Digital Domain’s Thanos were kept in line with each other. Early on this included Dan sending examples of our Thanos to Digital Domain as reference and vice-versa. We also each had Josh Brolin’s onset performance to base Thanos’s character on, which made a huge difference in keeping his character consistent between the two facilities.
io9: The Thanos we see in this movie has much more screentime and closeup time than he has in any previous Marvel movie. What changed had to be made for designing his model this time to compensate for his major screentime? What did Weta have to contribute to that process, specifically?
Aitken: Thanos had made several appearances in previous MCU movies, particularly in a scene in the first Guardians of the Galaxy movie. But his previous appearances had all been quite brief. In Infinity War, Thanos is the main protagonist of the film, appearing in over 600 shots so he underwent a re-design to make sure he would hold up to the increased level of scrutiny in this movie. This consisted of refining his form to add detail and also tweaking his appearance to make him look more like Josh Brolin in places, while keeping the key characteristics of his appearance: his purple skin and his furrowed chin. Both Weta Digital and Digital Domain were involved in this redesign exercise, with Marvel choosing the aspects of each facilities’ Thanos that they liked the most and these were rolled into both versions of the character. Our work involved making the fine detail of Thanos’s face more human, keeping his overall character but makes him more relatable. We were also involved in making his brow, cheeks and upper lip more like Josh Brolin’s, which helps make his facial performance more believable because the performance synchs with the shape of his face
io9: What was the biggest technical challenge you faced bringing Thanos to life in the scenes you worked on with him?
Aitken: We made further refinements to our digital character pipeline for Thanos. A lot of this work was around making sure we could accurately re-create Josh Brolin’s performance on Thanos. We built two facial puppets: one of Josh Brolin and one of Thanos, each with a complete facial animation system. Our process involves solving the captured performance motion onto the Josh Brolin actor puppet first, which allows us to check that we are matching the live action performance accurately. Once we are happy with the facial animation on the actor puppet we migrate it across to the Thanos puppet. We carefully calibrated the two puppets to allow for a seamless migration of the actor’s performance on to the character. Getting this calibration right was challenging technically, and involved the use of forensic data to accurately capture the density of facial tissue between the skin and the skull at various points across the head. But once that calibration was complete the process of migrating the performance onto the character puppet was relatively trivial.
io9: Tell us a little bit about the process of creating Titan’s look for the film. Were you given much leeway in creating the aesthetic of the world?
Aitken: The look of planet Titan was developed and refined as we worked on the show. The Production Designer Charlie Wood and his team produced a huge portfolio of stunning concept art for the look of Titan which was the basis for the initial look of the planet, including the live-action set. The design of the giant windmill structures that litter the planet came from this concept art, and the planet had a distinct other-worldly feel. But as we got into the specifics of the shot work the filmmakers asked us to explore making the lighting and atmospherics of Titan more earth-like, particularly in early sequences before the fight really took off. We replaced the set with our CG version of Titan in most shots so that we could re-light it to match this new lighting direction. We also enhanced the design of Titan to include more evidence of the previous civilization that had existed there, we referenced ancient Greek and Mayan ruins for inspiration for this.
io9: Part of what made the Titan action sequences in the film so visually interesting was the gravity affecting both the characters and debris floating around the planet. What are some of the challenges behind effectively showing that change in gravity when you’re planning FX shots out?
Aitken: We considered simulating the variable gravity on Titan early on in our work, but in the end it was all cheated via keyframe animation. This was more appropriate because as our work on the Titan sequences progressed the variable gravity was more and more used purely as a story-telling device: present when the story required it and absent otherwise. So Thanos could jump and land hard on a giant floating rock if that was what the story required. The challenge was to selectively incorporate this effect in shots without it appearing weird to the audience and taking them out of the moment, that was the fine line our animators had to tread.
io9: At one point in the Titan battle, Thanos uses the gauntlet to bring down a meteor storm on his enemies. Were there any specific challenges in managing the logistics and physics of a scene like that for you?
Aitken: The challenge in creating the sequence where Thanos destroys a moon and pulls it down on Iron Man and the Guardians was the huge scale of the event: the Russo’s referred to this sequence as a ‘global destruction’ event and the atmosphere on Titan is filled with dust and debris for the rest of the movie. Our destruction pipeline is set up to facilitate animation-driven timing and overall shape of events like this, so our animators are able to define things like the speed and scale of the meteors and the speed of the shockwave that propagates when the largest meteor hits the ground. Then the FX team takes over to add simulations for the heat shield effects, fire, smoke and destruction, using the distributed sparse volumetric solver within our in-house simulation framework called Synapse. Synapse is ideal because it allows us to distribute our sims over multiple machines, which means we can preserve a very high level of local detail across this large event. This also gave us the freedom to simulate meteors together with very long smoke trails that can interact and influence each other when traveling close to each other and the camera at a very high voxel resolution.
io9: The Titan battle sequence is where the audience really gets to see the enhanced elements of Iron Man and Spider-Man’s new suits in action. Starting with Spider-Man’s mechanical legs—was any part of them practical, or did Weta create them completely digitally? How did you go about making them feel like a natural extension of Spider-Man’s actions in the film?
Aitken: The legs that are a feature of Spider-Man’s new Iron Spider suit were entirely CG. The key to making these work effectively was to design them so that they had complete freedom of movement, because the animators had to get them into many different complex poses: from the elevated stance, adopted from comic reference art, that Spidey adopts when he has caught Mantis in mid-air; to the spidery scuttling motion when Spidey uses the legs to move around quickly as he webs the unconscious Guardians to safety. So we had to rig the puppet for the legs so that they were able to articulate fully at each joint, and model the joints so that all that articulation was believable. Then our animators were able to keyframe the action of the legs so that it tied in very closely to the animation on the rest of his body.
io9: Spider-Man’s suit has a more metallic look this time around. When recreating the character digitally for action and dialogue scenes, how did you simulate the way the new suit and its mask moves around on Peter’s face and body?
Aitken: Spidey is CG from the neck down when his head is exposed, and fully CG the rest of the time. The key to getting the suit fully integrated into shots where we can see Peter Parker’s face is to get a very accurate digital copy of Peter’s body motion, a matchmove. Once we have this matchmove of Peter’s onset motion, in many cases we can simply wrap the Iron Spider suit onto this and render the result. Tom Holland wore a motion capture suit on set which got us a long way towards to finished matchmove. Rendering a reflective metallic suit can cause issues when the surrounding environment is only available as a 2D plate. But as we created the Titan environment entirely in CG in almost every shot, it was simple for our renderer Manuka to calculate the reflections off the suit by ray-tracing into the surrounding environment. In approaching the facial animation requirements of Spider-Man when he is wearing the Iron Spider mask, rather than simply sculpting jaw-open and jaw-close targets into the suit we took the approach of building a full facial animation rig for our Peter Parker digital double. We then ran that rig through the complete set of FACS targets with a panel of digital cloth stretched over the face. We derived the facial blend-shape targets for the Iron Spider mask from this cloth simulation. This extra level of detail in the suit’s facial rig meant that it was possible to lip-read Spidey’s dialogue as he was talking through the mask. The exception to this was the Iron Spider eyes, which had specific eye-shape targets sculpted based on the mechanical multi-blade iris structure of the suit’s eyes.
io9: Moving on to Iron Man—his new armor uses nanotechnology to summon weapons and extra attachments out of, seemingly, thin air. What processes did you use to create the effect of Tony summoning his new weapons? Was that something Weta specifically worked on, or was it a collaborative process with other studios?
Aitken: Weta Digital did a lot of the early development work designing the look of Tony’s nano-tech particle Iron Man suit. Other facilities subsequently contributed shots that included this effect as well, and the end result was a collaboration of several facilities. Weta Digital built 12 different bleeding edge weapons to designs that we originated in-house, based on concept art that we received for a handful of them, and previs. We developed procedural modeling techniques to animate the nano-tech particles forming these weapons, and also to show the suit redistributing it’s nano-tech material from one part of Iron Man to another in the final stages of the conflict. A key aspect of this was showing the nano-tech material moving first as a liquid, which then forms the mechanical structures of the under-suit, and then finally showing the exterior shell of the suit forming over this framework.
io9: Doctor Strange’s spells play a major part in the Thanos battle on Titan. Given that his solo movie established a very unique visual style for Strange’s magical abilities, what was it like for Weta to build on that style for its own work in Infinity War?
Aitken: When Dr. Strange and Thanos square off on Titan Strange unleashes everything he has, swapping one magical weapon for another as each fails to defeat Thanos. Some of the effects were based closely on effects that had been developed for the original Dr. Strange movie: the mandalas that Strange can conjure up with his hands and the Eldritch Whip, for these we referenced the look of the original movie closely. One aspect of this work that I always appreciated was how these magical effects were solidly grounded in the physical world: for example, the sparks spinning off the mandalas had a sense of mass, like welder’s sparks. This physicality was something we tried to carry over to the new effects we created for Infinity War. So when Strange stops Thanos closing his fist on the gauntlet by unleashing the Bands of Cyttorak, we referenced imagery of glowing forged steel to ground the bands in the real world. And later when Strange blasts Thanos with the Golden Lightning, the energy in the bands melts the rock it collides with into giant splashes of molten lava, which we created as hi-res fluid simulations.
io9: After the film’s release, the Russo Brothers have mentioned that some digital shots were created specifically for use in the trailers to throw fans off before the film opened. Given that the Titan scenes get a good bit of screentime in the official trailers, did Weta create any shots that specifically weren’t for use in the movie? If so, which shots were made for trailer use, and is that a process the team has had to do before as part of their movie work?
Aitken: Most of the shots that Weta Digital created for the film’s trailers were either trailer-only, or ended up being modified extensively for their subsequent use in the film. This isn’t the first time that we have encountered this. In some cases the changes come about because we are still designing the look of the environments, refining the appearance of the suits and developing the FX simulations at the time when the trailer shots need to be completed. The version of the shot that gets seen when the movie opens has had the benefit of all that extra time refining and locking in the look of the work.
io9: Did Weta have a part in creating the digital effect for the characters killed by Thanos’ snap in the final moments of the film? If so, how did the design of that effect come about, and was it a collaborative effort with other FX teams on the film?
Aitken: When Thanos snaps his fingers at the end of the film, half of all life in the universe is killed in an effect we called the ‘blip’. Early on in our involvement with the project Marvel asked us to develop a concept for what the blip might look like. Part of the brief from the Russo Brothers was that the effect had to look final, like there was no coming back from it. They also wanted it to look painful with a component that was quite violent, they didn’t want it to be too gentle or lyrical. We initially presented a series of 2D concept frames and two of those were selected to be combined into a full CG simulation test. We chose Drax’s blip-out for the test because he doesn’t have hair, we wanted to hold of on developing the hair component of the blip until we had the rest of the effect somewhat in place. The blip effect relied on a highly detailed digital double of the character blipping out. This digital double provided a canvas for the blip to play out on and allowed for the effect to come on seamlessly: we transition to these digital doubles earlier than might be apparent with the bodies, clothes and hair typically CG from the start of the shot. Complex growth algorithms were developed to propagate the effect across the characters in an organic way. The simulation ran in a volumetric way so that the characters didn’t appear to turn into shells at the moment that they blipped away.
Avengers: Infinity War is available digitally and on Blu-ray and DVD now.