The Avengers – Hulk


Marvel’s The Avengers was a highly anticipated blockbuster and no doubt it was big hit in cinemas when it was released last year. A main character in the movie was the Hulk, played by Mark Ruffalo. In the movie, he turns from a normal human into a buff green shirtless killing machine. Industrial Light and Magic (ILM) was responsible for this CGI and they did such a great job that they were just recently nominated for an Academy Award for Best Visual Effects.


Here’s kinda how they did the digital double. ILM used motion capture to catch the emotions Mark Ruffalo potrayed on screen. Every bit of Hulk stems directly from Mark, from the pores on his skin, to the grey hair of his temples, right down to using a dental mold of Mark’s teeth as a basis for Hulk’s teeth. Their strategy was to work to out rendering and texture issues on the Banner (Hulk before he turns human) digital double until it looked indistinguishable from Mark Ruffalo.

The realism of this digital double is fucking awesome!

The realism of this digital double is fucking awesome!



As the Banner and Hulk shares the same topology, they were able to transfer textures, material settings and the facial library for animation. This gave them a decent base to start working from but with their significantly different proportions, there was a lot of retargeting work that need to be done. They tried to be economical with their poly counts but with Hulk they made a conscious decision that he was going to be extremely dense in his resolution for a better mesh. By working ike this, they never came up short on resolution for all of the close-ups and detailed shape work that was required to represent the anatomy under Hulk’s skin. They then incested in a robust multi-resolution pipeline so that the model was manageable for the artists to work with.



Here’s an interesting behind the scenes video!


Terminator 2: Judgement Day

cgi4 (1)

Canadian director James Cameron directed The Terminator (1984). He is well known for his use of cutting edge visuals and effects technology. The Terminator is his first groundbreaking sci-fi blockbuster movie in the visual effects arena. He pushed the boundaries of special effects with The Terminator. It was during a period of time where Hollywood was experimenting with new means of visual effects through the production of films that fused the genres of science fiction and horror.

Seven years later, Cameron came back to direct Terminator 2: Judgement Day. Judgement Day came back even bigger than before, in terms of CG. It was the first film to feature a computer generated main character. The VFX in the film was completely top notch for that period of time. Not only was there the CGI Terminator, it also morphed and regenerated body parts. And on top of that, it could also turn into a mercury like liquid metal that seeped through little cracks. The movie paved the way for all the other VFX-laden movies.

Most of the effects was provided by ILM and the creation of the visual effects took 35 people altogether that included animators, computer scientist, technicians, and artist. It took ten months to produce, for a total of 25 man-years. And despite the large amount of time spent, the CGI sequence was only a total of five minutes on screen. But all this work was worth it because the visual effects team won the 1992 Academy Award for Best Visual Effects.


For the scene featuring Sarah Conner’s nuclear nightmare, the people from 4-Ward Production constructed a cityscape of Los Angeles using large-scale miniature buildings and realistic roads and vehicles. The pair, after having studied actual footages of nuclear tests, then simulated nuclear blast by using air mortars to knock over the cityscape, including the intricately built buildings. 4-Ward created a large layered painting of the city augmented with a radiating blast dome and disintegrating buildings created with an Apple Macintosh program called Electric Image. They also contributed a number of shots showing molten steel spilling out of a trough onto the floor, and used real mercury directed with blowdryers to create the eerie shots of the shattered T-1000 pieces melting into droplets and running back together.

Pirates of the Carribean: Dead Man’s Chest


Davy Jones stars as the protagonist in the second installment of the Pirates series. He is completely CGI and everything about him is so believable it’s crazy! Of course the team responsible for this had to be none other than Industrial Light and Magic.

The production shot real actors on set and digitally replaced them. In order to do this, each actor was scanned and modelled. They wore a motion capture suit which enable them to be replaced in post production. ILM was unable to rely on traditional MoCap or hand animation as there were multiple issues. It had to be done in special studios with multiple cameras and the cameras and tracking markers are special expensive equipment used only in a calibrated environment. Also, the data needed to be cleaned up tremendously as the data stream has both noise and errors. The whole process is complex to set up, and it’s also expensive and highly specialized therefore it wasn’t used. ILM created an innovative new system called Imocap and that allow onset and on location motion capture to elicit the most believable look and performance possible out of actor Bill Nighy.

pirates-topperHe wore a pair of gray ‘pajamas’ with reference dots placed around the suit and his face, and his performance was captured entirely on set as he interacted with other actors. This improves the performance of the other actors as they would have someone ‘real’ to interact with, and it also gave the animators a highly detailed reference.


Being ILM, they made a breakthrough with Imocap when they only had to film with a single onset film camera instead of multiple cameras when using MoCap. A single camera removes the many restrictions motion capture process gives. With Imocap, motion capture could be done on set. The approach is to model the actor’s range of motion and then they used an elaborate system to fit the range of possible motions the actor could do, to the data from the single camera source.

Besides Imocap, the other challenges ILM faced with the character of Davy Jones was his 46 flopping tentacles. ILM wanted the tentacles’ curling and movement to reflect Davy Jones’ mood, not just lifelessly bob around, but they didn’t want an animator to have to manually manipulate each and every one frame-by-frame so to solve this, their programmers added a sort of inter-tentacle motor to automatically move them around. Mathematical expressions and/or keyframe motion fed to motors in the joints between the cylinders making up Davy Jones’ 46 tentacles caused them to bend, curl, writhe, and perform in life-like ways. “Stiction” kept the tentacles from sliding.


As the computer knows what the actor’s limbs could do from any one frame to the next, it can ignore a lot of mathematical possibilities and add to the solution. Once the solution is constrained by this virtual range of possible motion, a single camera can produce a very powerful motion capture data stream. While the motion capture system worked extremely well, the lip sync was not done this way and instead hand animated.

djFor the tentacles, an articulated rigid body dynamics engine was utilized to achieve the desired look. Each tentacle was built as a chain of rigid bodies, and the articulated point joints served as a connection between the rigid bodies. This simulation was performed independently of all other simulations, and the results were placed back on an animation rig that would eventually drive a separate flesh simulation.



I recently re-watched Battleship and what I thought was pretty cool about the film was that rigid body and water simulation! There’s a scene where the aliens comes out of the water and destroys everything in it’s path and that looked pretty cool and I know it takes a shit load of hard work to do.Of course the awesome people behind the VFX of Battleship was none other than Industrial Light & Magic (ILM). There were over a thousand shots of VFX to be done and part of it involved some sort of water simulation.

Three years before Battleship’s release, ILM had already started discussion on the water sims pipeline. They had a well-developed water sims pipeline which was used in Poseidon, as well as Pirates of the Caribbean. However, they had to step it up a notch and reinvent their water sims due to time constraint. Thus, ILM started on what they would internally call the Battleship Water Project. Together with their R&D team, they came up with a new water pipeline and advanced tools to improve their workflow. The following picture shows the layer breakdown.

It would probably bore you to explain how they did eventually went around doing the water sims pipeline so I shall not go so much detail into that. Basically, their problem was that since the scene was an open sea scene, large water simulation on a level set particle based process was needed and they broken everything down into grids and optimize them. However when the millions of cells of the alien ship interacted and collided with the water geometry, a lot of fine details in the complex water structures was lost from the simulation as the grid size on screen was perhaps only a two foot square compared to real world.

Thus to counter this problem, ILM added on top a type of FLIP PIC solver for particle based simulation which allowed finer detailed solution. This gave the traditional approach an allowance for wider scales. Each of these particle groups would then have a grid placed around them. Developed by ILM themselves, the secondary grids added to the particles are adaptive in size, and calculated based on how close the camera was to the particle simulation. Now with the particle secondary solution, the imagery could be resolved to a pixel resolution.

Check out the rendered product below!