Artificial images have been around almost as long as movies. As computing power has grown and digital photography has become commonplace, special effects have increasingly been created digitally, and have become much more realistic as a result.
ACM's Turing Award for 2019 to Patrick M. Hanrahan and Edwin E. Catmull reflected in part their contributions to computer-generated imagery (CGI), notably at the pioneering animation company Pixar.
CGI is best known in science fiction or other fantastic settings, where audiences presumably already have suspended their disbelief. Similarly, exotic creatures can be compelling when they display even primitive human facial expressions. Increasingly, however, CGI is used to save money on time, extras, or sets in even mundane scenes for dramatic movies.
To represent principal characters, however, filmmakers must contend with our fine-tuned sensitivity to facial expressions. Falling short can leave viewers in the "uncanny valley," distracted or even repulsed by a zombie-like representation. "Trying to do realistic humans is still the most difficult aspect of visual effects," said Craig Barron, creative director at visual development and experience company Magnopus. Barron shared the 2008 Academy Award for Best Visual Effects for The Curious Case of Benjamin Button, in which the title character ages backward from an old man to an infant.
In the last decade, many films have included short flashbacks with younger versions of their characters. Within the last year, however, some films have used new techniques to create feature-length performances by convincingly "de-aged" actors. Artificial intelligence also increasingly will augment the labor-intensive effects-generation process, allowing filmmakers to tell new types of stories.
Figure. In Ang Lee's 2019 action movie Gemini Man, Will Smith (right) is confronted by a younger clone of himself (left). This filmmakers went through a complicated process to "de-age" Smith for the younger role.
"The idea of de-aging has been around for a while, and companies like Lola Visual Effects have been doing an amazing job of using 2D tools to basically track young patches onto old faces to make people look convincingly a different age," said Guy Williams of Weta Digital in Wellington, New Zealand. Williams was video-effects supervisor for Ang Lee's 2019 action movie Gemini Man, in which Will Smith played an assassin confronted by a younger clone of himself. In that film, filmmakers took the process a step further, he said. "Instead of modifying an image to make a performance looks younger, we erase the image, create a synthetic young performance, and place it into the shot."
"In many ways it felt less about a visual effect and more like we were creating a reality and aiming to work out exactly how the human face works, operates, but also how it ages over time," added Stuart Adcock, facial animation supervisor at Weta. Because of this wholesale replacement, "we were not limited to any photography," Adcock said. "We had full freedom, so when we did need to take it somewhere else, then we were able to do that."
To build its model, the team first ran Smith through various facial exercises, such as raising his upper lip. Such "action units," which have overlapping, interacting effects in various regions of the face, constitute the elements of the widely used facial action coding system (FACS), originally based for detailed understanding of the skull and its muscles. After a year or so of tuning a "puppet" to reproduce "current-age Will," they turned their attention to his younger clone.
Adcock likens this process to creating a musical score derived from a performance on one instrument. This representation allows the same musical piece to be performed on another instrument, which imparts its own characteristic sound.
To construct the "score," the team built on its previous expertise animating fantastic characters like Gollum in The Hobbit, equipping Smith with facial markers and a head-mounted camera rig that recorded his expressions. These rigs have been getting sleeker and lighter all the time, Adcock said. Such "performance capture," as contrasted with "motion capture" of bodily movements, provides extremely high-quality facial data for the animators.
Supporting the Storytellers
For renowned director Martin Scorsese, however, headgear and facial markers (dots on the actors faces) were still unacceptably disruptive to the human interactions he sought in the 2019 epic The Irishman. The film features Robert De Niro, Al Pacino, and Joe Pesci, all now in their late 70s, in roles spanning many ages across several decades, so a strategy for de-aging was critical. This resulted in the film being nominated for Best Special Effects and nine other Academy Awards.
Beginning several years earlier, Pablo Helman and his team from Industrial Light and Magic (ILM), the special effects house founded by George Lucas, worked with Scorsese to develop ways to create the desired images much less intrusively. To capture the performances, they built a bulky apparatus in which a normal camera was flanked by two infrared cameras and associated infrared lighting that provided synchronized, high-quality stereoscopic information needing neither visible facial markers nor changes in the lighting desired by the filmmakers.
The size and weight of this equipment posed some challenges; for example, precluding the popular hand-held Steadicam shots. Moreover, although the infrared light did not interfere with the main camera, the infrared cameras were sensitive to cigarette smoke, and could not see through vintage car windshields.
In parallel with the hardware, the ILM team developed a new software pipeline to construct the de-aged images. In addition to FACS capture from the current actors, the team acquired huge numbers of images from their previous performances over the decades, allowing them to select a facial model for each of the years in the film. Rather than reproduce the exact look of De Niro from Taxi Driver (1976) or Goodfellas (1990), however, they strove to create a de-aged version of his Irishman character. Moreover, instead of using animators to build a puppet with De Niro's facial motions, their FLUX software morphed the captured performance and lighting data to create the new image.
Instead of using animators to build a puppet with DeNiro's facial motions, the FLUX software morphed captured performance and lighting data to create the new image.
The result was a compelling de-aged character that let the actors act—and interact.
"If you're going to get the top actors of our time, you need to support their acting process and not create a technology or impose a technology that would somehow diminish that performance. That's what they were able to achieve," said Barron, who said the less-intrusive technology "allows more adoption among a wider group of people into different kind of films."
Under the Hood
Unlike the latest superhero movie, a film like The Irishman succeeds when the audience forgets about its technical sophistication. Nonetheless, achieving this realism requires an enormous, diverse team innovating in both hardware and software (and a reported $160-million budget).
"There are people that are creative, they're thinking about design and lighting and composition and performance of storytelling, and there's the people under the hood that have to make that all work. It's really a collaboration between the technologist and the artist," said Barron, whose current work at Magnopus aims to extend storytelling to the realm of virtual and augmented reality. "It's the collaboration that determines whether the project is successful or not."
For example, the Weta team worked together to improve the "facial solver" that represents the captured positions of face markers and other features in terms of underlying action units. Although there may be different ways to break down the various training movements, they used deep learning to ensure their description matched the way their animators think about those movements. With that training, the solver could then decide, for each frame of the footage, which muscles were firing, Stuart said. "It gave us a good starting point."
"The power of AI being applied to visual effects is relatively new and there are huge potentials for that," Barron said. "To harness the power of the computer to teach it to simulate reality, whether it's through creating a performance or a synthetic human or an environment, I think will have a lot of benefit to creating more and more credible illusions."
Bringing Back the Dead
Creating lifelike representations of actors in roles they never played does raise challenging ethical issues. One is the ability to put words in the mouths of politicians, or to put celebrities into pornographic scenes. Fortunately, such "deepfakes" are unlikely to have the Hollywood-level resources and actor co-operation used to make truly convincing fakes, although they are already good enough to cause trouble. (Some critics noted that The Irishman repeated some implausible claims from the book on which it was based, but worries about movies distorting historical facts are nothing new.)
"The power of AI being applied to visual effects is relatively new and there are huge potentials for that," Barron said.
Another concern is reuse of actors who are unavailable, or even dead. The 2016 "Star Wars story" Rogue One, for example, included a brief but controversial appearance by Peter Cushing, who had died in 1994. The most ambitious re-animation so far is the reported casting of James Dean, who died in 1955, as a costar in the film Finding Jack, scheduled for release late this year.
The filmmakers obtained legal permissions to use the actors' likenesses in these films. Still, some commenters are worried about the effect of recycling actors from previous eras will make it harder for current actors to find roles, except as a blank slate onto which more famous faces are mapped.
Ironically, one of De Niro's breakout roles was as a young Vito Corleone in The Godfather Part II. With today's technology, he might have been demoted to a body double for a de-aged Marlon Brando.
De-Aging the Irishman, fxguide, December 2019, https://bit.ly/2XZV1DE
The Lies of The Irishman, Slate, August 2019, https://bit.ly/3cATzwF
Pioneers of Modern Computer Graphics Recognized with ACM A.M. Turing Award, ACM, March 18, 2020, https://awards.acm.org/about/2019-turing
The Hobbit: An Unexpected Journey VFX | Breakdown - Gollum, Weta Digital, https://bit.ly/2UbKgg
Dr. Ekman explains FACS (Facial Action Coding System), https://bit.ly/3gRRakx
How The Irishman's Groundbreaking VFX Took Anti-Aging To the Next Level, Netflix, https://bit.ly/2AJFU99
©2020 ACM 0001-0782/20/8
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and full citation on the first page. Copyright for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, to republish, to post on servers, or to redistribute to lists, requires prior specific permission and/or fee. Request permission to publish from [email protected] or fax (212) 869-0481.
The Digital Library is published by the Association for Computing Machinery. Copyright © 2020 ACM, Inc.