When Ed Catmull earned his Ph.D. from the University of Utah in 1974, with a thesis on three-dimensional (3D) computer graphics, he applied for jobs in academia. He did not get a single one. His academic interests eventually led him to co-found Pixar Animation Studios, create the breakthrough 1995 animated film Toy Story, and receive the Association for Computing Machinery's 2019 A.M. Turing Award, along with his colleague Pat Hanrahan.
At the time, nobody thought computer graphics were worth much of anything. Catmull, 74, remembers trying to convince Hollywood studio executives that the technology he was working on would be important to the future of movies. "We were so irrelevant that we couldn't even get in the door to explain anything whatsoever," he says.
Hanrahan, 66, now a professor at Stanford University, agrees. "At that time, computers were about number crunching, sorting, databases, and all sorts of cool topics," Hanrahan says. Graphics were viewed as too playful for such a utilitarian machine. "They didn't see this as something you would interact with like we do today."
This is only the second time the Turing Award has been awarded for computer graphics. The first, in 1988, went to Ivan Sutherland, who taught the first computer course Catmull took at Utah. At Utah, working in a program funded by the federal Advanced Projects Research Agency, Catmull tackled the problem of how to create images on a curved surface. Computer images were made out of a series of flat polygons, and he had to figure out how to bend them, how to generate a better type of curved surface, and how to know which parts of the 3D image would be visible. The solution to that last problem was z-buffer, which calculates the depth, or z-axis, coordinates of the image.
"We were so irrelevant that we couldn't even get in the door to explain anything whatsoever," Catmull recalls.
After finishing his doctorate, Catmull initially took a job with a small company in Massachusetts that was working on computer-aided design, but within a few months was recruited by the New York Institute of Technology (NYIT) to start a computer graphics laboratory. In 1979, Hollywood finally noticed, and Star Wars creator George Lucas hired him to work at LucasFilm developing computer graphics. The graphics division became Pixar in 1986 when Apple founder Steve Jobs bought it; he later sold it to Disney.
Hanrahan originally studied nuclear physics, and after a stint working with Catmull in the graphics lab at NYIT, earned a Ph.D. in biophysics from the University of Wisconsin in 1985. He was trying to map the nervous system of a nematode, a parasitic worm, which led him into the field of data visualization and, ultimately, to computer graphics.
He joined Pixar shortly after Jobs bought it. The studio was trying to create an industry standard for handling graphics, and Hanrahan was the architect of the system, leading a collaboration of 19 different companies. His software to generate 3D images and determine their shading and texture eventually became RenderMan, a popular image rendering program that won the two a science and engineering Academy Award in 1993. RenderMan was used to create the silver, liquid T-1000 Terminator from the future in Terminator 2, and the 3D waltz scene in Beauty and the Beast, which led to Disney commissioning the first Toy Story movie. The two have been awarded additional Oscars for their achievements with computer animation.
Hanrahan also developed a language for describing the shading of the 3D images to show how they would look as lighting changed, and made it accessible to the artists creating the scenes. Engineers at LucasFilm and Pixar also tackled other challenges, such as reducing the image blur produced by motion. While the state of the art had been to create images with 40,000 polygons, they wanted their systems to handle 80 million, so bits of animation would appear realistic enough to be used in live-action films.
Hanrahan says there were hundreds of innovations and thousands of people involved in improving animation. Whereas SIGGRAPH, the annual Special Interest Group on Computer Graphics and Interactive Techniques conference, was a niche area when first launched in 1974, by the mid-1990s Hanrahan says it was probably the largest ACM conference. The goal set by pioneers in the field of actually making a computer animated movie, and not merely developing technology, fired people's imaginations. "It just became this catalytic intellectual endeavor, like this moonshot kind of thing, and all these people started improving the technology," Hanrahan says.
It was a goal he did not expect would be achieved quickly, perhaps not even in his lifetime. Hanrahan believed a movie would be a more complex version of a Turing test for artificial intelligence. Instead of creating a chatbot that an interrogator could not distinguish from a human, he imagined creating an artificial world that people could not tell from the real world. "I thought we'd have to solve the AI problem in order to actually make a full-length movie," he says. "I was wrong."
Catmull was more optimistic. He was already imagining a computer animated movie back in 1974. "I figured at the time it would take 10 years, because I knew we didn't have nearly enough processing power and we hadn't solved nearly enough problems," he says. "I was wrong, incidentally. It took 20 years, not 10."
In the early 1980s, the researchers calculated how much of the then-available computing power it would take to make a full-length computer-generated film. The answer: 100 Cray supercomputers, which at the time cost about $10 million apiece. Catmull recalls framing his approach at that time as, "Let's focus on algorithms and making the pictures look good and various techniques, and at some point the processing power will catch up with what we need, which happened around 1990."
Before giving up on hardware, Lucas-Film built the Pixar image computer to composite two-dimensional images together at a rate far faster than anything then available. That technology wound up spurring the development of volumetric medical images from MRI and CT scans.
The graphics algorithms also created a feedback loop with chip designers at companies such as NVidia. The engineers would learn about new algorithms from SIGGRAPH conferences and incorporate them into their chips, allowing them to make high-powered graphical processing units (GPUs) for the gaming industry. They provided graphics researchers with more powerful equipment they could use to improve their algorithms, which led to further improvements in the chips, until GPUs became so powerful they spilled into other areas. "Deep Learning and Neural Networks all of a sudden had enough power, so this field then exploded into a whole new field," Catmull says.
These days Catmull is semi-retired, though he does some consulting work and is revising his 2014 book on creativity. He says the secret of a successful career is choosing a problem that is neither so easy as to bore you nor so impossible it cannot be achieved. "Taking stuff where you kind of get what it is but it's really hard, that's where you want to be," he says.
Hanrahan continues to teach at Stanford, and his favorite course teaches freshmen how to build an operating system from the ground up. His advice: "Just find that thing that really interests you and do the hard work to become skilled in it."
Both men say they will likely donate their shares of the $1-million Turing Award prize money to charity. Both also say winning a Turing Award was more exciting than winning an Oscar, because it comes from their own community.
As for which Toy Story they prefer, Hanrahan says he likes the second one best. For Catmull, the question is like asking which is his favorite child. "I can't really pick one," he says, "because each had a special meaning."
Figure. Watch the the Turing recipients discuss their work in the exclusive Communications video. https://cacm.acm.org/videos/2019-acm-turing-award
©2020 ACM 0001-0782/20/6
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and full citation on the first page. Copyright for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, to republish, to post on servers, or to redistribute to lists, requires prior specific permission and/or fee. Request permission to publish from [email protected] or fax (212) 869-0481.
The Digital Library is published by the Association for Computing Machinery. Copyright © 2020 ACM, Inc.