News
Architecture and Hardware

What Did Steve Jobs Do For Computer Science?

Posted
Steve Jobs

It’s only fitting that a man who set out to build a better computer actually changed people—and the way we think about computers. When Steve Jobs died on October 5, it unleashed an avalanche of sentiment and tributes normally reserved for a much-loved president, legendary movie star, or world-champion athlete. In a world where intellectual and scientific achievement is increasingly marginalized and trivialized, Jobs broke the bounds of gravity. It’s safe to say he was the Thomas Edison of our time.

Apple is the most admired brand in the world and the company’s impact and influence is indisputable. Gleaming, silver MacBook Pros and impossibly cool iPads are everywhere. Trendy iMacs adorn desktops while the iPhone has singlehandedly transformed society and business. In the corporate world, information technology is no longer driven from the top down. Increasingly, employees decide which technologies and tools the organization will use.

Thank Steve Jobs for all of this. Along with a revolution in the way we listen to music and access different media, Jobs took existing devices and technology—things like the computer mouse and multitouch gestures—and transformed them into part of a highly integrated system. It’s clear that he had a deep understanding of how things get used and knew how to build an entire ecosystem. Apple products are not only easy to use, they are fun to use. "Steve Jobs was brilliant at transforming computers and digital technology into something useable, accessible, and highly coveted by ordinary consumers," notes Joseph Konstan, a professor of computing science and engineering at the University of Minnesota.

The evidence is everywhere. Alarm clocks and automobiles include iPod connectors. Visual voicemail, introduced on the iPhone, is fast becoming a standard. And industries as diverse as travel and retail have repackaged their products, created entirely new offerings, and altered sales channels to cater to the evolving desires of iPhone- and iPad-wielding consumers. Other computer manufacturers have not escaped the Apple traction beam either. They have scrambled to build better and more beautiful products as a result of the Cupertino company.

Jobs’ influence extends into the technical computing field as well. Web inventor Sir Tim Berners-Lee used a NeXT computer to write the initial browser that fueled today’s Internet. "Programming the World Wide Web client was remarkably easy on the NeXT," he notes. Jobs co-founded the firm in 1985 and introduced the computer in 1988. Apple purchased NeXT in 1996 and when Jobs returned to Apple the next year, he used it to lay the foundation for today’s OS X operating system.

"Steve was a champion of usable technology—even sexy technology," Berners-Lee observes. "Intuitive on the outside and extensible and cool engineering on the inside." Adds Don Norman, co-founder and executive consultant at Nielsen Norman Group and former director of Apple’s Technology Lab: "Apple stands out as perhaps the greatest innovator of our time."

Beyond Computing
Beyond the mind-bending array of accomplishments and accolades, there’s an often-overlooked side to Steve Jobs and Apple. The company’s contributions to the computer science field are sparse—particularly in recent years. When Jobs returned to Apple as CEO in 1997 he immediately axed the company’s research lab as a cost-cutting measure. Apple never reinstituted the group—even as the company returned to financial health. (It now sits on more than $75 billion in cash.) "Since that time," states Norman, "Apple hasn’t advanced the development of the science."

Norman doesn’t criticize the initial move—even though it cost him his job. "The company was in dire financial straits and it was the right decision at the time," he says. But when Jobs eliminated the group that developed QuickTime and helped set HDTV and Wi-Fi standards, Apple set out on an entirely different course. "The ‘R’ disappeared from R&D and Apple got very good at the ‘D,’ " Norman says.

A perfect example is multitouch technology. "When the technology became affordable for small screens, Apple jumped on it and soon controlled the sources. When others said, ‘We should do the same,’ they discovered that everyone was booked with Apple." Steve Jobs was also good at creating de facto standards, Norman adds. "Apple was the first to introduce small floppy discs and he was the first to eliminate them. It also was one of the first to get rid of hard drives when it brought the MacBook Air to market."

None of this changes the fact that Apple remains conspicuously absent from academic and industry conferences and events. While the likes of Dell, Intel, IBM, Microsoft, and Hewlett-Packard support and sponsor academic conferences and exchange research papers, Apple remains hunkered down in Cupertino. While other tech heavyweights participate in science, technology, engineering, and mathematics (STEM) programs at schools and contribute to industry events to entice young people to join the computer science field, Apple is nowhere to be seen.

Intel spends untold dollars each year sponsoring science competitions. It takes out ads inspiring kids to enter the science and engineering fields. Microsoft holds student competitions for robotics and programming with the purpose of getting young students interested in STEM. And Google holds Summer of Code programs in which students are inspired and, yes, paid to code.

Norman attributes Apple’s actions to a "cult of secrecy" and believes that the practice can be traced to Apple’s early days when leaks were the norm and executives sometimes first learned about new products through the press. Jobs instituted strict controls partly as a "revolt" against the past, Norman believes. However, "there’s a give and take that’s part of science and, on a certain level, there’s an obligation to share basic information and knowledge," he says. "Everyone suffers when researchers don’t talk to one another."

Insular thinking and a belief that industry events provide little or no benefits are at the center of Apple’s thinking, Konstan argues. Konstan, the chair of CHI 2012, says past attempts to involve Apple have consistently fallen on deaf ears. "The sense is that Apple likes to control its own presence and hold its own events."

Norman argues that Apple has strayed so far from a research-oriented approach that "there’s no longer an understanding of the role of scientific research in the advancement of the field. Apple has become a product-centric company that’s afraid of leaking any hint of what direction it might go in the future."

Facing the Future
What path Apple takes in the post-Jobs era is anyone’s guess. But when an undisputed industry leader such as Apple de-emphasizes R&D and eschews participation in industry events, it has a potentially negative impact on the scientific community, Norman and others argue. It is no coincidence that today’s labs are a mere shadow of what they once were and a growing number of them are only a vehicle for company-specific product-based research, he says.

Many wonder whether Apple will change its approach under CEO Tim Cook, who holds an undergraduate degree in electrical engineering, as well as an MBA. Another question is to what extent Jobs’ wife, Laurene Powell Jobs, will embrace philanthropic causes, including computer science education. Steve Jobs did not, at least publicly, make any significant charitable contributions, despite a reported $6.5 billion fortune. However, Powell Jobs—who holds an MBA from Stanford and formerly worked in asset management at Merrill Lynch and Goldman Sachs—has an admirable record of supporting education issues and financially contributing to organizations and institutions.

In the end, it’s nothing short of ironic that the same quality that made Steve Jobs great may have served as his biggest flaw. His nontechnical background and focus on design allowed him to view computing in a way that no one else did—and perhaps no one else ever will. But it also left him blind to the needs of the research community, as well as young people, including women and minorities. The latter groups are not entering the computer science field in sufficient numbers and that is something that threatens the future of the entire field.

Nevertheless, the legacy of Steve Jobs will endure. Frank Rose, author of West of Eden: The End of Innocence at Apple, says the biggest takeaway from Jobs is that his "uncompromising nature and insistence on quality," combined with Apple’s enormous success, has greatly influenced the way designers, scientists, business leaders, and government officials make decisions about aesthetic and technical issues. "Steve Jobs proved," says Rose, "that you can make technology personal and accessible, that you can combine form and function to create truly great products."

 

Samuel Greengard wrote this article on a 27-inch iMac running OS X Lion.
 

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More