Home → Magazine Archive → June 2023 (Vol. 66, No. 6) → Making Connections → Full Text

Making Connections

By Neil Savage

Communications of the ACM, Vol. 66 No. 6, Pages 8-10

[article image]

Save PDF

When he was a student at the Massachusetts Institute of Technology (MIT), Ethernet inventor Bob Metcalfe briefly considered pursuing a career in tennis. He was captain of the 1968–1969 MIT tennis team, which had a record of 15 wins and 4 losses, and he was ranked sixth in New England in doubles, even while taking classes and holding a programming job at defense contractor Raytheon. That, unfortunately, was not enough to make a go of it.

"There's playing pros and there's teaching pros," Metcalfe says. "I could easily be a teaching pro, but that just seemed boring. And for being a playing pro, I wasn't good enough."

Metcalfe wrote his undergraduate thesis on a bus coming back from a tennis match and submitted it to Minsky at the last possible moment.

The tennis world's loss was the computer world's gain, however, as Metcalfe went on to become an Internet pioneer, develop Ethernet, and help get it named a networking standard, actions that earned him the 2022 ACM A.M. Turing Award on the 50th anniversary of the invention of the technology.

Metcalfe was working at Xerox's Palo Alto Research Center (PARC) in 1973, having just earned his Ph.D. from Harvard University. At that time, the standard model for computer users was to have dumb terminals with no processing power at their desks, all linked to a central computer. PARC scientists decided to build a personal computer so users could run software at their desks, but needed a way to network the machines. At the same time, Xerox was building laser printers, which could handle input at a rate of 20Mb/s; the dumb terminal network operated at only 300b/s.

Metcalfe teamed up with electrical engineer David Boggs, who died last year, to come up with a way to build a fast, reliable local area network. Their only constraint was that it had to fit on a card that held 60 medium-scale integration (MSI) chips, each of which held hundreds of transistors, a much smaller number than today's computer chips.

The pair set out to design a system that could use as few wires as possible—preferably zero. At the time, though, radios were too bulky to make wireless networking feasible, so they settled on one connection, a coaxial cable. They chose the name "Ethernet" to represent the idea that the system would be agnostic about the medium that carried the signal. In the 19th century, physicists had believed that light traveled through a medium they called "the luminiferous ether," but in the early 20th century the Michelson-Morley experiment proved that no such thing existed. That left the term "ether" up for grabs, and Metcalfe grabbed it.

The original Ethernet worked on three basic technologies, none of which are still in use. One was the Jerrold tap, a method used by the cable television industry to tap into a coaxial cable. The second was Manchester encoding, in which each bit cell was either 1-0 or 0-1 transition. That had the advantage of embedding a clock into each data packet. Boggs and Metcalfe still needed a way to get nodes of the network to take turns retransmitting packets in the event two of them interfered with each other.

Metcalfe's friend Steve Crocker suggested he look at the retransmission scheme developed by ALOHAnet, a radio-based network created by the University of Hawaii to connect computers on different Hawaiian islands. Metcalfe spent a month in Hawaii studying their system and determined that randomized retransmission was the right method to use when modified for cable.

Back to Top

A Deep Background

Metcalfe had long been interested in technology. His father was a gyroscope technician and had a basement room filled with electronic junk that the two of them used to build a train set. In 1959, Metcalfe's eighth-grade teacher encouraged him to create something out of the material, so he made a rudimentary computer capable of adding any number from 1 to 3 to another number from 1 to 3. "I built it out of relays, toggle switches, and neon lights," he recalls.

A few years later he went to MIT to study electronic engineering, which was what the school called 'computer science' in those days. His advisor was Marvin Minsky, who pioneered the field of artificial intelligence (AI). "Marvin Minsky didn't think I was smart enough to do AI, so I went into networking instead, and it worked out wonderfully," he says. "So, thank you, Marvin." Metcalfe wrote his undergraduate thesis on a bus coming back from a tennis match and submitted it to Minsky at the last possible moment.

After leaving MIT in 1969, he went to Harvard to earn a master's degree and a Ph.D. in applied mathematics. He focused his research on then-new ARPANET, the first public, packet-switched network, created by the U.S. Department of Defense's Advanced Research Projects Agency. He offered to build the interface that would connect Harvard to ARPANET, but "Harvard said no. 'This interface between the Harvard and the ARPANET is too important for a grad student'," he said he was told. Instead, Metcalfe was hired by MIT to build the interface.

In 1979, he left Xerox PARC and founded 3Com, which made Internet hardware and software. He also worked as a consultant for Digital Equipment Corporation (DEC), then the second-largest computer maker after IBM. DEC's vice president of engineering, Gordon Bell, asked him to design a networking protocol for that company. Metcalfe recalled, "I told Gordon that I wouldn't do that for him, both because I felt some sort of loyalty to Xerox, but also I felt that Ethernet was already the best one I could come up with."

Instead, Metcalfe and Bell decided in February 1979 that Xerox and DEC should combine their technologies so Xerox printers could work with DEC computers. Metcalfe ran into an Intel engineer looking for a standard to apply to their chip technology, so he got the three companies together, aiming to make Ethernet a standard. "I was the marriage broker—that's their term—that brought DEC, Intel, and Xerox (DIX) together," Metcalfe says.

To satisfy anti-trust laws, DIX pushed for an open standard, IEEE Project 802. Soon, IBM and General Motors showed up to their meetings, each pushing their own standards. The IEEE decided to standardize all three proposals, but only Ethernet remains. Bell says that is because it proved superior to other approaches. "I have to put Ethernet as probably one of the most elegant designs I know of," he says, which is why it has become essentially the core of the modern Internet.

As for Metcalfe, Bell says, "Bob is a joy to work with. Very good sense of humor and a really good view of the world."

Figure. Before inventing Ethernet, Metcalfe built this Interface Message Processor (IMP) board at MIT to connect a PDP-10 host computer to ARPANET.

Part of the way Ethernet built that industry was by allowing for much higher transmission speeds. The original Ethernet carried 2.94Mb/s, 10,000 times faster than the dumb terminal network it replaced. Ethernet has since grown to far higher speeds, allowing video to dominate the Internet.

Back to Top

Six Careers

Metcalfe is not one to stick with a single job forever. "When you've been at something for 10 years, it's time to move on to something new," he says. After 3Com, where he was CEO for a time and also had "a bunch of jobs" until he left in 1989, he went into journalism, publishing InfoWorld and writing a weekly column.

In 2001, he became a venture capitalist and a partner at the VC firm Polaris Partners. His interest in entrepreneurship was not new, though. As an undergraduate at MIT, he had spent a fifth year earning a degree in industrial management from the Sloan School of Management, during which time he founded three companies.

Ten years later, spurred by the desire of his wife, Robyn Shotwell Metcalfe, to get away from the cold New England winters, he became a professor of innovation and entrepreneurship at the University of Texas, Austin, where he worked to make Austin "a better Silicon Valley."

Now he is embarking on his sixth career, as a research affiliate in computational engineering at MIT, where he's working on modeling the configuration of geothermal wells for energy. "It's different, but it's also hard because I have to learn all this stuff that I haven't really learned over the years I've been doing other things, like thermodynamics" he says.

Metcalfe's advice to young people: Not everyone should be an entrepreneur, but if you want to do it, go on and do it. "It's a very high calling. It's how innovation gets done," he says. He advises against dropping out of school, and suggests people go work for someone else first. A first job can serve as a sort of post-doctoral fellowship, he says.

Last year, Metcalfe endowed a professorship of entrepreneurship at MIT Sloan. The foundation he and his wife created also has funded a writing professorship and an Internet professorship at MIT.

While he does not yet know what he will do with the $1 million prize the comes with the Turing Award, Metcalfe has other goals. For one, he would like to "fix" science. Among his proposals are reforming the peer review system, eliminating tenure and accreditation, and dissolving the "silos" that separate scientific disciplines, which he says create a lot of unnecessary friction.

While he may have given up a career in tennis, he points out that because MIT did not have a football team at the time he was a student, at 77 he still has four years of NCAA eligibility in that sport. "You know, I weigh 220 pounds, and I got wheels and hands, but I've yet to attract the attention of the Texas football coach. Hook 'em, Horns!"

Figure. Watch Metcalfe discuss his work in the exclusive Communications video. https://cacm.acm.org/videos/2022-acm-turing-award

Back to Top


Neil Savage is a science and technology writer based in Lowell, MA, USA.

©2023 ACM  0001-0782/23/06

Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and full citation on the first page. Copyright for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, to republish, to post on servers, or to redistribute to lists, requires prior specific permission and/or fee. Request permission to publish from [email protected] or fax (212) 869-0481.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2023 ACM, Inc.


No entries found