News
Artificial Intelligence and Machine Learning

Why the Simplex Method, at Age 75, is Still the Go-To Algorithm

Posted
Daniel A. Spielman
Since its creation more than two decades ago by Daniel Spielman (above) and Shang-hua Teng, smoothed analysis has been used to analyze performance of algorithms other than the simplex method, including interior-point methods for linear programming.

In 1947, mathematical scientist George Dantzig invented the simplex method, a powerful and practical means to find solutions to linear programming for optimization problems. Scientists lost no time putting the simplex method to use in a variety of applications across government, industry, science, and engineering.

Half a century later, when Daniel A. Spielman was a Ph.D. student at the Massachusetts Institute of Technology, the simplex method stood at the top of the pantheon of important algorithms. However,  research had shown the simplex method had proven pitfalls; it ought not to perform as well as it did.  What was going on?

Spielman solved the mystery in 2001, in joint work with Shang-hua Teng, now University Professor and Seeley G. Mudd Professor of Computer Science and Mathematics in the Viterbi School of Engineering at the University of Southern California. Pioneering a technique called smoothed analysis, their work provided an original and compelling explanation of the power of the simplex method and suggested a new paradigm for gauging the effectiveness of algorithms.

This work was recognized in 2008 by the Gödel Prize, sponsored jointly by ACM's Special Interest Group on Algorithms and Computing Theory (SIGACT) and the European Association for Theoretical Computer Science. Now Spielman, Sterling Professor of Computer Science and a professor of Statistics and Data Science and of Mathematics at Yale University, has been awarded the 2023 Breakthrough Prize in Mathematics, in part for this work in optimization.

Spielman has a great ability "to come up with new approaches," said Lance Fortnow, dean of the College of Computing at the Illinois Institute of Technology. "It wasn't like someone else had invented smoothed analysis and Spielman said, 'Let's try it on linear programming'. This was, 'I want to understand linear programming. What's the right way to do it?'"

Simplex Bests Polynomial-time Competition

Dantzig formulated the concept of linear programming as a way to model optimization problems. The model produces a polyhedron, possibly of very high dimension, where the corners represent solutions. The simplex method provides a highly efficient way of moving along polyhedron edges toward an optimal corner. The algorithm was tailor-made for the computing machines that were just beginning to appear when Dantzig did this work.

In the 1970s, the rise of complexity theory brought new precision to the study of efficiency of algorithms. Generally, an algorithm is considered efficient if it runs in polynomial time, meaning that even for the worst-case inputs to the algorithm, the running time is always bounded by a polynomial function of the input size. This is in contrast with algorithms whose running time can increase exponentially with input size.

Soon a curious fact arose: despite its excellent performance in practice, the simplex method does not run in polynomial time. Examples were found on which simplex ran in exponential time. Eventually, polynomial-time algorithms for linear programming were found, but the simplex method continued to be used — and in many situations, outperformed its polynomial-time competitors.

Why does simplex work so well?

This question was in the air when Fortnow was a Ph.D. student at the Massachusetts Institute of Technology (MIT) in the 1980s, several years before Spielman became a graduate student there.  However, Fortnow recalls, few were attempting to resolve it. "The simplex method had already been around for 40 years," said Fortnow. The general attitude was, "Well, it works well in practice."

When Spielman and Teng came up with smoothed analysis, it was a big surprise. "It's a whole different way of looking at worst-case complexity," said Fortnow, "and they could apply it to linear programming. Both of these pieces were very exciting."

Worst Case versus Random

Spielman was a graduate student when he and Teng started working together; Teng was then an instructor at MIT.

The pair's initial goal was to improve the simplex method by finding a new way to proceed around the polyhedron, in the hope the improved algorithm would run in polynomial time.  "We failed in that effort," said Spielman, "but when we were working on it, we learned something."

The polynomial-time standard focuses on how an algorithm performs on the worst cases, regardless of whether such cases are typical. The cases for which the simplex method requires exponential time, such as the famous example found by Victor Klee and George Minty in 1972, "were known to be artificial," said Spielman. "They didn't tend to occur in practice."

An alternative way to look at algorithmic performance is to examine how an algorithm performs with random inputs. Around 1980, German computer scientist Karl-Heinz Borgwardt proved that when fed random inputs, the simplex method always runs in polynomial time. This result brought new insights, but did not tell the whole story.

Random inputs, far from being typical, are in fact "very special," Spielman said. "Every single part of the input being random is actually a very strong condition. That doesn't usually happen." In practice, inputs to the simplex method are from real-world systems, which are not purely random but contain structure.

What Spielman and Teng found is that if they took a worst-case input for the simplex method and introduced a small random perturbation, the perturbed input would suddenly run in polynomial time. "We learned that [the worst cases] were really very fragile," said Spielman.  "The way Shang-hua put it is that, if you are wearing boxing gloves, you couldn't build one of them. You needed great precision in building an example that would be bad for the simplex method."

This observation led Spielman and Teng to develop smoothed analysis, which blends the worst-case and random viewpoints, yielding a compelling explanation of why the simplex method works so well. The "smoothed complexity" of an algorithm is the maximum of the expected running time of the algorithm over inputs containing slight perturbations. For an algorithm of low smoothed complexity, every input has a neighborhood containing inputs on which the algorithm will perform well.

Since its birth more than two decades ago, smoothed analysis has been used to analyze the performance of algorithms other than the simplex method, including interior-point methods for linear programming. It also has guided the design of new algorithms.

Recently, smoothed analysis has been used in machine learning, where often one searches for clustering in data. "Data usually comes from the real world or from something natural, so you expect there to be noise in it," Spielman remarked.  "So in clustering or other machine learning tasks, smoothed analysis should be better than the worst-case analysis."

The Right Questions

Sometimes when a newcomer does great early work, the pressure to excel at that level can become crippling. However, "It was the opposite with Dan," Fortnow said. "It just got him started. He went on to do bigger and better things." The citation for Spielman's Breakthrough Prize mentions his contributions in spectral graph theory, the Kadison-Singer problem, numerical linear algebra, and coding theory.

Spielman has brought to all his work a highly original viewpoint that can guide new research directions. "There are the people who can find the right question to ask, and there are the people who can answer the questions," Fortnow said. "It's rare to be both.  And Dan's both."

Allyn Jackson is a journalist based in Germany who specializes in science and mathematics.

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More