Home → Magazine Archive → June 2015 (Vol. 58, No. 6) → Future Internets Escape the Simulator → Abstract

Future Internets Escape the Simulator

By Mark Berman, Piet Demeester, Jae Woo Lee, Kiran Nagaraja, Michael Zink, Didier Colle, Dilip Kumar Krishnappa, Dipankar Raychaudhuri, Henning Schulzrinne, Ivan Seskar, Sachin Sharma

Communications of the ACM, Vol. 58 No. 6, Pages 78-89

[article image]

Standardization of basic underlying protocols such as the Internet Protocol (IP) has enabled rapid growth and widespread adoption of the global Internet. However, standardization carries the attendant risks of reducing variability and slowing the pace of progress. Validation and deployment of potential innovations by researchers in networking, distributed computing, and cloud computing are often hampered by Internet ossification, the inertia associated with the accumulated mass of hardware, software, and protocols that constitute the global, public Internet.24 Researchers simply cannot develop, test, and deploy certain classes of important innovations into the Internet. In the best case, the experimental components and traffic would be ignored; in the worst case, they could disrupt the correct behavior of the Internet. Cloud computing researchers confront a similar dilemma. In order to maintain uniformity and efficiency in their data centers, commercial cloud providers generally do not provide "under the hood" controls that permit modification to the underlying network topology or protocols that comprise the cloud environment.

A clear example of the challenge is apparent to anyone tracking the pace of adoption of IPv6, a relatively modest revamping of IP. Because IPv6 deployment affects components throughout the Internet, years of extensive review, planning, and coordination have been required to ensure a smooth, if slow, transition. For researchers contemplating more fundamental innovations, such as non-IP protocols or new routing approaches, the barriers are correspondingly higher. Accordingly, researchers have been forced to employ compromise measures, such as validating their novel concepts only in simulation, or in modest, isolated laboratory configurations. These environments permit a wide range of experiments, but at the expense of the realism that comes with a large-scale physical deployment.


No entries found