By John Muratore, Troy Heindel, Terri Murphy, Arthur Rasmussen, Robert McFarland
Communications of the ACM,
Vol. 33 No. 12, Pages 18-31
Perhaps one of the most powerful symbols of the United States' technological prowess is the Mission Control Center (MCC) at the Lyndon B. Johnson Space Center in Houston. The rooms at Mission Control have been witness to major milestones in the history of American technology such as the first lunar landing, the rescue of Skylab, and the first launch of the Space Shuttle. When Mission Control was first activated in the early 1960s it was truly a technological marvel. This facility, however, has received only modest upgrades since the Apollo program. Until recently it maintained a mainframe-based architecture that displayed data and left the job of data analysis to flight controllers. The display technology utilized in this system was monochrome and primarily displayed text information with limited graphics (photo 1).An example display of 250 communication parameters is shown in Figure 1. The mainframe processed incoming data and displayed it to the flight controllers; however it performed few functions to convert raw data into information. The job of converting data into information upon which flight decisions could be made was performed by the flight controllers. In some cases, where additional computational support was required, small offline personal computers were added to the complex. Flight controllers visually copied data off the console display screens, and manually entered the data into the small personal computers where offline analysis could be performed.Although this system was technologically outdated, it contained years of customizing efforts and served NASA well through the early Space Shuttle program. Several factors are now driving NASA to change the architecture of Mission Control to accommodate advanced automation. First is the requirement to support an increased flight rate without major growth in the number of personnel assigned to flight control duties.A second major concern is loss of corporate knowledge due to the unique bimodal age distribution of NASA staff. Hiring freezes between the Apollo and Shuttle programs have resulted in NASA being composed of two primary groups. Approximately half of NASA consists of Apollo veterans within five years of retirement. The other half consists of personnel under the age of 35 with Shuttle-only experience. NASA considers it highly desirable to capture the corporate knowledge of the Apollo veterans in knowledge-based systems before they retire. Because the mainframe complex is primarily oriented to data display, it is a poor environment for capturing and utilizing knowledge.These factors have resulted in aggressive efforts by NASA's Mission Operations Directorate to utilize the following: a distributed system of Unix engineering-class workstations to run a mix of online real-time expert systems, and traditional automation to allow flight controllers to perform more tasks and to capture the corporate knowledge of senior personnel. Starting with the first flight of the Space Shuttle after the Challenger accident, the Real-Time Data System (RTDS) has played an increasingly significant role in the flight-critical decision-making process.
The full text of this article is premium content
No entries found
Log in to Read the Full Article
Purchase the Article
Create a Web Account
If you are an ACM member, Communications subscriber, Digital Library subscriber, or use your institution's subscription, please set up a web account to access premium content and site
features. If you are a SIG member or member of the general public, you may set up a web account to comment on free articles and sign up for email alerts.