************************************************************** * This is the first GAO report to be made available over * * the Internet. GAO wants to know how many people * * acquire the report this way. If you are reading this, * * please send mail to me and I'll keep * * count for them. Your name will not be saved or used. * ************************************************************** United States General Accounting Office GAO Report to the Chairman, Subcommittee on Telecommunications and Finance, Committee on Energy and Commerce House of Representatives June 1989 COMPUTER SECURITY Virus Highlights Need for improved Internet Management GAO/IMTEC-89-57 Contents Page EXECUTIVE SUMMARY 2 CHAPTER 1 INTRODUCTION 10 Internet Evolves From 10 an Experimental Network Rapid Growth of the Internet 12 Management in a Decentralized 12 Environment Future of the Internet 14 Internet Virus Spread Over 15 Networks to Vulnerable Computers Objectives, Scope, and 17 Methodology 2 VIRUS FOCUSES ATTENTION ON 19 INTERNET VULNERABILITIES Impact of Virus 19 Vulnerabilities Highlighted 20 by Virus Actions Taken in Response 26 to Virus Conclusions 28 Recommendation 30 3 FACTORS HINDERING PROSECUTION 32 OF COMPUTER VIRUS CASES No Statute Specifically 32 Directed at Viruses Technical Nature of Virus- 34 Type Incidents May Hinder Prosecution Proposed Legislation on 35 Computer Viruses and Related Offenses Conclusions 36 APPENDIXES APPENDIX I History of Computer Viruses 37 APPENDIX II Research Aimed at Improving 43 Computer and Open Network Security APPENDIX III Major Contributors to This Report 49 Abbreviations CERT Computer Emergency Response Team DARPA Defense Advanced Research Projects Agency FCCSET Federal Coordinating Council on Science, Engineering and Technology FRICC Federal Research Internet Coordinating Committee GAO General Accounting Office HHS Department of Health and Human Services IMTEC Information Management and Technology Division MIT Massachusetts Institute of Technology NASA National Aeronautics and Space Administration NCSC National Computer Security Center NIST National Institute of Standards and Technology NSF National Science Foundation OSTP Office of Science and Technology Policy PC personal computer EXECUTIVE SUMMARY PURPOSE In November 1988, a computer program caused thousands of computers on the Internet--a multi-network system connecting over 60,000 computers nationwide and overseas--to shut down. This program, commonly referred to as a computer virus or worm, entered computers and continuously recopied itself, consuming resources and hampering network operations. Concerned about Internet security and the virus incident, the Chairman, Subcommittee on Telecommunications and Finance, House Committee on Energy and Commerce, asked GAO to -- provide an overview of the virus incident, -- examine issues relating to Internet security and vulnerabilities, and -- describe the factors affecting the prosecution of computer virus incidents. BACKGROUND The Internet, the main computer network used by the U.S. research community, comprises over 500 autonomous unclassified national, regional, and local networks. Two of the largest networks are sponsored by the National Science Foundation and the Department of Defense. In addition, three other agencies operate research networks on the Internet. Over the past 20 years, the Internet has come to play an integral role in the research community, providing a means to send electronic mail, transfer files, and access data bases and supercomputers. There is no lead agency or organization responsible for Internet-wide management. Responsibility for computer security rests largely with the host sites that own and operate the computers, while each network is managed by the network's sponsor, such as a federal agency, university, or regional consortium. Plans are for the Internet to evolve into a faster, more accessible, larger capacity network system called the National Research Network. The initiative to upgrade the Internet-- described as a "super highway" for the research community--stems from a report by the Office of Science and Technology Policy. This Office, headed by the President's Science Advisor, has a broad legislative mandate to coordinate and develop federal science policy. In recent years, the public has become increasingly aware of computer virus-type programs that can multiply and spread among computers. The Internet virus differed from earlier viruses (which primarily attacked personal computers) in that it was the first to use networks to spread, on its own, to vulnerable computer systems. There is no federal statute that specifically addresses virus-type incidents. Forty-eight states have enacted laws dealing with computer crime. _____________________________________________________________________ RESULTS IN BRIEF Within hours after it appeared, the Internet virus had reportedly infected up to 6,000 computers, clogging systems and disrupting most of the nation's major research centers. After 2 days, the virus was eradicated at most sites, largely through the efforts of university computer experts. After the virus incident, multiple intrusions (not involving viruses) at several Internet sites added to concerns about security. These incidents highlighted such vulnerabilities as (1) the lack of an Internet focal point for addressing security issues, (2) security weaknesses at some sites, and (3) problems in developing, distributing, and installing software fixes (i.e., repairs to software flaws). While various agencies and groups have taken actions to enhance security, GAO believes that many of the vulnerabilities highlighted by the virus and subsequent intrusions require actions transcending those of individual agencies or groups. For this reason, GAO believes a security focal point should be established to fill a void in Internet's management structure. Several factors may hinder successful prosecution of virus-type incidents. For example, since there is no federal statute that specifically makes such conduct a crime, other laws must be applied. In addition, the technical nature of such cases may make it difficult to proceed to trial. PRINCIPAL FINDINGS Internet Virus Incident The onset of the virus was extremely swift. Within an hour after it appeared, the virus was reported at many sites, and by early morning, November 3, thousands of computers were infected at such sites as the Department of Energy's Lawrence Livermore National Laboratory, the National Aeronautics and Space Administration's Ames Research Center, the Massachusetts Institute of Technology, Purdue University, and the University of Maryland. The virus spread over networks largely by exploiting (1) two holes (flaws) in systems software used by many computers on the networks and (2) weaknesses in host site security policies, such as lax password management. The primary effects of the virus were lost computer processing and staff time. However, while apparently no permanent damage was done, a few changes to the virus program could have resulted in widespread damage and compromise of sensitive or private information. Vulnerabilities Highlighted The lack of an Internet security focal point created difficulties in responding to the virus. For example, problems were reported in communicating information about the virus to sites, coordinating emergency response activities, and distributing fixes to eradicate the virus. The virus also exploited security weaknesses at some sites. For example, the incident showed that some sites paid insufficient attention to security issues, such as proper password usage, and lacked system management expertise for dealing with technical issues. In addition, problems were highlighted in developing, distributing, and installing software fixes for known flaws. For example, vendors are not always timely in repairing software holes that may create security vulnerabilities. Further, even when fixes are available, sites may not install them, through either neglect or lack of expertise. In the subsequent intrusions, intruders entered several computer systems by exploiting a known software hole. In one case, the vendor had not supplied the fix for the hole, and in the other, the fix was supplied but not installed. Since the virus incident, agencies and groups have taken actions, such as creating computer emergency response centers and issuing ethics statements to heighten users' moral awareness. These actions are an important part of the overall effort needed to upgrade Internet security. However, GAO believes that a focal point is needed to provide the oversight, coordination, and policy-making capabilities necessary to adequately address Internet's security vulnerabilities. Since no one organization is responsible for Internet-wide management and the Office of Science and Technology Policy has taken a leadership role in initiating plans for a National Research Network, GAO believes that the Office would be the most appropriate body to coordinate the establishment of a security focal point. Prosecution Problems To prosecute computer virus-type incidents on the federal level, such laws as the Computer Fraud and Abuse Act of 1986 (18 U.S.C. 1030) or the Wire Fraud Act (18 U.S.C. 1343) may be used. However, the 1986 act, the law most closely related to computer virus-type cases, is relatively new, untried with respect to virus- type offenses, and contains terms that are not defined. Also, the evidence in such cases tends to be highly technical, requiring prosecutors to devote much time and resources preparing for them. _____________________________________________________________________ RECOMMENDATIONS To help ensure the necessary improvements to Internet-wide security are achieved, GAO recommends that the President's Science Advisor, Office of Science and Technology Policy, coordinate the establishment of an interagency group, including representatives from the agencies that fund research networks on the Internet, to serve as the Internet security focal point. This group should -- provide Internet-wide security policy, direction, and coordination; -- support ongoing efforts to enhance Internet security; -- obtain input and feedback from Internet users, software vendors, technical advisory groups, and federal agencies regarding security issues; and -- become an integral part of whatever structure emerges to manage the National Research Network. AGENCY COMMENTS As requested, GAO did not obtain official agency comments on this report. However, the views of officials from the Defense Department, National Science Foundation, and the Office of Science and Technology Policy were obtained and incorporated in the report where appropriate. CHAPTER 1 INTRODUCTION On Wednesday, November 2, 1988, a virus** appeared on the Internet, the main computer network system used by U.S. researchers. The virus reportedly infected up to 6,000 computers, consuming resources and hampering network operations. The Internet, an unclassified multi-network system connecting over 500 networks and over 60,000 computers nationwide and overseas, has come to play an integral role within the research community. A user on any one of the thousands of computers attached to any Internet network can reach any other user and has potential access to such resources as supercomputers and data bases. This chapter presents an overview of the Internet--how it evolved, how it is used and managed, and what plans there are for its further development--as well as a description of the events surrounding the Internet virus. ** Although there is no standard definition, technical accounts sometimnes use the term "worm" rather than "virus" to refer to the self-propagating program introduced on November 2. The differences between the two are subtle, the essential one being that worms propagate on their own while viruses, narrowly interpreted, require human involvement (usually unwitting) to propagate. However, their effects can be identical. We have chosen to use the term virus in deference to popular use. INTERNET EVOLVES FROM AN EXPERIMENTAL NETWORK The Internet began as an experimental, prototype network called Arpanet, established in 1969 by the Department of Defense's Defense Advanced Research Projects Agency (DARPA). Through Arpanet, DARPA sought to demonstrate the possibilities of computer networking based on packet-switching technology.** Subsequently, DARPA sponsored several other packet-switching networks. In the 1970s, recognizing the need to link these networks, DARPA supported the development of a set of procedures and rules for addressing and routing messages across separate networks. These procedures and rules, called the "Internet protocols," provided a universal language allowing information to be routed across multiple interconnected networks. ** Packet switching is a technique for achieving economical and effective communication among computers on a network. It provides a way to break a message into small units, or packets, for independent transmission among host computers on a network, so that a single communicatin channel can be shared by many users. Once the packets reach their final destination, they are reassembled into the complete message. From its inception, Arpanet served as a dual-purpose network, providing a testbed for state-of-the-art computer network research as well as network services for the research community. In the 1980s, the number of networks attached to Arpanet grew as technological advances facilitated network connections. By 1983 Arpanet had become so heavily used that Defense split off operational military traffic onto a separate system called Milnet, funded and managed by the Defense Communications Agency. Both Arpanet and Milnet are unclassified networks. Classified military and government systems are isolated and physically separated from these networks. Building on existing Internet technology, the National Science Foundation (NSF), responsible for nurturing U.S. science infrastructure, fostered the proliferation of additional networks. In 1985, NSF made the Internet protocols the standard for its six supercomputing centers and, in 1986, funded a backbone network--NSFnet--linking the six centers.** NSF also supported a number of regional and local area campus networks whose network connections were facilitated through NSF funding.*** As of September 1988, there were about 290 campus networks connected to NSFnet through about 13 regional networks. Many of these networks also connect to Arpanet. ** A backbone network is a network to which smaller networks are attached. Arpanet and Milnet are also backbone networks. *** Regional networks include partial-statewide networks (e.g., Bay Area Regional Research Network in northern California), statewide networks (e.g., New York State Educational Research Network), and multi-state networks (e.g., Southern Universities Research Association Network). Other federal agencies fund research networks. The Department of Energy, the National Aeronautics and Space Administration (NASA), and the Department of Health and Human Services (HHS) operate networks on the Internet that support their missions. This loosely organized web of interconnected networks-- including Arpanet, Milnet, NSFnet, and the scores of local and regional networks that use the Internet protocols--make up the Internet. The Internet supports a vast, multi-disciplinary community of researchers, including not only computer scientists but physicists, electrical engineers, mathematicians, medical researchers, chemists, and astronomers. Researchers use the Internet for a variety of functions; electronic mail, which provides a way of sending person-to-person messages almost instantaneously, is the most frequent use. Using electronic mail, researchers separated by thousands of miles can collaborate on projects, sharing results and comments daily. Other uses of the Internet include file transfer and remote access to computer data banks and supercomputers. Access to supercomputers has had a dramatic impact on scientific endeavors; experiments that took years to complete on an ordinary computer can take weeks on a supercomputer. Currently, use of the Internet is generally free-of-charge to individuals engaged in government-sponsored research. RAPID GROWTH OF THE INTERNET The Internet's transition from a prototype network to a large-scale multi-network has been rapid, far exceeding expectations. In the past 5 years, its growth has been particularly dramatic. For example: -- In late 1983, the Internet comprised just over 50 networks; by the end of 1988, the number had grown to over 500. -- In 1982, about 200 host computers were listed in a network data base; by early 1987, there were about 20,000, and by early 1989 the number exceeded 60,000.** ** Host computers, which include supercomputers, mainframes, and minicomputers, are the machines,attached to the networks, that run application programs. -- An October 1988 NSF network publication estimated that there were over half a million Internet users.** ** "NSF Network News", No. 5, NSF Network Service Center, Oct, 1988. Funding for Internet operations comes from the five agencies (DARPA, NSF, Energy, NASA, and HHS) involved in operating research networks and from universities, states, and private companies involved in operating and participating in local and regional networks. A 1987 Office of Science and Technology Policy (OSTP) report estimated federal funding to be approximately $50 million. A national information technology consortium official estimated that university investments in local and regional networks are in the hundreds of millions of dollars; state investments are estimated in the millions and rapidly growing.** ** Industry also invests in local and regional networks; however, the amount of that investment could not be determined. MANAGEMENT IN A DECENTRALIZED ENVIRONMENT Management of the Internet is decentralized, residing primarily at the host site and individual network levels. Early in the Internet's development, responsibility for managing and securing host computers was given to the end-users--the host sites, such as college campuses and federal agencies, that owned and operated them. It was believed that the host sites were in the best position to manage and determine a level of security appropriate for their systems. Further, DARPA's (Arpanet's developer and the major federal agency involved in the Internet in its early years) primary function was in fostering research in state-of-the-art technology rather than operating and managing proven technology. At each host site, there may be many host computers.** These computers are controlled by systems managers who may perform a variety of security-related functions, including -- establishing access controls to computers through passwords or other means; -- configuration management, enabling them to control the versions of the software being used and how changes to that software are made; -- software maintenance to ensure that software holes (flaws) are repaired; and -- security checks to detect and protect against unauthorized use of computers. ** For example, at the University of California, Berkeley, there are over 2,000 host computers. Operational Management at the Network Level Each of the Internet's more than 500 networks maintains operational control over its own network, be it a backbone network (such as NSFnet), a regional network, or a local area network. Distributed responsibility allows for use of different technologies as well as different types of administration. Each network is autonomous and has its own operations center that monitors and maintains its portion of the Internet. In addition, some of the larger networks maintain information centers that provide information on network use and resources. No Internet-wide Management No one agency or organization is responsible for overall management of the Internet. According to a DARPA official, decentralization provided the needed flexibility for the Internet's continuing growth and evolution. Within the Internet, networks operated by government agencies serve as backbones to connect autonomous regional and local (campus) networks. Agency backbone networks were established with agency missions in mind, and their structures and modes of operation generally reflect individual agency philosophies. In the fall of 1987, representatives of the five federal agencies--DARPA, NSF, Energy, NASA, HHS--that operate Internet research networks joined forces to form the Federal Research Internet Coordinating Committee (FRICC). The objectives of this informal group include coordinating network research and development, facilitating resource sharing, reducing operating costs, and consolidating requirements for international connections of the participating agencies. Currently, FRICC is involved in developing plans to upgrade the Internet and improve services. FUTURE OF THE INTERNET The Internet, long characterized by growth and change, is evolving into an enhanced, upgraded system to be called the National Research Network. Plans are for the enhanced network system to serve as a superhighway that would run faster, reach farther, and be more accessible than any other computer network system in the world. The National Research Network will include a number of high- speed networks, including NSFnet, Defense Research Internet, and other research networks funded by NASA, Energy, and HHS.** The networks will use a shared, cross-country, high-capacity link called the Research Interagency Backbone. ** Within the next few years, Arapnet will be replaced as an all-purpose network by NSFnet. A Defense Research Internet will be created for experimental work in computer networking. The initiative for an upgraded network stemmed from two high-level studies prepared by the Office of Science and Technology Policy and an ad hoc committee of the National Research Council.** OSTP has a broad mandate to coordinate and develop federal science policy. Within OSTP, the Congress established the Federal Coordinating Council on Science, Engineering and Technology (FCCSET) to initiate interagency consideration of broad national issues and coordinate government programs. ** "A Research and Development Strategy for High Performance Computing", Office of Science and Technology Policy (Washington, D.C., Nov. 1987), and "Toward a National Research Network", National Network Review Committee, National Academy Press (Washington, D.C., 1988). Both studies noted the critical importance of a modern, high-speed research network in providing for research and technology development. They concluded that current network technology did not adequately support scientific collaboration and that U.S. networks, commercial and government-sponsored, were not coordinated, had insufficient capacity, and did not assure privacy. The studies recommended that a national research network be established to improve network capabilities. The Chairman of the FCCSET Subcommittee on Networking has asked FRICC to develop a coordinated, multi-agency implementation plan for the National Research Network. FRICC has taken some initial steps toward upgrading the Internet. FRICC's NSF representative has agreed to take the lead in organizing the National Research Network, coordinating multi- agency efforts and the development of long-term management plans. In early 1989, NSF sent out a request for proposals to provide and manage the Research Interagency Backbone. INTERNET VIRUS SPREAD OVER NETWORKS TO VULNERABLE COMPUTERS The Internet virus, which entered computers and continuously recopied itself, was not the first virus-type program to infect computers. However, it differed from earlier viruses in several key respects. First, previous viruses were almost always limited to personal computers (PCs), whereas the Internet virus infected larger systems, such as minicomputers, workstations, and mainframes. In addition, the Internet virus was the first to spread over a network automatically (i.e., without requiring other programs or user intervention to transmit it). The networks themselves (i.e., the communications hardware and software that connect the computer systems) were not infected by the virus; rather, they served as a roadway enabling the virus to spread rapidly to vulnerable computers. In transit, the virus was indistinguishable from legitimate traffic and, thus, could not be detected until it infected a computer. The principal symptoms of the virus were degradation of system response and loss of data storage space on file systems. How the Virus Spread The Internet virus spread largely by exploiting security holes in systems software based on the Berkeley Software Distribution UNIX system and by taking advantage of vulnerabilities in host site security policies.** UNIX is the most commonly used operating system on the Internet--a University of California, Berkeley, researcher estimated that about three- quarters of the computers attached to the Internet use some version of UNIX. Machines infected were VAX and Sun-3 computer systems.*** ** UNIX is a registered trademark of AT&T Laboratories. Berkeley distributes its own versin of UNIX, and a number of other systems manufacturers have selected the Berkeley UNIX version as the basis for their own operating systems. The virus did not attack the operating system's "kernel" that manages the system; rather, it exploited flaws in peripheral service or utility programs. *** VAX and Sun-3 computers are built by Digital Equipment Corporation and Sun Microsystems, Inc., respectively. The virus propagated by using four methods of attack:** ** See appendix I for a more detailed account of the security flaws the virus exploited. Sendmail: A utility program that handles the complex tasks of routing and delivering computer mail. The virus exploited a "debug" feature of sendmail that allowed a remote operator to send executable programs. After issuing the debug command, the virus gave orders to copy itself. Fingerd: A utility program that allows users to obtain public information about other users, such as a user's full name or telephone extension. A hole in the program allowed the virus to propagate to distant machines. Passwords: The virus tried different methods to guess user passwords. Once the virus gained access through a correct password, it could masquerade as a legitimate user and exercise that user's privileges to gain access to other machines. Trusted hosts: Trusted host features provide users convenient access to each other's resources. This is not a software hole; it is a convenience sometimes used on local networks where users frequently use services provided by many different computers. By using these features, the virus spread quickly within local networks once one computer had been penetrated. Chronology of the Virus The onset of the virus was extremely swift. The first reports of the virus came from several sites at 9 p.m., Eastern Standard Time, on Wednesday, November 2. An hour later, the virus was reported at multiple Internet sites, and by early morning, November 3, the virus had infected thousands of computer systems. Most of the nation's major research centers were affected, including Energy's Lawrence Livermore National Laboratory; NASA's Ames Research Center; the University of California, Berkeley; the Massachusetts Institute of Technology (MIT); Carnegie Mellon University; Cornell University; Purdue University; and many others. The virus also affected sites on Milnet and several overseas sites. As noted earlier, the Internet is an open, unclassified network; the virus did not affect classified government or operational military systems. Once the virus was detected, many sites disconnected their computers from the Internet, leaving only one or two computers running to communicate with other sites and to permit study of virus activity. By Thursday, November 3, the sendmail and fingerd holes had been identified, and by late that night, the Computer Systems Research Group at the University of California, Berkeley, had posted patches on network bulletin boards to mend the holes.** By Friday evening, the virus had been eliminated at most sites. At a November 8 virus post-mortem conference, hosted by the National Security Agency's National Computer Security Center (NCSC), attendees concluded that the virus had been analyzed and eradicated by computer science experts located primarily at university research institutions, with U.S. government personnel playing a small role. ** A patch is a modification made to an object program. Patches to the sendmail hole had been posted on Thursday morning. OBJECTIVES, SCOPE, AND METHODOLOGY In response to an October 14, 1988, request of the Chairman, Subcommittee on Telecommunications and Finance, House Committee on Energy and Commerce, and subsequent agreements with his office, the objectives of our review were to -- describe the virus incident, -- examine issues relating to Internet security and vulnerabilities, and -- discuss factors affecting the prosecution of computer virus incidents. In addition, we sought to identify federal research directed specifically at viruses and to provide an overview of research that may improve security on open networks, such as the Internet. To understand the nature, structure, and management of the Internet and to determine events surrounding the Internet virus and related security issues, we reviewed: -- Reports, analyses, and briefings prepared by NCSC, DARPA, the Defense Communications Agency, NSF, NASA, and the Department of Energy. -- Academic analyses prepared by individuals associated with MIT, Purdue University, and the University of Utah. -- Accounts of the virus and its aftermath in scientific publications, industry journals, and newspapers. We discussed the virus incident, implications of an open network environment, security issues, the need for increased centralized management, and the National Research Network with -- Officials from the agencies listed above as well as from the National Institute of Standards and Technology (NIST), OSTP, FCCSET, FRICC, the Office of Management and Budget, and the General Services Administration. -- Officials representing systems software vendors, including the Computer Systems Research Group of the University of California, Berkeley; Sun Microsystems, Inc.; and Digital Equipment Corporation. -- Network users representing federal and academic sites, including Harvard University, MIT, NASA's Ames Research Center, Energy's Lawrence Livermore National Laboratory, and the University of California, Berkeley. -- Officials from private sector security companies in the Washington, D.C., area and California and from SRI, International, which operates the Defense-funded Network Information Center. To obtain a perspective on factors affecting the prosecution of computer virus offenses, we discussed the relevant laws with officials of the Federal Bureau of Investigation, Department of Justice, and Secret Service. We also discussed these issues with representatives of the Colorado Association of Computer Crime Investigators and the University of Colorado's Computer Law Center. We discussed research aimed at improving computer and open network security with officials from government agencies and systems software vendors cited above; with members of the Internet Activities Board, a technical group concerned with Internet standards; and with officials from Bolt, Beranek, and Newman, Inc., which maintains Arpanet's Network Operations Center. We did not develop a complete inventory of current research nor did we evaluate its potential effectiveness. Our work was performed in accordance with generally accepted government auditing standards. We performed our work primarily between November 1988 and March 1989 in Washington, D.C., and at research institutions and vendor locations in Massachusetts and California. We discussed the contents of a draft of this report with DARPA, NSF, and OSTP officials, and their comments have been incorporated where appropriate. However, as requested, we did not obtain official agency comments. CHAPTER 2 VIRUS FOCUSES ATTENTION ON INTERNET VULNERABILITIES Although the virus spread swiftly over the networks to vulnerable computers, it apparently caused no permanent damage. However, the virus highlighted vulnerabilities relating to (1) the lack of a focal point for responding to Internet-wide security problems, (2) host site security weaknesses, and (3) problems in developing, distributing, and installing software fixes. A number of agencies and organizations have taken actions since the virus to address identified problems. However, we believe that these actions alone will not provide the focus needed to adequately address the Internet's security vulnerabilities. IMPACT OF VIRUS The virus caused no lasting damage; its primary impact was lost processing time on infected computers and lost staff time in putting the computers back on line. The virus did not destroy or alter files, intercept private mail, reveal data or passwords, or corrupt data bases. No official estimates have been made of how many computers the virus infected, in part because no one organization is responsible for obtaining such information. According to press accounts, about 6,000 computers were infected. This estimate was reportedly based on an MIT estimate that 10 percent of its machines had been infected, a figure then extrapolated to estimate the total number of infected machines. However, not all sites have the same proportion of vulnerable machines as MIT. A Harvard University researcher who queried users over the Internet contends that a more accurate estimate would be between 1,000 and 3,000 computers infected. Similar problems exist in trying to estimate virus-related dollar loss. The total number of infected machines is unknown, and the amount of staff time expended on virus-related problems probably differed at each site. The Harvard University researcher mentioned earlier estimated dollar losses to be between $100,000 and $10 million. Estimated losses from individual sites are generally not available. However, NASA's Ames Research Center and Energy's Lawrence Livermore National Laboratory, two major government sites, estimated their dollar losses at $72,500 and $100,000, respectively. These losses were attributed primarily to lost staff time. Although the virus is described as benign because apparently no permanent damage was done, a few changes to the virus program could have resulted in widespread damage and compromise, according to computer experts. For example, these experts said that with a slightly enhanced program, the virus could have erased files on infected computers or remained undetected for weeks, surreptitiously changing information on computer files. VULNERABILITIES HIGHLIGHTED BY VIRUS In the aftermath of the virus, questions have been raised about how the virus spread, how it was contained, and what steps, if any, are needed to increase Internet security. These questions have been the subject of a number of post-virus meetings and reports prepared by government agencies and university researchers.** ** Major meetings included (1) a November 8 NCSC-hosted meeting to review the virus attack and its aftermath, attended by over 75 researchers and administrators from government and academia and (2) a December 2 meeting of UNIX vendors and users, hosted by NCSC, NIST, and a users group. Based on these assessments, we believe that the virus incident revealed several vulnerabilities that made it easier for the virus to spread and more difficult for the virus to be eradicated. These vulnerabilities also came into play in later intrusions (not involving a virus) onto several Internet sites in November and December. The vulnerabilities--lack of a focal point for addressing Internet-wide security problems; security weaknesses at some host sites; and problems in developing, distributing, and installing systems software fixes--are discussed below. Lack of a Focal Point to Address Internet-wide Security Problems During the virus attack, the lack of an Internet security focal point made it difficult to coordinate emergency response activities, communicate information about the virus to vulnerable sites, and distribute fixes to eradicate it. A Defense Communications Agency account of the virus cited a series of problems stemming from the lack of a central, coordinating mechanism. For example: -- Although the virus was detected at various sites, users did not know to whom or how to report the virus, thus hindering virus containment and repair. -- There were no plans or procedures for such an emergency situation. People used ad hoc methods to communicate, including telephone or facsimile. In many instances, sites disconnected from the Internet. While effective in the short run, this action also impeded communications about fixes. -- It was unclear who was responsible for protecting networks from viruses, resulting in confusion among user, network, and vendor groups. The confusion surrounding the virus incident was echoed by many Internet users. For example: -- A Purdue University researcher concluded that user response to the virus was ad hoc and resulted in duplicated effort and failure to promptly disseminate information to sites that needed it.** -- At Energy's Los Alamos National Laboratory, researchers reported that they received conflicting information on fixes. Because they did not have a UNIX expert on site, they had difficulty determining which fix was reliable. -- At Harvard University, researchers expressed frustration at the lack of coordination with other sites experiencing the same problems. ** Eugene H. Spafford, "The Internet Worm Program: An Analysis", Department of Computer Sciences, Purdue University, Nov. 1988. In a report resulting from NCSC's post-mortem meeting, network sponsors, managers, and users from major sites--including Defense's Army Ballistic Research Laboratory, Energy's Lawrence Livermore National Laboratory, DARPA, Harvard, MIT, and the University of California, Berkeley--called for improved communications capabilities and a centralized coordination center to report problems to and provide solutions for Internet users. Host Security Weaknesses Facilitated Spread of Virus Key to the Internet's decentralized structure is that each host site is responsible for establishing security measures adequate to meet its needs. Host computers are frequently administered by systems managers, typically site personnel engaged in their own research, who often serve as systems managers on a part-time basis. According to virus incident reports as well as network users, weaknesses at host sites included (1) inadequate attention to security, such as poor password management, and (2) systems managers who are technically weak. Inadequate Attention to Security Discussions of computer security frequently cite the trade- offs between increased security and the sacrifices, in terms of convenience, system function, flexibility, and performance, often associated with security measures. In deciding whether to establish additional security measures, systems managers must often be willing to make sacrifices in these areas. According to Internet users from academia, government, and the private sector, systems managers at research sites often are not very concerned with security. One example of a trade-off between security and convenience involves trusted host features on UNIX that allow users to maintain a file of trusted computers that are granted access to the user's computer without a password. The trusted host features make access to other computers easier; however, they also create potential security vulnerabilities because they expand the number of ways to access computers. The virus took advantage of the trusted host features to propagate among accounts on trusted machines. Some sites discourage use of the trusted host features; however, other sites use them because of their convenience. One Internet user observed that users do not like to be inconvenienced by typing in their password when accessing a trusted computer, nor do they want to remember different passwords for each computer with which they communicate. Another example involving inadequate attention to security is in password management. According to an NSF official, a major vulnerability exploited by the virus was lax password security. The official stated that too few sites observe basic procedures that reduce the risk of successful password guessing, such as prohibiting passwords that appear in dictionaries or other simple word lists and periodically changing passwords. The relative ease with which passwords can be guessed was discussed in an analysis of the Internet virus done by a University of Utah researcher.** He cited a previous study demonstrating that out of over 100 password files, up to 30 percent were guessed using just the account name and a couple of variations. ** Donn Seeley, "A Tour of the Worm", Department of Computer Science, University of Utah, Nov. 1988. Unpublished report. Careful control over passwords often inconveniences users to some degree. For example, an article in Computers and Security, an international journal for computer security professionals, notes that computer-generated passwords tend to be more secure than user-selected passwords because computer-generated passwords are not chosen by an obvious method easily guessed by an intruder. However, computer-generated passwords are generally more difficult to remember.** ** Belden Menkus, "Understanding the Use of Passwords", Computers and Security, Vol. 7, No. 2, April 1988. Systems Managers Who Are Technically Weak A number of Internet users, as well as NCSC and Defense Communications Agency virus reports, stated that the technical abilities of systems managers vary widely, with many managers poorly equipped to deal with security issues, such as the Internet virus. For example, according to the NCSC report, many systems managers lacked the technical expertise to understand that a virus attacked their systems and had difficulty administering fixes. The report recommended that standards be established and a training program begun to upgrade systems manager expertise. Problems in Developing, Distributing, and Installing Software Fixes Systems software is generally very complex. A major problem programmers face in software design is the difficulty in anticipating all conditions that occur during program execution and understanding precisely the implications of even small changes. Thus, systems software often contains flaws that may create security problems, and software changes often introduce new problems. Internet users and software vendors frequently cited problems relating to inadequacies in developing, distributing, and installing corrections to identified software holes. Holes that are not expeditiously repaired may create security vulnerabilities. The Internet virus incident and two later Internet intrusions highlighted problems in getting vendors to develop and distribute fixes and in having host sites install the fixes. Problems With Vendors A number of network users representing major Internet sites said that vendors should be more responsive in supplying patches to identified software holes. For example, more than 1 month after the virus, several vendors reportedly had not supplied patches to fix the sendmail and fingerd holes. Most vendors, when notified of a hole, send users a patch to repair the hole or wait until their next software revision, at which time the hole (as well as any other identified flaws) will be corrected. However, since a revision may take up to 6 to 9 months to release, the latter approach may leave systems vulnerable to security compromise for long periods. According to Internet users, critical security patches should be provided as quickly as possible and should not be delayed until the next release of the software.** ** According to a Defense official, this problem is compounded by the fact that sites not subscribing to software maintenance/support may not receive any new releases. Officials of one major vendor pointed out the problems they faced in distributing patches expeditiously. According to these officials: -- Their company sells computers with three or four different architectures, each with several versions of the UNIX operating system. When a fix is needed, they have to distribute about 12 different patches, making it difficult to develop and release patches quickly. -- Patches have to be carefully screened so that new holes will not be inadvertently incorporated. The officials noted that the quality assurance this screening provides is an important part of their business because their reputation depends on the quality of their software. -- Vendors have a hard time keeping track of customers who do not have service maintenance contracts. In addition, some systems are sold through contractors and the vendors may not know the contractors' customer bases. -- Disseminating a patch to thousands of users can cost a company millions of dollars. The vendor officials said they considered these factors in determining how to implement a patch. Berkeley's Computer Systems Research Group, which distributes its version of UNIX, has a software policy that differs from that of many other vendors. Berkeley generally provides source code along with the UNIX object code it sells to users.** However, Berkeley's policy is unusual--most vendors treat source code as proprietary and it is typically not provided to users. With source code, an experienced systems manager may be able to fix holes without waiting for the vendor to supply a patch or a system revision. ** Source code is the program written by the programmer. It is translated (by a compiler, interpreter, or assembler program) into object code for execution by the computer. Berkeley routinely transmits fixes to UNIX users and vendors through networks and bulletin boards. While this may result in timely fixes, it can also create security vulnerabilities. In particular, when a fix is widely disseminated, information about a vulnerability is also made apparent. Thus, there is a race between intruders seeking to exploit a hole and systems managers working to apply the fix. This dilemma was highlighted in multiple intrusions, which occurred in November and December 1988, at several Internet sites, including Lawrence Livermore National Laboratory and Mitre Corporation. In these instances, intruders exploited vulnerabilities in a UNIX utility program, called FTPD, that transfers files between Internet sites.** ** As discussed, the Internet virus exploited vulnerabilities in two other UNIX utility programs, sendmail and fingerd. Berkeley had sent out patches for the FTPD hole in October 1988. However, other UNIX vendors had not released patches for the hole. Mitre officials reported that their systems managers applied the Berkeley patch on many of their computers, but not on the computer penetrated by the intruders. Lawrence Livermore officials reported that they applied patches to computers that use Berkeley UNIX. However, the vendor for its other computers had not supplied a patch before the intrusion. Lawrence Livermore did not have source code for the other vendor's machines, so they had to wait for the vendor's patch. According to a Defense official, the intruders most likely tried to gain access to many machines until they found those machines to which patches had not been applied. Once the intruders penetrated the FTPD hole, they installed "trap doors" by adding new accounts and modifying systems routines, which allowed them continued access after the FTPD holes were closed. Officials from the Federal Bureau of Investigation and from sites involved in the intrusions said that the intruders have been identified and the case is under investigation. Reportedly, aside from the trap doors, no files were altered, and no classified systems were affected. Problems in Installing Software Fixes Even when a vendor distributes fixes, there is no assurance that sites will install them. Internet users and managers at several major university research and government sites cited the following reasons as to why fixes were not expeditiously installed: -- Systems managers vary in their ability and motivation to manage their systems well. -- System managers often serve on a part-time basis, and time spent on systems management takes away time from research. -- System revisions may contain errors, so some systems managers are reluctant to install the revisions. -- System revisions may be expensive if the system is not on a maintenance contract. -- Some sites do not know who their system managers are and, thus, have problems ensuring that fixes get distributed and installed. As discussed earlier, problems and confusion resulted when sites had to respond to the Internet virus. Although Berkeley posted a fix to both the sendmail and fingerd holes within 2 days after the onset of the virus and Sun Microsystems reportedly published a fix within 5 days, almost a month after the virus a number of sites reportedly still had not reconnected their host computers to the Internet. ACTIONS TAKEN IN RESPONSE TO VIRUS In response to the Internet virus, DARPA, NIST, NCSC,** and a number of other agencies and organizations have taken actions to enhance Internet security. These actions include developing computer security response centers, coordinating meetings, preparing publications to provide additional guidance, and publishing statements of ethics.*** ** NIST is responsible for developing standards and guidelines for the security of unclassified federal computer systems. It performs these responsibilities with the National Security Agency's technical advice and assistance. The Natioonal Security Agency (of which NCSC is a part) is responsible for the security of classified informatin in the defense and national security areas, including that stored and processed on computers. *** In addition, agencies are engaged in ongoing research aimed at improving network and computer security. An overview of these activities is presented in appendix II. Computer Security Response Centers Established In the wake of the virus, many Internet users, site managers, and agency officials have voiced concerns about problems in responding to and preventing emergency situations, such as the Internet virus. To address these concerns, some agencies are developing computer security response centers to establish emergency and preventative measures. The first center, the Computer Emergency Response Team (CERT), was established by DARPA in mid-November 1988. CERT's mandate is broad--it is intended to support all of the Internet's research users. DARPA views CERT as a prototype effort for similar organizations in other computer communities. Also, CERT is seen as an evolving organization whose role, activities, and procedures will be defined as it gains experience responding to Internet security problems. According to DARPA, CERT's three main functions are to provide -- mechanisms for coordinating community response in emergency situations, such as virus attacks or rumors of attacks; -- a coordination point for dealing with information about vulnerabilities and fixes; and -- a focal point for discussion of proactive security measures, coordination, and security awareness among Internet users. CERT has no authority, although it can make recommendations. CERT officials recognize the need to establish credibility and support within the Internet community so that its recommendations will be acted upon. CERT's nucleus is a five-person coordination center located at the Software Engineering Institute at Carnegie Mellon University in Pennsylvania.** CERT has enlisted the help of over 100 computer specialists who are on call when problems arise in their areas of expertise. In addition, CERT is developing working relationships with government organizations, including NCSC, NIST, Energy, and the Federal Bureau of Investigation, and with vendor and user groups. CERT expects to rely on DARPA funding until its value is recognized by the Internet community and alternate funding mechanisms are established--probably within 3 to 5 years. ** The objective of the institute, which is a Federally Funded Research and Development Center, is to accelerate the movement of software technology into defense systems. The Department of Energy began setting up a center at Lawrence Livermore National Laboratory in February 1989. This center is to focus on proactive preventive security and on providing rapid response to computer emergencies within the agency. The center plans to develop a data base of computer security problems and fixes, provide training, and coordinate the development of fixes. In addition, the center is considering developing software to assist in network mapping and to assure proper system configuration. Meetings Held and Guidance Issued NIST is coordinating interagency meetings to (1) draw on agency experience and develop a model for agencies to use in setting up response/coordination centers and (2) educate others on the model that is developed. NIST has also set up a computer system that may be used as a data base for computer problems and fixes and as an alternate means of communication in case the Internet's electronic mail system becomes incapacitated. In addition, NIST is planning to issue guidance this summer that will discuss threats inherent to computers and how such threats can be reduced. NCSC plans to distribute three security-related reports discussing (1) viruses and software techniques for detecting them, (2) the role of trusted technology in combating virus- related programs, and (3) security measures for systems managers. NCSC is also providing an unclassified system to serve as an alternate means of communications in case the Internet's electronic mail system is not working. Ethics Statements Released The Internet Activities Board, a technical group comprising government, industry, and university communications and network experts, issued a statement of ethics for Internet users in February 1989. Many Internet users believe there is a need to strengthen the ethical awareness of computer users. They believe that a sense of heightened moral responsibility is an important adjunct to any technical and management actions taken to improve Internet security. The Board endorsed the view of an NSF panel that characterized any activity as unethical and unacceptable that purposely -- seeks to gain unauthorized access to Internet resources; -- disrupts the intended use of the Internet; or -- wastes resources, destroys the integrity of computer- based information, or compromises users' privacy. The Computer Professionals for Social Responsibility and various network groups have also issued ethics statements encouraging (1) enforcement of strong ethical practices, (2) the teaching of ethics to computer science students, and (3) individual accountability. CONCLUSIONS In the 20 years in which it evolved from a prototype DARPA network, the Internet has come to play an integral role in the research and development community. Through the Internet, researchers have been able to collaborate with colleagues, have access to advanced computing capabilities, and communicate in new ways. In providing these services, the Internet has gone beyond DARPA's original goal of proving the feasibility of computer networking and has served as a model for subsequent public data networks. Since there is no lead agency or organization responsible for Internet-wide policy-making, direction, and oversight, management on the Internet has been decentralized. We believe this is because, at least in part, Internet developments were driven more by technological considerations than by management concerns and because decentralized authority provided the flexibility needed to accommodate growth and change on an evolving network. However, we believe that the Internet has developed to the point where a central focus is necessary to help address Internet security concerns. These concerns will take on an even greater importance as the Internet evolves into the National Research Network, which will be faster, more accessible, and have more international connections than the Internet. The Internet virus and other intrusions highlighted certain vulnerabilities, including -- lack of a focal point in addressing Internet-wide security issues, contributing to problems in coordination and communications during security emergencies; -- security weaknesses at some host sites; and -- problems in developing, distributing, and installing systems software fixes. Since the virus, various steps have been taken to address concerns stemming from the incident, from creating computer security response centers to issuing ethics statements to raise the moral awareness of Internet users. We support these actions and believe they are an important part of the overall effort required to upgrade Internet security. Host sites may need to take additional actions to heighten security awareness among users and to improve identified host level weaknesses, such as lax password management. However, many of the vulnerabilities highlighted by the virus require actions beyond those of individual agencies or host sites. For this reason, we believe that a security focal point should be established to fill a void in the Internet's management structure and provide the focused oversight, policy-making, and coordination necessary at this point in the Internet's development. For example, we believe that concerns regarding the need for a policy on fixes for software holes would be better addressed by a security focal point representing the interests of half a million Internet users than by the ad hoc actions of host sites or networks. Similarly, a security focal point would better ensure that the emergency response teams being developed by different Internet entities are coordinated and that duplication is lessened. There are no currently available technical security fixes that will resolve all of the Internet's security vulnerabilities while maintaining the functionality and accessibility that researchers believe are essential to scientific progress. Similarly, there is no one management action that will address all of the Internet's security problems. However, we believe concerted action on many fronts can enhance Internet security and provide a basis for security planning on the National Research Network. FRICC, an informal group made up of representatives of the five agencies that operate Internet research networks, is attempting to coordinate network research and development, facilitate resource sharing, and reduce operating costs. However, no one agency or organization has responsibility for Internet-wide management and security. The Office of Science and Technology Policy, through its Federal Coordinating Council on Science, Engineering and Technology, has, under its mandate to develop and coordinate federal science policy, taken a leadership role in coordinating development of an interagency implementation plan for the National Research Network. Therefore, we believe that the Office, through FCCSET, would be the appropriate body to coordinate the establishment of a security focal point. RECOMMENDATION We recommend that the President's Science Advisor, Office of Science and Technology Policy, through FCCSET, coordinate the establishment of an interagency group to serve as an Internet security focal point. This group should include representatives from the federal agencies that fund Internet research networks. As part of its agenda, we recommend that this group: -- Provide Internet-wide policy, direction, and coordination in security-related areas to help ensure that the vulnerabilities highli