28-May-86 22:21:02-PDT,16307;000000000000 Mail-From: NEUMANN created at 28-May-86 22:19:00 Date: Wed 28 May 86 22:19:00-PDT From: RISKS FORUM (Peter G. Neumann, Coordinator) Subject: RISKS-2.55 Sender: NEUMANN@SRI-CSL.ARPA To: RISKS-LIST@SRI-CSL.ARPA RISKS-LIST: RISKS-FORUM Digest, Wednesday, 28 May 1986 Volume 2 : Issue 55 FORUM ON RISKS TO THE PUBLIC IN COMPUTER SYSTEMS ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator Contents: Culling through RISKS headers; SDI (Jim Horning) Blind Faith in Technology, and Caspar Weinberger (Herb Lin) Risks of doing software quality assurance too diligently (PGN from Chris Shaw and the Torrance Daily Breeze) Collegiate jungle (Mike McLaughlin) Decease and Desist -- Death by Computer (Deborah L. Estrin) The Death of the Gossamer Time Traveler (Peter G. Neumann) Computer Ethics (Bruce A. Sesnovich) The RISKS Forum is moderated. Contributions should be relevant, sound, in good taste, objective, coherent, concise, nonrepetitious. Diversity is welcome. (Contributions to RISKS@SRI-CSL.ARPA, Requests to RISKS-Request@SRI-CSL.ARPA.) (Back issues Vol i Issue j stored in SRI-CSL:RISKS-i.j. Vol 1: MAXj=45) ---------------------------------------------------------------------- Date: Tue, 27 May 86 11:51:06 pdt From: horning@src.DEC.COM (Jim Horning) To: RISKS-REQUEST@SRI-CSL.ARPA (RISKS FORUM, Peter G. Neumann, Coordinator) Subject: SDI; Culling through RISKS headers [Message entirely edited] [[Jim and several others called my attention to an article in the NYTimes of 27 May 86, page 9. I have excerpted from the article, as follows. PGN ] "Feasible Computer Control For Missile Shield Doubted" by Charles Mohr (Special to the New York Times) "An expert [Jim Horning] in computer programs who was asked to advise on research into defense against long-range nuclear missiles says he is skeptical that a reliable computer system to control such a defense can ever be devised." The article quotes from a letter from Jim Horning to Douglas Waller (on the staff of Senator William Proxmire): "To date no system of this complexity has performed as expected (or hoped) in its first full-scale operational test; no one has advanced any reason to expect that an S.D.I. would either. A huge system that is intended to be used at most once, and cannot be realistically tested in advance of use, simply cannot be trusted." The article also quotes a statement signed by 36 of the 61 experts who attended a workshop on computing March 16-19 at Pacific Grove CA: "The effective defense from nuclear annihilation of the lives, homes and property of the American people, as embodied by the Strategic Defense Initiative (Star Wars), requires highly reliable computer systems of unprecedented complexity. As experts in reliable computing, we strongly believe that a system meeting these requirements is technologically infeasible." The article notes Dave Parnas' role in the ongoing discussions, and also "Lieut. Gen. James A. Abrahamson, the director of the missile defense organization, has said that computer programming was probably the most difficult technical problem faced by his group. But he stresses an optimistic view that it can be solved and argues that Mr. Parnas has applied "unrealistically high criteria"." Mr. Horning, who like Mr. Parnas has written computer programs for weapons systems, is supportive of Mr. Parnas, observing that "there has been a movement toward Parnas' position" among those knowledgeable about technology. The article also quotes from Jim Horning's "trip report" to participate in a meeting of the Strategic Defense Initiative Organization (see RISKS-1.2, 28 August 85). END of PGN excerpting.] [Wow, it is 9 months to the day since RISKS-1.2, and we've had 99 issues (not counting the "pilot issue", RISKS-1.1, on 1 Aug 85). I hope we are not overwhelming you, but I also hope we can keep up the generally good quality of contributions. PGN] ------------------------------ Date: Sun, 25 May 1986 17:45 EDT From: LIN@XX.LCS.MIT.EDU To: RISKS@SRI-CSL.ARPA Subject: Blind Faith in Technology, and Caspar Weinberger On the blind faith in technology, it is interesting to note that, when initial reports came in after the bombing of Libya that U.S. bombers had hit the French Embassy, Weinberger said, "That's impossible. They weren't ordered to do that." ------------------------------ Date: Wed 28 May 86 21:02:44-PDT From: Peter G. Neumann Subject: Risks of doing software quality assurance too diligently To: RISKS@SRI-CSL.ARPA From the Torrance Daily Breeze, 19 May 1986, page 1, courtesy of Chris Shaw: Death threats dog fired whistleblower (by James Hart, Aerospace writer) Finding a new job after getting fired can be hard enough, but Edward F. Wilson never expected the death threats. Wilson, a computer software engineer fired narly a year ago from a small Hawthorne-based aerospace company, says he's paying the price for speaking out against government contracting abuses. The threats -- anonymous, of course -- have come over the telephone twice in recent weeks at his Long Beach home... "Whistleblowing, I'm afraid, is not very popular," he said with a sigh. He said that soon after being asked ... to draw up software quality-assurance programs required by the government, he realized that Amex Systems officials were doing it strictly for show. "They said to me on several occasions that they had no intention of implementing them," he said. The article goes on to document Wilson's memo to his employer, his being fired for "being a troublemaker", his filing a wrongful discharge suit, the ensuing criminal investigation currently underway on unnamed government programs, various denials, etc. Dina Rasor, director of the Project on Military Procurement, a self-styled watchdog agency in Washington D.C. spoke about the situation: "I've heard of whistleblowers being blackballed from the industry and of government whistleblowers put in 'do-nothing' jobs, but in five years of working with these people I've never had anyone receive a death threat. ... What I've found is so unusual about Ed Wilson is that he made his complaints known to the company well before he was fired. He hasn't brought all this up later as sour grapes." Wilson said he remains optimistic he will eventually find a job, but admits his "faith in the system is diminishing." "I did what I thought was in the best interests of the country," he said. ------------------------------ Date: Tue, 27 May 86 08:42:00 edt From: mikemcl@nrl-csr (Mike McLaughlin) To: risks@sri-csl.ARPA Subject: Collegiate jungle Darwinian selection will solve the backup problem on campus. Them that backs up will survive, them that don't, won't. Permission is granted to delete "campus" and insert any other sphere of computer-supported activity presently known or yet to be discovered. Mike McLaughlin ------------------------------ Date: Mon, 26 May 86 18:43:45 pdt From: estrin%usc-cseb@usc-cse.usc.edu (Deborah L. Estrin) To: neumann@sri-csl%usc-cseb@usc-cse.usc.edu Subject: Decease and Desist -- Death by Computer ReSent-To: RISKS@SRI-CSL.ARPA An editorial appeared in yesterday's (Saturday's) LA Times. It is written by Forman Brown, on the subject of computer error. Following are a few exerpts: "I first became aware of my death last May when my checks began to bounce. Never having experienced bouncing checks before, and knowing that I had quite a respectable balance at the bank, I was both shocked and angry. When I examined the returned checks and found, stamped over my signature on each of them, in red ink, "Deceased", I was mystified. Then, when one of the recipients of my checks, a utility company, demanded that I appear in person, cash in hand, plus $10 for their trouble--their trouble--I was shocked, angry and mystified. I wondered just how they expected us deceased to acquiesce." Well, to paraphrase, Brown went to the bank, the series of tellers could not believe such a thing had happened and said it was probably the computer's fault and sent him home to write new checks and explanations--including one to a friend who thought he was dead due to the "deceased" notice on the bounced check. Then the next month he found that his social security payment was not credited to his account. On investigation he found that whatever troubled the computers "had spread to those of the Social Security system as well." This went on for a couple of months despite visits to Social Security. Then finally the bank agreed to credit the amount to his account until Social Security started payment again--which they did several months later. Brown thought the story was over until his physician contacted him recently to say that Medicare had refused to accept his bill for services rendered becuase the date of the service was six months later than the date of the patient's decease... He concludes by saying that if he were 20, all this might merely be irritating, but since he is 85 the prospect of death is too near to be treated lightly. ------------------------------ Date: Wed 28 May 86 22:08:47-PDT From: Peter G. Neumann Subject: The Death of the Gossamer Time Traveler To: RISKS@SRI-CSL.ARPA Dr. Paul MacCready has had some marvelous successes, including the first and only human-powered flight across the English Channel in 1979 on his Gossamer Condor. His Time Traveler, a short-winged model of the prehistoric Quetzalcoatlus northropi from 65 million years ago, had made something like 43 consecutive safe flights and starred in a film, "On the Wing", replicating the original appearance and flying style of QN. Weighing in at 44 pounds, it includes battery-operated motors, a computerized autopilot, and ground-based radio controls. Unfortunately, the bird chose the day of its first public appearance, 17 May 86 at Andrews Air Force Base, to have its head break off. Computer archaeologists of the future will of course try to ascertain whether the accident was due to human error in overtaxing the creature, or to a computer program bug in the safety controls that might have otherwise have prevented flight instability, or some other cause. We hope that the head crash can be repaired. The construction cost, variously reported as $500,000 and $700,000, was funded by the National Air and Space Museum and the Johnson Wax Company. [Maybe this was inspired by its more modern precursor, the "one-SEATER WAX-WING".] Your roving [raving or raven'?] reporter, PGN ------------------------------ Date: Tue, 27 May 86 13:36:39 edt From: rti-sel!dg_rtp!rtp41!dg_rama!bruces%mcnc.csnet@CSNET-RELAY.ARPA To: rtp41!dg_rtp!rti-sel!risks@SRI-CSL.ARPA Subject: Computer Ethics The following is a copy of a review I wrote for a recent newsletter of the Boston chapter of Computer Professionals for Social Responsibility (CPSR). Readers of RISKS may be interested, as well. METAPHILOSOPHY is a British journal published three times yearly which is dedicated to considerations about particular schools, fields, and methods of philosophy. The October 1985 issue, Computers & Ethics (Volume No. 16, Issue No. 4), is recommended reading [...]. This issue's articles attempt to define and delimit the scope of Computer Ethics, and examine several emerging and current concerns within the field. One current concern is responsibility for computer-based errors. In his article on the subject, John W. Snapper asks: "...whether it is advisable to ...write the law so that a machine is held legally liable for harm." The author invokes Aristotle's "Nichomachean Ethics" (!) in an analysis of how computers make decisions, and what is meant by "decision" in this context. On the same subject, William Bechtel goes one step further, considering the possibility that computers could one day bear not only legal, but moral responsibility for decision-making: "When we have computer systems that ...can be embedded in an environment and adapt their responses to that environment, then it would seem that we have captured all those features of human beings that we take into account when we hold them responsible." Deborah G. Johnson discusses another concern: ownership of computer programs. In "Should Computer Programs Be Owned?," Ms. Johnson criticizes utilitarian arguments for ownership, as well as arguments based upon Locke's labor theory of property. The proper limits to extant legal protections, including copyrights, patents, and trade secrecy laws, are called into question. Other emerging concerns include the need to educate the public on the dangers and abuses of computers, and the role of computers in education. To this end, Philip A. Pecorino and Walter Maner present a proposal for a college level course in Computer Ethics, and Marvin J. Croy addresses the ethics of computer-assisted instruction. Dan Lloyd, in his provocative but highly speculative article, "Frankenstein's Children," envisions a world where cognitive simulation AI succeeds in producing machine consciousness, resulting in a possible ethical clash of the rights of artificial minds with human values. The introductory article, James H. Moor's "What is Computer Ethics," is an ambitious attempt to define Computer Ethics, and to explain its importance. According to Moor, the development and proliferation of computers can rightly be termed "revolutionary": "The revolutionary feature of computers is their logical malleability. Logical malleability assures the enormous application of computer technology." Moor goes on to assert that the Computer Revolution, like the Industrial Revolution, will transform "many of our human activities and social institutions," and will "leave us with policy and conceptual vacuums about how to use computer technology." An important danger inherent in computers is what Moor calls "the invisibility factor." In his own words: "One may be quite knowledgeable about the inputs and outputs of a computer and only dimly aware of the internal processing." These hidden internal operations can be intentionally employed for unethical purposes; what Moor calls "Invisible abuse," or can contain "Invisible programming values": value judgments of the programmer that reside, insidious and unseen, in the program. Finally, in the appendix, "Artificial Intelligence, Biology, and Intentional States," editor Terrell Ward Bynum argues against the concept that "intentional states" (i.e. belief, desire, expectation) are causally dependent upon biochemistry, and thus cannot exist within a machine. If you're at all like me, you probably find reading philosophy can be "tough going," and METAPHILOSOPHY is no exception. References to unfamiliar works, and the use of unfamiliar terms occasionally necessitated my reading passages several times before extracting any meaning from them. The topics, however, are quite relevant and their treatment is, for the most part, lively and interesting. With its well-written introductory article, diverse survey of current concerns, and fairly extensive bibliography, this issue of METAPHILOSOPHY is an excellent first source for those new to the field of Computer Ethics. [METAPHILOSOPHY, c/o Expediters of the Printed Word Ltd., 515 Madison Avenue, Suite 1217, New York, NY 10022] Bruce A. Sesnovich mcnc!rti-sel!dg_rtp!sesnovich Data General Corp. rti-sel!dg_rtp!sesnovich%mcnc@csnet-relay.arpa Westboro, MA "Problems worthy of attack prove their worth by hitting back" ------------------------------ End of RISKS-FORUM Digest ************************ -------