RISKS-LIST: RISKS-FORUM Digest Monday 11 December 1989 Volume 9 : Issue 53 FORUM ON RISKS TO THE PUBLIC IN COMPUTERS AND RELATED SYSTEMS ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator Contents: Computerized public records boon to private eyes probing suitors (Jay Elinsky, Jon von Zelowitz) Should computers be legally responsible? (A. Lester Buck) Automatic toll systems (Jerry Harper) Software Development (Bill Murray) Newsgroup posting rejected, rejected, rejected, ... (Earle Ake) Comments on Unix INDENT program (Simson L. Garfinkel, Nick Lai, David McAllister)) The RISKS Forum is moderated. Contributions should be relevant, sound, in good taste, objective, coherent, concise, and nonrepetitious. Diversity is welcome. CONTRIBUTIONS to RISKS@CSL.SRI.COM, with relevant, substantive "Subject:" line (otherwise they may be ignored). REQUESTS to RISKS-Request@CSL.SRI.COM. TO FTP VOL i ISSUE j: ftp CRVAX.sri.comlogin anonymousAnyNonNullPW cd sys$user2:[risks]get risks-i.j . Vol summaries now in risks-i.0 (j=0) ---------------------------------------------------------------------- Date: Sun, 10 Dec 89 21:44:06 EST From: "Jay Elinsky" Subject: Computerized public records boon to private eyes probing suitors >From "Boy Meets Girl, '89, Can Be a Detective Story" by Dirk Johnson, in the New York Times, 10-Dec-89, Page 1: "Computerizing of public records in recent years has proved a boon to investigators, who say they can find out almost anything simply by keying a Social Security number into a computer... `It's usually very easy', said Ed Pankau, the president of Inter-Tect [a Houston investigative agency], who is the author of `How to Investigate by Computer'." The context of the article, from the lead paragraph: "Eager to trust but determined to verify, many single women in this age of risky romance are hiring private detectives to check the backgrounds of their suitors." A few paragraphs later, "Women are far more likely than men to hire an investigator, and usually their suspicions are on the mark, detectives said." In this case I'm tempted to call easy access to records (assuming it's legal) a benefit instead of a risk. Then again, I got married ten years ago, and my fiancee's investigation of my background was more traditional, like meeting my parents and using her "feminine intuition". And the article ends with a non-computer-related risk: The woman who is the subject of the article had three men investigated and found "skeletons in their closets". The fourth man she investigated was a-ok, and she was so thrilled that she told him he had passed the investigation. He wasn't thrilled to hear that he had been investigated. "`He kind of freaked out', she said. `But then, as I tried to explain why I did it, he understood, kind of'. She added, `We're not dating anymore.'" Jay Elinsky, IBM T.J. Watson Research Center, Yorktown Heights, NY ------------------------------ Date: Mon, 11 Dec 89 00:10:10 PST From: vonzelow@adobe.com (Jon von Zelowitz) Subject: Don't Give Social Security Numbers to Girlfriends [...] For $500, the Inter-Tect investigative agency in Houston promises to verify within a week a person's age, ownership of businesses, history of bankruptcies, if there are tax liens, appearance in newspaper articles, as well as divorces and children. Some clients have paid the detective agency as much as $10,000 to unearth secrets. "People want to find a quality partner," said Mr. Pankau, who is a former investigator for the Internal Revenue Service. [!-jvz] "I wouldn't say they're paranoid, but they're very cynical." ...sun!adobe!vonzelow vonzelow@adobe.com Jon von Zelowitz ------------------------------ Date: 11 Dec 89 07:21:12 GMT From: buck@siswat.UUCP (A. Lester Buck) Subject: Should computers be legally responsible? Recently I ran across a copy a paper in my files entitled "Are There Senses in which a Computer may Properly by Held Responsible for its Actions?" by J.P.A. Race from Brunel Univeristy, UK, in the volume "Information Technology for the Eighties", ed. R.D. Parslow (Heyden,1981). I include some extended excerpts from his article. Several of the points Race proposes have fascinating RISKS implications, and I would be quite interested in more recent references on this subject. I also have not yet heard of insurance policies for the actions of a computer. Do such policies exist? "...at the moment our natural reaction [to a wrong brought about by the agency of a computer], to say `It was the computer's fault', will be laughted at. The sophisticated will [...] turn immediately to the human being involved, the programmers, operators, compiler writers, maintenance engineers, sponsors, consultants, hardware designers... and try to apportion liability among them." "Such critics of our `It was the computer's fault' are themselves naive. We are quite quite correct in our instincts to start by blaming the computer, for the following reasons: "The computer system may be indeed liable, and no one else. The program may have been correctly designed and arranged to adapt to circumstances, but on this occasion the adaptation led it astray. The human beings involved acted in good faith to the best of their ability, and are not liable. Yet a blame-worthy thing happened. Therefore the non-human system must be blamed, unless we call the happening an Act of God, which we would never do if a human being had been involved instead of a computer. "The computer system may evade responsibility wholly or in part, because of negligence or deliberate action on the part of one or all of the human beings involved. But we have to start somewhere, and by calling to account the agent -- the computer -- which is prima facie responsible, the process of finding a culprit fails safe. We shall see later how the computer may be expected to defend itself by inculpating others, and how it can make amends if found wholly or partly liable. Race defines a "computer system" as "one particular combination of hardware, software, and data, such that its functional behaviour is different from any other system", the data being particularly important to distinguish between initial identical twins that have experienced different learning sets for their expert systems. He then draws a distinction that a responsible computer is not at all the same as an "intelligent" computer. "No, the characteristic of the computer systems under discussion is not so much intelligence, as the ability to rationalise, that is, to give an account of their actions, and to construct courses of action using powerful planning procedures, like Terry Winograd's SHRDLU..." "We are talking about responsibility: no need to put [the computer system] to Turing's test to show a human level of intelligence. In fact, as pointed out by Turing's brother, a computer like [this responsible system] will make a poor showing at writing a sonnet, but that doesn't stop us from treating it as a responsible agent. Few submarine captains write good sonnets either, or clerks, or public corporations, yet all these entities have responsibilities in law." "There are three aspects of punishment: Retribution: the need for society to get its revenge on the wrong- doer: a basically irrational (but quite understandable) emotional response. If this means pushing a computer system over a cliff after it has driven someone to suicide through sending them wrong electricity bills, we can understand it. [...] Rehabilitation: In this sense, the aim of punishment is to improve the individual for his own sake and that of society. In the case of a computer, this may involve re-programming or the indication to the computer that its previous response had been wrong, so that this reprimand is stored as a new parameter value to adjust its future behaviour. The fact that we did not use a cat-o'-nine-tails or the brig does not mean what we did was not punishment, any more than it is not punishment - of a severe kind - if a court-martial reprimands an officer who runs a frigate aground. Deterrence: The fact that [this computer system] is punished should be communicated to other computer systems working on similar things. [...] "Lastly in this section we should consider the case where a computer is involved as a principal in a civil suit: punishment is not involved, but restitution of damages is. In the past one would have thought it bizarre for the mechanical agent to be the actual defendant, yet as was said at the outset, our big problem with computers is that unlike hammers, it is very hard for a plaintiff to find any human being to accept responsibility for their bad behaviour. So he should be able to sue them, just as a corporation may be sued, and for much the same reason. "PROPOSALS Computer systems should be designed to include the ability to give an account of the bases of their actions, as `expert systems' do now, and, at a more mundane level, commercial systems do with an inbuilt `audit trail'. Our instinct to `blame the computer' when it makes errors should be elevated to a policy. It should be possible to take civil or criminal action against it in circumstances in which a human agent would also have been taken to court. Based on the system's own account of events, and other evidence, responsibility will be assigned and judgement given. Where the responsibility is laid at the computer's door, the basis for its behaviour (program or data) will be altered. To provide for restitution of damages when a system is successfully sued in a civil action, it should not be a `man of straw' but would hold money reserves or insurance." A. Lester Buck buck@siswat.lonestar.org ...!texbell!moray!siswat!buck ------------------------------ Date: Fri, 8 Dec 89 19:33:16 GMT From: Jerry Harper Subject: Automatic toll systems An automatic toll system is currently in operation in Southern Norway, installed by Philips if my memory serves me adequately. The system is extremely straightforward in operation. Regular users of the road simply buy a playing-card sized reflective disk which is mounted on the nearside rear window. As a vehicle approaches the toll point it passes over a sensor (a loop, I think) which activates a low intensty microwave transmitter. A beam from the transmitter strikes the upper nearside of the vehicle and a proportion of rays are reflected from the disk onto a receiver. Now as each reflective disk has the ID of the vehicle owner embossed on it the reflected pattern also presents the ID and thus beginning and endpoints of a journey on a toll road can be accurately determined. Furthermore, if the system receiver fails to register an ID a video camera is activated which photographs the backend of the suspect vehicle. The owner can then be traced through the license plate. Road users can be either billed every month directly or may pay a cetain amount in advance. Vehicles can pass through the electronic "gate" at up to 70kmh with detection accuracy of ID remaining up in the early nineties (?)(I am quoting from memory but I remember being stunned by the figures at the time). Only extreme environmental conditions affect the system adversely (micro wave is not as sensitive to climatic changes as infra-red radiation). How do I come to have this knowledge? Well, we are designing an automated policing system here which should keep us busy for a few years. John G. Harper, Computer Science Dept., University College, Dublin 4, IRELAND ------------------------------ Date: Sat, 9 Dec 89 12:14 EST From: WHMurray.Catwalk@DOCKMASTER.NCSC.MIL Subject: Software Development (Re: Curtis Jackson, RISKS-9.50) Curtis Jackson writes in RISKS-9.50: >We insisted on writing the design spec before writing any code, and >finalizing the design spec (after initial review) down to the >individual bit level. We then wrote pseudo-code for all modules, and >peer-inspected those. Finally we wrote the code with strict commenting >standards and assembled it, then peer-inspected that. Finally we wrote >module tests, simulated those, then string tests, simulated those, and >one day the hardware was off the drawing boards and in the lab. Note the order in which code and test data were prepared. (Note also the two "finally"s.) This scenario illustrates part of the problem that we have in software development. Test data are part of the specification and the acceptance criteria of any product. In all other engineering activities they are prepared before, rather than after the product. In software, not only are they prepared after the product, but often as an after thought. For the most part, they are prepared by the same person as produced the code. Thus, the product exits test "when the programmer can no longer find anymore of his own errors." Before any one tells me that you cannot prepare the test data until after the code because you do not know what either will look like until the code is written, let me say that I have heard that argument before. My answer (retort) is that if you started preparing the code before you knew what the product would look like and exactly how it will behave, then you clearly started too soon. Now the contributor quoted clearly thought that he and his colleagues were proceeding in a rigourous and disciplined manner, in accordance with the very best of practice. That their practice could be so far from good engineering practice is evidence of how far we have to go. Note that he said nothing about building a prototype. How would you like to fly in an airplane built to a new design that did not inlcude a prototype? (Do not build a plane without a prototype; do not fly paasengers in the prototype.) How long will we tolerate this practice and thw quality that results. Must we reinvent engineering? Are we so wrapped up in our own mythology and metaphors that we cannot learn from other disciplines? William Hugh Murray, Fellow, Information System Security, Ernst & Young 2000 National City Center Cleveland, Ohio 44114 21 Locust Avenue, Suite 2D, New Canaan, Connecticut 06840 ------------------------------ Date: 11 Dec 89 12:57:50 EST From: fac2@dayton.saic.com (Earle Ake) Subject: Newsgroup posting rejected, rejected, rejected, ... There is a new newsgroup that I subscribe to called vmsnet.announce.newusers. I decided to post a message to it. That is when the fun began. The newsgroup is unmoderated. After I posted to the newsgroup, I received a mail message from a site scolding me for posting to a moderated newsgroup and told me to post directly to the moderator instead. He is a copy of the message. "This newsgroup is moderated, and cannot be posted to directly. Please mail your article to the moderator for posting." It then included my original message to make sure I knew what I had done. I was going to ignore it until I received 5 more messages from the same site complaining about the same message. I started to wonder, do we have a loop here? I fired a message off to the postmaster at the offending site to stop the messages. A day past and no reply. The messages kept coming in. I started to keep track of how many and how often they were mailed. They were mailed every half hour since I first posted the message. I then looked up the administrators name in the UUCP maps. I sent him a message directly to have him stop these things. He responded by saying he wasn't sure what was happening and to send him a sample of the message so he could fix it. Now we are up to 85 messages. I finally got a response from him saying he had shut them off. It seems one of the sites that he is connected to accepted the message and then tried to hand it off to his site. His version of NEWS thinks that any newsgroup that has the word 'announce' in it IS MODERATED no matter how you have it set up! The remote site tried every half hour to hand the message off to his site. His site would in turn reject the message and send a nasty gram back to me. We finally got all this straightened out after 140+ messages bounced back to me. I don't think I will post to that newsgroup in the near future!!!!! Earle Ake fac2@dayton.saic.com uunet!dayvb!fac2 ------------------------------ Date: Wed, 6 Dec 89 3:43:25 EST From: simsong@prose.CAMBRIDGE.MA.US (Simson L. Garfinkel) Subject: comments on Unix INDENT program (Lai, RISKS-9.50) Actually, indent is merely changing the old-style C code (x =- 1) to the new style. (x -= 1). Careful programmers always put in whitespace between assignment and varibles for this reason. Some compilers will flag (x=-1) as a warning because it is ambigious. Careful programmers also always use lots of parens to make their intentions clear. such as ((x<3) && (y>2)). ------------------------------ Date: Wed, 6 Dec 89 08:29:02 PST From: lai@east.Berkeley.EDU (Nick Lai) Subject: More on indent Of course. I alluded to the old style / new style conversion (K&R A.17) in my previous note. I am a careful programmer. The whole point is that sometimes I run across code written by idiots who insist on two-space indentation, no spaces between elements of comma-seperated lists ("x=foo(9,34,&hoho)"), and writting code like "int x=-1". Indent held out the promise of being able to convert that crap into something readable. But it was just a boulevard of broken dreams. Nick ------------------------------ Date: Mon, 11 Dec 89 15:03:16 -0700 From: dmcallis%albion@cs.utah.edu (David McAllister) Subject: Problem with indent revisited Just so you know, the problem with indent swapping "x = -y" to "x -= y" also works with "x = *y" to "x *= y" Pretty silly, huh? DMc ------------------------------ End of RISKS-FORUM Digest 9.53 ************************