Subject: RISKS DIGEST 11.32 REPLY-TO: risks@csl.sri.com RISKS-LIST: RISKS-FORUM Digest Thursday 21 March 1991 Volume 11 : Issue 32 FORUM ON RISKS TO THE PUBLIC IN COMPUTERS AND RELATED SYSTEMS ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator Contents: A further lesson from DeTreville's cautionary tale (Alan Wexelblat) Another anecdote about automatic transfer systems (Ken Mayer) Re: "What the laws enforce" (Bob Johnson, TK0JUT1, Ernesto Pacas-Skewes) Re: California, driving, and privacy, again (Caveh Jalali, Flint Pellett, Jurjen NE Bos) Re: Pilot Error - an impartial assessment? (Christopher Stacy) Fast Food and locked cash registers (Jonathan Leech) RISKS of digital voice forgery exaggerated (Fernando Pereira) Report on ACM's position on privacy (Barbara Simons) ISM Workshop Announcement (Brian S. Hubbard) The RISKS Forum is moderated. Contributions should be relevant, sound, in good taste, objective, coherent, concise, and nonrepetitious. Diversity is welcome. CONTRIBUTIONS to RISKS@CSL.SRI.COM, with relevant, substantive "Subject:" line. Others ignored! REQUESTS to RISKS-Request@CSL.SRI.COM. For vol i issue j, type "FTP CRVAX.SRI.COMlogin anonymousAnyNonNullPW CD RISKS:GET RISKS-i.j" (where i=1 to 11, j always TWO digits). Vol i summaries in j=00; "dir risks-*.*" gives directory; "bye" logs out. ALL CONTRIBUTIONS CONSIDERED AS PERSONAL COMMENTS; USUAL DISCLAIMERS APPLY. Relevant contributions may appear in the RISKS section of regular issues of ACM SIGSOFT's SOFTWARE ENGINEERING NOTES, unless you state otherwise. ---------------------------------------------------------------------- Date: Thu, 21 Mar 91 14:57:16 est From: wex@PWS.BULL.COM Subject: A further lesson from DeTreville's cautionary tale (RISKS-11.30) One important lesson that John might have noted was: - Use Paper and Pencil. Long ago I learned that people will try to fix obvious problems before they read their email. Therefore, whenever I report a serious problem by email, I also leave handwritten notes taped to the appropriate doors and keyboards so that anyone who sits down to fix the problem will see my note and be aware that something is not as they expect. Of course, this is not perfect, as physical notes are sometimes not seen or are ignored, but it has saved me many times when email didn't work or wasn't read until "too late." --Alan Wexelblat, Bull Worldwide Information Systems phone: (508)294-7485 ------------------------------ Date: Wed, 20 Mar 91 12:25:25 -0500 From: ken@visix.com (Ken Mayer) Subject: Another anecdote about automatic transfer systems Several years ago, when I began working for a new employer, I was given the option of direct deposit for my weekly paycheck. I also opted to have a certain amount of money transferred to my savings account from my checking account on a monthly basis. Due to a clerical error, the automatic transfer started one month earlier than I expected causing undo embarassment when many of my checks bounced. When I complained (not only was I billed overdraft charges, I had to pay a returned check fee at my other bank, one of my credit card carriers and the electric company), the bank droid shrugged her shoulders and said (basically), "Tough luck, bozo." Infuriated, I immediately closed my account and took my business elsewhere. Here's where things get interesting: Even though the account was closed, the automatic transfer was not turned off! Every month for the next two YEARS I received a statement from this bank from hell that my account was overdrawn, my automatic savings transfer did not go through and I will be billed for insufficient funds. Every quarter I got a letter stating the my account balance was negative and I should call the local branch to straighten it out. Speaking with the bank the bank manager, I got a lot of apologies, and explanations how the computer needed the right incantation and he didn't know it. (He really was a nice fellow, it was just that this particular bank's data processing system was written before electric power was popular.) The letters stopped coming when I moved to another state. Ken Mayer, Technical Support Engineer, Visix Software Inc. 703.758.8230 ...!uunet!visix!ken ken@visix.com ------------------------------ Date: Wed, 20 Mar 91 11:35:39 -0600 From: robjohn@logdis1.oc.aflc.af.mil (CDC Contractor Bob Johnson;SCSS;) Subject: Re: "What the laws enforce" (TK0JUT1, RISKS-11.30) Begging your pardon, but there is a great difference between trespassing on my property and breaking into my computer. A better analogy might be finding a trespasser in your high-rise office building at 3 AM, and learning that his back-pack contained some tools, some wire, a timer and a couple of detonation caps. He could claim that he wasn't planting a bomb, but how can you be sure? As a prudent office-building-owner, wouldn't you call in the police bomb squad, and deny access to the tenants until the whole building had been inspected and been declared safe? My system has over 2,000 users, and well over 70,000 files. When we have a breakin (and we have had a couple), how much time is it going to cost me to do a complete audit of the operating system executables and configuration files, have all the users change their passwords and inspect their files for damage, analyze the intruder's activity and plug the security hole, document the intrusion for law enforcement agencies, and pursue prosecution (if we so decide)? Just counting the direct cost of manpower, the sum involved is many thousands of dollars. Under federal law (as I understand it) - any breakin that causes more than $5,000 of damage is a FELONY. This includes the incidental costs mentioned above. I am for making the penalties for computer trespass extremely painful to the perpetrator. Perhaps in this fashion we can encourage these people to find a more productive use of their time, and can avoid the cost of cleaning up and verifying our systems after these events. Most administrators who've had to clean up and audit a system of this size probably think that a felony rap is too light a sentence. At times like that, we tend think in terms of boiling in oil, being drawn and quartered, or maybe burying the intruder up his neck in an anthill. ------------------------------ Date: Tue, 19 Mar 91 16:23 CST From: TK0JUT1@NIU.BITNET Subject: Re: "What the laws enforce" (Leveson, RISKS-11.31) In RISKS 11.31, Nancy Leveson takes exception apparently to my "analogy" of computer hacking to trespassing on grass and argues with passion that computer trespass is uncool. Sorry, but that analogy wasn't mine, and I was responding to it. The point isn't whether we approve or disapprove of hacking or computer trespass. Most of us agree it's at best tacky, at worst dangerous. Most of us agree that some social response is needed to both proactively and reactively curtail trespass and other predatory behavior in all its forms. The question is what are the most appropriate legal responses to computer trespass and what are the problems with current attempts to invoke criminal penalties for it? Those of us who have followed the recent secret service cases are concerned with the application of comfortable legal definitions to new forms of offense for which those laws may not be appropriate. The current metaphor of hacking as "home entry" and applying sanctions comparable to B&E seem neither accurate nor just. Equating credit card fraud and other forms of rip-off with hacking only adds to the confusion. By accepting the trend to apply former metaphors to new conditions, we risk setting precedents that affect how computer behavior, access to information, and other emerging problems faced by computer hobbyists will be handled in the coming decades. Few objected to the enactment of RICO laws, and fewer still to the laws allowing confiscation of property of drug suspects. The attitude seemed to be that harsh measures were justified because of the nature of the problem. Yet, those and similar laws have been expanded and applied to those suspected of computer abuse as we see in the cases of Steve Jackson Games, RIPCO BBS, the "Hollywood Hacker," and others have been raided under questionable circumstances. The Hollywood Hacker illustrates some of these problems. Stuart Goldman, an investigative journalist, appears to have been set up and caught accessing the computers of the Fox network by using an account to which he apparently was not fully authorized. In a media event-type raid (Fox cameras were present), the SS and Los Angeles police raided him in March '90, took his equipment, and he faces a five year sentence for what appears, according to the indictment, to be at worst a trivial offense, at best a peccadillo for which an apology, not a sentence, is appropriate. I'm wondering: What does law think it's enforcing? What is the appropriate metaphor for computer trespass? What distinctions should be made between types of offense? Please remember, nobody is justifying trespass, so continual harangues on its dangers miss the point. I am only suggesting that there is a greater risk from misapplication of law, which--like a virus--has a historical tendency to spread to other areas, than from computer hackers. It's easier to lock out hackers than police with guns and the power of the state behind them, and we have already seen the risks to people that result from over-zealous searches, prosecution, and sentencing. [Still trying to be semianonymous? PGN] ------------------------------ Date: Wed, 20 Mar 91 12:14:58 CST From: skewes@CAD.MCC.COM (Ernesto Pacas-Skewes) Subject: Re: "What the Laws Enforce" (Leveson, RISKS-11.31) > ... Although such laws may not discourage true criminal behavior, they do > discourage potentially destructive "play" by essentially law-abiding people. They also discourage potentially constructive "play" by essentially law-abiding people. Knowing that you will be severly punished if you (maybe unintentionally) hurt somebody else tends to discourage initiative. Knowing that you are in an environment where you cannot hurt any body else tends encourage it. Your caution is also affected in opposite directions. The relative benefits of (and relation between) initiative and caution are debatable, the key, as with most anything else, is to strike the "right" balance. Many other qualities and values come into place, I am only trying to illustrate that severe laws by themselves don't cut it, and that laws that are "too" severe may even be counterproductive. > In fact, personal and business privacy and property is extremely important > in a complex, crowded society such as ours. Completely agree, I would even remove the complex and crowded society. > ... an important system that may save lives (or cost them if done wrong) > because a company with which I need to deal has had to severely > restrict outside computer access because of security fears. I fail to see how doing it right or doing it wrong is related to the access to secured data unless doing it right means doing it on time. If the right/wrong doing is determined by the accessing of secured data there may be a major hole in that system. > ... Draconian security > measures to prevent frivolous access and pranks (in situations where it would > not otherwise be necessary because there is nothing of value to steal) will > hurt us all and cost our society untold dollars and perhaps worse. The value, I think, is determined by whoever decides to impose the draconian security measures, if there is none, why bother? Well ..., maybe a lawyer would be able to find the "appropiate" value, and envolving lawyers almost always hurts and may cost untold dollars. The draconian security measures imposed by the company you refer to, at least warn you that somebody already places value on the data you need to access and these measures may very well save you from getting bitten by the severity of the laws that you are rooting for. You request the data, the company's system gives it to you, the company's lawyer finds out you got the data and decides you are a good money maker, you pay more for crimes than for misdemeanors. Or is the lack of protection and implied authorization? I value my privacy, I try to protect it (if the law helps, even better). Ernesto ------------------------------ Date: Tue, 19 Mar 91 14:53:38 -0800 From: Caveh Jalali Subject: Re: California, driving, and privacy, again (RISKS-11.31) The major concern I have about Automatic Vehicle Identification (AVI) systems is that they might make life too easy for our friends at the law enforcement agencies. Photo radar is bad enough -- now our car could turn into a credit-card-on-wheels for anyone who needs to balance their budget for that month! instead of taking a picture, the camera would simply emmit the query signal, and record your car's ID. The speeding ticket, parking ticket, etc... could be in the mail before you even realize you did anything illegal. ------------------------------ Date: 20 Mar 91 17:43:18 GMT From: flint@gistdev.gist.com (Flint Pellett) Subject: Re: California, driving, and privacy, again (Hibert, RISKS-11.31) Sometimes people can't seem to see the forest for the trees. If the roads were paid for out of general funds like income tax money, (where there is already a mechanism and bureaucracy in place for collecting it) then there would be no need to build expensive toll booths, invest in transponders for cars, or bar code readers and a new sticker every month, or any of that other stuff: you wouldn't have to hire people to maintain the equipment and install it and collect the money. You could actually spend all the money that is going toward that technology and bureaucracy building roads instead! And nobody would have to complain about waiting in line at the toll booth ever again! (But wait: then people might be able to see how much tax they are really paying!) IMHO: the main RISK this is demonstrating isn't the risk to privacy involved in having toll booths able to track your movements, it's the risk of inventing technology that is going to create more problems (and expense) when we wouldn't need that technology at all if we just addressed the social and political problems (taxes that are too high, so we disguise them as tolls, etc.) we started with. But technological problems seem to be easier to solve than political ones. Flint Pellett, Global Information Systems Technology, Inc., 1800 Woodfield Drive, Savoy, IL 61874 (217) 352-1165 uunet!gistdev!flint ------------------------------ Date: 21 Mar 91 10:10:08 GMT From: jurjen@cwi.nl (Jurjen NE Bos) Subject: Re: California, driving, and privacy, again There is still a better solution: - The user is fully anonymous - The box is the car is owned by the user, not by the government - The user has smart card containing his money - Opening the smart card only allows limited damage to the system - Fast payment (20 ms) over IR - extendible to phones, public transport, shops, etc The system is called SmartCash and is developed by our neighbors, DigiCash. ------------------------------ Date: Tue, 19 Mar 1991 17:09-0500 From: Christopher Stacy Subject: Re: Pilot Error - an impartial assessment? (Hollombe, RISKS-11.31) > If the pilot's dead, it's his fault. In a regulatory sense, this would generally be true, because the Federal Aviation Regulations are written that way. That is, the FAR's can be interpreted to basically say, "it's always the pilot's fault." In a legal sense, sometimes the aircraft manufacturer or someone is held partly or totally responsible. NTSB reports almost always cite multiple contributing factors, often putting some of the blame on controllers, airline practices, poor FAA regulations, and pilots. There is almost always something the pilot "could have" done, if he had thought of it, and such things are at least useful hindsight. Those are three common ways for finding fault, and they often come up with different answers. "Fault" is a slippery concept, and it's risky make broad generalizations about a complicated domain, based on simple bottom-line analysis that don't make their motivations explicit. ------------------------------ Date: 21 Mar 91 17:10:46 GMT From: leech@cs.unc.edu (Jonathan Leech) Subject: Fast Food and locked cash registers In RISKS-11.30, Dwight McKay says ``I can see it now, "Sorry, we cannot give you a drink right now, our computer is down."'' A similar incident happened to me a few weeks ago. While getting lunch (loosely speaking) at Taco Bell, a fire down the street cut power. I happened to be about to pay at the time. Not only could they not take any further orders, they couldn't accept payment as the cash register would not open. At least I got lunch for free. ------------------------------ Date: Wed, 20 Mar 91 14:45:31 EST From: pereira@research.att.com (Fernando Pereira) Subject: RISKS of digital voice forgery exaggerated It is the opinion of colleagues of mine working on speech recognition and speech synthesis that the risk suggested by David Turner of digital voice forgery from small speech samples is negligible. As everyone knows who has dialed up a modern voice mail system or directory assistance service, sentences constructed by concatenating prerecorded words sound very unnatural. More sophisticated methods, which to some extent handle co-articulation (interword transitions), require much greater amounts of speech data, and they still fall far short of natural speech, particularly in the correct modeling of speech durations and intonation. My colleague David Talkin says: ``It is MUCH more likely that a human mimic could listen to the short passages and subsequently perform successful voice forgery.'' Fernando Pereira, 2D-447, AT&T Bell Laboratories, 600 Mountain Ave, Murray Hill, NJ 07974 ------------------------------ Date: Tue, 19 Mar 91 18:50:11 PST From: SIMONS@IBM.COM Subject: Report on ACM's position on privacy (in response to Lotus Marketplace) The following statement was passed by ACM Council and will be issued as a press release: Whereas the ACM greatly values the right of individual privacy; Whereas members of the computing profession have a special responsibility to ensure that computing systems do not diminish individual privacy; Whereas the ACM's Code of Professional Conduct places a responsibility on ACM members to protect individual privacy; and Whereas the Code of Fair Information Practices places a similar responsibility on data holders to ensure that personal information is accurate, complete, and reliable; Therefore, be it resolved that (1) The ACM urges members to observe the privacy guidelines contained in the ACM Code of Professional Conduct; (2) The ACM affirms its support for the Code of Fair Information Practices and urges its observance by all organizations that collect personal information; and (3) The ACM supports the establishment of a proactive governmental privacy protection mechanism in those countries that do not currently have such mechanisms, including the United States, that would ensure individual privacy safeguards. ======================== Here is some information on how to join ACM. The RISKS forum is an ACM sponsored activity. ACM is also getting more involved in the kinds of issues represented by RISKS and the above statement. If you support these activities and are not currently a member of ACM, I urge you to demonstrate your support by joining. You can obtain a membership application from any issue of CACM. If you can not get ahold of CACM, you can obtain an application from: ACM, P.O. Box 12114, Church Street Station, New York, NY 10257 The costs are: $71 Voting Member (You are asked to have a Bachelor's degree, equivalent level of education, or four full-time years of experience. The Bachelor's does not necessarily have to be in computer science. I don't know if it has to be in a related area.) $71 Associate Member (No membership requirements) $21 Student Member (You must be a registered student at an accredited educational institution, and a faculty member must certify your status.) $66 Joint member of the IEEE-Computer Society $57 Member of one of the following overseas computing societies ACS (Austraila), AFCET (France), AICA (Italy) BCS (United Kingdom) BIRA/IBRA (Belgium), CIPS (Canada), CSZ (Zimbabwe), GI (Germany), HKCS (Hong Kong), ICS (Ireland), IPA (Israel), IPSJ (Japan), NGI (Netherlands), NZCS (New Zealand), SCS (Shanghai). Spouse members: Voting Members 1st person + CACM $71 2nd person, no CACM $48 Student Members 1st person + CACM $21 2nd person, no CACM $14 $35 Retired members (Annual income from part time and consulting work does not exceed $2500; age + years of ACM membership must exceed 75) One can also join a SIG without joining ACM. While that would be less expensive than joining the SIG and ACM, it would not be as effective in demonstrating support for the activities listed above. Barbara Simons, National Secretary, ACM ------------------------------ Date: Wed, 20 Mar 91 15:42:55 -0500 From: Brian S. Hubbard Subject: ISM Workshop Announcement Sponsored and Administered By: TIS, TRUSTED INFORMATION SYSTEMS, INC. In Coordination With: DIS DEFENSE INVESTIGATIVE SERVICE The 1991 Industrial Security Manual: A Workshop on Satisfying NEW Requirements For Site Approval of Automated Information Systems Washington, D.C., 7-9 May 1991 Los Angeles, California, 14-16 May 1991 In order to process classified information using automated information systems (AISs), a contractor site must receive approval by the Defense Investigative Service (DIS). The requirements for such site approvals are stated in Chapter 8 of the Industrial Security Manual (ISM), DoD 5220.22. At the invitation of DIS, Trusted Information Systems, Inc. participated in the development of the 1991 Industrial Security Manual which was promulgated by the Director of DIS in January 1991. This revision of the ISM reflects the requirements of DoD Directive 5200.28. The process of receiving site approval has been administratively streamlined; however, the requirements themselves have been made technically more sophisticated and exacting. The revised requirements also offer a more realistic approach to addressing threat and risk. Part of this latest revision requires contractors to meet the requirements of DoD 5200.28-STD, the Trusted Computer System Evaluation Criteria or TCSEC, commonly known as the "Orange Book". In order to explain the new requirements of the ISM and their application to specific processing environments, Trusted Information Systems, Inc. in coordination with the DIS is sponsoring and administering a comprehensive three-day workshop. This workshop is being developed by developers of the new requirements. Lecturers include Stephen T. Walker (TIS), Carole Jordan (Defense Investigative Service), Marvin Schaefer, Charles P. Pfleeger, William C. Barker (TIS) For further information, contact Brian at hubbard@TIS.COM or Trusted Information Systems, Inc., Attn: WORKSHOP COORDINATOR, 3060 Washington Road, Glenwood, MD 21738, Phone: (301) 854-6889 FAX: (301) 854-5363 ------------------------------ End of RISKS-FORUM Digest 11.32 ************************