Subject: RISKS DIGEST 14.68 REPLY-TO: risks@csl.sri.com RISKS-LIST: RISKS-FORUM Digest Tuesday 1 June 1993 Volume 14 : Issue 68 FORUM ON RISKS TO THE PUBLIC IN COMPUTERS AND RELATED SYSTEMS ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator Contents: Error on California Unemployment Checks (PGN) Re: Fake ATM Machine Steals PINs (Dan Franklin) Re: CHI & the Color-blind? (John R. Levine) Defending Crypto with the 2nd Amendment (Peter K. Boucher, David A. Honig, Jim Purtilo) Get yer RSA encryption now! (more on CLIPPER) (Jay Schmidgall) Re: Clipper (Carl Ellison) AIS BBS (Kim Clancy, Paul Ferguson, Vesselin Bontchev, Jim Thomas) The RISKS Forum is a moderated digest discussing risks; comp.risks is its Usenet counterpart. Undigestifiers are available throughout the Internet, but not from RISKS. Contributions should be relevant, sound, in good taste, objective, cogent, coherent, concise, and nonrepetitious. Diversity is welcome. CONTRIBUTIONS to RISKS@CSL.SRI.COM, with appropriate, substantive "Subject:" line. Others may be ignored! Contributions will not be ACKed. The load is too great. **PLEASE** INCLUDE YOUR NAME & INTERNET FROM: ADDRESS, especially .UUCP folks. REQUESTS please to RISKS-Request@CSL.SRI.COM. Vol i issue j, type "FTP CRVAX.SRI.COMlogin anonymousAnyNonNullPW CD RISKS:GET RISKS-i.j" (where i=1 to 14, j always TWO digits). Vol i summaries in j=00; "dir risks-*.*" gives directory; "bye" logs out. The COLON in "CD RISKS:" is essential. "CRVAX.SRI.COM" = "128.18.10.1". =CarriageReturn; FTPs may differ; UNIX prompts for username, password. For information regarding delivery of RISKS by FAX, phone 310-455-9300 (or send FAX to RISKS at 310-455-2364, or EMail to risks-fax@vortex.com). ALL CONTRIBUTIONS CONSIDERED AS PERSONAL COMMENTS; USUAL DISCLAIMERS APPLY. Relevant contributions may appear in the RISKS section of regular issues of ACM SIGSOFT's SOFTWARE ENGINEERING NOTES, unless you state otherwise. ---------------------------------------------------------------------- Date: Tue, 1 Jun 93 17:26:46 PDT From: "Peter G. Neumann" Subject: Error on California Unemployment Checks The California Employment Development Department sent out 75,000 duplicate unemployment checks --- which it attributed to a "computer error" [you guessed it, right?]. "Those who cash their checks will be asked to reimburse the state or will have future benefits docked," said a spokesperson. [Source: San Francisco Chronicle, article by Jonathan Marshall, p. A15.] [I wonder what the increased workload will be in trying to recover from this accident, and whether they will actually catch all of those who cashed the extra check or will do so at some point in the next six months! PGN] ------------------------------ Date: Tue, 1 Jun 93 14:59:45 EDT From: "Dan Franklin" Subject: Re: Fake ATM Machine Steals PINs Brinton Cooper suggests several good ways to reduce the RISK that you're dealing with a fake ATM machine. Another way to "authenticate" an ATM machine is to use it to ask for your bank balance--a piece of information only a genuine ATM would (presumably) have. Since you can't do this without giving the machine your PIN, a failure means you have to check your bank balance every few days thereafter to see if someone made a withdrawal--or just change your PIN. But if you believe the risk of the machine being fake is low, and a way to phone the bank whose name appears on the machine is not apparent, this is another possibility. Dan Franklin ------------------------------ Date: 20 May 93 21:12:50 EDT (Thu) From: johnl@iecc.cambridge.ma.us (John R. Levine) Subject: Re: CHI & the Color-blind? The question of whether color-blind people can see signs and controls isn't particularly a computer related one, but is certainly made more of an issue by the complex interfaces common in computerized systems. The number of people affected is surprisingly large. I'm not very color blind and have no trouble telling red from green, so long as they are fairly bright, but a have a real problem with black-on-red or red-on black signs and displays. They look like, say, maroon on crimson, with practically no contrast. (For some reason, green doesn't seem to have this problem.) I believe this is a common problem for the large number of slightly color blind people. For the more severely color blind there is also the better-known problem is of not being able to distinguish between reds and greens of similar brightness. For designers of computer displays, I'd think it would be straightforward to test them for color-blind robustness by replacing the color pallette used in the program with a monochrome one that maps each color to an suitably bright shade of grey. Or if that's hard, one could always point a video camera at the sign or display and show the image on a black-and-white TV. Regards, John Levine, johnl@iecc.cambridge.ma.us, {spdcc|ima|world}!iecc!johnl ------------------------------ Date: Tue, 1 Jun 93 16:13:52 -0700 From: "Peter K. Boucher" Subject: Defending Crypto with the 2nd Amendment There's a risk in associating your cause with the 2nd Amendment. Some constitutional protections are "more equal" than others. hunter@ncbi.nlm.nih.gov (Larry Hunter) writes: > ... One consequence of that balance is that you cannot legally possess > nuclear weapons or even, say, a .50 caliber machine gun privately. "A well regulated militia, being necessary to the security of a free state, the right of the people to keep and bear arms, shall not be infringed." Everyone I know who has studied the history of this statement agrees that it means that "the people" must have unrestricted access to all types of arms, in order to a) obviate the need for a standing army, and b) defend the country in the absence of a standing army. No one I know expects (or desires) that private ownership of nuclear arms be allowed (although private ownership of the most powerful weapons of the day, such as canons, was common in the 18th century, and is exactly what the framers of the constitution sought to protect). Also, a .50 caliber machine gun can be purchased legally in most states, if you pay the appropriate federal tax. Paul Robinson writes: > Judges frown heavily on laws mandating prior restraint, ... This is good if use you some other amendment (not the 2nd) to back up your case. The Brady Bill (which mandates a 5-day waiting period on the purchase of a handgun) is pure prior restraint. There is no empirical evidence whatsoever that it will have any beneficial impact on the level of violent crime in the U.S. Such laws have been enacted in several states, but none has ever been shown to have reduced any aspect of violent crime. Even though the premise of such laws is "government must restrain the people prior to the purchase of a handgun, on the chance that the purchaser might be planning to commit a crime with it," none of them has ever been struck down for this reason. Pick another amendment with which to defend crypto, because the 2nd is risky. Peter K. Boucher ------------------------------ Date: 31 May 93 20:34:21 GMT From: "David A. Honig" Subject: Re: Crypto and the 2nd Amendment (Hunter, RISKS-14.65) PGN writes: > [Incidentally, at last week's IEEE Symposium on Research in Security > and Privacy, a rump group decided that because crypto falls under > munitions controls, the right to bear arms must sanction private uses of > cryptography! PGN] While this may amuse some, this actually addresses at a profound and often overlooked intent of the 'Founding Fathers'. The People are guarenteed the right to bear arms, not just for personal defense (which was obvious to them), but also because: politicians prefer unarmed peasants. An unarmed populace is much easier to dominate. And so is a populace without the ability to have privacy. David Honig ------------------------------ Date: Tue, 1 Jun 93 18:25:51 -0400 From: purtilo@cs.UMD.EDU (Jim Purtilo) Subject: Re: Crypto as "Right to Bear Arms" issue, followup Just a point of information concerning Larry Hunter's post on the analogy between encryption as a "fundamental right" and other rights, such as those recognized by the US Bill of Rights. Larry observes: # ... I am completely convinced that framers of the Constitution would have # wholeheartedly endorsed citizen access to effective encryption as a # fundamental right. ... Unfortunately, Larry then reasons that "the `right to keep and bear arms' strategy for defending encryption doesn't seem like to succeed practically" based upon some erroneous assumptions concerning the second amendment. # .... there are several practical problems with the idea. First of all, # constitutional rights must be balanced against each other. Your right to # bear arms is balanced against the rights of your neighbors to pursue their # happiness in an orderly society. One consequence of that balance is that # you cannot legally possess ... , say, a .50 caliber machine gun privately. Of course, some thoughtful citizens would observe that there is no balancing act in Larry's statement. Your neighbors' ability pursue their happiness in an orderly society is enhanced by both your and their ability to accept responsibility for personal safety in this otherwise unsafe world. And this includes responsibility for protection from potentially corrupt governments. Regardless, by the analogy to personal weaponry (machine gun), Larry weakens his point unnecessarily, since in fact these *can* be owned and used in this country. They are taxed and regulated, but nevertheless legal to own by any honest citizen, not otherwise having local restrictions. The real balancing act in his analogy is between too-powerful government and the threat posed by an armed citizenry -- and in fact our history has many examples where the government has backed off or altered potentially unfair policies based upon the fear of the consequences when We the People reasserted control via arms. Information is the ultimate in personal "arms". In today's age, how else than by knowing the citizenry's thoughts and plans can a government preserve and protect its power base? That is to say, encryption technology becomes a powerful protection of citizens from from too-big government, and *control* of encryption is that big-government's counter. Fortunately, Larry concludes with a message many of us soundly support: # We should be # fighting the claim that cryptography is useful primarily to criminals (and # is therefore threatening) for precisely the same reason [that we fight # claims that speech, arms, etc. are only useful to criminals.] Jim Purtilo ------------------------------ Date: Thu, 20 May 1993 08:16:07 -0500 (CDT) From: Jay Schmidgall Subject: Get yer RSA encryption now! (more on CLIPPER) In RISKS DIGEST 14.64, drand@osf.org writes: |> That the crooks could always create more effective crypto gear is also a red |> herring. Maybe they could. But the law is being structured so that this |> itself would be considered probable cause of a crime. We'll have to see if |> (how much) this is abused. One hopes that just the unlicensed crypto gear |> would not be sufficient to indict honest people. Ok, I guess I must have skimmed over this part. Let me see if I understand this properly: If I have got more secure crypto gear, probable cause exists that I have committed a crime. Hmmm. Does this include any crypto gear that may have been purchased before the corresponding CLIPPER-enabled gear became available? Or am I allowed to deduct the cost of the gear I previously bought as well as the cost of the new CLIPPER gear I must buy to fall into line with the law? Of course, this becomes a powerful tool for anyone interested in secured international information exchange. Since it is probable that most countries that use an existing (more secure?) encryption standard are not likely to switch to CLIPPER, any one exchanging such information must own gear capable of understanding that encryption. Hence, probable cause and you are tapped. Hmmm again. Does knowledge of the encryption algorithm also count as probably cause? After all, one can always do it in software... Hmmm again. Do they have to be able to find a program which implements it, or merely some text which describes the algorithm? I expect this will become worse than the witch hunts of old; however, instead of starting a rumour that someone you don't like is a warlock or witch, the rumour will be that "Jay knows RSA!" Somehow, to me this seems the most horrific part of this whole mess. I do hope I've misunderstood. jay@vnet.ibm.com (c) Copyright 1993. All rights reserved. ------------------------------ Date: Wed, 19 May 93 21:48 EDT From: Carl_Ellison@vos.stratus.com Subject: Re: Clipper (Bidzos, RISKS-14.64) Jim Bidzos writes: >The only way to make that advantage "disappear" is to publish everything about >Capstone, including the algorithm that the keys you manage are used with, and >wait a few years and a few hundred papers before proposing it as a standard. Whit Diffie also asked for publication of the Skipjack algorithm. That's good, as far as it goes, but it's possibly more important to publish papers giving design rules and cryptanalysis methods so that the readership can judge the quality of your cipher design methods. This occurs in the private sector (especially in academia) but is not something we're likely to see from the NSA. Although I would love to learn their methods, I wouldn't want the NSA to publish these details. I want my tax dollars to be of some good and NSA secrecy is part of that package. So -- I just don't want any NSA involvement in commercial cryptography. We'll limp along on our own. Perhaps, if we do something really stupid (eg., use RSA when the NSA knows that the PRC's cryptanalysts know how to factor 1000 bit numbers in seconds) they'll be nice enough to tell us, but that's the limit of the interaction I would hope to have with the NSA. In particular, I hope this round of examination of export policy exempts commercial cryptography (especially freely available code) from controls. It's not even good gallows humor that my company is restricted from shipping DES subroutine code to a country where there is DES subroutine code already on BBSs. Then again, maybe that is how we will have to develop code from now on: write code assuming that the customer will find and install his own public domain subroutine packages of DES, RSA, etc. .... ------------------------------ Date: Mon, 17 May 93 10:02:00 EDT From: Kim Clancy Organization: National Institute of Standards and Technology (NIST) Subject: Re: AIS BBS (RISKS-14.61) I am the sysop for the AIS BBS mentioned earlier. I would like to submit 2 pieces of information for your readers. One, the bbs number is 304-420-6083, although that will be changing soon, details are on the bbs. The second is that the screen captures displayed in the message to RISKS were done at at time when Mary Clark was taking care of the administrative function of upgrading user's access. She is a trainee and has no decisions on the management of the bbs; that is my job. I don't want to see her name tied to this as she is only functioning as directed. Her name was only on the bbs for a short period of time and has since been removed. It is unfortunate that this screen capture occurred during the very short period of time her name was listed, at least I view it as that. Any questions about the bbs should be directed to me. Thanks much. Kim Clancy ------------------------------ Date: Mon, 17 May 93 09:55:57 EDT From: fergp@sytex.com (Paul Ferguson) Subject: Re: Questioning AIS purpose and defending anonymous posting (Friedman, RISKS-14.60) This reference was extracted from Computer Underground Digest, (CuD), #5.36 (May 16 1993), File 2--Building Bridges of Understanding in LE & Comp. Community - > The problem with many of these formats is that they tend to exclude > the average computer user or law enforcement agent. There's now an > alternative. Kim Clancy, a security specialist for the Dept. of > Treasury's Office of Public Debt, has begun the "round-table forum > on Mindvox to bring a variety of views into open dialogue. The intent > is to increase the understanding by the public of the legitimate tasks > of law enforcement, and to expand an awareness of the civil liberties > concerns of the computer public for investigators and others. Law > enforcement personnel are understandably hesitant to engage in such > discussions. But, from what I've seen, there is no ranting, the > discussions are generally of high quality (although an occasional > topic drift does occur), and those participating are sincere in their > attempts to stimulate discussion. [...] > Kim's [Clancy] credentials for moderating this type of a forum are > impressive. In addition to her security and anti-virus skills, she > set up the AIS BBS, BBS run by Dept. of Treasury, Bureau of Public > Debt. Run by Computer Security Branch, AIS BBS is intended as a > resource for security specialists, scholars, or others seeking > information about the varieties of computer abuse and how to combat > them. The files range from CERT advisories, documents on viruses, > and "underground" files to simple public domain/shareware utilities, > such as virus checkers. For those lacking ftp access, AIS BBS is > an excellent source of information and a public service of value > to a broad range of computer professionals and researchers. The > AIS number is currently (304) 420-6083, in late may it will > change to (304) 480-6083. [ remainder deleted ] Since the proverbial cat is out of the bag, I think the original poster was justified to post the alert anonymously. I'd heard about this "service" being run by Clancy a while back, but had actually verified that virus disassemblies were available on AIS. Since that time (and quite possibly since public disclosure or because of a change of direction), the SysOp of that system who was once listed as "Mary Clark" has since been replaced by Clancy as the point of contact on the system. Also, it would appear that they have removed the virus disassembly from public access. I don't think that its proper to question a posters integrity simply on the basis that the message was posted anonymously. I think that there are instances where anonymity on the network can be a good (and sometimes necessary) thing. The situation of anonymous posting is solely dependent upon the language, content and context of the information contained within the message and in these interesting days of Big Brother, it might even be in some folks best interest to post anonymously any information that may be considered derogatory with regards to Uncle Sugar. Power to the "little" people. Paul Ferguson, Network Integrator, Centreville, Virginia USA fergp@sytex.com ------------------------------ Date: Wed, 12 May 93 22:45:02 +0200 From: bontchev@informatik.uni-hamburg.de (Vesselin Bontchev) Subject: Risks of anonymity and credulity (Friedman, RISKS-14.60) I don't know who the anonymous submitter was, but I do know who is the second person who wished to remain anonymous, for the simple reason that he has sent me the full file non-anonymously. In fact, I know this person personally and he is a very respectful person. BTW, the part that has been published here is less than 1% of the full file. I can send it to you, if you still don't believe. [...] > I fully believe that our government sometimes does > things that are stupid, immoral, and illegal, but this isn't the kind of > stupidity that they do. Well, maybe you shouldn't be that much confident about the things that your government can do... Anyway, I do not know whether the US Government really supports the BBS in question. All I know is that the BBS claims to be "official". > In short, we need to be critical thinkers. In addition, we need to think > about the way in which anonymous posting lets things like this get widely > disseminated without exposing the original poster to embarrassment and > ridicule. The next hoax, lie, or distortion from an anonymous source may not > be this obvious. Is the ability to anonymously make this kind of claim a > risk? Actually, the anonymous poster did a service to all of us by bringing such a topic for discussion. What are the RISKS of ignoring valuable information, just because the person who has posted it wished to remain anonymous? Another issue is what are the RISKS of misusing the freedom of speech and officially spreading viruses around... I am Bulgarian and my country is known as the home of many productive virus writers, but at least our government has never officially distributed viruses... bontchev@fbihh.informatik.uni-hamburg.de Vesselin Vladimirov Bontchev Virus Test Center, University of Hamburg Vogt-Koelln-Strasse 30, rm. 107 C, D-2000 Hamburg 54, Germany +49-40-54715-224 ------------------------------ Date: Thu, 13 May 93 02:19 CDT From: Jim Thomas (tk0jut1@niu.bitnet) Subject: Re: RISKS DIGEST 14.60 In Risks (Vol 14 #58) appeared a post that makes us appreciate freedom of speech and information exchange we enjoy in the U.S. The primary risk I've learned after reading the post is that anonymous posters with an axe to grind are potential threats to freedom of expression. Two anonymous posters falsely depict AIS BBS, a bulletin board run by Dept of Treasury/Office of Public Debt personnel as a public information service, as a board engaged in "unethical, immoral, and possibly illegal activities" [...] The remainder of the anonymous post presents screen captures of directories and files to which the poster objects. Especially troublesome for the anonymous accusers are virus-oriented files. AIS is a reputable and professionally run open-access BBS. It has one of most extensive collections of text and other files related to all aspects of security in the country. Some may object to some of the materials, just as some might object to RISKS DIGEST or CuD being "funded" with taxpayers money. It strikes me as reprehensible to take selected material out of context and piece together an image of immorality or worse by presenting a misleading image of the materials on the BBS and the purposes for which those materials are intended. That the accusers make their claims while hiding behind the cloak of anonymity strikes me as the type of cowardice associated with witch hunts. The anonymous posters seem to be bothered by the existence of virus source code on the board. I wager one would learn far more about virus writing and distribution tactics from VIRUS-L than from the AIS files, but the two anonymous posters seem to be part of a handful of strident pseudo-moral entrepreneurs who feel that only the information they judge as appropriate for public consumption should be made available. I'm surprised that the anonymous critics did not also include a demand that public libraries also be closed. It is one thing to disagree with the position of another and raise the contentious issues as a matter of public debate. It is quite another to engage in the cowardly act of anonymously distorting the function of a legitimate and widely-used BBS by insinuating "unethical, immoral, and possibly illegal activities." CuD ran an interview with the AIS BBS personnel (CuD 4.37, 1992), and a few excerpts may put the purposes of AIS BBS in perspective: *** begin excerpts *** Q: What is this Board? (name, number, who runs it (dept & sysop). What kind of software are you using? When did the Board go on-line? A: The Bulletin Board System (BBS) is run by the Bureau of the Public Debt's, Office of Automated Information System's Security Branch. The mission of the Bureau is to administer Treasury's debt finance operations and account for the resulting debt. The OAIS security branch is responsible for managing Public Debt's computer systems security. The AIS BBS is open to the public and the phone number for the Board is (304) 420-6083. There are three sysops, who manage the Remote Access software. The BBS operates on a stand-alone pc and is not connected to any of other Public Debt systems. The Board is not used to disseminate sensitive information, and has been up operating for the past 15 months. <> Q: What are the goals and purposes of the Board? A: The BBS was established to help manage Public Debt's security program. Security managers are located throughout Public Debt's offices in Parkersburg, WV and Washington DC. The security programmers saw a need to disseminate large amounts of information and provide for communication between program participants in different locations. Because the Board was established for internal purposes, the phone number was not published. However, the number was provided to others in the computer security community who could provide information and make suggestions to help improve the bureau's security program. Gradually, others became aware of the Board's existence. Q: What kinds of files and/or programs do you have on the Board? Why/how do you choose the files you have on-line? A: There is a wide variety of files posted. In the beginning, we posted policy documents, newsletter articles from our internal security newsletter, bulletins issued by CERT, such as virus warnings, and others for internal use. I located some "underground" files that described techniques for circumventing security on one of the systems we manage. The information, from Phrack magazine, was posted for our security managers to use to strengthen security. When we were called by others with the same systems, we would direct them to those files as well. Unexpectedly, the "hacker" that had written the file contacted me through our BBS. In his article he mentioned several automated tools that had helped him take advantage of the system. I requested that he pass on copies of the programs for our use. He agreed. This is how our "hacker file areas" came to be. Other hackers have done the same, and have we also received many files that may be useful. It is, indeed, an unusual situation when hackers and security professionals work together to help secure systems. However, this communication has been beneficial in strengthening an already secure system. Q: How did you get the idea to set it up? A: The security branch accesses many BBSs on a daily basis for research purposes, information retrieval and to communicate with others. Since our security program is decentralized, the BBS seemed to be an effective way of communicating with program participants in diverse locations. Perhaps the anonymous accusers are correct: Some types of information may pose a risk if abused. But, in an open democracy, the potential for abuse has been neither a necessary nor a sufficient justification to silence those with whom we disagree. If potential for abuse were a primary criterion for suppressing the flow of information and freedom of expression, we would live in a rather silent world, and there would likely be no RISKS digest (which arguably subverts the national interest by undermining faith in computers and in government, all of which is largely done with public funding). Hiding behind anonymity to reduce the risks of accounting for their accusations, the anonymous posters call not only for silencing, but for sanctions against the sysops. This suggests several risks: 1) Posters who are unwilling to accept responsibility for their claims are more able to distort information in ways that leave the target vulnerable and unable to face their accusers. 2) Anonymous posters who call for silencing and sanctions on the basis of unexamined and questionable claims create a chilling effect on freedom of expression. 3) Anonymous posters with an apparent axe to grind contribute to poisoning the well of free information and reduce the opportunity to openly discuss and debate issues. Our society can far more readily tolerate the existence of information that some may find inappropriate than we can risk the censorship of information because it offends a few zealots engaged in a form of cyber-guerilla warfare by making anonymous claims. Jim Thomas, Cu-Digest, Sociology/Criminal Justice Northern Illinois University, DeKalb, IL 60115 [OTHER MESSAGES ON THIS SUBJECT WERE LARGELY REDUNDANT TO THE PRECEDING CONTRIBUTIONS, AND ARE OMITTED. PGN] ------------------------------ End of RISKS-FORUM Digest 14.68 ************************