Subject: RISKS DIGEST 13.35 REPLY-TO: risks@csl.sri.com RISKS-LIST: RISKS-FORUM Digest Saturday 4 April 1992 Volume 13 : Issue 35 FORUM ON RISKS TO THE PUBLIC IN COMPUTERS AND RELATED SYSTEMS ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator Contents: [Loose-end clean-up] Some details on Patriot problem (Frederick G. M. Roeber via Stanley Chow) Re: War Games II (Les Earnest) More on Gibson electronic book virus (Tom Maddox via Blake Sobiloff) Re: Neuromancer (Keith Bierman) Risks of faked-up software advertising (John Lupien) Xerox PaperWorks; Imprecise Interrupts (Barry Johnson) Imprecise FP traps (Gideon Yuval) CMOS RAM for security (Tom Brusehaver) Subject of the Data as "Owner" (Bill Murray) Re: FBI v. digital phones (Cris Pedregal Martin) On Electronic Privacy... (Peter Wayner) The RISKS Forum is moderated. Contributions should be relevant, sound, in good taste, objective, coherent, concise, and nonrepetitious. Diversity is welcome. CONTRIBUTIONS to RISKS@CSL.SRI.COM, with relevant, substantive "Subject:" line. Others may be ignored! Contributions will not be ACKed. The load is too great. **PLEASE** INCLUDE YOUR NAME & INTERNET FROM: ADDRESS, especially .UUCP folks. REQUESTS please to RISKS-Request@CSL.SRI.COM. Vol i issue j, type "FTP CRVAX.SRI.COMlogin anonymousAnyNonNullPW CD RISKS:GET RISKS-i.j" (where i=1 to 13, j always TWO digits). Vol i summaries in j=00; "dir risks-*.*" gives directory; "bye" logs out. The COLON in "CD RISKS:" is essential. "CRVAX.SRI.COM" = "128.18.10.1". =CarriageReturn; FTPs may differ; UNIX prompts for username, password. ALL CONTRIBUTIONS CONSIDERED AS PERSONAL COMMENTS; USUAL DISCLAIMERS APPLY. Relevant contributions may appear in the RISKS section of regular issues of ACM SIGSOFT's SOFTWARE ENGINEERING NOTES, unless you state otherwise. ---------------------------------------------------------------------- Date: 3 Apr 92 17:58:00 EST From: Stanley (S.T.H.) Chow Subject: Some details on patriot problem (from comp.realtime) This is the first time I have seen a detailed description of the round-off/accumulation problem. Stanley Chow (613) 763-2831 Article 74 of comp.realtime: Path: bqnes23!bmerh2!bnrgate!stl!uknet!mcsun!dxcern!vxcrna.cern.ch!roeber From: roeber@vxcrna.cern.ch Newsgroups: comp.realtime Subject: Re: Design question about Patriot Message-ID: <1992Apr3.104714.1@vxcrna.cern.ch> Date: 3 Apr 92 09:47:14 GMT References: <11204@mindlink.bc.ca> <1992Apr3.062551.7306@sequent.com> Sender: news@dxcern.cern.ch (USENET News System) In article <1992Apr3.062551.7306@sequent.com>, jjb@sequent.com (Jeff Berkowitz) > [...] > The article states "...this particular Patriot battery had been > running continuously for about 100 hours...[and]...had built up > a timing lag of 0.3433 second." This timing error was sufficient > to shift the "range gate" enough to cause the system to disregard > the Scud. > > My "comp.realtime" question: what algorithm in the Patriot system > would cause an ABSOLUTE time variation like this to change a > relative computation, like the predicted course of the incoming > missile? I just received my copy of the US General Accounting Office's "Report to the Chairman, Subcommittee on Investigations and Oversight, Committee on Science, Space, and Technology, House of Representatives: Patriot Missile Defense - Software Problem Led to System Failure at Dhahran, Saudi Arabia" (phew!) Since I can't find a copyright notice on it anywhere, I'll quote the appropriate paragraph: "The range gate's prediction of where the Scud will next appear is a function of the Scud's known velocity and the time of the last radar detection. Velocity is a real number that can be expressed as a whole number and a decimal (e.g., 3750.2563...miles per hour). Time is kept continuously by the system's internal clock in tenths of seconds but is expressed as an integer or whole number (e.g., 32, 33, 34...). The longer the system has been running, the larger the number representing time. To predict where the Scud will next appear, both time and velocity must be expressed as real numbers. Because of the way the Patriot computer performs its calculations and the fact that its registers are only 24 bits long, the conversion of time from an integer to a real number cannot be any more precise than 24 bits. This conversion results in a loss of precision causing a less accurate time calculation. The effect of this inaccuracy on the range gate's calculation is directly proportional to the target's velocity and the length of the the system has been running. Consequently, performing the conversion after the Patriot has been running continuously for extended periods causes the range gate to shift away from the center of the target, making it less likely that the target, in this case a Scud, will be successfully intercepted." Interestingly, when the Israelies were running their Patriot systems, they studied the heck out of them, and noticed this problem after merely 8 hours of operation. (8 hours corresponds to a range-gate shift of 55 meters; 100 hours corresponds to 678 meters.) They sent this data back, and the Americans responded with something like, "This was designed for the European theater, where it would be run for a few hours before being moved. Who in his right mind would run a system continuously for eight hours? Or more?" They did write a software patch to fix the problem, but it took two weeks to arrive in Saudi, because of transport difficulties. It arrived the day after the failure. Frederick G. M. Roeber | CERN -- European Center for Nuclear Research e-mail: roeber@cern.ch or roeber@caltech.edu | work: +41 22 767 31 80 r-mail: CERN/PPE, 1211 Geneva 23, Switzerland | home: +33 50 42 19 44 ------------------------------ Date: Thu, 2 Apr 92 20:28:47 -0800 From: Les Earnest Subject: War Games II (Raymond, RISKS-13.33) I found Eric Raymond's account of NORAD telephone indirection amusing but not at all unusual -- I recently encountered a more elaborate runaround in dealing with the county bureaucracy that manages the bus system here. Eric is lucky that he did not get the treatment that we used when I had an office at C.I.A. headquarters. There we answered the phone with "Hello" and, unless the calling party immediately named a person who shared the office, we were programmed to hang up without another word. As one who helped design the initial computer system that went into the Cheyenne Mountain facility, and who much later provided some input to the screenwriters who wrote "War Games," I will assert that there is less there than meets the eye (and imagination). That facility was intended to integrate and control a number of other so-called command- control-communication (C3) systems, but suffered from the same fundamental design flaws as its predecessors. (See my C3 series in RISKS that I suspended two years ago, but that I intend to resume as soon as I get time.) Let me acknowledge that the Cheyenne Mountain facility is not totally useless. If a nuclear holocaust does occur and if the evironmental security systems there function as advertised, the residents of that hole may have an opportunity to repopulate the Earth after a time. Fortunately, the even more elaborate and senseless proposal to develop a successor system, dubbed the Strategic Defense Initiative by Ronald Reagan, now seems to be fading away despite attempts to revive it as a Killer Meteorite Initiative. Les Earnest, 12769 Dianne Drive, Los Altos Hills, CA 94022 415 941-3984 Les@cs.Stanford.edu ...decwrl!cs.Stanford.edu!Les ------------------------------ Date: Fri, 3 Apr 1992 02:53:26 -0500 From: ccb@rac2.wam.umd.edu (Blake Sobiloff, Human-Computer Interaction Lab) Subject: More on Gibson electronic book virus (from alt.cyberpunk) Subject: Re: William Gibson's next novel.... From: tmaddox@milton.u.washington.edu (Tom Maddox) Date: 2 Apr 92 19:29:44 GMT Organization: The Evergreen State College, Olympia, Washington Newsgroups: alt.cyberpunk,rec.arts.sf.written References: <1992Mar31.231258.4312@pasteur.Berkeley.EDU> Keywords: William Gibson Xref: umd5 alt.cyberpunk:9967 rec.arts.sf.written:6007 Sender: news@u.washington.edu (USENET News System) Path: umd5!haven.umd.edu!darwin.sura.net!jvnc.net!yale.edu!spool.mu.edu!uwm.edu!ogicse!henson!news.u.washington.edu!milton.u.washington.edu!tmaddox Message-ID: <1992Apr2.192944.9444@u.washington.edu> Given the recent confusion (mine included) about a few matters Gibsonian, I thought I might post a clarification. I talked to him last night, and he said: the virus-loaded thingie does or will soon exist, but it's certainly not his next novel; in fact, it's a couple of thousand words that he refers to as "the text," and it's poetry, accompanying "a bronze booklike object" that weighs a bunch and contains etchings by Dennis Ashbaugh and a diskette with the Gibson poem; the poem, if the software works the way it's intended to, will self-destruct after a single reading ("but you can videotape it"); this is all aimed at the art collector's market, and I get the impression Gibson's part in it occurred as a favor to the artist; in re the draft dodger matter--he was never actually drafted but went to Canada to evade the possibility; he says he would not claim the moral credit that goes with being an actual draft dodger. This report brought to you by Citizens for More Accurate Cyberfacts. Tom Maddox, tmaddox@u.washington.edu ------------------------------ Date: Thu, 2 Apr 92 15:06:07 PST From: Keith.Bierman@eng.sun.com (Keith Bierman fpgroup) Subject: Re: Neuromancer (RISKS-13.33) >William Gibson, well known for his "Neuromancer" (which in 1986 anticipated >what is today known as virtual reality), has a new book, ... Gibson was far from the first. The details are best left to the rec.sci.fi.* groups; but it is serious revisitionst history to claim that Gibson is the creator of VR. [Gibson's new Book of the Dead might be titled NECROMANCER, to contrast it with his earlier NEUROMANCER! PGN] ------------------------------ Date: Fri, 20 Mar 92 15:14:23 EST From: lupienj@hpwarq.wal.hp.com (John Lupien) Subject: Risks of faked-up software advertising Brian Kantor's article about the zip codes being wrong on the addresses used in an advertisement for a text processing package brought to mind another faked-up advertisement that is potentially much RISKier - in COMPUTER LANGUAGE there is a recurring advertisement for a software product that uses a compound bow with an arrow as its illustration. Perhaps the intent is to indicate that the product is high-tech, accurate, powerful, and easy to use (all of which might be said of compound bows), but if you look closely you notice that the arrow is on the wrong side of the bow, and could not possibly be actually nocked on the bowstring. If the bow was loosed in this configuration, the most likely result would be embarrassment on the part of the operator, but if the arrow were to partially catch the string, it could do considerable damage to the operator and/or anyone else around. The target, however, would not be exposed to significant risk of being hit... and if I was the intended target of the advertisement, I have to say that it was rather wide of the mark... To put the risk more succinctly, it is important to get the details right in your advertisements: people who notice lack of attention to detail in advertisements may well assume that this is indicative of the product as well. John R. Lupien, lupienj@hpwarq.hp.com ------------------------------ Date: Fri, 3 Apr 92 09:38:00 EST From: BARRY JOHNSON Subject: Xerox PaperWorks; Imprecise Interrupts I had assumed a couple of recent items would have been answered by someone more informed than I but, since they haven't, let me offer my bit ... Security v. Xerox Corp.'s PaperWorks The March 30, 1992 issue of _InfoWorld_ (p.37) shows the top couple of inches of the form that one uses to request stuff. Right below a 'Please do not write above this line' header is a SECURITY block. It has 26 boxes across the page with a letter A-Z above each. Soo, there is something ... Imprecise Interrupts As surprised as I was to hear that DEC's ALPHA chip may not return an interrupt tagged to the instruction that caused it, I was even more concerned by the subsequent discussion that seemed to imply that, since an IBM mainframe did this years ago, it is probably OK. As far as I know, the Unisys (Burroughs) A Series has been pipelining in its higher-performance machines for years but it has always reported an interrupt against the instruction that caused it. This is despite, I believe, the possibility that it may have over a dozen instructions concurrently in the pipe. Not being a hardware person, I am left wondering: is it possible to properly manage inter-instruction dependencies in the pipeline without being able to track an interrupt to the instruction that caused it? Wouldn't it be necessary to track the offender so that one could determine the dependent instructions that will also need to be cleaned out? The bottom line is that just because it has been done before doesn't mean it is OK to keep doing it (c.f. MicroSoft Windows 3.1: just because NOT checking parameters has been done before doesn't mean they should keep doing it!). Barry Johnson ------------------------------ Date: Fri, 1 Mar 92 13:04:40 PST From: gideony@microsoft.com (Gideon Yuval 1.1114 x4941) Subject: Imprecise FP traps (Klossner, RISKS-13.27) In RISKS-13.27, andrew@frip.wv.tek.com says "the only practical difference between precise & imprecise exceptions is that you can't report the PC of the offending instruction when imprecise". I think this is overoptimistic: To implement the default (denormalizing) mode of IEEE754/854 underflow handling, most systems trap on underflow, and let the trap-handler do the denormalizing. This becomes difficult in an imprecise-exception environment, when the original operands may already have been overwritten by the time the exception occurs (even if the "offending" instruction can be determined). Since a system that breaks IEEE defaults is obviously one that breaks IEEE, this seems a real issue. Gideon Yuval, gideony@microsoft.com, 206-882-8080 (fax:-883-8101;TWX:160520) Microsoft, 1 Microsoft Way, Redmond, WA 98052-6399 (home: -232-2119) ------------------------------ Date: Fri, 3 Apr 92 09:49:18 CST From: intran!clam!tom@uunet.UU.NET (Tom B.) Subject: CMOS RAM for security Over the past 5 years our little division has used the Premis accounting software package. About 3 years ago, Xerox bought our company, and has kind of left us be our own little company. We continued to use Premis software. Last fall Xerox started cutting back, and was thinking of rolling our accounting into Xerox's accounting, so we stopped paying support to Premis for the software we were about to stop using. Changes happen and our office moved. During the move, the accounting PC lost its battery that saves the cmos ram (its an original IBM AT). Once I found what drive type it had, and reset the date and time, and got everything running, the accounting guy tried his software. Well the software now says we are not licensed. Thinking a simple phone call was all that it would take to get the license going again, the Premis folks informed us there would be a fee, including all the months of support we didn't pay for, but were still using (their assumption). Then when pressed, they claimed the software license was "non-transferable", and since Xerox is now our company name, the software must be re-licensed (Xerox had been sending them checks for most of 3 years, and they never complained). A more wizardly PC guy who works here looked at things, and found the Premis software was on its own non-DOS partition, and the original Premis floppies only contained a DOS program that can load this non-DOS partition. The backup procedures used here only backup DOS partitions (Premis charges extra for the backup program, so someone decided we would use our own backup program). Maybe these aren't risks, but they sure are dangerous to people not computer wizards, in that no one had informed anyone that this was really special software (non-DOS), using non-standard hardware (cmos ram), in ways not intended. What if some new piece of software had needed the same cmos ram to store its license. Tom Brusehaver uunet!intran!tom ------------------------------ Date: Sat, 4 Apr 92 18:04 EST From: WHMurray@DOCKMASTER.NCSC.MIL Subject: Subject of the Data as "Owner" (Postpischil, RISKS-13.34) >Finally, there is a lesson to be learned about database records and privacy. >Although the Department of Safety keeps these records, we should not consider >the Department to be the owner of the records. Each record is owned by the >person whose record it is, and the owner has a right to know what is in the >record and when changes are made. The owner has a right to control their... It is a gross over-simplification to assert that because a record is about me that I "own" it. While I clearly have an interest in it, and while the holder may have a responsibility to me to protect that interest, that interest may be far short of ownership. Ownership may be defined as the exclusive right to use. This is the definition that we usually have in mind when we talk about chattels. With regard to real property, such ownership may be granted by the sovereign and subject to eminent domain, but the definition still works reasonably well. Ownership of information is similar in that most of my rights, for example copyrights, are granted by the sovereign or the legislature. Information is different in that some use of it by others may not infringe my rights or intent. However, if we were to use this definition of ownership and if we assert that that ownership automatically or by default rests with the data subject, then we would expect that all use of personal data would have to be by delegation of the subject owner's rights. Clearly this is not the case and the driving license record and the driving license itself are good examples to prove the point. If under this definition I were the owner of my driving license record, then the state would be able to use it only under a license from me. Clearly that is not the case. The record and its uses are authorized by law. It is kept at state expense for a legitimate state purpose. While some state laws do make explicit mention of the rights of the subject of the data, most are silent. None suggest that the state uses the record at the sufferance of the subject. Other examples are less clear. A copy of the pay record is kept by the the employer to fulfill his legal obligations to the employee, the employee's bargaining agent, the state, and the owners; many of these obligations he could not meet in the absence of such records. They are kept at his expense for a legitimate purpose. While it may well be that his right to keep such records is granted by the employee as part of the employment agreement, I am not aware of any such agreement that speaks explicitly to the issue. Perhaps more to the point, I am not aware of any challenges to the rights of employers to keep such records. While equity and fair information practice both clearly suggest that the employee has a protectable interest in the record, such interest would appear to be far short of "the exclusive right to use." Balancing the rights and interests of record keepers against the rights of record subjects is a complex issue. It will eventually be clarified, and perhaps even codified, at law. To date the law is mostly silent. In the presence of such silence I suppose anyone can assert what he would like to be the outcome as though it were fait accompli. I would suggest that it is less than helpful, perhaps even destructive, to do so. William Hugh Murray, 21 Locust Avenue, Suite 2D, New Canaan, Connecticut 06840 203 966 4769, WHMurray at DOCKMASTER.NCSC.MIL ------------------------------ Date: Fri, 3 Apr 92 9:49:01 EST From: pedregal%unreal@cs.umass.edu Subject: Re: FBI v. digital phones (Dobkin, RISKS 13.33) Reply-To: pedregal@cs.umass.edu Daniel Dobkin offers a justification for the FBI's reasons to demand an ability to wiretap without Telco cooperation. (This follows the reasoning by Brian Kantor, carefully avoiding its disputed last step.) Dobkin says: > In fairness to the FBI, there are other possibilities, such as when a > Telco employee is himself the subject of an investigation. I find this extremely naive. If to investigate a Telco employee one must stay outside of Telco COs and such, there is a serious security problem at Telco regardless of other technical issues. The other side of the coin (misuses of such capabilities) has been discussed already. I agree with Kantor that any conceivable (legal) wiretap can be carried out with Telco's cooperation. This is enough, and it has been well argued that more than this (i.e., what the Bureau demands) poses risks both on civil rights and technical perspectives. The risks of the good-natured remark by Dobkin are the assigning of (pure) "motivation" or moral qualities to (in this case, government) institutions, and forgetting Occam's razor: Telco wiretapping is sufficient, effective, and cheap. Why more? Cris Pedregal Martin -- Computer Science Dept., UMass, Amherst. ------------------------------ Date: Sat, 28 Mar 92 10:12:57 -0500 From: wayner@cs.cornell.edu (Peter Wayner) Subject: On Electronic Privacy... I read William Session's piece in the NYT on why the FBI wants to maintain the ability to tap phones and I think he makes good points. If we want to defend cryptography and electronic privacy, I think we need to argue that privacy is not just a philosophical shield used by child pornographers and drug dealers. It is an essential tool that normal people need to protect their interests. I wrote this down in an op-ed piece on Senate Bill 266 (the one that would ban encryption) that didn't see light. Rather than re-invent the squeal, I'll just send it along. Everyone knows the problem with postcards. The mailman hands over the mail and says, ``Too bad about your brother. Paid all that money to go to ski in Switzerland just to break his arm on the first day.'' Right now, the Senate is considering an anti-crime bill that effectively requires all communication to be as easily readable to government officials as postcards. The reason, they say, is that drug dealers and other criminals are using secret codes to do business. The sad truth is that the bill will hardly deter criminals. If anything it will make their lives easier by making it impossible for American companies and citizens to keep anything confidential. The point of Section 2201 of Senate Bill 266 is to prevent people from using secret codes. The practice is not widespread now because people communicate with paper and sealed envelopes are enough to do the trick. In the coming years more and more letters and documents will travel electronically and in electronic communication, encryption is the equivalent of an envelope. Although you can't write sealed with a kiss across it, you can make still make sure that no electronic postmaster is perusing your love letters. There may be no seal, but the banks and other businesses can use encryption to keep financial statements confidential. Although the language of the bill makes it more a non-binding resolution than a real law with teeth, it specifically asks that all manufacturers of communication equipment ensure that the ``plaintext'' of a message is easily available to the state. No one knows what this will entail, but there is the real danger that the executive branch might make regulations based on these non-binding wishes of Congress. In any case, any piece of communication that can be read by the government can also be read by 16 year old hackers or more importantly, foreigners companies spying on their American competitors. The problem, in the eyes of the bill's authors, is that computers make it too easy to jumble a letter or a ledger record into incomprehensible mess of seemingly random noise. Only the person with the correct password can resurrect the file. The government is worried that this computer power will be a tremendous hindrance to investigations. They won't be able to find produce any evidence of wrongdoing because everything will be encrypted in code. While this may be a problem, it's nothing new. Criminals have been using secret codes since the beginning of time. One of the classic cliches of Hollywood is the doublespeak of gangsters. One asks, ``Did the shipment of tomatoes come in yet?'' The other responds, ``Yes, and they will cost you 10,000 bananas.'' Henry Wadsworth Longfellow made the encryption technology of the American revolution famous when he set it to rhyme. ``One if by land, two if by sea, and I on the opposite shore shall be.'' Although there weren't any computers involved, the message got through. It is easy to create an innocuous system that doesn't look like a code. When a major New York jeweler was caught evading sales tax by sending empty boxes to addresses out of the state, the police found that they were making a little dot next to the illegal entries in the books. When Elliot Ness brought Al Capone down for tax evasion, he got the bookkeeper to decipher the cryptic entries in the books. Would prohibiting encryption stop this behavior? Hardly. What's another law when you're already taking on the IRS? Almost by definition, this bill will touch sensitive subjects. If people consider something a private matter, they usually have a reason for feeling that way. There may not be a right to privacy specified in the constitution, but many feel that there should be strict limitations on the bounds of government. In a sense, requiring all messages to travel on the equivalent of postcards isn't much different from getting a search warrant for every envelope in the country. But, the people on the other side of the spectrum who feel that ``privacy'' is just a philosophical shield for suspicious activity should be just as wary of the bill. Criminals are notorious for using confidential data to defraud. My credit card company assumes that if I tell them my account number and the billing address over the phone then I am probably who I say I am. But when every bill or letter must be readable by the government, who knows which criminal will find a way to discover my credit card number and billing address? Companies won't be able to protect themselves or their customers with encryption. This bill should alarm any business with trade secrets. Encryption is to computers what locks are to filing cabinets and security guards to the front door. The Soviet Union, France and Iraq, to name just a few, already devote a substantial effort to stealing American non-military technology. It doesn't make sense to prevent business from protecting themselves. The fact is that individual people are the best guardians of their own information. Privacy is not just an important legal tradition-- it is also a good crime deterrent. The police will always dream of listening to the hidden thoughts of the criminals, but gaining the ability to scan every letter in the country won't make a difference. Even the stupidest crooks know enough to not to spell out their plans on a postcard. The rest of us who have completely legal secrets, though, will be left with no protection by Senate Bill 266. ------------------------------ End of RISKS-FORUM Digest 13.35 ************************