Subject: RISKS DIGEST 11.65 REPLY-TO: risks@csl.sri.com RISKS-LIST: RISKS-FORUM Digest Friday 10 May 1991 Volume 11 : Issue 65 FORUM ON RISKS TO THE PUBLIC IN COMPUTERS AND RELATED SYSTEMS ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator Contents: RISKS Backlog (PGN) Draft International Standard on the safety of industrial machines (Martyn Thomas) Netware 286 Trojan Problem (John Graham-Cumming) Big Brother in the air (Andrew Koenig) "Bugs" (William Ricker) Now which train am I part of? (Mark Brader) Where justice is done ... (Herman J. Woltring) The RISKS Forum is moderated. Contributions should be relevant, sound, in good taste, objective, coherent, concise, and nonrepetitious. Diversity is welcome. CONTRIBUTIONS to RISKS@CSL.SRI.COM, with relevant, substantive "Subject:" line. Others ignored! REQUESTS to RISKS-Request@CSL.SRI.COM. For vol i issue j, type "FTP CRVAX.SRI.COMlogin anonymousAnyNonNullPW CD RISKS:GET RISKS-i.j" (where i=1 to 11, j always TWO digits). Vol i summaries in j=00; "dir risks-*.*" gives directory; "bye" logs out. The COLON in "CD RISKS:" is essential. "CRVAX.SRI.COM" = "128.18.10.1". =CarriageReturn; FTPs may differ; UNIX prompts for username, password. ALL CONTRIBUTIONS CONSIDERED AS PERSONAL COMMENTS; USUAL DISCLAIMERS APPLY. Relevant contributions may appear in the RISKS section of regular issues of ACM SIGSOFT's SOFTWARE ENGINEERING NOTES, unless you state otherwise. ---------------------------------------------------------------------- Date: Fri, 10 May 91 10:34:38 PDT From: "Peter G. Neumann" Subject: RISKS Backlog Well, I don't know what combination of circumstances caused it, but recently there have been hundreds of messages pending in the queueueueueueue that I had either flagged for consideration or in a few cases not even read, on ATMs, cashless gas pumps, red and green clocks, electronic cash, automated highways, droids, self-parking cars, SSNs, Dutch crackers, one-time passwords, credit cards, etc., all of which appear to be further iterations on previous iterations on iterations ... . I trust you will bear with me for arbitrarily cutting them off. The RISKS mailbox had become enormous (hovering around 500 undeleted messages), and because MM apparently REWRITES the whole file on the drop of a hat, that consumes ghastly amounts of waiting time and makes reading mail somewhat unpleasant. Sometimes mail comes in faster than MM can handle it and prevents me from even reading for minutes on end -- even with a SPARCstation! So, it was time. Thanks to all of you who responded so diligently on those subjects and went for so long without hearing about might have happened to your contributions. If you feel your contribution would still be valuable and timely, please revise and resubmit. And thanks to all of you who have expressed appreciation for my being such an effective filter so that YOU don't have to filter through all of the third- and fourth-order interstitiations! But the no-pass filter is too antisocial, so I hope we can return to a more even flow. PGN ------------------------------ Date: Fri, 10 May 91 11:33:12 BST From: Martyn Thomas Subject: Draft International Standard on the safety of industrial machines IEC TC4 WG3 has developed a draft international standard on electrical equipment of industrial machines [sic]. This is being extended to cover professional, leisure and domestic use of machines - so it could have very wide significance. I quote some comments on the use of software, from the latest draft (April 1991).[henceforth, my comments are in [], all else is text from the draft standard] 12.3.4 software verification Equipment using reprogrammable logic shall have means for verifying that the software is in accordance with the relevant program documentation. 12.3.5 Use in safety-related functions Programmable electronic equipment shall not be used for category 0 emergency stop functions ... [this is where stopping is achieved by immediate removal of power to the machine actuators - each machine *must* be so equipped]. For category 1 emergency stop functions [where power is available to the actuators to achieve the stop, and is then removed] and all other safety-related stopping functions, the use of hardwired electromechanical components is preferred ...... These requirements shall not preclude the use of programmable electronic equipment for monitoring, testing, or backing-up such functions but this equipment shall not prevent the correct operation of these functions. NOTE: It is believed at present that it is difficult, if not impossible, to determine with any degree of certainty in situations when a significant hazard can occur due to the maloperation of the control system, that reliance on correct operation of a single channel of programmable electronic equipment can be assured. Until such time that this situation can be resolved, [!!] it is inadvisable to rely on the correct operation of such a single channel device. [So that is the perceived state of the art in standards: the problem is the software, (no mention of complexity being the root issue), and multiple channels are seen to solve the problem (although the sections on redundancy and diversity make no reference to software, so no guidance is given on their relative benefits). 12.3.1 says that the IEC 65A standards shall be followed, which may help, but I find it very depressing that our technology should be so apparently immature that new standards cannot find a way to define where and how it may be used.] [Exercise for the reader: compare and contrast the approach of this standard with UK Def Stans 00-55 and 00-56.] Martyn Thomas, Praxis plc, 20 Manvers Street, Bath BA1 1PX UK. Tel: +44-225-444700. Email: mct@praxis.co.uk] ------------------------------ Date: 8 Mar 91 20:49:08 GMT From: John Graham-Cumming Subject: Netware 286 Trojan Problem I'm managing (at least for the moment) a Novell network running Netware 286. I've recently realised that it is possible to pipe a file into the LOGIN command. This has the rather unfortunate affect that it is possible to write a Trojan horse which simulates login (it was quite easy in GW-BASIC plus a batch file on my system) which does not need to print the almost standard "Access denied." type message when pretending that a user has incorrectly typed his password; it doesn't need to pretend at all as it is possible to write a program that steals the password and performs a successful login. This makes the Trojan horse very hard to detect. Has anyone else had a similar problem with piping and LOGIN? Any simple solutions? John Graham-Cumming, Oxford University Computing Laboratory, Programming Research Group, 8 - 11 Keble Road, Oxford UK ------------------------------ Date: Wed, 8 May 91 23:54:21 EDT From: ark@europa.att.com (Andrew Koenig) Subject: Big Brother in the air The May 1, 1991 issue of The Aviation Consumer (a newsletter for pilots and airplane owners published by Belvoir Publications in Greenwich, Connecticut) has a remarkable article about transponders in general aviation aircraft. Airplanes, especially little ones, are mediocre radar reflectors because they're so far away from the radar transmitter and have lots of smooth curves. To make them easy to track, people install transponders. A transponder contains a receiver that picks up a radar pulse from the ground and a transmitter that blasts out a much stronger pulse in response. It also sends out twelve bits of data that the pilot can set by twiddling four knobs to select octal digits. This is called a "squawk code" and is a way of ensuring positive identification by ATC. When a pilot first calls a controller on the radio, the controller usually responds with something like "squawk 1776," which is a request for the pilot to set 1776 into the transponder dials. Presumably no other aircraft in the area is squawking 1776 (you're supposed to squawk 1200 if not talking to ATC), so the ATC radar can unambiguously track that particular airplane. Indeed, transponders are required in high- volume airspace. What the Aviation Consumer article says is that some pilots have found that their transponders have been modified so that they continue to operate even when turned off. Moreover, when in that state, they squawk a particular code unrelated to what's in their dials. Further research revealed that the modifications were made by US Customs agents as part of the "war on drugs." The idea is that they the Customs agents find airplanes that they think are likely to be used for drug smuggling and then modify those airplanes' transponders to make the airplanes easier to track. In particular, if the pilot flies the airplane with the transponder switched off, the magic squawk code sets off all kinds of alarm bells in the ATC office. They were unable to find out whether these clandestine transponder modifications were pursuant to court orders, or whether Customs was just targeting, say, a randomly selected population of Cessna 210 aircraft in southern Florida. [One interest to RISKS that might not be obvious is its close philosophical connection to the current debate about trapdoors in encryption systems.] ------------------------------ Date: Thu, 9 May 91 13:26:58 EDT From: wdr@wang.com (William Ricker) Subject: "Bugs" (was Re: Fences, bodyguards, and security) "We" /qua/ NATO may trail Fleming's Q's department. However, the erstwhile DDR had a very high-tech bug factor, which is one of the few eastern German factors successfully converting to the world market, I hear. The secret police's bug factory looked at its capabilities (so the radio reports (BBC? Christian Science Monitor? NPR? I forget) say) when unification came and decided to go into the hearing aid business, since their miniaturization technology, previously classified, beat anything the West's hearing-aid businesses had -- so they had some chance of maintaining their salaries & volume. Given some of the really small things already on the US market in hearing aids, if they really had *those* beat, their bugs *would* have been comparable to Bond. I haven't heard if they made a go of it. /s/ Bill Ricker wdr@wang.wang.com ------------------------------ Date: Thu, 9 May 1991 23:43:00 -0400 From: msb@sq.com (Mark Brader) Subject: Now which train am I part of? [I am forwarding this exchange to Risks; I don't have permission to identify the original writers. The second writer drives trains for a major North American railway -- Mark Brader] > The coal unit trains that operate around here (Cleveland OH) use > radio controlled mid-train helpers. When it's working right, the > radio control keeps everything in synch. When it's not, it makes > quite an interesting mess, viz. Rockport yard a year or two ago - > something went askew, and the helpers kept on doing their thing when > they shouldn'ta! Ground through the ball of the rail, among other things... When Locotrol I was first introduced to [railway], the system required a code number on the head-end to match the receiver on the slaves. The code number was any number chosen by the shop staff (that was the theory anyway). One day, a coal train was in the siding somewhere near [place in mountains]. He was sitting there, luckily, with a full train brake set. As it happened, someone had picked the same code number on two sets of remote equipment, and the guy approaching the siding was in Throttle 8 climbing a hill. The slaves on the guy in the siding picked up the Throttle 8 signal and tried to push the train out of the siding! They eventually (after grinding a few moons in the rail) got an overspeed and shut down... long day for the crew on that one... ------------------------------ Date: Fri, 10 May 91 18:14 MET From: "Herman J. Woltring" Subject: Where justice is done ... The recent commotion on the Dutch crackers/hackers might have been placed in a better context at the outset if the following News Analysis in the Dutch daily "NRC Handelsblad" of 23 April 1991 would have been quoted. It was written by Frank Kuitenbrouwer, standing legal commentator with that journal and (former) visiting professor of Law at Utrecht University, The Netherlands [and included here with his knowledge]. Under the heading "Computer Crime Bill gives the Justice Department too much elbow room", Mr Kuitenbrouwer wrote as follows [in Dutch -- the translation is my responsibility -- HJW]: Amsterdam, 23 April. Are The Netherlands lagging behind in the fight against computer crime? This suggestion was made in an article in the New York Times daily of last weekend where the alarm-bell chimed about Dutch "hackers" (i.e., crackers of computer systems) who allegedly had burglarized American Defense systems by remote entry from the university system SURFnet ["Samenwerkende Universitaire Reken Faciliteiten" -- Cooperating University Computation Facilities, a private foundation with central offices in Utrecht, albeit not at Utrecht University -- HJW]. This contemporary student exercise featured on Dutch TV a few months ago without causing too much commotion. "You see?", the Americans must think, "computer crime in The Netherlands is handled in the same way as drugs", but that is -- like the situation with drugs -- too simple a form of reasoning. Indeed, Dutch Law is silent on the item of "computer peace disturbance", while Germany, France, and recently also the U.K. have introduced new provisions in their penal codes. However, these were rather incidental adjustments of the Penal Code, while The Netherlands are working on a Total Program to guide the Dutch Penal Code into the digital era by one move, including new authorizations for the police and for the Department of Justice. The relevant Bill is already under consideration in Parliament, including a proposal to declare Computer Cracking a criminal offence. Whether such a special rule will have much effect is another matter. Many years ago a student at the former Twente University of Technology [now a full University in the North-East of The Netherlands -- HJW] explained man's irrepressible tendency to browse around in each other's computer homework in terms of "the instinct that leads man to play Mastermind". In 1987, an American criminologist, Erdwin H. Pfuhl from Arizona State University, qualified the 46 computer crime laws effectuated between 1975 and 1985 in the U.S.A. as "symbolic legislation". This was no exaggeration: by 1987, not a single prosecution had been initiated on the basis of the new provisions. However, Pfuhl found 58 criminal cases on computer misuse based on existing rules such as forgery. Also in The Netherlands, this has proven possible. Pfuhl mentioned some interesting reasons for this lack of impact of the Penal Code: -- Incidentalness: Computer misuse continues to be a rather rare event; curiosity battles with disapproval for priority. -- Positive image of the perpetrators. The term "Rust effect" is sometimes used (after the young German sports aviator who outwitted Russian Air Defense); the serious nature of the reproach is tempered by appreciation for the stunt. Actually, many a serious informatician started out as a hacker: "Experience as a hacker is a valuable asset on a CV", according to the Dutch computer journal Computable. -- The victims are usually institutions and thus don't strongly appeal to one's imagination. Besides, although most publicity is directed to the hackers, it would appear that most computer misuse originates from within by far, from one's own organisation or enterprise. In this way, all computer misuse is brought within the gray zone of white-collar crime and of the concomitant company cultures. -- The element of play. Already the Dutch author Johan Huizinga taught in his "Homo Ludens" [the Playing Man (1938); Huizinga was a historian and professor, first in Groningen and later in Leyden -- HJW] that playing has no moral function: it resides *outside* the conventional antitheses: wise or stupid, true or false, good or bad. (Government) automatization a game? Indeed, one representative spoke about "young branches of sports" during the parliamentary deliberations of 15 February 1986. Progress in automatization is often said to depend on unconventional solutions, on prospecting border lines, including normative ones. -- Own fault. Lack of elementary precautions in the systems under attack. Last year, the German Accounting Office investigated an army computer facility. On the roof, an antenna was found that led to the basement, but nobody could explain the thing's purpose. Even though unauthorised access was duly logged by the system, those signals were swamped in more than 1000 reports per 24 hours, according to system personnel. The Netherlands wish to make penalization of computer peace disturbance dependent on the requirement of a "clear threshold"; this offense can only lead to a conviction if the attacked system is secure [cf. "Computers at Risk -- Safe Computing in the Information Age", US National Research Council, National Academy Press, Washington DC 1991, ISBN 0-309-04388-3 -- HJW]. Thus, the ball is bounced back in the case of the attacked American computers. Is this not again one of those typical Dutch idiosyncracies? Not at all: back in 1985, the Federal Republic of Germany included the requirement of "special security" in its penalization of computer peace disturbance. In that country, the relativity of the computer crime provisions became most obvious in the case of the three hackers who "by order of the KGB" [the quotes are mine -- HJW] had burglarized into American networks. Early 1990 they were convicted to conditional punishments only [cf. Clifford Stoll's "Stalking the Wily Hacker", Comm. of the ACM 31(5), 484-497, May 1988, reprinted in Dunlop & Kling, "Computerization and Controversy -- Value Conflicts and Social Choices, AP 1991 -- HJW]. The judge did not even consider computer peace disturbance and confined himself to third-rate espionage only. Even in the U.S.A., Robert Morris -- who virtually brought a large research network to a stand-still in November 1988 because of of a virus prank that went out of control -- got off with a conditional punishment. Precisely because of the symbolic nature of many computer crime laws, there is rather much reason to fear that the ambitious way the matter is tackled in The Netherlands may be worse than the problem. Certainly, the investigatory authorizations for the authorities go much further than, e.g., the recommendations of the Council of Europe. There is a real danger that these authorizations will be used by the Justice Department for some extensive fishing in the electronic data processing pond [this alarm was already rung on 20 May 1988 in the same journal by Richard de Mulder, professor of criminal law at Erasmus University in Rotterdam, chairperson of the task force on computer crime of the Netherlands Society for Computers and Law -- HJW]. Such fishing does not have to be confined to direct misuse of information technology but could concern anything that the Justice Department finds interesting. Even the most spectacular computer hack threatens to pale before the fishing expeditions made possible under the Dutch Computer Crime Bill. ---------- I might add that the latter point becomes particularly striking with the recent RISKS posting on allegedly illegitimate house searches and impoundings by the USA's Secret Service, even more so as the Dutch Bill provides the gouvernment with extensive rights to protect its own secrets. At a previous occasion, I intimated that those provisions remind more of the U.K.'s Official Secrets Acts (where everything is secret unless officially released, with official `D-notices' sent to editors etc. if something has leaked out) than of the USA's Freedom of Information Act pursuant to its First Amendment under the Bill of Rights, Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech or of the press; or the right of the people peaceably to assemble and to petition the Government for a redress of grievances (Art. 1). In my mind, `freedom of speech and of the press' presupposes equitable access to relevant information: without such access, this freedom becomes an empty shell. While I am not an admirer of some ways that things are run in North America (where I used to live for two years, with one daughter born there who now has dual citizenship), I find this provision particularly appealing. Fortunately, the constitutions of both countries provide sufficient leeway to partake in the legislative debate before the gouvernment goes fully out of control. However, denouncing the authorities as `the bad guys' should not diminish concern for other, less public invaders of our electronic privacy! Going back to the networking situation, there are a few datasets on (im)proper rules of conduct; those who are interested in scanning those, may wish to send the following line to NETSERV@BITNIC.BITNET in New York: GET CONDUCT CODE For more commercially oriented rules, you might send the following three lines to LISTSERV@BITNIC.BITNET: GET LEGAL COMMERCE GET LEGAL GTDA GET LEGAL COUNSEL In particular, the LEGAL COMMERCE file from the US Department of Commerce states that remote computer access in BITNET is possible for file transfer only (using commands like the above GET filename filetype), but not for remote login. Thus, hacking on BITNET seems possible only for those files that have not been protected against remote retrieval, similar to ftp on the Internet. I am not certain about the EARN situation, but SURFnet certainly provides the possibility for remote login via telnet etc., both nationally and internationally, leaving it to the individual nodes' discretion to inhibit this feature locally, either in general or from specific sources. This, in my mind, is the proper approach; however, some care my be necessary in the long run to prevent hackers from outsmarting the protection schemes. Otherwise, one may become exposed to similar tricks as have been succesful for free long-distance calls on the telephone system. Herman J. Woltring, Eindhoven/NL ------------------------------ End of RISKS-FORUM Digest 11.65 ************************