4-Dec-85 17:21:08-PST,12726;000000000000 Mail-From: NEUMANN created at 4-Dec-85 17:16:59 Date: Wed 4 Dec 85 17:16:59-PST From: RISKS FORUM (Peter G. Neumann, Coordinator) Subject: RISKS-1.26 Sender: NEUMANN@SRI-CSL.ARPA To: RISKS-LIST@SRI-CSL.ARPA RISKS-LIST: RISKS-FORUM Digest Wednesday, 4 Dec 1985 Volume 1 : Issue 26 FORUM ON RISKS TO THE PUBLIC IN COMPUTER SYSTEMS Peter G. Neumann, moderator Contents: Humility (Matt Bishop) Reliable Computer Systems (Jim Horning) Electromagnetic Interference (Peter G. Neumann) Hackers (Thomas Cox) "The Hacker Game": Is it simulating security of *REAL* machines? (Ted Shapin) Unexpected load on telephone trunks (Ted Shapin) Summary of Groundrules: The RISKS Forum is a moderated digest. To be distributed, submissions should be relevant to the topic, technically sound, objective, in good taste, and coherent. Others will be rejected. Diversity of viewpoints is welcome. Please try to avoid repetition of earlier discussions. (Contributions to RISKS@SRI-CSL.ARPA, Requests to RISKS-Request@SRI-CSL.ARPA) (FTP Vol 1 : Issue n from SRI-CSL:RISKS-1.n) ---------------------------------------------------------------------- Date: 2 Dec 1985 0926-PST (Monday) From: Matt Bishop Organization: Research Institute for Advanced Computer Science Address: Mail Stop 230-5, NASA Ames Research Center, Moffett Field, CA 94035 Phone: (415) 694-6363 [main office], (415) 694-6921 [my office] Current-Disease: Chocoholism To: RISKS@SRI-CSL.ARPA Subject: Humility In Risks 1.25, you wrote a very good article pleading for greater humility. I'd like to add a little to that. Very often a solution is proposed which alleviates the symptom, but aggravates the cause, of the problem. (Draw your own examples, folks -- the best ones are political, and I'm not touching THOSE with a ten-foot pole!) Unfortunately, those are often the most appealing because they let us forget, for a time, that the problem exists. When it returns, the symptoms are different but the root cause is still there -- and more rotten than ever. As another thought, I've found that in order to ask the question that leads to a solution for a problem you have to know most of the answer already -- it's merely a matter of synthesizing the various parts into a whole. (As an example, Riemannian geometry existed before Einstein put it to use; it was a mathematical toy, done to prove the Fifth Postulate was just that, a postulate.) But for all non-technical problems, science alone cannot provide the answers -- it can provide techniques for solving the technical components, but no more. And when people forget this, disaster follows, because science is used to treat the result, rather than the cause. (Incidentally, "science" is not the culprit. The same thing happens in spheres where science takes a back seat to ethics and morality -- and what I said still applies. No one discipline can provide a complete answer to any non-technical problem. Unfortunately, an incomplete, but complete-looking, answer can usually be obtained from any discipline -- and this is what we must avoid doing!) Matt ------------------------------ From: horning@decwrl.DEC.COM (Jim Horning) Date: 2 Dec 1985 1354-PST (Monday) To: RISKS@SRI-CSL.ARPA Subject: Reliable Computer Systems Although reliability is only part of risk assessment, it is an important one. I would like to bring to the attention of this forum a book to which I made a modest contribution. ``Reliable Computer Systems: Collected Papers of the Newcastle Reliability Project,'' edited by Santosh K. Shrivastava, Springer-Verlag, 1985, xii + 580 pages, ISBN 0-387-15256-3 (New York) and 3-540-15256-3 (Berlin). This volume brings together in one place more than 30 papers by more than 20 authors reporting more than a decade of research on reliability. It contains papers that survey the issues, define terminology, propose partial solutions, and assess the state of the art. Jim H. ------------------------- From the introduction by Brian Randell: "The origins of the project can be readily traced back ... to my participation in the 1968 NATO Conference on Software Engineering. Quite a number of the attendees have since remarked on the great influence this conference had on their subsequent work and thinking. This was certainly true in my case. ... One major theme of the conference was the great disparity between the level of reliance that organizations were willing to place on complex real time systems and the very modest levels of reliability that were often being achieved-- for example, it was also at about this time that there was considerable public debate over the proposed Anti-Ballistic Missile System, which we understood was to involve relying completely on a massively complicated computer system to position and detonate a nuclear device in the upper atmosphere in the path of each incoming missile! "At the NATO Conference there was thus much discussion about improved methods of software design, though there was a mainly implicit assumption that high reliability was best achieved by making a system fault-free, rather than fault-tolerant. Another much-debated topic concerned the practicality of attempting to provide rigorous correctness proofs for software systems of significant size and complexity. Such discussions, I am sure, played a large part in ensuring that ... I was seeking to do something constructive about the problems of achieving high reliability from complex computing systems, and yet, was feeling rather pessimistic about the practicality of proving the correctness of other than relatively small and simple programs. ... "From the start, our aim was to study the general problems of achieving high reliability from complex computing systems, rather than concentrate on problems specific to a particular application area or make of computer. Quoting from the original project proposal: `The intent is to investigate problems concerned with the provision of reliable service by a computing system, notwithstanding the presence of software and hardware errors. The approach will be based on the development of computer architecture and programming techniques which facilitate the structuring of complex computing systems so that the existence of errors can be detected and the extent of their ramifications be determined automatically, and so that uninterrupted service (albeit probably of degraded quality until the faulty hardware or software is repaired) can be provided. ... It is clear that for the foreseeable future, the designers of large-scale computing systems will not be able to achieve adequate system reliability by depending entirely on the reliability of the hardware and software components which make up their system.' ... "We started by studying the problems of difficult faults in (relatively) simple systems and then gradually increased the difficulty of the systems that we were prepared to consider." ------------------------------ Date: Wed 4 Dec 85 17:11:18-PST From: Peter G. Neumann Subject: Electromagnetic Interference To: RISKS@SRI-CSL.ARPA On page 30 of the Dec 85 issue of the IEEE Spectrum is an article entitled "Taming EMI in microprocessor systems". It begins as follows. It was late one summer afternoon in 1983, near the end of the day shift at a steel plant in the eastern United States. An operator using a new radio link was guiding the last ladle of molten steel as it moved along an overhead track from the blast furnace to the ingot molds. Soon the end-of-shift horn would sound. Without warning, the ladle tipped prematurely as it neared the molds, pouring hot steel on the floor and on some of the workers. One worker was killed, and four were seriously injured in this accident. After an investigation, electromagnetic interference (EMI) was blamed for the tipping of the ladle. Reflections from a scaffolding reinforced the field produced by the transmitter's antenna, producing a signal that was received by a drop cord acting as an antenna, interpreted by the cord's switch circuitry, which triggering the switch circuit to pour the molten steel. This sounds a little like Rube Goldberg, but is another example of the EMI problems discussed in RISKS-1.19 and revisited in .23 and .24. PGN ------------------------------ Date: Tue, 3 Dec 85 18:58:11 cst From: ihnp4!gargoyle!sphinx!benn@ucbvax.berkeley.edu (Thomas Cox) To: ihnp4!risks Subject: Hackers Over the course of the year I have been reading thousands of articles from all print sources that relate to computers. Hundreds have been on crime and security. Dealing with these articles is part of a job I hold. The "cracking" done by so-called hackers, i.e. young computer hobbyists, falls into a very few categories. 1. a non-password-secured system is "broken into". 2. a password is stolen. 3. a credit card number is transmitted VIA COMPUTER for illegal use. 4. a particular string of crackings was perpetrated by some young people who systematically searched for computer mainframes of a certain make and model. They then used the factory-installed password "system" (or some such) that was not removed by the end-user support staff. 5. bypassing telephone company billing circuits to make "free" calls. Please notice that 1. no password-protected system is EVER likely to be broken into by so-called hackers. They can sit and guess, just like they can try and guess the combination to my bike lock. I'm not worried about it. 2. most so-called computer crime has been nothing other than the TRANSMISSION of illegally-attained credit card numbers, Sprint account numbers, and the like. The claims regarding "encouraging kids to hack" is simply garbage. Encourage them all you want. They will eventually give up, because it isn't going to work. That is what the behavioral scientists call "extinction" of a behavior: it never works, and eventually isn't repeated any more. Like trying to jump so you can fly. Sincerely, Thomas Cox ...ihnp4!gargoyle!sphinx!benn [This is likely to generate various responses -- from hackers whose good name is being besmirched , from serious crackers and secure operating systems folks who know that most operating systems are so vulnerable that they can be cracked in many ways other than those mentioned above, and more. But you should all beware of the head-in-the-sand view that everything is just fine <"There are no real risks!">, which you might continue to hold until you are ostrichized for shortsightedness following a technologically based breakin. PGN] ------------------------------ Date: Mon 2 Dec 85 10:34:02-PST From: Ted Shapin Subject: "The Hacker Game": Is it simulating security of *REAL* machines? To: CMP.WERNER@R20.UTEXAS.EDU cc: risks@SRI-CSL.ARPA, human-nets@RED.RUTGERS.EDU Phone: (714)961-3393; Mail:Beckman Instruments, Inc. No, I heard the game is a maze type game, not a simulation of security on any real system. The advertisement is just hype to sell the game. Ted. ------------------------------ Date: Tue 3 Dec 85 14:07:40-PST From: Ted Shapin Subject: Unexpected load on telephone trunks To: risks@SRI-CSL.ARPA, telecom@MIT-XX.ARPA Phone: (714)961-3393; Mail:Beckman Instruments, Inc. Mail-addr: 2500 Harbor Blvd., X-11, Fullerton CA 92634 Message-ID: <12164247581.25.BEC.SHAPIN@USC-ECL.ARPA> In the previous posting, which I forgot to include here, a complaint was raised about BYTEnet Listings, BYTE magazine's BBS system, indicating that the person was unable to get through. Only too true! BYTEnet Listings was responsible for shutting down the long-distance access to the ENTIRE state of New Hampshire some months ago, due to the enormous number of calls they received. To get the Public Domain HOPE or PROLOG, you should first try your local BBS systems, or the BBS run by Computer Languages magazine... Mike Farren uucp: {dual, hplabs}!well!farren Fido: Sci-Fido, Fidonode 125/84, (415)655-0667 USnail: 390 Alcatraz Ave., Oakland, CA 94618 ------------------------------ End of RISKS-FORUM Digest ************************ -------