RISKS-LIST: RISKS-FORUM Digest Wednesday 7 September 1988 Volume 7 : Issue 45 FORUM ON RISKS TO THE PUBLIC IN COMPUTERS AND RELATED SYSTEMS ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator Contents: Cheater software (Rodney Hoffman) Re: COMPASS REPORT (Nancy Leveson) Re: Risks Digest 7.44 (Jerome H. Saltzer) Display of telephone numbers (Bruce O'Neel) Telephones and privacy (C.H. Longmore) Gambling with video arcade machines (Mike Blackwell) Video Games (Ed Nilges) Wembley-on-the-Motown (Jeffrey R. Kell) The RISKS Forum is moderated. Contributions should be relevant, sound, in good taste, objective, coherent, concise, and nonrepetitious. Diversity is welcome. CONTRIBUTIONS to RISKS@CSL.SRI.COM, with relevant, substantive "Subject:" line (otherwise they may be ignored). REQUESTS to RISKS-Request@CSL.SRI.COM. FOR VOL i ISSUE j / ftp kl.sri.com / login anonymous (ANY NONNULL PASSWORD) / get stripe:risks-i.j ... (OR TRY cd stripe: / get risks-i.j ... Volume summaries in (i, max j) = (1,46),(2,57),(3,92),(4,97),(5,85),(6,95). ---------------------------------------------------------------------- Date: 6 Sep 88 08:06:50 PDT (Tuesday) From: Rodney Hoffman Subject: "Cheater software" >From a story by Kim Murphy in the Sept. 3 'Los Angeles Times': General Dynamics Corp. was accused of using "cheater software" and other fraudulent practices to falsify tests and supply defective components for the U.S. Navy's Phalanx anti-missile gun system and the Standard Missile program. In a lawsuit filed in Los Angeles federal court, one former and four current General Dynamics employees -- technicians, supervisors and quality-control specialists -- accused the St. Louis-based defense contractor of encouraging its employees to engage in widespread test falsifications that may have compromised the integrity of the two weapons systems.... A "cheater software" computer program allows company technicians to begin running a test, then abort it and obtain a passing reading, the suit contends.... [more on other, non-software-related means used to falsify tests] ------------------------------ Date: Tue, 06 Sep 88 16:58:42 -0400 From: leveson@electron.LCS.MIT.EDU Subject: "COMPASS REPORT in RISKS 7.40 (Bev Littlewood via Brian Randell)" Re: RISKS-7.44 Wait a minute. I have been tarred with an opinion that I do not have and have never espoused. >Nancy Leveson's remarks are just as bad, although possibly more amusing. >Nancy told the story of her encounter with an engineer who wanted to know >what number to fill in for the probability of the event labeled "software >failure" on a fault tree. Leveson tried to explain that there was no number >but he persisted. Finally she answered, "Just write 1.0." > >Leveson is LITERALLY correct - the software will ultimately fail with certainty >- but the engineer is asking a responsible question. He wants to know how >frequently the software will fail so that he can make scientifically informed >judgments to aid in the engineering decisions he must take. I did not say his question was irresponsible, only that it was unanswerable at this time with the confidence that he wants. Since we cannot measure such low reliability numbers, then these types of systems should not be built to depend on the correct operation of the computer, i.e., the fault tree should take the conservative view of assigning 1.0 to the probability of failure of the software and, therefore, the builders will be required to show that the system is safe even if the software fails. For example, a fully automated nuclear power plant safety system or fly-by-wire aircraft might be considered safe enough if there are usable and reliable back-up systems that do not rely on proper operation of a computer. The problem with the Airbus 320 appears to be that they are relying on the computers and not taking the need for back-up systems seriously. The same, unfortunately, is also true for the nuclear power plant design that the engineer I was speaking with was evaluating. >Thus far I suppose we are not in too great disagreement with the likes of >Cullyer and Leveson. How did my opinions on the subject (which were never stated in the original message from Jon Jacky) and John Cullyer's get lumped together? I am not usually accused of agreeing with anyone :-). >They seem to think that this is due in some way to defects in statistical >methodology and that we should therefore have no more truck with statistics >(and statisticians?). I have never made such a statement or implied this. I am in favor of research in software reliability (and safety) measurement and in its use currently for systems that do not involve loss of life and therefore do not require very low numbers and high confidence. >Improvement of statistical methodology will not >be able to dent this problem. But that does not mean that Cullyer-type >'certification' can be used instead (unless he means by this a assurance that >the failure rate is ZERO - and I do not believe this is the case). Rather >it means that we are in a genuine impasse, and perhaps ought to face the >unpalatable view that we should not be building systems which require a >reliability which is not assurable. Most systems are not 100% safe, nor does society require them to be. We usually require only an acceptable level of safety. The problem is in Bev's equating failure-free and safe. Making them failure-free would certainly work, but it is not necessary. For example, aircraft are not failure-free, but they seem to have a safety level that is acceptable to society. What we need to eliminate is catastrophic failure modes, not necessarily ALL failure modes. If we had to do the latter, we would need to abandon much of current technology. I personally feel that there are other solutions (which I have written extensively about) than just not building systems with computers. That is, I believe we can build adequately safe systems without requiring 10**-9 reliability. Unfortunately, not many software engineers know enough about system safety to build such software systems and the system safety engineers do not understand computers. JANET = Brian_Randell@uk.ac.newcastle ------------------------------ Date: Tue, 6 Sep 88 13:47:33 EDT From: Jerome H. Saltzer Subject: Automatic Number ID: Great Idea! (RISKS-7.44) In "Automatic Number ID: Great Idea!", Patrick Townson makes several good arguments favoring Automatic Number Identification (ANI). I agree that on balance ANI will be a good thing once the novelty wears off and people become accustomed to the new rules of the game. But Townson may be carrying a good argument a little too far when he says, > As for legitimate reasons to not want your number displayed to > the called party, I can't think of any. I assume that he took that somewhat polar position in order to draw out suggestions for legitimate reasons, so here are a couple of cases in which maintaining the privacy of the caller does seem to make some sense: 1. Hotlines (e.g., drug-abuse and suicide) and police department tip numbers depend on anonymity of the caller to perform a function that is usually considered to have some value to society. Some police departments maintain a line separate from 911 (which often has an ANI feature) just for this purpose. If the caller of a hotline knew that the calling number would be automatically recorded, at least some of the information that flows in this way would dry up, and some of the help dispensed this way would not be. (The technique of dialing a prefix code to block automatic number identification caters to this requirement. I doubt that many hotlines would take Townson's hard-nosed approach and refuse to accept a call from a prospective suicide who has blocked ANI.) 2. When a private party calls on a "big organization," (for example, making ten queries to stock trading companies about their commission rates in anticipation of opening one account) there is an understandable preference for not leaving one's number, simply to avoid unwanted followup calls (e.g., from hungry brokers). Again, the ANI-blocking prefix satisfies this requirement, because no hungry stockbroker is going to refuse a call that sounds like it comes from a promising prospect. Townson's polar position might be plausible if you assume telephones are answered only by private individuals. He is well-advised to refuse anonymous calls to his bulletin board and welcome to refuse them at his private phone. But I believe that the need for blocking ANI remains for other situations. Jerry ------------------------------ Date: Tue, 06 Sep 88 17:30:25 EDT From: Bruce O'Neel Subject: Re: Display of telephone numbers on receiving party's phone I much prefer using a prefix ( *21 say) only when you WANT the number to be known, rather than when you DO NOT want the callee to see it. bruce ------------------------------ Date: Tue, 6 Sep 88 20:28+0100 From: C H Longmore Subject: Re: Telephones and Privacy Patrick Townson's article in RISKS 7.44 states: > Having ANI implemented will simply make it too inconvenient for most of the > low-life scum who hide behind their telephone to continue their practices. > As for legitimate reasons to not want your number displayed to the called > party, I can't think of any. Again, you have to make the analogy of going > to see someone in person. It is completely unfair and unrealistic to say > that you have the right to disturb someone at whatever they were doing and > that they in turn have no right to demand to know who you are. How could you apply this to a situation where [as in the UK] certain police forces operate systems whereby people can give information to the police *anonymously* by calling a device as simple as an answering machine? How could you apply it to a situation where a potential customer wishes to obtain a quote by phone *without* running the risk of that company using the information so gained to apply the hard-sell. Can you imagine someone using a confidential medical advice line (such as an AIDS advisory service) if there was a possibility of the call being easily traced? How many people would telephone up the Samaritans if their number wasn't confidential? In the UK these are not problems.... yet. Our current telephone network is not capable of supporting these features.... yet. It *should* be possible to conceal your own telephone number from the person you are calling.. however, it is also the right of the person receiving the call to refuse to communicate with anybody who does not want his/her telephone number revealed. The latter is easy enough to implement.... a simple user-settable switch on the telephone is all that is needed. The 'privacy' argument has two sides.... it is the right of an individual *not* to have their phone number displayed, but it is also the right of the individual *not* to answer anonymous calls. A problem to which the solution seems easy enough.... (now prove otherwise!) Conrad H Longmore Computer Science Dept, University of Birmingham, Birmingham B15 2TT, UK. email: CCAse7-16%multics.bham.ac.uk@cunyvm.cuny.edu ------------------------------ Date: Tue, 6 Sep 1988 17:34-EDT From: Mike.Blackwell@ROVER.RI.CMU.EDU Subject: Gambling with video arcade machines I was once called to be an expert witness in a case involving gambling with video poker machines. The case never went to trial, but I did gain some insight as to how they work. In a typical video card machine, you put in your quarter, and earn one card hand (five card draw or blackjack are the most common). You play the hand (discarding and drawing new cards - no betting here, you just get one shot), and at the end win so may points for your final hand (from zero for a bad hand, one for two-pair, two for three-of-a-kind, up to maybe 500 for a royal-flush). In a real gambling machine (like in Vegas), you'll get quarters for your points - they spit right out of the machine. In a non-gambling arcade machine, you just get free games for your points. This is legal, no different than winning free games or extra time at Space Invaders (in Pennsylvania, at least, playing video poker is considered to require skill...). What makes the arcade style card machine a gambling device is the addition of a "knock-off" switch. This is a switch, either directly on the machine, or wired from the bar, which zeros out the any free games. How it works is you rack up your zillion free games, find the barkeeper (or whoever, but this seems to usually take place in bars), and he'll pay you one quarter for each free game you've won. Then he'll clear the free games from the machine. It is solely the presence of this knock-off switch that classifies the machine as a gambling device. Apparently (though I was never able to confirm this), simply power cycling the machine does not clear free games. Even if a bar is not caught paying off, the game distributor can be tried for providing machines with knock-off switches. -m- ------------------------------ Date: Tue, 06 Sep 88 13:27:04 EDT From: Ed Nilges Subject: Video Games How many computer professionals have noticed the continual technical improvement of video games in the past couple of years, and the concomitant decline of their social and moral content? Nintendo and other Japanese companies, with little knowledge or care about the effect of racial tensions on American cities, market games such as Ninja Warriors vs. Bad Dudes, which feature a Caucasian-looking (or Japanese) hero, fighting black villains. The video game user manipulates the characters in a seedy back-alley environment featuring garbage cans and rats...an offensive comment, on the face of it, on the state of the inner city. Other games allow players to act as contra rebels or air force pilots, without teaching them either the misery of dying in a jungle, or the difficulty of qualifying to be an Air Force enlisted man, let alone pilot. The graphics are beautiful: the content is vile. Surely it's our responsibility as computer professionals to protest this application of a technology which could improve lives. While the Japanese are to be applauded for making technology affordable, they are to be condemned for taking technology developed in the US, and using it to degrade people in this way. Perhaps the most effective thing for computer people to do is to approach restaurant owners in their own community who operate such machines and explain their concerns as professionals and, if applicable, as parents. Video game parlor operators are obviously not going to listen, but restaurant owners may. ------------------------------ Date: Tue, 06 Sep 88 10:11:27 EDT From: Jeffrey R Kell Subject: Re: Wembley-on-the-Motown (RISKS 7.42) (I play keyboards/synthesizer part-time aside from my "real" job...) Any electronic musical equipment, especially anything constantly moved with a traveling show, is far from being highly reliable. The newest electronic keyboards are probably the most sensitive of all. You are fortunate to get predictable results in a controlled (studio) environment, let alone have a delicate piece of equipment be thrown around by the road crew, plugged into unstable power sources and subjected to varying temperature conditions. Original synthesizers (Moog, Arp) with their knobs, switches, and patch cords took some time to set properly, but they did STAY SET, other than perhaps small tuning adjustments. The second generation could record these settings in small internal memories, thus making the first synthesizer "programs". The third generation was practically all digital (notably Korg, Yamaha) with settings programmed as parameters, and internal memory grew to several K-bytes of RAM, often with battery backup and cassette backup. The latest generation recreates sounds from digital samples, much like a CD Disk player. The Synclavier mentioned is one of the top-of-the-line of these types. I own a Korg DSS-1 (standard disclaimers) which has 768K RAM, a 3.5 diskette drive, and can turn out a 48KHz sample rate at 14 bit resolution. Very nice, but one minor power glitch and memory is lost must be rebooted (about 30 seconds, but rather annoying in the middle of a song). I was once stuck on an out-of- town job with two third-generation Poly-800's which were packed by a member of the road crew and inadvertently left turned on (they can operate from batteries alone) resulting in loss of memory (and cassette backup at home). The risk of high-tech music can also be heard in CD disks. They sound superb when they work, but go far off into the ozone when they fail. Quite a far cry from the brief bump of a phonograph needle skipping a groove. I have caught myself longing for the days of the trusty piano which, with a tuning or two a year, always got the job done. At least in that respect, I find it easier to relate to the techno-phobes who distrust automation. Jeffrey R Kell, Dir Tech Services, Admin Computing, 117 Hunter Hall, Univ of Tennessee at Chattanooga, Chattanooga, TN 37403 ------------------------------ End of RISKS-FORUM Digest 7.45 ************************