10-Sep-86 08:30:05-PDT,9145;000000000000 Mail-From: NEUMANN created at 10-Sep-86 08:27:56 Date: Wed 10 Sep 86 08:27:56-PDT From: RISKS FORUM (Peter G. Neumann -- Coordinator) Subject: RISKS-3.53 Sender: NEUMANN@CSL.SRI.COM To: RISKS-LIST@CSL.SRI.COM RISKS-LIST: RISKS-FORUM Digest, Wednesday, 10 September 1986 Volume 3 : Issue 53 FORUM ON RISKS TO THE PUBLIC IN COMPUTER SYSTEMS ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator Contents: Hardware/software interface and risks (Mike Brown) More on Upside down F-16s (Mike Brown) "Unreasonable behavior" and software (Gary Chapman) Re: supermarket crashes (Scott Preece) The RISKS Forum is moderated. Contributions should be relevant, sound, in good taste, objective, coherent, concise, nonrepetitious. Diversity is welcome. (Contributions to RISKS@CSL.SRI.COM, Requests to RISKS-Request@CSL.SRI.COM) (Back issues Vol i Issue j available in CSL.SRI.COM:RISKS-i.j. Summary Contents in MAXj for each i; Vol 1: RISKS-1.46; Vol 2: RISKS-2.57.) ---------------------------------------------------------------------- Date: Tue, 9 Sep 86 09:48:31 edt From: mlbrown@nswc-wo.ARPA To: risks@csl.sri.com Subject: Hardware/software interface and risks In RISKS 3.51 Bill Janssen writes of errors made in failing to consider the interaction of the hardware and the software under design. This failing was all too common in the writing of assembly and machine code during the early days of programming. Discrete wired machines often had OP codes that were not generally well known (i.e. the computer designers kept it secret). Interestingly, these unknown OP codes were included when more modern machines emulated the original discrete design. An excellent example is the old IBM 4-PI CP-3 and the IBM 360. The hardware/software interface within the machine can create significant problems from both a software safety and software security standpoint. Software designers will have to have an increasingly detailed knowledge of the total system to produce the safe, secure software that critical systems will require. ------------------------------ Date: Tue, 9 Sep 86 10:00:00 edt From: mlbrown@nswc-wo.ARPA To: risks@csl.sri.com Subject: More on Upside down F-16s In RISKS 3.52 Jon Jacky writes: >..it sounds like the right solution is to remind the pilots not to attempt > obviously destructive maneuvers. ...if you take the approach that the > computer is supposed to check for and prevent any incorrect behavior, then > you have saddled yourself with the task enumerating every possible thing the > system should not do." Perhaps a solution is to remind the pilots not to attempt obviously destruc- tive maneuvers however, relying on procedures to eliminate or reduce the risk of hazards is the least acceptable way. Pilots are human and as such are prone to making errors. Look at the safety record for general aviation and the Navy - both are dismal and are often reported to be due to pilot error. Its fine to tell the pilot "Lower your wheels before you land, not after" but we still have gear up landings. We should not concern ourselves with checking for and preventing any incorrect behavior but we should preclude that behavior which will result in damage to or loss of the aircraft or the pilot. We do not need to anticipate every possible mistake that he can make in this regard either - all we need to do are to identify the hazardous operational modes and prevent their occurrence. Mike Brown, Chairman Triservice Software Systems Safety Working Group ------------------------------ Date: Tue, 9 Sep 86 14:28:24 pdt From: Gary Chapman Subject: "Unreasonable behavior" and software To: RISKS@CSL.SRI.COM Jon Jacky wrote: I detect a notion floating about that software should prevent any unreasonable behavior. This way lies mad- ness. Do we have to include code to prevent the speed [of an F-16] from exceeding 55 mph while taxiing down an interstate highway? I certainly agree with the thrust of this. But we should note that there is plenty of evidence that coding in prohibitions on unreasonable behavior will be required, particularly in the development of "autonomous" weapons that are meant to combat the enemy without human "operators" on the scene. Here's a description of a contract let by the United States Army Training and Doctrine Command (TRADOC), Field Artillery Division, for something called a "Terminal Homing Munition" (THM): Information about targets can be placed into the munitions processor prior to firing along with updates on meteorologi- cal conditions and terrain. Warhead functioning can also be selected as variable options will be available. The intro- duction of VHSIC processors will give the terminal homing munitions the capability of distinguishing between enemy and friendly systems and finite target type selection. Since the decision of which target to attack is made on board the weapon, the THM will approach human intelligence in this area. The design criteria is pointed toward one munition per target kill. (I scratched my head along with the rest of you when I saw this; I've always thought if you fire a bullet or a shell out of a tube it goes until it hits something, preferably something you're aiming at. But maybe the Army has some new theories of ballistics we don't know about yet.) As Nancy Leveson notes, we make tradeoffs in design and functionality for safety, and how many and what kinds of tradeoffs are made depends on ethical, political and cost considerations, among other things. Since, as Jon Jacky notes, trying to prohibit all unreasonable situations in code is itself un- reasonable, then one wonders what sorts of things will be left out of the code of terminal homing munitions? What sorts of things will we have to take into account in the code of a "warhead" that is supposed to find its own targets? What level of confidence would we have to give soldiers (human soldiers--we may have to get used to using that caveat) operating at close proximity to THMs that the things are "safe"? I was once a participant in an artillery briefing by a young, smart artillery corps major. This officer told us (a bunch of grunts) that we no longer needed "forward observers," or guys attached to patrols to call in the ranges on artillery strikes. In fact, said the major, we don't need to call in our artillery stikes at all--his methods had become so advanced would just know where and when we needed support. We all looked at him like he had gone stark raving mad. An old grizzled master sergeant who had been in the Army since Valley Forge I think, got up and said, "Sir, with all due respect, if I find out you're in charge of the artillery in my sector, I will personally come back and shoot you right between the eyes." (His own form of THM "approaching human intelligence", no doubt.) (I wouldn't be surprised if this major wrote the language above.) What is "unreasonable" behavior to take into account in coding software? The major's or the sergeant's? -- Gary Chapman ------------------------------ Date: Mon, 8 Sep 86 09:54:06 cdt From: "Scott E. Preece" To: RISKS@CSL.SRI.COM Subject: Re: supermarket crashes > From: mogul@decwrl.DEC.COM (Jeffrey Mogul) > I don't often shop at that market, partly because the markets I do use > have cashiers who know what things are rather than relying on the > computer. Some day, just for fun, I might mark a pound of pecans with > the code number for walnuts, and see if I can save some money. ---------- Does the word "fraud" mean anything to you? Even if your pet cashier can tell at sight a pound of peanuts from a pound of walnuts, I don't see any reason to assume he would know what the correct price of either was or even which was more expensive on a particular day. The cashier is just as dependent on price stickers in a piece marked store as the scanner is on the UPC label in a scanner store. If I were designing a cash register, I'd make sure it could retain the current session through a power outage (no re-ringing the stuff already in the bags), but I don't think I'd require it to work while the power was off. Personally, if I were in a store when the power went out, I would leave quickly. If power loss is COMMON in the area where the store is built, the designers should work around it (perhaps by providing battery-powered scanners or emergency backup power); in my neighborhood I think it's reasonable to write off as a minor inconvenience -- the speed and efficiency of the scanners when the power is on is a more than reasonable trade for the inconvenience the tiny part of the time it isn't. -- scott preece gould/csd - urbana uucp: ihnp4!uiucdcs!ccvaxa!preece arpa: preece@gswd-vms ------------------------------ End of RISKS-FORUM Digest ************************ -------