RISKS-LIST: RISKS-FORUM Digest Sunday 13 March 1988 Volume 6 : Issue 42 FORUM ON RISKS TO THE PUBLIC IN COMPUTERS AND RELATED SYSTEMS ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator Contents: A legal problem -- responses sought (Cathy Reuben) Computers on Aircraft (Robert Dorsett) High-Tech Trucking (Rick Sidwell) Re: Programs crying wolf (Peter da Silva) Pay cut (Martin Taylor) Dangers of Wyse terminals (A.Cunningham) Burnt-out LED (G. L. Sicherman) Re: Display self-test (Peter da Silva) Calculator Self-tests: HP34C has a full functional self-test (Karl Denninger) Trying harder on complex tasks than on simpler tasks (Robert Oliver) Police using computers - Licence plate matches - etc, etc. (Ted G. Kekatos) The RISKS Forum is moderated. Contributions should be relevant, sound, in good taste, objective, coherent, concise, nonrepetitious. Diversity is welcome. Contributions to RISKS@CSL.SRI.COM, Requests to RISKS-Request@CSL.SRI.COM. For Vol i issue j, FTP SRI.COM, CD STRIPE:, GET RISKS-i.j. Volume summaries in (i, max j) = (1,46),(2,57),(3,92),(4,97),(5,85). ---------------------------------------------------------------------- From: Cathy Reuben Subject: A legal problem -- responses sought Organization: Harvard Law School [Forwarded-From: John W Manly ] I am writing a law school paper on the proper allocation of rights in software between programmers and their employers. I am curious to know how well the legal standards I've uncovered line up with the way people in the industry peceive the equities of the situation. Below is a hypothetical which lays out the basic problem. Please send me your reactions. I don't need anything extensive, just a short statement of where you personally come out and why, and from what perspective (i.e. programmer, employer, student, etc.) you're approaching the problem. I'm not interested in what you think the law is, only what you feel it should be. Many thanks! (Please be sure to respond directly to me [and NOT TO RISKS]: Cathy Reuben, Harvard Law School, REUBEN@HULAW1.BITNET - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - In 1981 Mr. John Allan receives a Masters degree in computer science from University of Massachusetts. At that time, Allan delivers a paper entitled "No More Manuals: The Use of Touch and Sound Sensitive Hardware to Promote Accessibility to Computer Technology." Shortly after that time, Allan is recruited by a representative from Medicomp, Inc., a small company servicing hospitals. Medicomp's primary product is MEDSTORE, a database for storing patient information. Medicomp seeks to enhance MEDSTORE with an on-line, touch-sensitive help system. Allan accepts a programming position with Medicomp. During his four years there, he develops modules for a touch- sensitive help facility. These modules are incorporated into MEDSTORE. Largely due to MEDSTORE's remarkable ease of use, Medicomp quickly becomes the leading supplier of patient information database systems for hospitals. In 1985, Allan leaves Medicomp. At that time, he teams up with a lawyer to create TAXELF, do-it-yourself tax preparation software for small businesses. TAXELF utilizes Allan's now famous touch-sensitive help utility, and is projected to be a huge commercial success. Shortly before TAXELF is due to be released, Medicomp files suit against Allan. Their underlying argument is simple: "As the investor in touch-sensitive help, Medicomp deserves the fruits of its success. You, Allan, basically stole something that belongs to us." Allan's answer to Medicomp's argument is also straight-forward and compelling: "You hired me as an expert in help utilities, and you got what you paid for. Any further benefits from the system should flow to me as creator." Questions: (for use as a guide only) Should Allan have the right to reuse the touch sensitive help utility he developed while at Medicomp? a. Right to copy the actual code? b. Right to rewrite the code from memory? c. Right to use the program structure and organization? c. Right to use touch sensitive help in general? What rights, if any, should Medicomp retain in the utility which they hired Allan to produce? a. Right to use the utility in MEDSTORE? b. Right to use the utility in other Medicomp products? c. Right to prevent Allan from using the utility? d. Right to prevent Allan from using touch sensitive help? Should Allan's rights to use the modules, or the ideas they embody, be any greater than those of the general public? Has the act of answering these questions changed your first impression of what is just in this case? If so, why did you back down?! Should you have? [I trust that Cathy will share her results with RISKS. PGN] ------------------------------ Date: Sun, 13 Mar 88 04:47:05 CST From: mentat@louie.cc.utexas.edu (Robert Dorsett) Subject: Computers on Aircraft [RISKS-6.41] > I don't believe that pilots are expected to believe computers over >indications given by other sources. What other sources are they supposed to use? Consider the standard navigational equipment on the 747-200: Horizontal Situation Indicator--computer processed display. Flight Director--computer generated flying instructions. Autopilot--analog/digital computer. Flight Performance Computer/Flight Management System--computer used for flight management, calculating fuel consumption, etc. Inertial Navigation System--computer used for "blind" navigation. The INS is usually linked to the HSI and autopilot; there are a variety of configurations that the pilot may select. The FMS, when installed, can link into the network as well, and fly the airplane efficiently from take- off to landing. On the 747-400, Airbus A320 (and the forthcoming A340), MD-11 (the DC-10 derivative) and, to a lesser degree, the Boeing 757 and 767, the pretense of electromechanical instruments has been done away with altogether, and replaced with CRT displays, under the assumption that the CRT displays are less prone to failures. The problem here is that the *means* of display may in itself contribute to error: for example, the current vogue for the traditional line of instruments displaying a "clock" airspeed, artificial horizon, and altimeter, is to have a computer-displayed "tape" airspeed, and tape altimeter bracketing the horizon. The immediate sacrifice is the lack of "trend" information: tape instruments are only marginally better than a digital LED display. Research on these issues is continuing, but what I've read indicates that NASA is advising caution, while Boeing and Airbus are producing their own, contrary figures. The point must be made that, in modern aircraft, all of the pilot's inputs are preprocessed by computers. The Boeing philosophy thus far has been to simplify overall design and efficiency by introducing automation; the Airbus philosophy has been to redefine the role of the pilot in the cockpit while simultaneously changing the way information is displayed. It is clear that Boeing has considered following in Airbus' footsteps during the design phase of the (suspended) 7J7. On the navigation issue: airlines have little say in how their pilots actually navigate: it's largely up to the background of the individual pilot. While one pilot may double- or triple-check sources, another might prefer to read the newspaper: consider the worst-case scenario, the incompetent captain and the resentful and disinterested first officer. There is a great tendency in modern airplanes to rely on the INS/autopilot link, to great detriment, as evidenced by the China Airlines flip over California in 1985, or the KAL 007 tragedy. A recent conference sponsored by the Flight Safety Foundation, held in Tokyo, advocated a return to the attitudes of the early 1960's, and a return to basic skills. It is clear that highly automated cockpits serve to insulate the pilot from the airplane, and thus increase boredom and stress. Design engineers, on the other hand, see the pilot error problems, and try to insulate the pilot yet further, creating more automated and "safe" systems. Modern cockpits such as the A320's, are contrary to the recommendations of organizations such as the Flight Safety Foundation's: the reasons most often cited are minimising training and maintenance costs, and reducing "pilot workload", all at the expense of long-term pilot welfare. Robert Dorsett Internet: mentat@walt.cc.utexas.edu UT Austin UUCP: {ihnp4,allegra,ihnp4}!ut-emx!walt.cc.utexas.edu!mentat ------------------------------ Date: Sat, 12 Mar 88 08:13:37 -0800 From: Rick Sidwell Subject: High-Tech Trucking Here is an article from a report sent by California State Senator John Seymour to all of his constituents. The issue has been discussed before in RISKS; this is a fresh example. "HIGH TECH TRUCKING" "Under state and federal law, truck drivers are required to keep handwritten logs to record the number of miles and hours they're on duty. These logs are easily tampered with and are often a work of fiction as some drivers try to circumvent highway safety laws designed to prevent accidents. "The result has been a dramatic increase in truck-related accidents, injuries and deaths on our highways. According to the California Highway Patrol, last year alone, 678 Californians died and more than 16,000 were injured in truck- related accidents. Snce 1982, truck-involved fatalities are up over 40 percent and truck-related injuries are up more than 32 percent. "In his continued leadership role in highway safety, Senator Seymour has introduced legislation to require large commercial trucks to install 'black boxes.' The 'black box' is an onboard computer that automatically records drive time, speed, distance traveled as well as other important functions that reveal how a driver handles his rig. "'More and more, truck drivers are pushing themselves and their equipment beyond their limits,' said Seymour. 'Driver fatigue, equipment failure and speeding are killing hundreds of innocent people every year on our highways. By requiring the use of "black boxes," heavy commercial truck drivers will be forced to more closely adhere to highway safety laws.'" When I first read this, I noticed that there was a potential invasion of privacy in that a highway patrolman could look at the electronic log and see if the trucker had been speeding, and give him a ticket if so. Then it dawned on me that this is the very purpose of requiring the "black boxes" to be installed! It would be interesting to know what the "other important fuctions that reveal how a driver handles his rig" are. ------------------------------ From: nuchat!sugar!peter@uunet.UU.NET From: peter@sugar.UUCP (Peter da Silva) Date: 11 Mar 88 08:48:29 GMT Subject: Re: Programs crying wolf (RISKS DIGEST 6.38) Organization: Sugar Land UNIX - Houston, TX Once upon a time a programmer who regularly used both MS-DOS and UNIX systems sat down at an MS-DOS system and typed "format". The program replied: PLEASE INSERT FLOPPY DISK IN DRIVE C: AND HIT RETURN The programmer stuck the floppy in the machine, hit , and formatted his hard disk. What's wrong with this picture? (1) The UNIX format program took a reasonable default if executed with no parameters: the floppy drive. The MS-DOS format program took a stupid default: the current drive. (2) The MS-DOS format program printed an incredibly stupid "warning" message. "Please insert floppy disk in this hard drive". I understand that the situation has been corrected since then. Peter da Silva `-_-' ...!hoptoad!academ!uhnix1!sugar!peter ------------------------------ Date: Fri, 11 Mar 88 17:29:25 est From: Martin Taylor Subject: Pay cut I'm not sure for whom this is a risk, but today's Toronto Globe and Mail reports that an ex-cabinet minister was placed in charge of a new agency which was expected to be quite important. But the politics of the situation changed and the agency had very little to do, so the minister asked that his pay should be halved. The possibility of reducing someone's pay had not been programmed, and the computer reported, and someone publicised, that his pay had been doubled. Very embarrassing for him and for the government of the day. (This happened some years ago). Martin Taylor (mmt@zorac.arpa) ------------------------------ Date: Fri, 11 Mar 88 15:46:08 GMT From: A.Cunningham Return-Path: <@CUNYVM.CUNY.EDU:cstjc@ITSPNA.ED.AC.UK> Subject: Dangers of Wyse terminals Organisation: Dept of Computer Science, University of Edinburgh The department of computer science at Edinburgh University has a collection of Sun workstations for use by first year undergraduates. Connected to the Suns via pads are a number of Wyse75 terminals. Recently mail was sent to users which had the following effect: 1). The user's keyboard was locked and his screen blanked. 2). His terminal was put into reflect mode (input to terminal was reflected back to the host). 3). The nasty bit. Files permissions were changed and processes were killed. The first year students involved were caught and now face disciplinary proceedings. A few questions were raised that may be of interest to other users of the terminals. 1). Why are the features in the terminal in the first place? I can only assume that Wyse put them in as security features. A hacker accesses your system you lock out the terminal. 2). Has anyone had similar experiences? I've only been reading this group for a year while we've know of the possiblities of the Wyse for at least two. At first it was limited to changing a friend's screen to inverse mode. We never envisaged it being used so destructively. 3). Is there a modification to the Wyse to stop it? We need this to stop next year's CS1 from doing the same thing again. [This is another tip-of-the-iceberg problem. All of the control characters, escape sequences, and function keystrokes that are used (constructively) by software driving your terminal can also be MISUSED by any programs running as if they were you, Trojan horses, etc. Recall that an early example was Trojan Messages, which when READ (not interpreted) would GETCHA. PGN] ------------------------------ Date: 12 Mar 88 05:43:00 GMT From: harvard!necntc!decuac!hjuxa!uucp@rutgers.edu From: gls@odyssey.ATT.COM (g.l.sicherman) Subject: Burnt-out LED (Re: RISKS-6.39) Organization: AT&T Bell Laboratories, Middletown, NJ Al Stangenberger's lament points up the vulnerability of LED digits to burnout errors. Maybe we should redesign the digits to look like this? -- -- -- -- -- -- -- | | | | | | | | | | | | | -- -- -- -- -- -- -- -- -- | | | | | | | | | | | | | | | -- -- -- -- -- -- -- It's ugly but at least it detects single errors. (Surely somebody has thought of this already? Are arabic numerals technologically obsolete?) A recent issue of _Industrial Design_ (Jan. 1974) presents an entire alphabet in this format. Imagine the potential for transmission errors! (In fact, the article goes even further: it presents a four-stroke alphabet. How's that for low resolution?) Col. G. L. Sicherman ...!ihnp4!odyssey!gls [The visual confusion between 6 and 8 is a bit awesome, and the unnaturalness of 1 and 7 is also. (The GE check code is a little easier to deal with -- people can ignore it.) But putting in display self-checks that tries to GET-THE-LED-OUT seems much more acceptable. PGN] ------------------------------ Date: 13 Mar 88 15:45:26 GMT From: nuchat!peter@uunet.UU.NET (Peter da Silva) Subject: Re: Display self-test (RISKS-6.39) Organization: Public Access - Houston, Tx Many calculators [have some sort of self-test]. They come up with all segments lit. That way you can tell when they're bad. Gas pumps do this too... ever noticed digital gas pump displays showing 8888.88 before you start pumping? ------------------------------ Date: Fri Mar 11 11:05:24 1988 From: ames!lll-crg!lll-winken!ddsw1!karl@ucbvax.berkeley.edu (Karl Denninger) Subject: Calculator Self-tests: My HP34C has a full functional self-test Organization: Macro Computer Solutions, Inc., Mundelein, IL The HP34C has a sequence, which you ask for by hitting , which does a full functional self-test. You get all segments lit if all is ok, or an error code (or a dead unit) if it fails. The manual claims that it is a full computational and functional test (and it does take a couple of seconds to run). I use it every time I power the thing on. Karl Denninger | Data: +1 312 566-8912 Macro Computer Solutions, Inc. | Voice: +1 312 566-8910 ...ihnp4!ddsw1!karl | "Quality solutions for work or play" ------------------------------ Return-Path: Path: pyramid!cbmvax!hutch!rabbit1!robert From: rabbit1!robert@csl.sri.com (Robert Oliver) Date: 10 Mar 88 20:45:26 GMT Subject: Trying harder on complex tasks than on simpler tasks Organization: Rabbit Software Corp., Malvern PA My experience indicates that we often DO try harder on complex tasks than on simple ones. In working on a large on-line transaction processing system, it was observed by various people (notably those responsible for testing and quality assurance) that whenever we completed major overhauls of the system, it often passed the tests with little trouble and did not "crash" when eventually run live. New versions which contained simple fixes or minor modifications inevitably acted mysteriously during testing or catastrophically when put on-line. What this implied was that complex changes garnered more of our attention than simple changes when we were analyzing the problem, designing and implementing the change, and testing the final product. This is not to imply that we were simply careless when making simple changes. On the contrary, we were much more careful than most software groups I have seen. However, the simple changes did not elicit that keen level of awareness needed to adequately foresee hidden problems and to test for such possible cases. Careless, no. Less careful, less alert, less interested, maybe. It's not only a very gray area, but it's also a tough problem to correct. One can state that, "when making simple changes, remember to be just as alert and think just as clearly as when making complex changes," but the very nature of the problem will often undermine this maxim. Robert Oliver Rabbit Software Corp. (215) 647-0440 7 Great Valley Parkway East ...!ihnp4!{cbmvax,cuuxb}!hutch!robert Malvern, PA 19355 ...!psuvax!burdvax!hutch!robert ------------------------------ Date: 9 Mar 88 22:27:44 GMT From: moss!ihuxv!tedk@rutgers.edu (Ted G. Kekatos) Subject: Police using computers - License plate matches - etc, etc. Organization: AT&T Bell Laboratories - Naperville, Illinois All this talk about innocent people vs. police computers reminds me of the Movie "Brazil". If you have not seen it, it is available in video tape. The same RISKS question comes up again: If the "computer system" helps the police to find one (1) indeed "bad" person, and also find one (1) indeed innocent person, are we willing to deal with the consequence. Ted G. Kekatos backbone!ihnp4!ihuxv!tedk (312) 979-0804 AT&T Bell Laboratories, Indian Hill South, IX-1F-460 Naperville & Wheaton Roads Naperville, Illinois. 60566 USA [If you are looking for one person and you find two, you have some incentive to probe further. The problem is when you get only one, and it is the wrong person. But ultimately it is how the query response is handled that matters. PGN] ------------------------------ End of RISKS-FORUM Digest ************************