Subject: RISKS DIGEST 11.19 REPLY-TO: risks@csl.sri.com RISKS-LIST: RISKS-FORUM Digest Friday 1 March 1991 Volume 11 : Issue 19 FORUM ON RISKS TO THE PUBLIC IN COMPUTERS AND RELATED SYSTEMS ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator Contents: About Risks in Believing AI Gurus (Klaus Brunnstein) Re: Jim Horning's note on software warranties (Alan Wexelblat) But the computer person said it was OK! (Steve Bellovin) Automatic Patching (Larry Nathanson) Risk from workstations with built-in microphones (Steve Greenwald) Specs for special equipment (Jim Purtilo) Re: worse-is-better for the 1990s (Tim Chambers, Mark McWiggins, Flint Pellett, Dan Franklin) The RISKS Forum is moderated. Contributions should be relevant, sound, in good taste, objective, coherent, concise, and nonrepetitious. Diversity is welcome. CONTRIBUTIONS to RISKS@CSL.SRI.COM, with relevant, substantive "Subject:" line. Others ignored! REQUESTS to RISKS-Request@CSL.SRI.COM. FTP VOL i ISSUE j: ftp CRVAX.sri.comlogin anonymousAnyNonNullPW CD RISKS:GET RISKS-i.j (where i=1 to 11, j is always TWO digits. Vol i summaries in j=00; "dir risks-*.*" gives directory; "bye" logs out. ALL CONTRIBUTIONS CONSIDERED AS PERSONAL COMMENTS; USUAL DISCLAIMERS APPLY. Relevant contributions may appear in the RISKS section of regular issues of ACM SIGSOFT's SOFTWARE ENGINEERING NOTES, unless you state otherwise. ---------------------------------------------------------------------- Date: 1 Mar 91 17:45 +0100 From: Klaus Brunnstein Subject: About Risks in Believing AI Gurus (M.Minsky) On February 19-20, 1991, Borland collected, on the occasion of its 10th anniversary, a rare collection of gurus, experts, engineers, artists in Munich (Title: European Software Festival). On the program: - Niklaus Wirth on his Oberon language concept (lecture, workshop) - Bjarne Stoustrup (AT&T) with 2 lectures on C++/Object Oriented Programming - Marvin Minsky: Lecture on Personal Software and Programs Who Know You (but he really gave a survey of AI history) plus Lecture on Artificial Animals - Philippe Kahn: Back to the Future - Joseph Weizenbaum against overestimation in research - Izumi Aizu: Hypernetwork Society Some more could not come (Alan Kay should not use plane), others really did not come (Cyberspace guru Jaron Lanier was only virtually present in a video) One of the most stimulating (and generally uncomparable) events was a concert of Tod Machover (composer, director MIT Media Lab) who demonstrated his "conductor-aiding handglove" in a new composition, after having demonstrated his concept of "hyperinstruments" with a piece from his opera "Valis". Also some native German speakers: - Computer Art professor Herbert W. Franke on Experimental Esthetics - Thomas von Randow on Cryptosystems ("If Mary Stewart had applied crypto- logy..." - my own contribution was on Risky Paradigms in Informatics' Box of Pandora (starting from J.v.Neumann's assumption, that his EDVAC be equivalent to the human brain, with peripheral devices analog to "organs", I analysed risks in misconceptions, errors in realisations, misunderstanding on the users side, and malicious misuse, with examples well known to Risk Forum readers). For about 2,000 participants in gigantic Gasteig Philharmonic site, this must have been a stimulating experience, but at least some of them will have corrected (hopefully) their world model on trustable gurus when Marvin Minsky, in a `press conference' was asked whether he would like to connect his nerves and brain to a `desktop brain' machine. Following his lecture where he argued that the human brain consists of about 40 neural machine each of which may have a specific frame architecture (he represented theories of Darwin, Freud, Piaget as specific search trees, and he discussed several types of frames representing brain activities), he then exclaimed: Imagine that you install 400 neural machines which will give 10 times the power of a brain! While Joseph Weizenbaum sat on the other side very quietly, Marvin became even more explicit when he discussed the possible role of philosophers (at most, those of the last 200 years seemed relevant to him) and religion (which he suggested to forbid as religion does not contribute to problem-solving). No surprise that philosophers such as Socrates are not discussed in AI circles: following the relativity of knowledge (Oida ouden eidos! =I know that I do not know!), a knowledge base is a self-contradiction! (I now understand why MIT students were forbidden philosophy in Marvin's time as dean, as Joe Weizenbaum remembers). >From a European point of view, the rational aspects, Human Information Processing and its Informatics equivalent "Artificial Intelligence" were so overemphasized that it became very difficult to believe that Marvin Minsky really believed what he said. With his special inspiration, Joseph Weizenbaum brought this to the real point when (summing up his experiences in discussions at MIT) he mentioned a dream: "I dreamt that Marvin leaves at his end - which is hopefully very far in the future a letter - saying: I never meant what I said!" Personally, I would feel much more comfortable if I knew that Marvin Minsky is more conscious of the impact of his words; but I think that he just means what he (and many others) says: that there is only a difference in material (dry Silicium realisation of "intelligence" versus the evoluted wet Carbon-based brain), and that it is only lack of contemporary knowledge why today's machines' intelligence seems inferior to human one. And that the vanishing difference between machine and brain will evolute in intelligence amplifiers where human nervous system may be connected to a desktop machine. In my analysis, it is this kind of misconceptions that are an esssential reason for contemporary computer accidents - misconceptions of scientists (uncritically following paradigms and unverified assumptions), top- and medium-level-misconceptions, misimplementations, misunderstandings on the users' side, conscious use of side effects and other misuse... Klaus Brunnstein, University of Hamburg (March 1, 1991: 5 p.m. GMT) PS: apologies to those who regard such philosophical discussions as mere speculations; I honestly do not wish to distract anybody from analysing real accidents on a realistic (that is, non-philosophical) basis. ------------------------------ Date: Fri, 1 Mar 91 13:21:57 est From: wex@PWS.BULL.COM Subject: Re: Jim Horning's note on software warranties (RISKS DIGEST 11.16) While I don't want to turn this into a comp.software-eng flamefest, I would like to disagree with something I think Horning is implying. He says: > Software doesn't wear out, and hence doesn't need "maintenance" to cope > with physical wear. Any defects were present at the outset. This is true only in the most trivial sense. "Defects" can be one of two things. Either: - a defect is a difference between the actual operation of the program and some formal specification of its intended behavior; or - a defect is a difference between the actual operation of the program and some expectation on the part of the users/designers/clients. Now, in the first sense it's true that software doesn't come to have more defects over time (though more may be discovered, obviously). However, it's important to remember that the latter definition is the one more commonly used, and under that definition, programs will indeed "wear out." That is, I assert we cannot think of a program only as having a static set of defects which prevent it from meeting its intended purpose. Rather, there is a dynamic gulf between the expectations of the users and the operation of the program. This is why software engineers are taught "the specification is *never* frozen." This is why complex systems which take years to be delivered so often fail. It's not that they don't do what they were originally intended to do -- it's that they can't cope with all the additional things they were asked to do during development (q.v. any number of military systems, command & control systems of all sorts). As an aside, I don't like the analogy between software and books because books are primarily inert conveyers of information; people expect programs to *do* things. It's one thing to complain that your on-line encyclopedia is "wrong" and another thing entirely to complain that your aircraft-landing system is "wrong." --Alan Wexelblat, Bull Worldwide Information Systems phone: (508)294-7485 ------------------------------ Date: Fri, 1 Mar 1991 00:40 PST From: JERRY@HMCVAX.CLAREMONT.EDU Subject: Faxing a horse Today I attended an "Adobe Technical Conference." Contrary to its name, this was mainly a marketing meeting. One discussion examined PostScript Level 2. One addition is the ability to create forms that are cached in memory (or on disk.) The idea is that one will be able to call up a form that will have already been imaged and thereby speed printer output. Another discussion focused on the integration of PostScript and FAX technology. Adobe would like to see printers which have built in fax send/receive capabilities. Their idea is that with such a printer/fax device one could have remote printing on any fax machine in the world. They have planned extensions to PostScript Level 1 & 2 to permit a user to print a file with a phone number and have the printer fax the file to the phone number. In many ways, this is an attractive idea. To push it further, they would have these PostScript printer/fax machines recognize when they had connected with another PostScript printer/fax device and in those cases send PostScript files. This is also an attractive idea when you consider a) the visual improvement PostScript can add to faxes and b) the compression in data of sending the PostScript over the scanned bits (Adobe claimed reduction of transmission times by factors of 2 to 25.) Adobe demonstrated this benefit by stating that 1/2 of all phone calls from Japan to the United States are made by fax machines. You can imagine the money saved by reducing the length of these phone calls by a factor of two. The Adobe representative also portrayed what seemed to him an excellent idea. A company (a manufacturer for example) might cache their purchase order form on a supplier's printer/fax machine's disk and then reduce their fax transmittal times to the order itself, with no time spent transmitting the form. Later on, I cornered the Adobe representative in a hallway and spoke of my concerns that some manufacturer could play tricks by ordering parts for a competitor and use a competitors purchase order form to make it all seem very legitimate. Well, I didn't know the half of it. The representative admitted that security issues were great and offered a better scenario: Imagine thieves faxing a trojan horse into another printer/fax machine. The receiver would be looking at an innocuous message, something like, "Hello World.", or "Joe's Pizza is now open", but his printer/fax would contain a PostScript program that copies future output to disk and late at night faxes that output back to the thieve's machine. It's good to know that Adobe realizes the potential problems before producing the product and is trying to install features to prevent the abuse. Still, Adobe has many "unsophisticated" users and it will be interesting to see how they teach these users that their printer/fax machines can become a corporate spy. Jerry Bakin. ------------------------------ Date: Thu, 28 Feb 91 20:26:15 EST From: smb@ulysses.att.com Subject: But the computer person said it was OK! That reminds me of an incident that happend to me a few Saturdays ago. I needed to pick up some 12-gauge electrical cable. I browsed past the 14-gauge coils, then picked up a 25' coil of 12-gauge. The price was about twice what the 14-gauge cost. Odd, and unreasonable; I checked further. The 25' coil of 12-gauge cost exactly the same as a 50' coil; that price in turn was slightly more than the same length of 14-gauge. Ah -- someone put the wrong sticker on the 25' coil, I thought. There was a clerk standing nearby; I pointed out the error to him. He consulted a handy printout. ``Nope; it's priced correctly; see?'' Sure enough, the printout showed that 25' and 50' of 12-gauge wire cost the same thing. I tried pointing out that that was unreasonable; he shrugged and walked away. Having a bit of time, I decided to tell the manager. It took a bit of explaining to get my point across; one of the two individuals I was speaking to didn't understand anything I was saying. After all, the computer had the price as the sticker; what was the problem? I finally made myself understood, but to no avail -- store prices were set from the regional computer center a few towns away, and nothing could be done until the office re-opened on Monday. The price was wrong, and absurd -- but there was no way for them to fix it. (That they didn't do anything else, such as pulling the erroneously- marked coils from the shelf is another matter, of course.) --Steve Bellovin P.S. The next time I was in, a few days later, the pricing had been corrected. ------------------------------ Date: Wed, 27 Feb 91 22:29:00 -0500 From: lan@bucsf.bu.edu (Larry Nathanson) Subject: Automatic Patching When I read the first half of the article on the Hi-Track Dynamic Microcode Downloading, my thoughts turned immediately to America Online. I can vouch for the fact that AOL is self patching- I've seen Macintosh and Apple //e code come down the pike, into my computers. Sometimes it's icons, sometimes the inherent menu structure changes- sometimes new modules appear. I don't think there is any limit to what they can send over the lines- after all, they have game modules available on an optional basis (for the //e anyhow) which contain much code. Nowadays, the patches are rather infrequent. During beta testing they were almost daily. The risks of accepting executable code, straight off the line, and executing it w/o giving the user a chance to virus check seems nerveracking, and my first impulse was that it is a risk. However, after some thought I decided that it should be relatively safe. There are 2 ways viral code could get through- an outside job, and an inside job. For an outside job, the hacker would have to figure out the communiction code, (As a beta tester of the original Applelink Personal, let me tell you that this is NOT easy to do.), AND dissassemle the entire communications software- to find out where to patch. Then our illustrious hacker would have to establish a link to the user's pc. To do this, #1) The user would have to call the hacker's machine which would emulate a telenet node, or #2) the hacker would have to take over one or more telenet nodes. Neither is easy to accomplish without either the user knowing, or telenet getting VERY upset (i.e., litigious). While anything is possible, I'm willing to bet that this is sufficiently improbable that AO is safe to external hacks using the patch command to give me a virus. Remember that nothing is 100% safe. Also- if they DID give me this virus, then SAM should grab it as soon as it starts to go after another application. As for inside jobs, well, I don't know. One would hope that a disgruntled employee couldn't do anything alone that the 'gruntled' employees wouldn't stop. I know that 'freedom of hacking' is kept rather tight around Quantum- sometimes the people who are supposed to be doing things run into trouble. My feeling is that they are aware of the risk, and have safeguards to prevent such a thing - it would be the end of a very promising online system, if such a scandal was to occur. And the same caveat- while nothing is 100% safe, a runtime infection checker should catch anything while it's on the move. (NOTE- I am not an employee of Quantum Computer Inc, makers of America Online, and other nifty things. I am a former beta-tester and online guide, with no current formal affiliation, except for lots of friends.) --Larry Nathanson lan@bucsf.bu.edu ------------------------------ Date: Fri, 1 Mar 91 12:46:49 -0500 From: "Steve Greenwald" Subject: Risk from workstations with built-in microphones A lot of new workstations and personal computers now come with a built-in microphone with hardware to digitize the microphone input. Example applications are digital signal processing and voice-annotation of software for documentation purposes. Many of these workstations are designed to be used as stations on local area networks (LANs). Additionally, it is quite common in some organizations to have workstations in the offices of individuals. One risk of this technology seems clear: it is entirely possible for a determined person to write software that will activate the microphone and digitizer without the knowledge of the workstation user. The digitized acoustic data could then either be locally stored in the workstation (say, on a hard disk) for later retrieval, or, if the workstation is attached to a network, the data could even be sent to the eavesdropper's location. If the network is used, it would be quite possible (given the throughput of modern LANs) to send digitized voice. Using common techniques, digitized voice data requires a data rate in the neighborhood of 56 kilobits per second, while most LANs have data rates in the area of megabits per second. Of course, this method would require that the eavesdropper somehow acquire access to the workstation (perhaps with a trojan horse, or even physical access). Naturally, this would allow someone remote in time and/or space to eavesdrop on the area around the workstation. Some possible solutions: 1) Eliminate the microphones when not absolutely required. 2) Have a switch (either manual or software controlled) to turn the microphone circuitry on or off. A problem with this solution is that it requires the user to remember to turn the microphone off when it is not in use, and would not eliminate eavesdropping while the microphone was turned on during periods when the user was not using it. Additionally, if the switch were software controlled, the eavesdropper's software could simply turn it on if it detected it as being turned off. 3) Have some sort of indicator (say, an LED) which would clearly show when the microphone circuitry was active. The indicator should be under the control of hardware only, to prevent it being disabled by the software of the eavesdropper. 4) Have an automatic logging function which would record the time and duration of each use of the microphone circuitry. A problem is that such a log would have to be frequently examined by the user, unless some sort of automated exception reporting were used. 5) Have some sort of sound-blocking cover which the user can put over the microphone when it is not being used. A possible problem with this solution is that if the eavesdropper has physical access to workstation, it would be possible to replace the cover with a fake one which is not sound-blocking. Therefore it would be desirable to have some sort of build-in facility which could test the operation of the cover. 6) Require some sort of user authentication whenever the microphone is to be used. Even something as simple as a "dead man's switch" would work (although that could be annoying in practice). Steve Greenwald, graduate student, Computer and Information Sciences University of Florida, Gainesville, Florida [The microphone problem was previously discussed in the context of NeXT by E. Loren Buhle, Jr. in RISKS-10.65. PGN] ------------------------------ Date: Wed, 27 Feb 91 00:54:53 -0500 From: purtilo@cs.UMD.EDU (Jim Purtilo) Subject: Specs for special equipment I have just checked into the hotel wherein a large meeting of software folks is held. The meeting announcement stated "if you plan on bringing a PC for use in the hotel room, then be sure to tell the hotel this when you make reservations. They will have the special equipment for you to hook your modem to the hotel phone system." I did this when making reservations. There are specs and then there are specs, of course. At the desk, I asked "do you have the equipment I asked for?" to which I received a cheery "of course, Dr. Purtilo! We have your request entered right here in our computer." Indeed. I find my room is spacious, and quite special. It is well suited for a person who is environmentally disadvantaged. The front desk's reservation system, it seems, allows a flag for "special equipment" to be set, which refers to a room with all furniture placed for easy wheel-chair access. Mirrors, bath gear, toilets, etc, are all placed and oriented for someone with much different patterns of mobility than I exhibit. I have not quite decided whether this is a software design flaw (desired state information not expressible in the system), user interface error, or just my usual luck. (I am, obviously, overcoming my own handicap in this room, namely, that a phone cord is permanently fixed to the wall without nice modular jack for nethacking. Screwdrivers, 'gator clips, and an attitude ... don't leave home without them!) Jim ------------------------------ Date: Tue, 26 Feb 91 16:03:20 mst From: Tim Chambers Subject: Re: worse-is-better for the 1990s (Newcomer, RISKS-11.16) >From: "Joseph M. Newcomer" > There is a risk of accepting the barely adequate; you may have to >live with it for a long time. I think the author misses Dick Gabriel's point. By the way he chooses his words, he is lamenting the fact that the world suffers from living with "barely adequate" implementations. The issue seems is more one of standards than of what is The Right Thing. The company I work for has a long-standing tradition of trying to Do the Right Thing with technologies in our products. A funny thing happened when we ventured into larger and larger markets -- the slogan "standard is better than better" began to catch on with engineers battling with competitors for shares of billion-dollar markets. We began to seek out standards and promote them in our products. I'd like to know if examples exist of cases where Right Thing technology *has been* compatible with mass markets. I can think of plenty of counter-examples: VHS versus Beta; multi-process, high-cost-of-entry computers (UNIX workstations) versus single-process, low-common-denominator computers (PC); and technologies used in television and power transmission. In all cases, the poorer candidate for being the Right Thing has more economic clout (i.e. it thrives in a larger market than Right Thing alternatives). (Perhaps FM radio is closer to the Right Thing than AM -- I welcome comments from an expert of what an ideal radio broadcast system would be so AM and FM could be compared to it.) I don't see this as much of a lesson at all. It's the natural order of things in a competitive world. ------------------------------ Date: Wed, 27 Feb 91 18:34:43 GMT From: mark@intek01.UUCP (Mark McWiggins) Subject: Re: worse-is-better for the 1990s (Newcomer, RISKS-11.16) "Joseph M. Newcomer" writes: > ... There is a risk of accepting the barely adequate; you may have to >live with it for a long time. Hear, hear. Admiral Grace Hopper was quoted as saying "Well, it's not really what we want, but we'll fix it in the next release" on the occasion of the release of the original COBOL. Mark McWiggins, Integration Technologies, Inc. (Intek), 1400 112th Ave SE #202, Bellevue WA 98004 +1 206 455 9935 mark@intek.com ------------------------------ Date: 27 Feb 91 21:48:09 GMT From: flint@gistdev.gist.com (Flint Pellett) Subject: Re: worse-is-better for the 1990s (Gitomer, RISKS-11.17) >Perhaps what we are seeing is Gresham's Law as applied to computers: [...] >Now if I could only figure out how to hoard an operating system or high-level >language :-) That's easy: you overprice it. When you can get BASIC for free but have to pay $100 for a C compiler, you end up with things that should have been in C written in BASIC. When it costs $400 for a bare-bones UNIX vs. $100 for DOS, the lesser system pushes out the greater one quite often. Flint Pellett, Global Information Systems Technology, Inc., 1800 Woodfield Dr., Savoy, IL 61874 (217) 352-1165 uunet!gistdev!flint flint@gistdev.gist.com ------------------------------ Date: Wed, 27 Feb 91 10:53:41 -0500 From: dan@BBN.COM Subject: Re: worse-is-better for the 1990s (Gitomer, RISKS-11.17) If you're a hardware manufacturer, you can hoard your software by just not letting it run on anybody else's hardware! This is one reason that UNIX drove out better, or at least better-adapted, operating systems: they couldn't run on anything except their own vendor's hardware, and the vendor wasn't interested in changing that situation since the greater intrinsic value was a competitive advantage in selling hardware. If you're not a hardware manufacturer, you can still accidentally "hoard" an OS or high-level language by specializing it to a particular architecture so that you can't easily move it onto newer hardware. Multics and ITS are examples. In this case the problem is that operating systems and languages of greater intrinsic value usually end up requiring hardware of greater intrinsic value or specialization in order to do their job. Multics had protection rings and true dynamic linking, with specialized hardware to make them fast; UNIX has neither, so it can run on machines without the extra hardware. Fortunately, Gresham's Law need not apply if the creator of the software isn't interested in hoarding it. GNU Emacs and gcc are good counterexamples. Dan Franklin ------------------------------ End of RISKS-FORUM Digest 11.19 ************************