RISKS-LIST: RISKS-FORUM Digest Wednesday 15 February 1989 Volume 8 : Issue 26 FORUM ON RISKS TO THE PUBLIC IN COMPUTERS AND RELATED SYSTEMS ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator Contents: "$15 Million Computer Dud Baffles Udall" (Joseph M. Beckman) Re: Computer blamed for 911 system crash (Rodney Hoffman, Paul Blumstein) Selling who-called-the-800-number data (Bob Ayers) PIN? Who needs a PIN? (Alan Wexelblat) Door Sensors and Kids (Eddie Caplan) Risks of misunderstanding probability and statistics (Tom Blinn) Why you can't "flip" bits on a WORM disc (Daniel Ford) Credit Checker & Nationwide SS# Locate (David Andrew Segal) Re: Authenticity in digital media (Pete Schilling) Re: multi-gigabuck information "theft" (Jeff Makey) The RISKS Forum is moderated. Contributions should be relevant, sound, in good taste, objective, coherent, concise, and nonrepetitious. Diversity is welcome. * RISKS MOVES SOON TO csl.sri.com. FTPable ARCHIVES WILL REMAIN ON KL.sri.com. CONTRIBUTIONS to RISKS@CSL.SRI.COM, with relevant, substantive "Subject:" line (otherwise they may be ignored). REQUESTS to RISKS-Request@CSL.SRI.COM. FOR VOL i ISSUE j / ftp KL.sri.com / login anonymous (ANY NONNULL PASSWORD) / get stripe:risks-i.j ... (OR TRY cd stripe: / get risks-i.j ... Volume summaries in (i.j)=(1.46),(2.57),(3.92),(4.97),(5.85),(6.95),(7.99). ---------------------------------------------------------------------- Date: Wed, 15 Feb 89 16:45 EST From: "Joseph M. Beckman" Subject: "$15 Million Computer Dud Baffles Udall" Summarized from the Washington Times (2-15-89): The US Office of Surface Mining has spent some $15 million on a computer system to prevent strip mine law violators from obtaining new permits. The GAO is calling it a failure. The system apparently has a high error rate because it uses lists of names and addresses that are not complete. Arizonia democrat "Mo" Udall was quoted as saying "I'm really baffled. We have computer systems in this country to keep track of everything from missiles to kindergarten kids who are sick or absent. But the Interior Department can't develop a system, even with the help of %15 million, to keep violators out of the coalfields." By using the phrase "missiles to kindergarten kids" he seems to imply that systems are handling things as complex as missiles to as simple as... Of course, the fact that the subjects of the systems may be very complicated says nothing about what the system is actually doing. Joseph ------------------------------ Date: 15 Feb 89 09:38:31 PST (Wednesday) From: Rodney Hoffman Subject: Re: Computer blamed for 911 system crash -- more (RISKS-8.24) On Saturday, 11 Feb 89, the Los Angeles city emergency 911 telephone system crashed twice. The initial story, summarized in RISKS 8.24, blamed "a power failure in the computer's signalling mechanism." The `Los Angeles Times' (14 Feb 89) carried a follow-up story by Frederick M. Muir and Paul Feldman, with the following new information. The crash was caused by one power converter board, an SL/1 positron switch, that helps control power fed to a complex switching system. It failed for still unknown reasons. It's an off-the-shelf part that involves a low degree of technology and sells for $1000, according to Pacific Bell service manager Mike Fink. The switch recieves incoming 911 calls and routes them virtually instantaneously to the first open phone line. "Fink said the board that failed is usually so reliable and simple that no backup was designed into the system. It is virtually the only part of the system -- which cost $1.6 million to install -- without a backup." Asked for past failure statistics, Pacific Bell and General Telephone, which between them operate hundreds of 911 systems across California, reported only two other failures in the past two years, neither of which was linked to the part which failed Saturday. ------------------------------ Date: Wed, 15 Feb 89 09:29:23 PST From: Paul Blumstein (paulb@ttidca.tti.com) Subject: Computer blamed for 911 system crash -- more (Re: RISKS-8.24) ... The Los Angeles 911 system has had continual overload problems since its inception because it was expected that only 30-40% of emergency callers would use the system. The actual number turned out to be 75%. In addition, the system has received a large amount of non-emergency calls. The overload has caused a several-minute delay during peak periods before a 911 operator could be reached. Paul Blumstein, Citicorp/TTI, Santa Monica, CA {philabs,csun,psivax}!ttidca!paulb or paulb@ttidca.TTI.COM ------------------------------ Date: Mon, 13 Feb 89 12:59:53 PST From: ayers@src.dec.com (Bob Ayers) Subject: Selling who-called-the-800-number data Those that liked the idea of states selling driver info will really love this one. As reported in the 20 February Forbes magazine, a new company, Strategic Information Inc ... will collect, analyze and resell information on everything from retail prices in grocery stores to the premiums charged by insurance companies ... [it] intends to offer custom tailoring of such data to meet the needs of individual clients ... One feature, available this spring through a 160-million-name database that Strategic recently purchased, will be marketed to companies with toll-free phone lines: For a fee, the companies can check the origins of any calls they receive through 800 numbers -- even those that don't go through -- enabling them to target the dialers for follow-up mailings or sales pitches. ------------------------------ Date: Wed, 15 Feb 89 10:39:44 CST From: wex@radiant.csc.ti.com (Alan Wexelblat) Subject: PIN? Who needs a PIN? Last night I had a rather frightening experience with my bankcard. Using one of the network of machines which is supposed to accept my card, I tried to make a withdrawal. The machine accepted my card, printed a message on its screen saying "Hello Alan Wexelblat, welcome to " and gave me the standard menu of options (withdraw, deposit, transfer, balance). At no time did it ask for my PIN. I didn't notice this until I had already tried to make a withdrawal. The transaction was denied because my bank's computer was down, but the implication is really fairly scary: anyone with my card can walk up to this machine and get $400 of my money without either him or me doing anything to compromise the "highly private" PIN. I'm not sure if this is the normal mode of operation for this bank's machines, or was a peculiar isolated failure or was due to a system-wide fault in the operating software. Anyone else had a similar experience? (The machine maker is Diebold; does anyone know if all machines by a given manufacturer run the same software? I've used other Diebold machines before and never had anything like this happen.) --Alan Wexelblat TI Application Tools, Austin, TX [If you bank in a Diebold Cave, you say "NO PIN, OH SESAME!" PGN] ------------------------------ Date: Wed, 15 Feb 89 16:42:36 EST From: eddie.caplan@H.GP.CS.CMU.EDU Subject: Door Sensors and Kids While reading back issues of RISKS, I ran across the discussion here about automatic sensors for controlling doors. This made me recall that when we would bring our 2 year old son into work, he was not tall enough to trip the electronic eyes on the elevator doors. Subsequently, we always had to be sure to hold the door until he passed through or he would get bonked. The doors never closed hard enough to cause him any serious damage, but that's the RISK of the doors' hardware working properly. ------------------------------ Date: 15 Feb 89 08:37 From: blinn%dr.DEC@decwrl.dec.com (Dr. Tom @MKO, CMG S/W Mktg, DTN 264-4865) Subject: Risks of misunderstanding probability and statistics As a person who has earned a doctorate in statistics, with emphasis on its practical applications (although I no longer work in that field), I have been both amused and appalled by some of the recent contributions focusing on probabilistic and statistical analysis of the risks of aircraft engine failures. Some of the contributions assume, for example, that there really is such a thing as "the probability that one engine will fail", and that therefore you can compute the probability that two engines will fail (assuming that the failures are independent) by simply squaring this "p". This is such an incredibly simplistic way of looking at the problem that I'm amazed that anyone would offer it for consideration. Clearly, on any given aircraft, the engines share some subsystems in common; for example, they draw fuel from a common supply, possibly with a common fuel pump, possibly using two or more independent pumps. Certain failures in the common subsystems could cause both (or all) engines to fail. On the other hand, the engines have other subsystems that are not shared. While these unique subsystems may have been equivalent (and thus, have a common propensity to fail) at the time of manufacture, they almost certainly are not equivalent after any period of maintenance in the field. Consequently, even if we disregard the failures of common subsystems, the remaining engines almost certainly don't share a common probability of failure. Assuming they do can be an interesting and useful strategem for thinking about joint probability of failure, but it's a dangerous oversimplification. In RISKS-FORUM Digest Volume 8 : Issue 24, it is asserted by Barry Redmond that >If someone makes a mistake on one engine at any of these times, there is a >high probability that they will make the same mistake on the other engine(s). That may be true, but it may not be true, because the same person may not be working on all the engines. I would agree that an incompetent mechanic working on all the engines is likely to make the same mistakes on all of them, but the reality of aircraft engine repair is different. >The probabilities of failure are not independent because if one engine fails it >immediately increases the probability of another failing. This is a very interesting assertion. It seems to be saying that there is a causal relationship between a first engine failure and the likelihood of a second. Now, I would agree that if I were on an aircraft where one engine had just failed, I'd worry lots more that a second would fail as well then I usually would worry about engine failure when no engines had failed, but this doesn't mean that the probability of failure of the other engines has changed in any way. (It also doesn't mean that it hasn't, and if it has changed, it could be less or greater.) It's unfortunate that a thorough grounding in probability theory and in statistical inference (and in risk analysis) isn't a part of the technical curriculum. Failures happen. They usually are not independent. Knowing how to analyse the risks of failures can help in making the tough decisions about where to put resources to "prevent" or "protect against" failures. Tom Dr. Thomas P. Blinn, Marketing Consultant, Application Platforms, U. S. Channels Sales, Digital Equipment Corporation, Continental Blvd. -- MKO2-2/F10 Merrimack, New Hampshire 03054 Opinions expressed herein are my own, and do not necessarily represent those of my employer or anyone else, living or dead, real or imagined. ------------------------------ Date: Wed, 15 Feb 89 11:23:41 EST From: Daniel Ford Subject: Why you can't "flip" bits on a WORM disc Some contributors have noted that there are risks in trusting the integrity of data stored on indelible storage devices such as WORM type optical discs. These types of devices are often employed to store archival data that is never legitimately altered (bank records, school transcripts, transaction logs, etc.). There seem to be two risks to trusting this technology. The first is "How can you be sure that the disc you are reading is the original and not some altered copy?" and the second was "How can you be sure that some bits have not been 'flipped' by overwriting a disc sector with a new value that happens to burn a pit in the right spot?" The first concern is valid, but the second is not. There are two reasons for this. Firstly, each disc sector on a WORM (and other types of optical discs) disc is protected with a sophisticated error correction code. These codes are very robust and are used because the very high storage densities of optical discs tend to give them correspondingly high error rates. So, if a bit (or several) was somehow "flipped", the ECC would either "correct" the change or report a read error. The second reason has to do with how data is actually encoded on the disc surface. Contrary to what might first be thought, "pits" (the holes) and "lands" (space in a track between pits) do not correspond directly to 1's and 0's. Rather, their lengths and transitions form a sequence that encodes the data. Many codes have been developed, but a common one is NRZM (Non-return to zero mark). Basically, in this code the transitions between the lengths of both pits and lands record sequences of 0's and the transitions between the two record individual 1's. Certain minimum and maximum lengths of pits and lands must be respected for clocking and detection purposes. In such a scheme, you cannot just flip one bit (by making a pit longer) you must flip two or more. So, even if you could get past the ECC, it would be quite difficult to get something specific and meaningful (i.e. not some weird control character in the middle of someone's name) by overwriting a WORM disc sector. Further, each sector overwrite will also overwrite the ECC and change its encoded value, which is burned into the disc along with the data, to some other value. As such, it is unlikely that the ECC and the sector contents will remain consistent after an overwrite (giving subsequent read errors). It would be much easier to forge a disc and substitute it for the real thing then try to alter the original. But, safeguards against that can be developed as well. Dan Ford [Thanks for the elaboration. But remember that even if you have an N-error detecting code, many (N+1)-bit falsifications will go undetected. Similar problems exist with ECC. PGN] ------------------------------ Date: Wed, 15 Feb 89 16:52:28 EST From: dasegal@brokaw.LCS.MIT.EDU (David Andrew Segal) Subject: Credit Checker & Nationwide SS# Locate A member of my research group received the following "comforting" advertisement in the mail (comments in [] are my editorial remarks...): CREDIT CHECKER & NATIONWIDE SS#-LOCATE just got BETTER! PROFESSIONAL CREDIT CHECKER has always offered: * Consumer Credit Reports from thousands of credit sources coast-to-coast. * Social Security Number tracing anywhere in the country. * Driver's License reports from every state but Massachusetts [See Risks 8.20] * Financial reports on over 9,000,000 businesses all across the USA. and now, PROFESSIONAL CREDIT CHECKER offers an exciting NEW service: [oh, boy] NATIONAL ADDRESS/IDENTIFIER UPDATE! ----------------------------------- With NATIONAL ADDRESS/IDENTIFIER UPDATE you can enter either a name and address or a Social Security Number. The Network will search all over the nation and get a complete report back to you in just seconds! You can get such information as all current names, aliases, social security numbers and/or variances, date of birth, present and past employers and past and/or present addresses. You can find people anywhere in the country without having to access a full Credit Report. No permissible purpose under the Federal Law is required to run NATIONAL ADDRESS?IDENTIFIER UPDATE...and NO RECORD of an inquiry will be logged on the consumer's credit report! ... [Boy, it isn't illegal and no one will ever no you invaded their privacy!] -----END OF ADVERTISEMENT------ I think the ad says it all. David Andrew Segal, MIT Laboratory for Computer Science [And don't forget the on-line National Credit Information Network mentioned in RISKS-8.11. PGN] ------------------------------ Date: 15 Feb 89 14:51:00 EST From: "ALBTSB::SCHILLING1" Subject: Re: Authenticity in digital media (RISKS-8.25) Seeing hasn't been believing for a long time. Remember Fred Astaire dancing on the ceiling in the movie "Singing in the Rain"? And the newsreel footage showing Hitler dancing a little jig in front of the Eiffel Tower after the French surrender in WWII was a good piece of 1940 film editing, not an accurate motion picture. Counterfeit paintings in the style of well-known artists have been around for at least four hundred years. The Shroud of Turin was recently found to date from the 13th instead of the 1st century A.D. Counterfeit coins were a problem in Roman Empire. Computers haven't cut us off from history. They just provide new tools with which human beings can fool one another. Pete Schilling, Alcoa Laboratories ------------------------------ Date: 14 Feb 1989 2127-PST (Tuesday) From: Jeff Makey Subject: Re: multi-gigabuck information "theft" In RISKS DIGEST 8.23 Mark Brader paraphrases a recent article from the Toronto Star: >A password belonging to [a large Canadian] company was used to steal >information which the company values at $4 billion (Canadian) ... This report isn't news. The "computer files" are nothing more than the source code for AT&T's UNIX operating system, copies of which may be easily obtained for a license fee on the order of a few thousand dollars -- a far cry from $4 billion. I suspect that AT&T's lawyers are at the root of this sensationalism. Jeff Makey ------------------------------ End of RISKS-FORUM Digest 8.26 ************************