1-Dec-85 16:15:03-PST,12776;000000000000 Mail-From: NEUMANN created at 1-Dec-85 16:13:36 Date: Sun 1 Dec 85 16:13:35-PST From: RISKS FORUM (Peter G. Neumann, Coordinator) Subject: RISKS-1.25 Sender: NEUMANN@SRI-CSL.ARPA To: RISKS-LIST@SRI-CSL.ARPA RISKS-LIST: RISKS-FORUM Digest Sunday, 1 Nov 1985 Volume 1 : Issue 25 FORUM ON RISKS TO THE PUBLIC IN COMPUTER SYSTEMS Peter G. Neumann, moderator Contents: Some Thoughts on Unpredicted Long-Term Risks (Peter G. Neumann) Computer snafu halts treasury (Peter G. Trei) "Hacker" Game (Ken Brown; Keith F. Lynch; Werner Uhrig) Summary of Groundrules: The RISKS Forum is a moderated digest. To be distributed, submissions should be relevant to the topic, technically sound, objective, in good taste, and coherent. Others will be rejected. Diversity of viewpoints is welcome. Please try to avoid repetition of earlier discussions. (Contributions to RISKS@SRI-CSL.ARPA, Requests to RISKS-Request@SRI-CSL.ARPA) (FTP Vol 1 : Issue n from SRI-CSL:RISKS-1.n) ---------------------------------------------------------------------- Date: Sun 1 Dec 85 16:00:32-PST From: Peter G. Neumann Subject: Some Thoughts on Unpredicted Long-Term Risks To: RISKS@SRI-CSL.ARPA This is a short note on the subject of how little we understand or are willing to admit about long-term effects (whether eventually very obvious effects or sometimes long-hidden side-effects) -- and of how, even if a risk is known, it may be suppressed. The front page of today's NY Times (Sunday 1 Dec 85) has two articles on Bhopal, one year after. Stuart Diamond's article begins, "Medical studies conducted in the year since the chemical leak ... indicate that the chemical responsible for the accident causes serious long-term health problems that were unknown before the disaster." Furthermore, the Bhopal problems appear to have been due not just to the pesticide ingredient methyl isocynate, but to an unforeseen chemical reaction that transformed some of it to hydrogen cyanide. (An antidote to the latter chemical was therefore not used until months afterwards.) In various past issues of the RISKS Forum we have noted risks such as side-effects of pacemaker interference; auto microprocessor bugs; command and control computer problems; and so on -- perhaps ad nauseum to some of you -- and the dangers of making a finite set of assumptions that underly proper system behavior. I have also alluded occasionally to lessons that we might learn from environmental risks such as toxic substances in our food, drink, and environments; some of those risks were known in advance but ignored -- e.g., for commercial reasons; others came as "surprises" (thalidomide, for example), but probably represented a lack of care and long-term testing. In some cases the risks were thought of, but considered minimal. In other cases, the risks were simply never considered. At the beginning of the holiday season, this note merely adds a plaintive cry for greater humility. Science (especially computer science) does not have ALL the answers. Furthermore, the absence of any one answer (and indeed ignorance of a question that should have been asked) can be damaging. But, as we see from the nature of the problems to date, some of us too often keep our heads in the sand -- even after being once (or multiply) burned. Eternal vigilance is required of all of us. Bureaucrats and technocrats who say "don't worry, nothing can go wrong" must be exposed. But technocrats who say "we can't do it at all" need to be very careful in their statements -- locally anything is possible. However, we must remember that it is in the global system integration and in operation under conditions of stress that things tend to break down -- and also where rational arguments tend to break down. I am now finally back in California after 7 weeks on the road. It is good to be back (I think), but it is time to get RISKS rolling again. I hope that this forum is helping to increase our awareness of the problems and of what we can (and cannot) do to improve our computer systems and their use. But the useful perpetuation of RISKS -- and the application of your knowledge in real systems -- depends on you. PGN ------------------------------ Date: Fri 29 Nov 85 00:43:47-EST From: Peter G. Trei Subject: Computer snafu halts treasury To: risks@SRI-CSL.ARPA From the Wall Street Journal, Monday 25 November 1985 [quoted without permission] A Computer Snafu Snarls the Handling of Treasury Issues by Phillip L. Zweig and Allanna Sullivan Staff reporters of the Wall Street Journal NEW YORK- A computer malfunction at Bank of New York brought the Treasury bond market's deliveries and payments systems to a near- standstill for almost 28 hours Thursday and Friday. Although bond prices weren't affected, metal traders bid up the price of platinum futures Friday in the belief that a financial crisis had struck the Treasury bond market. However, Bank of New York's problems appeared to be more electronic than financial. The foul-up temporarily prevented the bank, the nation's largest clearer of government securities, from delivering securities to buyers and making payments to sellers - a service it performs for scores of securities dealers and other banks. The malfunction was cleared up at 12:30 p.m. EST Friday, and an hour later the bank resumed delivery of securities. But Thursday the bank, a unit of Bank of New York Co., had to borrow a record $20 billion from the Federal Reserve Bank of New York so it could pay for securities received. The borrowing is said to be the largest discount window borrowing ever from the Federal Reserve System. Bank of New York repaid the loan Friday, Martha Dinnerstein, a senior vice president, said. Although Bank of New York incurred an estimated $4 million interest expense on the borrowing, the bank said any impact on its net income "will not be material." For the first nine months this year, earnings totaled $96.7 million. Bank of New York stock closed Friday at $45.125, off 25 cents from the Thursday, as 16,500 shares changed hands in composite trading on the New York Stock Exchange. Bank of New York said that it had paid for the cost of carrying the securities so its customers wouldn't lose any interest. Bank of New York's inability to accept payments temporarily left other banks with $20 billion on their hands. This diminished the need of many banks to borrow from others in the federal funds market. Banks use the market for federal funds, which are reserves that banks lend each other, for short-term funding of certain operations. The cash glut caused the federal funds rate to plummet to 5.5% from 8.375% early Thursday. The electronic snafu is by far the largest of computer problems that periodically have bedeviled the capital markets. Almost all goverment securities transactions are settled electronically through the New York Federal Reserve Bank. In this system, computers of clearing banks are linked to one another through a central computer, to enable banks to settle purchases and sales of securities by customers. According to Wall Street sources, the malfunction occurred at 10 a.m. Thursday as Bank of New York was preparing to change software in a computer system and begin the days operations. Until Friday afternoon, Bank of New York received billions of dollars in securities that it couldn't deliver to buyers. The Fed settlement system, which officially closes at 2:30 p.m., remained open until 1:30 a.m. Friday in the expectation that technicians would be able to solve the problem. Rumors about bank problems often send commodity traders scurrying to buy precious metals. In the platinum pit at the New York Mercantile Exchange, the price for January delivery surged $12.40 an ounce to $351.20 Friday on volume of 11,929 contracts, a 29-year record. Reports that the Fed was investigating transfer problems at Bank of New York prompted the platinum buying. [end of quotation] I talked to a friend of mine who was peripherally involved in the recovery from this 'snafu', and it seems that the primary error occured in a messaging system which buffered messages going in and out of the bank. The actual error was an overflow in a counter which was only 16 bits wide, instead of the usual 32. This caused a message database to become corrupted. The programmers and operators, working under tremendous pressure to solve the problem quickly, accidently copied the corrupt copy of the database over the backup, instead of the other way around. One thing I have often noticed is that the 'normal run' code of software packages tends to get much more through testing then the code for error recovery; not only is it more difficult to test, but the general feeling of 'this code will never execute' demotivates programmers. In this case, it sounds like the people at BONY never held a 'fire drill' to figure out how to handle a corrupt primary database. Does anyone else have examples where attempts at error recovery magnified problems? Peter Trei oc.trei@cu20b ------------------------------ Date: Thu, 21 Nov 85 14:27:34 pst From: decwrl!Glacier!oliveb!felix!birtch!ken@ucbvax.berkeley.edu (Ken Brown x254) To: Risks@SRI-CSL Subject: "Hacker" Game -------------------- This is in response to Ted Shapin's article regarding the 'irresponsible' game, HACKER (RISKS-1.23). -------------------- The game HACKER, is just a game. It has nothing to do with the trying to make people (read kids) try to break into a computer system. I know. I helped write the IBM PC version of the game. The name of the game, HACKER, just deals with the initial 'screen' of the game, where one simulates trying to logon to a system. It is not very realistic (MY_VIEW) of an actual logon sequence. The rest of the game has NOTHING to do with 'hacking', as I view the term hacking. However, I do agree with you about the packaging blurbs, regarding the "not caring," but I am not the one who wrote that blurb. The game is strictly for entertainment purposes. It will not teach people how to break in to remote (or local) systems. Just about anything you type will get you past the "security system." Ken Brown [... !trwrb!scgvaxd!felix!birtch!ken] These ramblings are my own, and probably do not reflect those of my employer or fellow employees. ------------------------------ Date: Sat, 23 Nov 85 11:28:22 EST From: "Keith F. Lynch" Subject: Hacker game To: BEC.SHAPIN@USC-ECL.ARPA Cc: risks@SRI-CSL.ARPA [...] Date: Mon 18 Nov 85 11:54:52-PST From: Ted Shapin Activision HACKER Makes you feel like you've unlocked someone else's computer system! ... This "product" is socially irresponsible! It leads young people to think breaking into unknown systems is OK. The "world" they discover may be the world of the penal system! I don't see what's wrong with this. This is better than cracking for real, and I doubt that anyone will learn any useful cracking techniques from this game. Do you also think that toy guns should be banned? What about Adventure, Zork, and Dungeons and Dragons, which teach people to kill and to steal? I think fantasy role playing games are of great benefit. They give people of all ages a chance to 'get it out of their system' in a harmless way. ...Keith ------------------------------ Date: Sat 23 Nov 85 13:28:39-CST From: Werner Uhrig Subject: "Hacker" Game: Is game simulating security of *REAL* machines? To: BEC.SHAPIN@USC-ECL.ARPA cc: risks@SRI-CSL.ARPA [...] I wouldn't be surprised if this game actually simulates the security features (or lack thereof) of some real-life systems ... ... in which case, it's *REALLY* time to be alarmed. On the other hand, this just might cause a lot of sites to decide to pay attention to improving their security, or cause efforts which advance the state of the art of security, which wouldn't be that bad, when you think about it. Has someone with access to the game and knowledge of the security features of different minis/mainframes checked this out yet? ------------------------------ End of RISKS-FORUM Digest ************************ -------