💾 Archived View for gemini.spam.works › mirrors › textfiles › magazines › CUD › cud0660.txt captured on 2022-06-12 at 10:55:37.
-=-=-=-=-=-=-
From <@vm42.cso.uiuc.edu:owner-cudigest@VMD.CSO.UIUC.EDU> Mon Jul 4 00:02:35 1994 Date: Sun, 3 Jul 1994 22:36:00 CDT Reply-To: TK0JUT2@MVS.CSO.NIU.EDU Sender: CU-DIGEST list <CUDIGEST%UIUCVMD.bitnet@vm42.cso.uiuc.edu> Subject: Cu Digest, #6.60 To: Multiple recipients of list CUDIGEST <CUDIGEST%UIUCVMD.bitnet@vm42.cso.uiuc.edu> Computer underground Digest Sun June 30, 1994 Volume 6 : Issue 60 ISSN 1004-042X Editors: Jim Thomas and Gordon Meyer (TK0JUT2@NIU.BITNET) Archivist: Brendan Kehoe Retiring Shadow Archivist: Stanton McCandlish Shadow-Archivists: Dan Carosone / Paul Southworth Ralph Sims / Jyrki Kuoppala Ian Dickinson Coptic Idolator: Ephram Shrewdlieu CONTENTS, #6.60 (Sun, June 30, 1994) File 1--Open Letter to Veep Al Gore in re New Computer Standard File 2--PDC'94 CFP-Artifacts session (revised) File 3--ACM Releases Crypto Study Cu-Digest is a weekly electronic journal/newsletter. Subscriptions are available at no cost electronically. CuD is available as a Usenet newsgroup: comp.society.cu-digest Or, to subscribe, send a one-line message: SUB CUDIGEST your name Send it to LISTSERV@UIUCVMD.BITNET or LISTSERV@VMD.CSO.UIUC.EDU The editors may be contacted by voice (815-753-0303), fax (815-753-6302) or U.S. mail at: Jim Thomas, Department of Sociology, NIU, DeKalb, IL 60115, USA. Issues of CuD can also be found in the Usenet comp.society.cu-digest news group; on CompuServe in DL0 and DL4 of the IBMBBS SIG, DL1 of LAWSIG, and DL1 of TELECOM; on GEnie in the PF*NPC RT libraries and in the VIRUS/SECURITY library; from America Online in the PC Telecom forum under "computing newsletters;" On Delphi in the General Discussion database of the Internet SIG; on RIPCO BBS (312) 528-5020 (and via Ripco on internet); and on Rune Stone BBS (IIRGWHQ) (203) 832-8441. CuD is also available via Fidonet File Request from 1:11/70; unlisted nodes and points welcome. EUROPE: from the ComNet in LUXEMBOURG BBS (++352) 466893; In ITALY: Bits against the Empire BBS: +39-461-980493 UNITED STATES: etext.archive.umich.edu (141.211.164.18) in /pub/CuD/ ftp.eff.org (192.88.144.4) in /pub/Publications/CuD aql.gatech.edu (128.61.10.53) in /pub/eff/cud/ world.std.com in /src/wuarchive/doc/EFF/Publications/CuD/ uceng.uc.edu in /pub/wuarchive/doc/EFF/Publications/CuD/ wuarchive.wustl.edu in /doc/EFF/Publications/CuD/ EUROPE: nic.funet.fi in pub/doc/cud/ (Finland) ftp.warwick.ac.uk in pub/cud/ (United Kingdom) JAPAN: ftp.glocom.ac.jp /mirror/ftp.eff.org/ COMPUTER UNDERGROUND DIGEST is an open forum dedicated to sharing information among computerists and to the presentation and debate of diverse views. CuD material may be reprinted for non-profit as long as the source is cited. Authors hold a presumptive copyright, and they should be contacted for reprint permission. It is assumed that non-personal mail to the moderators may be reprinted unless otherwise specified. Readers are encouraged to submit reasoned articles relating to computer culture and communication. Articles are preferred to short responses. Please avoid quoting previous posts unless absolutely necessary. DISCLAIMER: The views represented herein do not necessarily represent the views of the moderators. Digest contributors assume all responsibility for ensuring that articles submitted do not violate copyright protections. ---------------------------------------------------------------------- Date: Thu, 23 Jun 1994 17:12:16 -0500 (CDT) From: Wade Riddick <riddick@JEEVES.LA.UTEXAS.EDU> Subject: File 1--Open Letter to Veep Al Gore in re New Computer Standard An Open Letter To Al Gore, Vice President of the United States of America A New Computer Standard: Fixing the Flats on the Information Highway The U.S. must manage the early adoption of industrywide standards that render emerging technologies compatible with each other and speed commercial acceptance. Such standards make it easier for purchasers to experiment with equipment embodying new technology and reduce the risk of committing to a technology that quickly becomes obsolete . . . In the U.S., technological standards are set with little regard to such issues. Large companies or government agencies set de facto standards... Unfortunately, none of these sources of standards has explicit responsibility for managing the standards process to best promote a new technology. - Robert Reich1 One important roadblock often missed by policymakers as they work to lay the foundations of the information super-highway is the incompatibility that exists among the operating systems and microchips that will form the highway's roadbed. When the Clinton Administration opened the telecommunications industry to competition, its goal was not to limit consumer choice, but rather to broaden choice by weakening narrow, monopolistic controls over technology and allowing small private companies to move technology in many different directions. None of this will be possible without a common standard to allow these diverse innovations to interact. Just as the national economy needs a common currency and a common language in which to conduct business, so too does the information superhighway need a standard through which its components can interact. Since the development of the U.S. Department of Defense's Advanced Research Projects Agency Network (ARPANET) in the 1960s, the federal government has done an admirable job establishing network protocols, which are rules needed for seamless long-distance data transmission between computers. Without such standards, today's international computer network, known as the Internet, would not exist. The U.S. government, however, has not done a good job of standardizing the basic commands needed to operate computers-the languages, compilers, operating systems and other instructions governing the microprocessor (the central processing unit, or CPU, that is a computer's "brain"). These forms of programming instructions are the most valuable types of electronic data because they tell computers how to handle information. If an application (program) can be transmitted between two different computers but cannot run on both machines-the current norm in the industry-the application's value is limited. Companies like Apple, IBM, Microsoft, Intel and Novell have little incentive to create truly open or common standards for operating systems or microchip instructions because each company in one way or another competes successfully on the basis of differences in its products. Proprietary standards (where all rights to the standard are retained by one firm) are one way these companies can protect their research and development (R&D) costs from reengineering by competing firms.2 The Problem Just as the mercantilist nations of the last century forced their currency on their colonies and used tariff barriers to discourage trade with other powers, computer makers in the twentieth century have set standards governing the internal commerce of their products to the detriment of the competition.3 In the same way that 19th-century Britain bucked the mercantilist trend, maintained a free trading regime, and lost ground to "freeloading" traders as a result, IBM defined an open PC standard and bore the costs of maintaining it while clone makers got a free ride. With no need for heavy R&D expenses, these companies could undercut IBM prices by a significant margin. In the past, proprietary standards have acted as unfair exchange standards, making it unnecessarily expensive for consumers to move their investments in data-and particularly software-around from one platform (operating system) to another. This deters investment, just as the asset-trapping nature of a command economy or non-convertible currency was for many years a substantial deterrent to foreign investment in Eastern Europe. Consumers have started demanding more compatibility between systems, but companies have been slow to react. As _The Economist_ put it, "every firm wants a monopoly-and every firm wants to call it an open standard."4 Recently, corporations have begun establishing interfirm alliances to allow their systems to support "multiple personalities" (multiple operating systems). Future IBM computers will be able to run Mac software, while Apple's new Power PC will run Windows and OS/2, thanks to the use of translation and emulation software.5 John Sculley-the ex-CEO of Apple-points out in _Defying Gravity_ that computer designs can no longer be based just on the engineers' experience of using the system. No one company has the business expertise to design an entire system in a world where more diverse products have to be brought to market faster than ever. That speed requires higher levels of coordination, cooperation and standardization between companies. The current proliferation of cross-licensing agreements falls short of a universal standard. The incentive to sell incompatible platforms is still there; companies have just decided to rely on translation software that they make, called microkernels, instead of full-blown operating systems for their profits. They have failed to break up the operating system into individual components that can be built by different companies according to comparative (instead of historical) advantage. Someday, as happened with railroads and automobiles, a standard for interchangeable software parts will emerge, either through government intervention or the natural evolution of a monopoly out of the market.6 This monopoly will, however, require government regulation at some point to prevent abuse, as was necessary with the railroad and telephone empires. It is often forgotten why, how, and at what cost the national railroads were unified. According to John Browning, "like railroads, new information networks are increasingly being built in large, monolithic chunks, starting from the long distance links and working down to the local one."7 Long distance links were the last part of the national rail system to be built, because it took an immense effort to integrate incompatible regional networks- particularly in the South where there were only spur lines.8 In fact, railroads, highways and even computers9 to a certain extent have been built up regionally with government stimulus and later coordinated through national structures. Regional and local monopolies had to be granted so that proposed standards would be self-enforcing, since where there is incentive to compete, there is incentive to deviate from the standard and affect the distribution of market share. Railroads were easy to standardize because the tracks were originally built with iron rails that wore out quickly. Tracks had to be rebuilt often, so it was not difficult-given adequate financial incentive-to rebuild the gauges to a particular width.10 The advent of steel, because of its durability, might actually have threatened this standardization. Fortunately, just as steel was replacing iron in the 1870s and '80s, local railroad companies came together in regional alliances to standardize gauges and policies for transcontinental shipping, ending decades of chaos in the industry. These alliances greatly reduced costs to the consumer and spurred investment in new railroad technology. Some railroad companies concerned with standardization feared the emergence of a monopoly and tried to preserve their independence by confederating. They borrowed from the American federalist model of government to create their own tripartite government with a legislative assembly, executive branch, and judiciary for settling disputes. This structure balanced competing regional interests against one another and produced an efficient, egalitarian, state-of- the-art continental transportation system.11 Since the governing convention created by these small cartels did not include all rail companies, nor address all of the public interest, it collapsed when Jay Gould and others began forming large conglomerates. New, antidemocratic giants emerged, which Congress then stepped in to regulate. Either through market evolution or government intervention, such a standardization of CPUs and operating systems is inevitable. According to _The Economist_, the computer industry is rapidly becoming "a commodity business"12 with all the accompanying industry- wide conventions. This is occurring in an industry producing goods with the highest intellectual property content in history (hardly characteristic of most commodities). It is possible for government to move in now, avoid further costs of incompatibility and establish a forward-looking, flexible standard that will preclude the development of a monopoly and will reshape the way value is created in the software industry. In the process, the hyper-competitive aspects of the computer industry that have served society so well could be preserved. As the National Performance Review prescribes, government can set clear goals and act as a catalyst while allowing private actors to move the ball down the field. Because of the peculiar nature of information, such a standard need not be autocratic, nor would setting one be risky. The Japanese and European efforts to set High-Definition Television (HDTV) standards flopped because they locked industry into analog hardware before superior digital technology was ready. Immature technologies have never been successfully pushed on society. The software industry has almost the opposite problem-not so much inventing the wheel or prematurely setting it in stone as constantly having to reinvent it (in order to operate applications under different systems).13 A computer's instructions are vastly different than the regular objects that come to mind when standards are discussed. The instructions CPUs use are virtual; they are not materially dependent on any particular piece of hardware. As symbols, they can always grow and be reinterpreted, unlike manufactured products such as metal pipe, whose dimensions cannot be changed once cast. Corporate planners, long resistant to the adoption of a standardizing framework, are beginning to see the adaptability of computer code as an advantage upon which a new standard could be based. As the senior technical editor of *BYTE* put it, "the battle is no longer about whether to layer object-oriented services and emulation systems . . . on a small kernel . . . nor whether to build an operating system in this style but how to do the job right."14 The remaining problem is one of coordination between corporations in getting these new systems to work together. The Solution The essential features of such a system are easily described. The system could be called DNA, after its biological counterpart which binds all organic matter into the same competitive framework. While object orientation15-the way in which commonly used types of data are paired with the instructions needed to manipulate that data-makes data transportable and software highly extensible *within* a platform, DNA would make that operating system and processor object oriented so that both data *and* software would be transportable across platforms. In other words, when a processor receives a standard DNA message telling it to do something like add two numbers or draw a line, it will have a library available to translate the instruction into the host language of that particular processor. Under this system, it would be up to the CPU's manufacturer to supply the most basic translation libraries, but other firms could supply add-ons or extensions for functions too complex for the CPU to execute. This way, market competition could be used to set standards for new forms of data, instead of having the government mandate standards for immature technologies. A company marketing a product which uses a completely novel form of data-say a device for producing certain odors16-would have an opportunity to create its own standard for data by marketing a new extension for the DNA system. A competitor might also market a similar plug-in, and both companies could compete to gain supporters for their mini-standard. In the end, the best solution would likely win out. Companies would not have to worry about maintaining compatibility with an existing base because no previous software could produce odors. The uniform interface of DNA would allow individual firms to use their expertise to replace inefficient system components easily, thereby broadening the market for their products. If DNA contained a standard driver for reading keyboard input, for example, and someone wanted to market a new voice recognition device that would be compatible with past software, that company could make a substitute for the keyboard interface that instead uses the firm's voice recognition hardware. DNA would increase the marketability of the voice recognition device, because customers could buy the physical device without having to upgrade their entire software library. According to *The Economist*, "today all firms need a niche"17 in the computer market-and universal standards can provide the necessary framework. DNA would not pick winners, but would instead make it easier for winners to emerge. Systems would be built component by component on the basis of efficiency, rather than through political or alliance considerations. Much DNA code may have to be interpreted on each platform, but with a common object code standard each platform would be able to do this in the most efficient manner. If this standard's basic design is flawed or technology passes it by (since technology moves faster than anyone's capacity to plan ahead), certain instructions could be reserved in advance to switch to a completely new, but as yet unspecified standard. In the past, companies have objected to the slight performance degradation caused by interpretation. The Macintosh has been successful precisely because of the huge "toolbox"18 of standard commands it makes available to applications. Because programs "call" these functions in the system, instead of in the application itself, Apple has managed to reduce program size and smoothly maintain the system's evolutionary growth path. Apple's new PowerPC is the first example of a "multiple personality" PC capable of running under more than one operating system. The PowerPC uses a new platform and microprocessor, the 601. To run the old software, which is written for a 68000 microprocessor, the PowerPC interprets and translates that code to the 601. Reinterpreting the old 68000 instructions slows things down, but by rewriting the toolbox to run on the faster new 601, Apple makes up for that loss. Users see no performance degradation with old software and see tremendous gains with new software. Most of Apple's competitors are planning similar interpretation schemes for their new systems. Since an open standard requires some sort of monopolistic power, it is clear that if DNA is implemented, companies will no longer profit from the creation of monolithic operating systems. The way value is created in the software and hardware industries would be radically altered under DNA, as shown in Figure 1, but who wants to make money reinventing the wheel? Real money is made on the cutting edges of technology, and this technological advancement should continue to be driven by the free market. U.S. policymakers must think seriously now about how to keep American industries globally competitive for the next fifty years. By 2040, no software power will make money reinventing the wheel. In a world where microprocessor architectures are proliferating instead of unifying and where technical progress is speeding up in all areas of science, a DNA-type standard is needed, if for no other reason than to coordinate the diffusion of technical expertise. Only by making new technology generic, so that a user can plug it in and go, will the learning curve needed to use new technologies efficiently be conquered. Technology transfer needs to become more automatic. Many writers, James Dearing among them, have thought of technology transfer as a "difference-reduction"19 problem-one of trying to get users and inventors to share the same knowledge about an invention so that the person in the field knows how to apply it as well as the inventor. In fact, really useful technology gets put to uses never dreamed of by its inventors. The problem is how to insulate the information needed to use new technology from the knowledge of how it works-which confuses most consumers. The historical trend in U.S. technological development is clear; either government or industry will eventually take steps to stop this continual rebuilding of operating systems from the ground up. The real issue to be decided in the telecommunications debate is not over who owns the virtual asphalt or builds the on-ramps. The question is who will own the resulting computer standard governing the packaging of information. Any firm which wins control will have a power not unlike the government's ability to print money: the firm will control the currency of day-to-day electronic transactions. This fact is becoming increasingly apparent and important to policymakers. According to Admiral Bobby Inman and Daniel Burton, "arcane topics like technical standards . . . that once were viewed as the responsibility of obscure bureaucrats will increasingly engage public officials at the highest levels."20 There is already a consensus in the industry as to what features computers will incorporate in the next decade. It is also clear that some sort of standard for object code will emerge as well. Government, though, has several options for the role it can play in this process: (1) the Commerce Department, perhaps with some authorizing legislation, could call industry heads together and order them to set a common object code standard; (2) Commerce could accept bids from various companies and groups for such a standard; or (3) finally, the federal government could itself craft a standard with the help of qualified but disinterested engineers, and then try to force it upon the industry through the use of government procurement rules, control over the flow of research and development money or other economic levers. The recent victory of Microsoft in its case against Stac Electronics over protecting its operating system indicates that some reform of the intellectual property laws may be needed as well. Given the acrimony in the current debate over the definition of a much-needed encryption (data security) standard, it is difficult to identify the most politically feasible path for policymakers to follow in developing common object code standards. There is enough of a consensus in the industry and among users now to begin the search for a solution. A serious effort should also be made to reach a consensus with other industrialized nations, for computers are globally interconnected to a degree that no other mass consumer product has been. Government can prevent a monopoly if it moves now. The unique nature of information technology would allow a common standard to develop without locking the industry into risky, immature technologies and would accelerate rather than hinder innovation. According to Nicholas Negroponte, director of MIT's Media Lab, "an open systems approach is likely to foster the most creative energies for new services and be a vehicle for the most rapid change and evolution."21 Such an approach would simply provide a stable framework within which businesses could compete on the basis of their expertise and not on their historical advantage. This is what America's founding fathers designed federalism to do from the start: balance competing sectoral and regional interests against one another to spur competition and development for the benefit of all. By Wade Riddick Author Biography Wade Riddick is a graduate student and National Science Foundation Fellow in the Department of Government at the University of Texas. He received his B.A. in English from Louisiana State University. He can be reached at RIDDICK@JEEVES.LA.UTEXAS.EDU. Figure 1 Traditional Microsoft Windows -> Disk / Screen / Memory / Audio / ... -> User IBM OS/2 -> Disk / Screen / Memory / Audio / ... -> User Apple Macintosh -> Disk / Screen / Memory / Audio / ... -> User Currently users have to pick one complete operating system to run. __________________________________________________________________ New Systems - Microsoft Windows / Microsoft Windows NT -> kernel -- IBM OS/2 - User \ - Apple Macintosh - Microsoft Windows / Apple/IBM PowerPC -> kernel -- IBM OS/2 -> User \ - Apple Macintosh In systems being introduced this year, users have to pick one company's kernel and then another company's operating system(s). ___________________________________________________________________ DNA Common Standard Microsoft Apple IBM ( ( ) ) ) ( Disk + Screen + Memory + ..... -> User Under DNA, no one company will make *the* operating system. ___________________________________________________________________ Notes 1 Robert Reich, "The Quiet Path to Technological Preeminence,"