In the Beginning was the Command Line (1999)

Author: BerislavLopac

Score: 177

Comments: 56

Date: 2020-11-05 14:08:50

Web Link

________________________________________________________________________________

kmeisthax wrote at 2020-11-05 17:49:53:

"In other words, the first thing that Apple's hackers had done when they'd got the MacOS up and running--probably even before they'd gotten it up and running--was to re-create the Unix interface, so that they would be able to get some useful work done. At the time, I simply couldn't get my mind around this, but: as far as Apple's hackers were concerned, the Mac's vaunted Graphical User Interface was an impediment, something to be circumvented before the little toaster even came out onto the market. "

This is Unix revisionism. Most of the early development of the Macintosh is documented on folklore.org and they certainly didn't rebuild Unix to fit on a Mac. They bootstrapped the Macintosh using the Lisa development environment, which itself was bootstrapped with Apple ][s. Unix was far too large and unwieldy for microcomputer hardware of the time, and Apple didn't have a license for it anyway. There's plenty of stories of early/hobbyist Mac buyers realizing their $2k computer had no development tools, calling up Apple, and being told that they'd need to buy $10k Lisa machines if they wanted to do real app development. MPW didn't come out until 2 years later.

(If you just wanted an easy-to-use development environment, Apple HAD worked on a GUI-capable BASIC for the Macintosh. But Bill Gates got wind of this and refused to renew their Apple ][ BASIC license unless they canned the project. Since no BASIC license meant no more Apple ][s, Apple caved, and the version of BASIC that Microsoft did ship on the Mac had no GUI support whatsoever. They would eventually ship Hypercard three years later, of course.)

arexxbifs wrote at 2020-11-05 19:42:47:

> This is Unix revisionism (...) they certainly didn't rebuild Unix to fit on a Mac

I think you're reading too much into this.

MPW, which was made and used by Apple developers and sold by Apple, featured a command line interface, which the Mac otherwise didn't have. They implemented this because a command line interface is a powerful tool when working with software development and most developers know this and will eventually want one. That's what Stephenson is saying here.

The Lisa Workshop, which was used for software development on the Lisa, was also a text-based interface.

> Unix was far too large and unwieldy for microcomputer hardware of the time

No it wasn't. Xenix was released for the Lisa in 1984.

homarp wrote at 2020-11-05 23:07:22:

https://macintoshgarden.org/apps/macintosh-programmers-works...

describes Macintosh Programmer's Workshop (MPW) as "a Unix-like integrated development environment for the Mac OS."

http://www.math.columbia.edu/~om/mpw-88.pdf

is a presentation by 2 of the authors of MPW. (Richard Meyers, Jeff Parrish)

okareaman wrote at 2020-11-05 21:44:54:

I loved MPW back in the day but I was an outlier compared to my other Mac developer friends

mistrial9 wrote at 2020-11-06 01:55:49:

I wrote the CodeWarrior tooling integration for MPW and presented the finished work to the CEO of MetroWerks :-) Given the engineering quality on both sides of that wide divide, I felt some honor at the time -- I was paid but not that much.

dkarl wrote at 2020-11-06 00:32:41:

I think Neal Stephenson was overly optimistic about the audience for this book, and I think he would regard it as a failure of the book that the only people reading it are people like us who can make the distinction you just made. He seemed to be writing for people who knew nothing that he hadn't already told them, and identifying CLIs with Unix was a simplification that I think was supposed to make it easier for his target audience to follow.

I did give the book to my computer-phobic father to read, because he was a historian who was interested in social and cultural changes and also _kind_ of curious about what I did for a living, and he said it was interesting, but he also said he didn't really understand it at a concrete level and was relying on the vivid metaphors to get any meaning out of it. I don't think he really considered it worthwhile for someone like him to read, which I think meant it was a fundamentally ill-conceived book. Still a fun read, though.

bawolff wrote at 2020-11-06 02:14:36:

I don't know, to me this short story practically screams preaching to the choir. Its almost like a patriotic rant to operating system fanboys. I don't think the audience was ever going to be non-computer nerds.

enriquto wrote at 2020-11-05 20:14:11:

> Unix was far too large and unwieldy for microcomputer hardware of the time,

I think that the article means that they implemented a unix-like command line, but not a full, posixly-compliant unix. Justh a mock sh and a handful of text-based utilities to be able to work in. At least this is how I read this paragraph.

isx726552 wrote at 2020-11-05 17:55:56:

The story of Mac Basic getting canceled is indeed a terribly sad moment in Mac history, documented here:

https://www.folklore.org/StoryView.py?story=MacBasic.txt

pinewurst wrote at 2020-11-05 23:08:04:

There was a native FORTH environment that came out fairly soon after the 1984 Mac announcement.

RodgerTheGreat wrote at 2020-11-06 04:54:52:

As fungi terraformed land on early Earth, blazing a trail for plant life to follow, Forth is often the first colonizer of new computing platforms.

WalterBright wrote at 2020-11-05 20:10:40:

I'm a bit surprised. BASIC is not that hard to implement. Apple could make their own.

Jtsummers wrote at 2020-11-05 20:19:21:

The folklore article in a sibling comment to yours clarifies it a bit. Apple did make their own, but it wasn't running on the Apple ][ or compatible with what was running on it. That was still their biggest revenue stream, and they couldn't afford MS withholding a license for BASIC on it or fragmenting the community (two versions of BASIC on the same platform depending on date of purchase). So they made a pragmatic choice to avoid getting screwed by MS, who went and did what they always did (especially back then): screwed them anyways.

WalterBright wrote at 2020-11-05 20:38:45:

Simply starting a credible project to do their own could have cause Microsoft to soften its terms.

As for a different version of BASIC, nothing stopped Apple from making a work-a-like BASIC. After all, that's how the Compaq was made, and plenty of other work-a-likes.

From a modern point of view, the software in those days looks pretty simple. I'm surprised there weren't a lot more clones.

Jtsummers wrote at 2020-11-05 20:49:41:

I can only take the folklore article as accurate here, but it seems that it boiled down to timing and cost. They probably could've done it, but their existing BASIC project had already taken a couple years. If they'd elected to replace MS's BASIC implementation with their own, they'd have had a year or so to get it done. And failure would've been very costly.

Corporate risk tolerance comes into play at that point.

jonsen wrote at 2020-11-05 20:13:48:

> Unix was far too large and unwieldy for microcomputer hardware of the time

“UniFLEX was very similar to Unix Version 7”:

https://en.m.wikipedia.org/wiki/UniFLEX

prepend wrote at 2020-11-05 17:10:52:

I love this essay. I’ve been hoping that Stephenson would rewrite it with 20 years of updates.

It describes the landscape better than anything else I’ve found as Stephenson is a user and a great writer, I think. Most other accounts are by people who make their living in journalism, or hardware, or software.

Stephenson is also an example of someone who is really into computers, and programming I suspect, but has a primary goal of writing. I like when non-programmers program (eg Jake VanderPlas [0] wrote chunks of scypi even though he’s an astronomer, even though he works as a programmer now).

[0]

http://vanderplas.com/media/pdfs/CV.pdf

bityard wrote at 2020-11-05 21:24:21:

If you like programmers who write fiction, you might also be interested to know that Mark Russinovich (of Windows Sysinternals fame) writes tech thriller novels.

(I haven't read them but they have been well-received.)

bawolff wrote at 2020-11-06 02:17:24:

Vernor Vinge is another good author in the category of people really into computers who write fiction (I remember one of his novels had a side plot about interplanatary usenet having routing failures)

lproven wrote at 2020-11-06 15:46:08:

It is _A Fire Upon the Deep_ and the interplanetary Usenet is a core plot point, IMHO.

theandrewbailey wrote at 2020-11-05 21:30:04:

There was a competing bicycle dealership next door (Apple) that one day began selling motorized vehicles--expensive but attractively styled cars with their innards hermetically sealed, so that how they worked was something of a mystery.

In retrospect, I think Neal was referring to a specific aspect of Apple's products when writing 'hermetically sealed', but I view almost any Apple product, as a whole, that way. (Apple doesn't want you to service them yourself, or know how the software works.) Even after 20 years, some things never change.

kordlessagain wrote at 2020-11-05 16:26:04:

Reagan would describe the scene as he saw it in his mind's eye: "The brawny left-hander steps out of the batter's box to wipe the sweat from his brow. The umpire steps forward to sweep the dirt from home plate." and so on.

This continues to "blow my mind" people can do this. What a gift and possibly curse!

lqet wrote at 2020-11-05 17:58:24:

> When the cryptogram on the paper tape announced a base hit, [Reagan] would whack the edge of the table with a pencil, creating a little sound effect, and describe the arc of the ball as if he could actually see it. His listeners, many of whom presumably thought that Reagan was actually at the ballpark watching the game, would reconstruct the scene in their minds according to his descriptions.

ddingus wrote at 2020-11-05 19:03:26:

That is telepathy. Amazing use of language.

Story telling in real time. Love it.

Stierlitz wrote at 2020-11-05 15:51:49:

“But around the same time, Bill Gates and Paul Allen came up with an idea even stranger and more fantastical: selling computer operating systems”

This article must be coming from some parallel universe. As I recall how Micro-Soft got a contract from IBM to supply an OS for their low-spec personal computer. They didn't have one, so Micro-Soft bought-in 86-DOS from Seattle Computer Products, using the IBM money to pay for it up front. Rather than buy it outright, Microsoft persuaded IBM to license a copy of DOS for each IBM PC sold. Later on with ‘Columbia Data Products’, Compaq and other, figuring-out how to clone the PC without paying IBM, Microsoft was more than happy to license DOS to them.

“Columbia_Data_Products”

https://en.wikipedia.org/wiki/Columbia_Data_Products

“Joint Development Agreement between International Business Machines Corporation and Microsoft Corporation”

“With respect to Phase I Output, to the extent such joint ownership is prevented by operation of law each party hereby grants to the other a non-exclusive, royalty-free, worldwide and irrevocable license to use, execute, perform, reproduce, prepare or have prepared Derivative Works based upon display, and sell, lease or otherwize transfer of posession or ownership of copies of, the Phase I Output and/or any Derivative Works thereof.”

http://edge-op.org/iowa/www.iowaconsumercase.org/011607/0000...

prepend wrote at 2020-11-05 17:03:03:

The novelty was in selling individual licenses of operating systems rather than bundling them only with hardware. Computers has OSes, and of course they were bought and sold among companies, but they were included with computer purchases.

So while IBM licensed with Microsoft to provide the OS (and MS just bought DOS from someone else), Microsoft sold the same OS to lots of others as well. And even as retail for upgrades and changes that didn’t come from the hardware vendor.

This is the same universe we’re all in.

jecel wrote at 2020-11-05 17:15:59:

CP/M and UCSD-Pascal, two operating systems available for the IBM PC besides DOS, had already been sold to individual users for many years.

Microsoft itself launched Xenix a year before the PC (August 1980).

prepend wrote at 2020-11-05 17:51:46:

I think like pretty much all of Microsoft, they didn’t innovate by doing it first, they innovated by popularizing it.

I can’t find sales of CP/M and UCSD-Pascal, but I imagine they aren’t what Microsoft started generating from their OS.

jecel wrote at 2020-11-06 03:54:32:

A Byte magazine editorial praised the newly launched PC as being the "Rosetta Stone" of computing for offering such a choice in operating systems. In practice, with PC-DOS being 5 times cheaper than the other two it was a standard from the start.

I mostly used QNX with PCs myself until Linux came along. So though it took a long time, PCs did eventually run nearly all known OSes.

dboreham wrote at 2020-11-06 03:06:15:

"History" is always horribly wrong as written, if you were there at the time, I suspect. Add OS/9 and Flex to the list. There were countless others.

lproven wrote at 2020-11-06 15:49:21:

That's OS-9, with a hyphen:

https://en.wikipedia.org/wiki/OS-9

_Not_ to be confused with the IBM family of OSes with a slash, notably OS/2 but also including OS/390 and OS/400.

walshemj wrote at 2020-11-05 21:57:53:

Actually not DEC PDP's had multiple OS's and not all of them where bundled RT-11 vs RSX-11 or RSTS/E.

You brought the system that suited we (as a Lab) ran RT-11.

Someone wrote at 2020-11-05 17:02:27:

I think that refers to

https://en.wikipedia.org/wiki/Open_Letter_to_Hobbyists

. If you could say microcomputers in the ‘70s had an OS, Basic was it.

marcosdumay wrote at 2020-11-05 16:58:07:

Mirosoft was selling software (licenses) way before the IBM deal happened. That phrase is about BASIC.

fit2rule wrote at 2020-11-05 17:23:52:

MS-BASIC was an operating system that predated DOS.

mauvehaus wrote at 2020-11-05 18:25:25:

2004 commentary reflecting developments in computing between the original writing and then:

http://garote.bdmonkeys.net/commandline/index.html

Written with Neal Stephenson's permission.

homarp wrote at 2020-11-05 22:57:42:

per Wikipedia page:

With Neal Stephenson's permission, Garrett Birkel responded to "In the Beginning...was the Command Line" in 2004, bringing it up to date and critically discussing Stephenson's argument. Birkel's response is interspersed throughout the original text, which remains untouched.

http://garote.bdmonkeys.net/commandline/index.html

rchase wrote at 2020-11-05 15:18:46:

Great book. Little dated, but still worth a read.

Love the HOLE HAWG analogy about tools that do what you tell them to, immediately and sometimes dangerously, regardless of whether what you told them to do was right.

mds wrote at 2020-11-05 16:18:26:

And here is the mandatory AvE Hole Hawg tear-down:

https://www.youtube.com/watch?v=qoR59rzqlxw

ThrowawayR2 wrote at 2020-11-05 17:27:45:

Though he sang the praises of the Hole Hawg, it's worth noting that he later switched to OS X. Usability still matters.

"_You guessed right: I embraced OS X as soon as it was available and have never looked back. So a lot of "In the beginning was the command line" is now obsolete. I keep meaning to update it, but if I'm honest with myself, I have to say this is unlikely._"

From question #8 of an interview with him in 2004 at

https://slashdot.org/story/04/10/20/1518217/neal-stephenson-...

His responses to the other questions are entertaining and worth a read as well.

vram22 wrote at 2020-11-05 18:47:35:

That can describe Unix command-line tools. No "are you sure y/n?" except if explicitly asked for via a flag, unlike DOS.

lproven wrote at 2020-11-06 15:51:09:

This is true, but the concept was taken further in the Oberon operating system, which has a design goal of _never_ asking the user questions at any point.

vram22 wrote at 2020-11-06 19:55:55:

Interesting. Another thing I vaguely remember reading about Oberon was that any subroutine in the OS could be used from any program, or something like that, for high code reuse. I'm sketchy on the details. Read it quite a while ago, maybe in a BYTE article about Oberon. Not sure if that implies if all programs were in one address space, or what.

sumtechguy wrote at 2020-11-05 16:09:11:

hehe this has become my favorite saying lately for computers 'do what I want, not what I told you to do!' computers have a lovely way of merrily going along and breaking things at a fairly fast pace.

prepend wrote at 2020-11-05 17:14:17:

Lately I find myself saying “do what I told you to do, not what you think I want to do”

Mainly this is due to the autocorrect, autocomplete on most devices nowawadys. I’m sure it’s very helpful, but I seem to notice the mistakes more than the successes. (Eg, trying to type “nowadays,” I had to break out of typing on my iPhone 3 times to backspace and stop it from changing it to other words and expressions)

sumtechguy wrote at 2020-11-05 18:17:16:

hehe that is awesome it is opposite of mine but also so true! I turned off autocorrect on my phone. Suggest is fine, but just changing it... not so much.

Jtsummers wrote at 2020-11-05 16:11:40:

Many CLI tools have a dry run option for expensive (time/resource wise) or risky commands (one way, irreversible or reversible only with a lot of effort). It would be interesting to see this become the default for some of them, with a separate flag `--now-i-mean-it` to actually execute.

vram22 wrote at 2020-11-06 19:56:59:

Yes, like

make -n

https://man7.org/linux/man-pages/man1/make.1.html

In fact, the above man page shows that one long form of the -n option is named --dry-run :)

sumtechguy wrote at 2020-11-05 18:18:52:

I wish more tools had the option of dry run. Been using it with ansible quite bit in the past few weeks. Look ma I can mess up 50 computers all at once!

prepend wrote at 2020-11-05 17:15:44:

I’ve spent so much time with rsync’s -n (I think it supports —-dry-run as well).

stainforth wrote at 2020-11-05 20:55:04:

Shouldn't dry run be the default and the "prod" run be requiring adding the switch

SarikayaKomzin wrote at 2020-11-06 05:20:06:

This is ones of those essays that pops up on Hacker News every so often, and I'm always happy to reread it. Regardless of how the technologies have changed or if the prognostications were correct, this article is about timeless and relevant themes.

Abstraction is the most powerful force within the human mind. It is perhaps solely responsible for the world we've built around us. But it is also absolutely terrifying how increasingly reliant we are on it.

"Contemporary culture is a two-tiered system, like the Morlocks and the Eloi in H.G. Wells's The Time Machine, except that it's been turned upside down. In The Time Machine the Eloi were an effete upper class, supported by lots of subterranean Morlocks who kept the technological wheels turning. But in our world it's the other way round. The Morlocks are in the minority, and they are running the show, because they understand how everything works."

We are more and more surrounded by technology that the majority of us don't understand even at a fundamental level (for the record, I include myself in the Elois). More often than not, these technologies are essentially taken for granted as magic. While no reasonable person should expect everyday people to understand the inner workings of their handheld supercomputers or their cable TV box, we would all be better off if we better understood the fundamental building blocks of the technologies we are surrounded by -- whether that's basic logic gates, the simple patterns of conditional statements in programming languages, what caching and cookies are on the web, or how a hard drive works at 30,000 feet.

"So GUIs use metaphors to make computing easier, but they are bad metaphors. Learning to use them is essentially a word game, a process of learning new definitions of words like "window" and "document" and "save" that are different from, and in many cases almost diametrically opposed to, the old."

Like Stephenson so vividly describes, the way our technology mixes metaphors is not instructive to what's actually happening on the metal. These lossy abstractions don't seem harmful at face value because they aren't, but, as they compound and more complex technologies are adopted in our homes and places of work, they threaten to make us less efficient at our jobs, more reliant on manufactures for repair and troubleshooting, more susceptible to disinformation and encroachments on our privacy, and, in my opinion most importantly, at risk for critical failures in our infrastructure (what if there aren't enough Morlocks?)((the IoT, machine learning and social media algorithms are what really frighten me)).

And we haven't even mentioned how we are now increasingly reliant on fragile systems that no single person can understand, and the dynamic nature of software means that many, many applications out there are essentially ships of Theseus that could sink at any time.

This isn't hyperbolic Doomerism. I don't think this is the Decline and Fall or anything. It's also not a condemnation of super-abstractions or casual technology use. I, like Stephenson, am a paying Disney World customer if you will. I just believe we need to invest more into the right kinds of high-level technocratic education for the general populace (and continue to combine it with liberal arts, of course), and our technologists need to invest in redundancies and stable technologies.

Luckily, we have built the Library of Alexandria 2.0 in the internet; we just need to use it.

Some fun, relevant links off the shelves of that library:

https://reasonablypolymorphic.com/book/preface.html

https://blog.nelhage.com/post/computers-can-be-understood/

https://www.nand2tetris.org/

https://mcfunley.com/choose-boring-technology

https://cs.stanford.edu/people/nick/how-hard-drive-works/

https://singularityhub.com/2016/07/17/the-world-will-soon-de...

https://en.wikipedia.org/wiki/The_Machine_Stops

https://medium.com/message/everything-is-broken-81e5f33a24e1

P.S. I love this related thought experiment -->

https://www.scientificamerican.com/article/rebooting-civiliz...

jll29 wrote at 2020-11-05 18:22:45:

Wow, a ~38k words essay about the command line.

dgritsko wrote at 2020-11-05 18:25:44:

I take it you've not read Neal Stephenson before. I often describe him as someone who uses 1000 words where others would use 500 - but I love him for it; he's probably my favorite author.

the__alchemist wrote at 2020-11-06 00:13:42:

Same! I love the degree of _plausibility_ in his novels. Even the most outlandish elements have been well-researched, and thought out.

NateEag wrote at 2020-11-05 23:31:16:

I'd describe it as an essay about complexity, computing history, open vs. closed systems, and personal responsibility.

thewakalix wrote at 2020-11-05 19:45:52:

It's about more than just the command line.