💾 Archived View for spam.works › mirrors › textfiles › computers › being.txt captured on 2023-06-14 at 16:00:16.

View Raw

More Information

-=-=-=-=-=-=-


90-02/BeingInNothingness

BEING IN NOTHINGNESS
Virtual Reality and the Pioneers of Cyberspace

By John Perry Barlow
Published in Microtimes Magazine

"Cyberspace.  A consensual hallucination experienced daily 
by billions of legitimate operators, in every nation...A 
graphic representation of data abstracted from the banks of 
every computer in the human system.  Unthinkable 
complexity.  Lines of light ranged in the nonspace of the 
mind, clusters and constellations of data.  Like city lights, 
receding..."
			--William Gibson, Neuromancer

Suddenly I don't have a body anymore.  

All that remains of the aging shambles which usually constitutes my 
corporeal self  is a glowing, golden hand floating before me like 
Macbeth's dagger.  I point my finger and drift down its length to the 
bookshelf on the office wall.

I try to grab a book but my hand passes through it.  

"Make a fist inside the book and you'll have it," says my invisible 
guide.  

I do, and when I move my hand again, the book remains embedded in 
it.  I open my hand and withdraw it.  The book remains suspended 
above the shelf.

I look up.  Above me I can see the framework of red girders which 
supports the walls of the office...above them the blue-blackness of 
space.  The office has no ceiling, but it hardly needs one.  There's never 
any weather here.  

I point up and begin my ascent, passing right through one of the 
overhead beams on my way up.  Several hundred feet above the office, 
I look down.  It sits in the middle of a little island in space.   I 
remember the home asteroid of The Little Prince with its one volcano, 
it's one plant.

How very like the future this place might be: a tiny world just big 
enough to support the cubicle of one Knowledge Worker.  I feel a wave 
of loneliness and head back down.  But I'm going too fast.  I plunge 
right on through the office floor and into the bottomless indigo below.  
Suddenly I can't remember how to stop and turn around.  Do I point 
behind myself?  Do I have to turn around before I can point?  I flip into 
brain fugue.

"Just relax," says my guide in her cool clinical voice.  "Point straight 
up and open your hand when you get where you want to be."

Sure.  But how can you get where you want to be when you're coming 
from nowhere at all?  

And I don't seem to have a location exactly.  In this pulsating new 
landscape, I've been reduced to a point of view.  The whole subject of  
"me" yawns into a chasm of interesting questions.  It's like Disneyland 
for epistomologists.  "If a virtual tree falls in the computer-generated 
forest..?"  Or "How many cybernauts can dance on the head of a 
shaded solid?"  Gregory Bateson would have loved this.  Wittgenstein, 
phone home. 

At least I know where I left my body.  It's in a room called Cyberia in a 
building called Autodesk in a town called Sausalito, California.  Planet 
Earth.  Milky Way.  So on and so forth.  My body is cradled in its usual 
cozy node of space-time vectors.

But I...or "I"...am in cyberspace, a universe churned up from computer 
code by a Compaq 386 and a pair of Matrox graphics boards, then fed 
into my rods and cones by  VPL Eyephones, a set of goggles through 
whose twin, parallax-corrected video screens I see this new world.  

When I move my head, the motion is tracked by a a Polhemus 
magnetic sensor and the imaging engine of cyberspace is instructed to 
alter what I see accordingly.  Thus, having made a controlled ascent 
back up through the floor of the "office," I turn to the left and I see red 
chair with a desk behind it.  I turn to the right and I see a door leading 
out onto the floating platform.  

The configuration and position of my right hand is fed into the system 
by a VPL DataGlove, also with an Polhemus attached to it.  The 
relationship between my hand and the eyephones is precisely 
measured by the two trackers so that my hand appears where I would 
expect it to.  When I point or make a fist, the fiber optics sewn into the 
DataGlove convert kinesthetics into electronics.  For a decisecond or so, 
my hand disappears and then reappears, glowing and toon-like, in the 
appropriate shape.

Despite the current confines of my little office-island, I know that I 
have become a traveller in a realm which will be ultimately bounded 
only by human imagination, a world without any of the usual limits of 
geography, growth, carrying capacity, density or ownership.  In this 
magic theater, there's no gravity, no Second Law of Thermodynamics, 
indeed, no laws at all beyond those imposed by computer processing 
speed...and given the accelerating capacity of that constraint, this 
universe will probably expand faster than the one I'm used to.

Welcome to Virtual Reality.  We've leapt through the looking glass.  
Now what?  Go ask Alice.


      


The Next Big Thing
Money from Nuthin'

"I think this is the biggest thing since we landed on the Moon," says 
Jaron Lanier, the dread-locked CEO of VPL Research.  (Who was 9 
years old at that time.)  I don't choke on that one.  Indeed, I'd take it a 
bit farther, guessing that Columbus was probably the last person to 
behold so much usable and unclaimed real estate (or unreal estate) as 
these cybernauts have discovered.

At Autodesk, the Sausalito publisher of AutoCAD drafting software, 
they spent the summer of T89 in product development heaven, talking 
telephone, automobile, airplane, computer.  They invoked Edison, Bell, 
Ford, and Jobs.  And there was that loincloth-and-machete sense of 
enterprise which one might have experienced in the Wright Brothers' 
Akron Bicycle Shop or Paul Jobs' garage in Mountain View...as well as 
countless less-chronicled shots at perpetual motion or baldness cures.

Neil Armstrong's small step ran about 70 Billion Real Dollars, but 
when John Walker, the Hacker King of Autodesk, committed his 
company to creating the first commercially-available "world in a can," 
he figured that the prototype "gizmo" could be built for about $25,000.     

VPL, the other trading post on VR frontier, isn't much fatter, although 
internal synergy seems to magnify output.  Since their incorporation in 
1985, they've had two Scientific American covers and produced the 
DataGlove, DataSuit, the PowerGlove, Swivel 3-D and VPL 
EyePhones, the only commercially available head-mounted display.  
They've been in a couple of big lawsuits (one, just concluded to their 
satisfaction, with Stanford University), and create, at a distance, the 
mirage of a fair-sized company going at it pretty hard.  

But up close, one can get on a first-name basis with every VPL 
employee in the course of an afternoon. They have yet to outgrow the 
third floor of their slightly tacky building at the Redwood City yacht 
harbor.

While Apple's research gazillions yield such dubious fruit as 
multimedia and the AppleFax Modem, while IBM replicates methods 
for chaining bureaucrats to its mainframes,  it begins to appear that the 
Next Big Thing will begin its commercial evolution as humbly as the 
personal computer.

As usual, the Big Guys have neither the means nor the desire to engage 
in such open-ended creation as settling the virtual universe will 
require.  Like the Union Pacific Railroad awaiting the fact of empire, 
they prefer to let the rag-tag pioneers die all over the frontier before 
they come out to claim it.  

When the Altairs and Osbornes of Virtual Reality have made their fatal 
errors are headed for Chapter 11, IBM probably will issue forth the 
SolutionStation VR Network or some such and accelerate natural 
selection in the field.

But as I write this, VPL and Autodesk still have  it to themselves.  
Actually, they are not the first to make virtual landfall. They are only 
the first at financial risk. Unlike the first automobiles or telephones 
their commercial fledglings had the advantage of long incubation by 
government and Academia.

Virtual Reality, as a concept, found first form at the University of Utah 
over twenty years ago in the fecund cranium of Ivan E. Sutherland, the 
godfather of computer graphics and the originator of about every  Big 
Computer Idea not originated by Alan Kay or Doug Englebart.  In 
1968, he produced the first head-mounted display.  This was the 
critical element in VR hardware, but it was so heavy that it had to be 
suspended from the ceiling...at some peril to its wearer.  Damocles was 
mentioned.

Besides, once you got it on, there wasn't much to see in there.  There 
wasn't a computer in existence which could churn out enough 
polygons per second to simulate a reality much more full-bodied than 
a game of Pong. 

So Virtual Reality passed a generation waiting for the equipment to 
arrive.  In 1985 the Japanese finally (and unintentionally) provided us 
with the right video displays when NASA's Mike McGreevy happened 
to notice that the Citizen Watch Co. LCD displays in a Radio Shack 
mini-TV were small enough to fit two in a head-mounted.  

I hardly need to detail what happened to CPU horsepower during that 
period.  By 1985, graphics engines of appropriate juice were almost 
within financial range of entities not involved in the defense of our 
nation.

Also by this time, NASA had made a strong commitment to VR 
research, though mostly in the service of "telepresence," the ability to 
project one's judgement and actions into a robot located some real 
place you'd rather not be, like space.  They were less persuaded by the 
attractions of unreal places.

The Air Force was also conducting research at Wright-Patterson under 
the direction of Tom Furness, but most of this was directed at the usual 
dismal purpose, simplifying the annihilation of non-virtual humans.  
Heads up displays and looks that kill were their speciality.

For all this expenditure of tax dollars, Virtual Reality still lacked two 
critical elements: a sense of whimsy and a fluid, three-dimensional 
method for "grabbing" and manipulating the furniture of cyberspace.  
VPL was on the case.  

VPL's Tom Zimmerman had always wanted the ability to actually play 
air guitar.  It was the sort of desire his "boss," Jaron Lanier, could 
understand. Jaron had only gotten into computers after concluding 
that musical composition was not a reliable day job.  And his 
ownership of more than 300 musical instruments might indicate, if 
nothing else, a probing dissatisfaction with the limits of each one.  

Over a two year period, Zimmerman and Young Harvill (also of VPL) 
created the DataGlove, a hand with which to strum those invisible 
strings.  While they were creating this hardware interface (though the 
Spandex feel of the DataGlove makes "leisureware interface" seem like 
a more appropriate term), Jaron and Chuck Blanchard were writing 
Body Electric, the software necessary to map the actual movements of 
the DataGlove and eyephones onto the virtual landscape.

The commercial colonization of cyberspace was beginning.  VPL's 
strategy was to build the most powerful simulations current 
technology would allow, without regard to hardware cost, selling the 
spin-offs at increasingly affordable prices.  Once such item, the 
PowerGlove, is a Nintendo game controller based on the DataGlove 
which VPL has licensed to Mattel.  (Available this Christmas at a store 
near you for $85.00.)  

Another VPL spin-off product is Swivel 3-D, odds on the best 3-D 
modeler for the Macintosh.  Young Harvill wrote it as a tool to create 
an artificial reality quickly and easily on Mac before integrating it into 
Body Electric and sending it over the twin Silicon Graphics CPUs 
which blow it up to full size.       

In September of 1988, John Walker wrote an internal Autodesk white 
paper called Through the Looking Glass: Beyond "User Interfaces."   In it 
he proposed an "Autodesk Cyberpunk Initiative" to produce within 16 
months a doorway into cyberspace...available to anyone with $15,000 
and a 386 computer.  The project's motto:  "Reality Isn't Enough Any 
More."  (I wondered if they considered: "I'd rather have a computer in 
front of me than a frontal lobotomy...")

Since NASA's Virtual Realities were running in the millions and VPL's 
in the middle hundreds of thousands, Walker envisioned a significant 
discount over previous models, but he knew that his customers, if any, 
would be more bargain-conscious than, say, the U.S. Air Force.  

Autodesk's Cyberia Project was running hard by Christmas, 1988, 
staffed by William and Meredith Bricken, Eric Gullichsen, Pat Gelband, 
Eric Lyons, Gary Wells, Randy Walser, and John Lynch.  When I 
arrived on the scene in May, they had been keeping hacker's hours for 
a long time.        

And they were ready to make a product.  They'd made a promo video 
starring Timothy Leary.  Gullichsen had even registered William 
Gibson's term "cyberspace"  as an Autodesk trademark, prompting an 
irritated Gibson to apply for trademark registration of the term "Eric 
Gullichsen."  By June, they had an implementation which, though 
clearly the Kitty Hawk version of the technology, endowed people 
with an instantaneous vision of the Concorde level.  

Meanwhile, back in the real world, things were getting complicated.  
While everyone who went to Autodesk's Cyberia agreed that Virtual 
Reality was something, there was less agreement as to what.  

Part of the problem was the scale of possibilities it invoked.  They 
seemed to be endless and yet none of them was anywhere near ready 
to return an investment.  But when something has endless possibilities, 
each of them is liable to dilute down to a point where people start to 
say things like, "Sure, but what's it really good for."  At which point the 
devoted cybernut might lapse into random syllables, his tongue heavy 
with all that golden potential.  

Virtual Reality induces a perception of huge potency underlying 
featureless ambiguity.  There is a natural tendency to fill this gap 
between power and definition with ideology.  And the presence of 
such unclaimed vastness seems to elicit territorial impulses from 
psychic regions too old to recognize the true infinity of this new 
frontier.  Disputes appeared like toadstools in the rich new soil of 
cyberspace.

Thus, by mid-November, the Autodesk half of the Next Big Thing was 
down to one full-time hacker: Randy Walser.  The Brickens had headed 
to Seattle to join Tom Furness in a (non-lethal) VR research program at 
the University of Washington.  Eric Gullichsen and Pat Gelband had 
formed their own VR company, Sense 8.  (Get it?)

Within, VPL's soulful band remained as tightly bonded as a Hell's 
Angels chapter.  Without, they found themselves increasingly tangled 
in legal hassles.  They were in court with AGE (a group of New York 
toy developers who are not just in it for their health), trying to protect 
their rights to the PowerGlove.  They'd just settled a suit with Stanford 
University.  In general, they were having experiences which made me 
question the axiom that you can't cheat an honest man.           

Still, everyone realized that a baby this size would be bound to 
occasion some labor pains.  As the general media began to pick up on 
Virtual Reality, its midwives were preparing themselves for interesting 
times.  It would be worth it.  But why?

To the people who will actually make the future, such a question is 
beside the point.  They will develop cyberspace because, like Mallory's 
mountain, it's there.  Sort of.


      


There some practical reasons for the settlement of cyberspace.  They 
aren't as much fun to think about as the impractical ones, but they 
exist.  First among them is that this is the next logical step in the quest 
to eliminate the interface...the mind-machine information barrier.

Over the last twenty years, our relations with these magic boxes have 
become intimate at a rate matched only by the accelerating speed of 
their processors.  From the brutal austerity of batch-processed punch-
cards to the snuggly Macintosh, the interface has become far less 
cryptic and far more interactive.  

There have remained some apparently unbreachable barriers between 
us and the CPU.  One of them was the keyboard, which even with the 
graphical interface and the accompanying infestation of mice, 
remained the principal thoroughfare from human perception to RAM.  
The thin alphanumeric stream which drips from our fingertips and 
into the computer is a pale reflection of the thoughts which produce it, 
arriving before the CPU at a pace absurdly mis-matched to its 
chewing/spitting capacities.

Then there is the screen itself.  While a vast improvement on the 
flickering LED's of the Altair or even the amber text of DOS, the 
metaphorical desktop remains flat as paper.  There is none of the depth 
or actual spatiality of experience.

After we get past what few documents we can keep on the screen at 
one time, we are back to the alphabetized hierarchy.  We can't pile it, 
as most of us tend to do in real life.  We have to file it.  And this is not

the way the mind stores information.  One doesn't remember the 
names of his friends alphabetically.  When looking for a phase in a 
book, you are more likely to look for its spatial position on the page 
than it's intellectual position in context.

The actual operation of human memory works on a model more like 
the one Saint Thomas Aquinas used.  Aquinas, who carried around in 
his head almost all the established knowledge of his simpler world, is 
said to have imagined a mind-castle with many different rooms in 
which varying kinds of ideas dwelled.  The floor plan increased with 
his knowledge.       

Nicholas Negroponte recreated a modest version of Aquinas' castle in 
the 70's.  He came up with a virtual office, represented in cartoon form 
on the screen.  One could mouse around to the "piles" of  "paper" 
stacked on the "desk" or "filing cabinet," leafing through them not by 
the first letter of their subject name but by their archaeological layer of 
deposition.  

The problem was the screen.  Negroponte created a flat picture of an 
office rather than something more like the real thing because that was 
all one could display on a screen.  In two dimensions, the image of 
desktop seemed a lot more natural than the image of the desk.  Thence 
the Macintosh. 

I used to think that the only way around these narrow I/O apertures 
lay in such heroic solutions as brain implants.  I think I was about 14 
when it occurred to me that this was the answer.  Brain surgery 
seemed a minor nuisance if it left one with the ability to remember 
everything.  

I suppose I'd still be willing to put a Cray in my cranium, but my faith 
in technology has moderated since early adolescence.  I'm more 
comfortable with the possibility of an interface which fills the gap 
between keyboarding and neurological hardwiring and involves no 
cortical knife-play.  Virtual Reality is almost certainly that.

And indeed, Virtual Reality may be so close to the implant side of the 
continuum that, as Randy Walser of Autodesk insists, it's not even 
appropriate to call it an interface.  It more a place...kind of like Fibber 
McGee's Ultimate Closet...than the semi-permeable information 
membrane to which we're accustomed.

Whatever you want to call it, Autodesk's John Walker puts it this way, 
"If cyberspace truly represents the next generation of human 
interaction with computers, it will represent the most profound change 
since the development of the personal computer."  Right.

But that still doesn't tell us what it's good for besides extending human 
quirkiness to the storage of immaterial stuff.  After all, most of what 
humans do with computers is merely an improvement over what they 
did with other keyboard-bound devices, whether typewriters or 
calculators.  Word processing and numerical analysis will be no easier 
"inside" the machine than it was outside.   

But let's quit being giddy for a moment.  We're talking bucks here.  
Right now a good working platform costs almost as much as a CAT 
scanner.  Who's going to buy one without something like Blue Cross 
footing the bill?  And why?

Alright, there is a reason why Autodesk is involved in this enterprise 
besides some daydream of the Ultimate Hack.  Whatever adventures 
they might entertain they afford by selling AutoCAD, the Dbase III of 
architecture.  How many architects have dreamed of the ability to take 
their clients on a walk inside their drawings before their 
miscommunications were sealed in mortar?

Virtual Reality has already been put to such use at the University of 
North Carolina.  There Sitterman Hall, the new $10 million home of 
UNC's computer science department, was designed by virtual means.

Using a head-mounted display along with a handlebar-steerable 
treadmill, the building's future users "walked through" it, discovering, 
among other things, a discomforting misplacement of a major interior 
wall in the lobby.  At the point of the discovery, moving the wall out 
was cheap.  A retrofit following the first "real" walk-through would 
have cost more by several orders of magnitude.  Thus, one can imagine 
retrofit savings from other such examples which could start to make 
DataSuits as common a form of architectural apparel as chinos and 
tweed.  

Given the fact that AutoCAD is already generating about  a hundred 
seventy million dollars a year even without such pricy appurtenances 
as cyberspace design tools, it isn't hard to imagine a scenario in which 
developing workstations for virtual architecture comes to look like 
very shrewd business.

Then there is the burgeoning scientific market.  Computers are the new 
microscopes.  Increasingly, they allow us to see into worlds which are 
not only too small but too weird to bring to human scale before.   For 
example, they are showing us the infinitely detailed order of chaos, 
never before observable, in a form which makes it possible to 
appreciate its simplicity as well as its complexity.

Virtual Reality promises the ability to not only see but to "touch" 
forbidden realms.  Again at UNC, work is already quite advanced in 
which one can assemble complex molecules like Tinkertoys, the 
attraction or repulsion between individual atoms in the assembly 
modelled to the scale of human tactile perceptions.  The drug industry 
alone could have uses for such capacity sufficient to sustain a lot of 
CyberBiz.  

One can imagine a lot of heretofore inaccessible "places" in which 
one's presence might be scientifically illuminating.  A Fantastic Voyage 
through the circulatory system will become possible (with or without 
Raquel Welch).  Or travel to alien worlds.  (Thanks to JPL, I have 
already taken an extremely convincing helicopter ride down the Vallis 
Marinaris on Mars.)

Then there all the places which have never before had physical 
existence on any scale: the rolling plains of mathematical topologies, 
the humming lattice of quantum states, cloud chambers in which mu 
mesons are the size of basketballs and decay over weeks rather than 
picoseconds.           

The possibility for less sober uses seems equally fertile.  One can 
imagine VR salons, video game parlors for big kids with Gold Cards, 
in which a central supercomputer provides the opportunity for a score 
of people to be Ms. Pacman.  Or whatever.  Nolan Bushnell, the 
founder of Atari and something of an expert on the subject of video 
games, is already at work on something like this.   

The list of possibilities is literally bounded only by the imagination.  
Working bodies for the damaged.  Teleconferencing with body 
language.  Virtual surgery.  Hey, this is a practical thing to do!        

And yet I suspect that something else altogether, something not so 
practical,  is at the root of these yearnings.  Why do we really want to 
develop Virtual Reality?  There seems to be a flavor of longing here 
which I associate with the desire to converse with aliens or dolphins or 
the never-born.  

On some level, I think we can now see the potential for technology, 
long about the business of making the metaphorical literal, of reversing 
the process and re-infecting ordinary reality with luminous magic.  

Or maybe this is just another expression of what may be the third 
oldest human urge, the desire of have visions.  Maybe we want to get 
high. 

      

Drugs, Sex, & Rock Tn' Roll
Boot Up, Jack In, Get Virtual

Technology is the new drugs.

			Jerry Garcia

Knowing that Garcia is a sucker for anything which might make a 
person question all he knows, I gave him a call not long after my first 
cyberspace demo.  Hell yes, he was interested.  When?  If I'd told him 
6:00 AM, I think he'd have been there on time.

He adapted to it quicker than anyone I'd watched other than my 4 year 
old daughter Anna (who came home and told her sisters matter-of-
factly that she been to a neat "place" that afternoon.)  

By the time he crossed back over to our side of Reality Horizon, he was 
pretty kid-like himself.  "Well," he finally said, "they outlawed LSD.  
It'll be interesting to see what they do with this."

Which brings me to a point which makes Jaron Lanier very 
uncomfortable.  The closest analog to Virtual Reality in my experience 
is psychedelic, and, in fact, cyberspace is already crawling with 
delighted acid heads.

The reason Jaron resents the comparison is that it is both inflammatory 
(now that all drugs are evil) and misleading.  The Cyberdelic 
Experience isn't like tripping, but it is as challenging to describe to the 
uninitiated and it does force some of the same questions, most of them 
having to do with the fixity of reality itself.  

While you can hardly expect people to lay down $15,000 for something 
just because it shakes their basic tenets, that's enough to make it worth 
the trip for me.  I think the effort to create convincing artificial
realities will teach us the same humbling lesson about reality which 
artificial intelligence has taught us about intelligence...namely, that we 
don't know a damned thing about it.    

I've never been of the cut-and-dried school on your Reality Question.  I 
have a feeling VR will further expose the conceit that "reality" is a fact. 

It will provide another reminder of the seamless continuity between 
the world outside and the world within delivering another major hit to 
the old fraud of objectivity.  RTReal'," as Kevin Kelly put it, "is going 
to be one of the most relative words we'll have." 

And that's just fine with me, since so much of what's wrong in 
America is based on the pathological need for certainty and the idiotic 
delusion that such a condition can even exist.

Another reason for relating this to acid is the overwhelming sense of its 
cultural scale.  It carries with it a cosmic titillation I haven't 
experienced since 1966.  There is also the same dense shower of 
synchronicities surrounding it.  (I must have run into William and 
Meredith Bricken ten times, always unexpectedly and sometimes in the 
strangest of places.  Today, as I was typing his name, Jaron called me 
for the first time in three weeks.  Then I felt strangely moved to call 
Eric Gullichson after a couple of months of silence.  He told me that 
yesterday had been his last day at Autodesk.  Etc. Etc. Etc.)  
     
Finally, Timothy Leary is all excited again.  Now I don't endow every 
one of his pronouncements with oracular qualities...I remember the 
Comet Starseed... but I have always thought that Uncle Tim is kind of 
like a reverse of the canary in the coal mine.  Whenever the culture is 
about to make a big move, he's the first canary to start jumping up and 
down.  

He's also, like Zelig, a kind of Zeitgeist chameleon.  He spent the 40's in 
the Army.  In the 50's, he was a tweedy young college professor, a 
Jules Feiffer cartoon.  In the 60's, he was, well, Timothy Leary.  In the 
70's, he became, along with H. R. Haldeman, a political prisoner.  He 
lived up the material 80's in Beverly Hills.  Whatever America is about 
to do, Tim starts doing it first.  

When I visited him recently, he was already as cyberpunk as he had 
been psychedelic when I last saw at Millbrook 22 years ago.  Still, his 
current persona seems reasonable, even seraphic.  He calmly scored a 
long list of persuasive points, the most resonant of which is that most 
Americans have been living in Virtual Reality since the proliferation of 
television.  All cyberspace will do is make the experience interactive 
instead of passive. 

"Our brains are learning how to exhale as well as inhale in the data-
sphere." he said.  Like our finny ancestors crawling up on land, we are 
about to be come amphibians again, equally at home in visceral and 
virtual frames. 

The latest bus is pulling out of the station.  As usual, Leary has been on 
it for a while, waiting patiently for it to depart.

      

Then there is the...uhhhm...sexual thing.  I have been through eight or 
ten Q. & A. sessions on Virtual Reality and I don't remember one 
where sex didn't come up.  As though the best thing about all this will 
be the infinite abundance of shaded polygonal party dolls.  As though 
we are devising here some fabulously expensive form of Accu-jac.  

This is strange.  I don't what to make of it, since, as things stand right 
now, nothing could be more disembodied or insensate than the 
experience of cyberspace.  It's like having had your everything 
amputated.  You're left mighty under-endowed and any partner 
would be so insubstantial you could walk right through her without 
either of you feeling a thing.  (In fact, when people play tag in Jaron's 
Reality Built for Two, one strategy is to hide inside the other person's 
head.)

And I did overhear the word "DataCondom" at one point...  Maybe the 
nerds who always ask this question will get a chance to make it with 
their computers at long last.  (I prefer not to think too much of how 
anyone who would want to make it with a machine might treat the 
women in their lives...if any there be.)

Fortunately, I think these dreams of cybersex will be thwarted by their 
own realization.  Yes, it will work for that purpose and it will be easy.  
But the real point of Virtual Reality, as with life itself, is contact.  
Contact with oneself alone is certainly a laudable enough goal, but the 
presence of half a million dollars worth of equipment between that 
subject and object is neither necessary nor desirable.

Even if Virtual Reality turns out to provide the format for the ultimate 
pornographic film...a "feelie" with a perfect body...it will serve us 
better as the ultimate telephone.     



Life in the DataCloud
Scratching Your Eyes Back In

There was a man who lived in town
And he was wondrous wise.
He jumped into a bramble bush
And scratched out both his eyes.
And when he saw what he had done,
With all his might and main,
He jumped back in the bramble bush
And scratched them in again.

				Old English Nursery Rhyme 
Information is alienated experience.

				Jaron Lanier

Since the Sumerians starting poking sticks into clay and claiming that 
the resulting cuneiform squiggles meant something, we've been living 
in the Information Age.  Only lately did someone come up with a name 
for it.  I suppose that was because we quit making anything else of 
value.  Before that, they just called it civilization.    

Indeed, one could make a pretty good case that consciousness, as we 
define it, arose simultaneously with the ability to communicate its 
products symbolically.  (See The Origin of Consciousness and the 
Breakdown of the Bicameral Brain by Julian Jaynes for related 
conclusions.)

The Sumerians had a pretty clear perspective on what this stuff was 
good for.  The preponderance of their runic tablets turn out to be, on 
translation, calendars, inventories, and mnemonic devices for such 
data as one might need to remember but which was too trivial to merit 
conversion into the other storage form of the era, epic poetry.  They 
didn't use it to describe anything.

Perhaps they recognized that even the most mundane experience 
would beggar any effort to describe it if one were serious about 
creating a genuine simulation.

The Egyptians didn't have any such illusions either, but, in addition to 
keeping track of cubits and high water, they found symbols useful for 
their elaborate liturgical purposes.  With so many dramatis personae in 
the pantheon, some method was required for sorting out each one's 
ritualistic preferences.

The Greeks, as was their wont, expanded the envelope further.  To the 
previously established (and sensible) uses for writing, they added 
commentary, philosophy, calculation and drama.  

Still, they restrained themselves from attempting to simulate 
experience on paper (or whatever it was they wrote on).  One might 
argue that drama was an effort to do that, but I think that the likes of 
Sophocles probably just found it easier not to have to personally teach 
his actors all their lines. 

As early as the 5th Century B.C. we hear the first warnings that 
information might constitute an abuse of experience.  Socrates 
suggested that writing things down might damage your ability to 
remember them in their proper, full-bodied form.  (An admonition we 
know about since Plato went ahead and wrote it down as soon as 
Socrates was hemlocked out of the ability to stop him.)

It wasn't until the 17th Century that things really got out of hand.  
Cervantes wrote Don Quixote and fiction was born.  From that point, 
any experience could be plucked from its holy moment in time and 
pressed like a flower in a book, to be reconstituted later in the 
imagination of the reader.   

The thin, alphanumeric trickle that is language was suddenly thought 
to be a acceptable surrogate for the boiling torrent of shapes, smells, 
colors, sounds, memories, and context which amalgamate in the 
cauldron of a human skull and become there something called Reality.  
No longer did one have to "be there."  One could read about it and get 
the flavor well enough.    

This absurd delusion is now universal.  The only reason anyone 
believes it is that everyone does.  

I, on the other hand, began to have my doubts around the time I 
started trying to create some of this magical information myself.  
Sometime in the 4th Grade, I began to write about the things that 
happened to me.  For awhile, the approval others showed my efforts 
was enough to inspire their continuation.

Gradually, however, the effort became painful.  The inadequacy of my 
word-replicas for experience was increasingly clear.  I tried poetry.  
This seemed to work until I realized that it did so because a poem is 
about itself and thus has no "real thing" to be compared to.

Writing about something continues to cause me nothing but anguish.  
The symbolic tools are hopelessly mis-matched to their three-
dimensional analogues.  For example, the word "chair" is in no way 
like any chair.  

Nor does it begin to imply the vast range of dissimilar objects to which 
one might apply it.  You can hop it up with adjectives... "big red 
chair"...or additional phrases... "big red chair that Washington sat 
in"...but the result is usually bad writing without much advancement 
of your cause.  I mean, "the big, deeply red, densely-brocaded, 
Georgian love seat that Washington sat in while being bled by leeches" 
is still, for all its lugubrious mass, not a chair.

And if it were, it wouldn't move in the way that real things do even 
when they're standing still.  Words just sit there.  Reality vibrates and 
hums.  I have a pet phrase for this element of the mismatch:  Using 
words to describe an experience is like using bricks to build a full-
sized, operational model of a fog bank.

Perhaps it was a subliminal recognition of this fact that caused 
America to fall in love with statistics.  As a descriptive tool, numbers 
are even worse than words.  They are very purely themselves and 
nothing else.  Nevertheless, we now put everything from flowing 
water to the human psyche into these rigid numerical boxes and are 
especially straight-faced as we claim it fits in them.

In doing this, we usually follow a rule I call, with characteristic 
modesty, Barlow's Law of Real Numbers.  This states that the 
combination of any two speculative numbers by any arithmetic 
operation will always yield a real number.  The more decimal places 
the better.

Computers have hardly been part of the solution in this area.  We pass 
our measuring grids over pulsating reality, shovel the results into our 
machines, thrash them with micro-circuits, and pretend that what 
floats up to the screen is "real."

Horseshit.

What computers can do, and have done to a fare-thee-well, is to 
provide us with a hyper-abundance of such processed lies.  Everything 
from U.S. News and World Report to Penthouse is now a dense thicket 
of charts, tables, graphs, and %'s.  All purporting to tell us something 
about what is.  

But it's all just information.  Which, apart from the fact that it's not to 
be confused with experience,  has several problems which Jaron Lanier 
succinctly enumerated for me: "The first problem is that it's in-
formation.  The second problem is that it's linear information.  And the 
third problem is that it's false information." 

Or, as we say in Wyoming, "Figures don't lie, but liars can figure."

Virtual Reality is probably not going to cure this nonsense any more 
than television, its one-way predecessor, has done.  The global supply 
of words, numbers, statistics, projections, analyses, and gossip...what I 
call the DataCloud... expands with thermonuclear vigor and all the 
Virtual Reality we can manufacture isn't going to stop that.

But it may go a long way toward giving us means to communicate 
which are based on shared experience rather than what we can 
squeeze through this semi-permeable alphanumeric membrane.  If it 
won't contain the DataCloud, it might at least provide some 
navigational aids through it.

Maybe it can scratch our eyes, blinded by information, back in again.