💾 Archived View for zaibatsu.circumlunar.space › ~solderpunk › phlog › do-you-even-compute-bro.txt captured on 2023-01-29 at 15:57:37.
View Raw
More Information
⬅️ Previous capture (2023-01-29)
➡️ Next capture (2023-03-20)
-=-=-=-=-=-=-
Do you even compute, bro?
-------------------------
Personal computing, broadly construed in such a way that it includes
smartphones and tablets and whatnot - basically, computing done on a
single-user device which is the personal property of the person doing
the computing - has never been bigger than it is today. The average
person owns more personal computers, buys personal computers more
frequently, and spends more time using a personal computer than even
a lot of hardened computing enthusiasts would have ten years ago.
This is not just more common and more socially accepted than it used
to be, it is, in a weird and quite rapid reversal of social norms,
socially *expected*, to the extent that now you're a social misfit
weirdo if you *aren't* on a computing device all the time.
It follows from all this that the modern average person must do an
awful lot of computing, right?
Right?
Many people will say "Yes, that's right, by definition. Anything
you do on a computer is computing and I won't engage in any kind of
silly No True Scotsman argument to the contrary, you tedious bore!".
That's fair enough, and if you really feel strongly like that you
probably want to stop reading right here, because in the rest of this
post I'm going to attempt a vague partition of things people do on
computers into "intrinsic computation" and "incidental computation",
This whole line of thought has fallen out of my ongoing efforts to
grapple with the issue of sustainable computing, permacomputing,
whatever we're calling it.
One possibly quite informative way to get at what ordinary people
do with the "supercomputer in their pocket", as smartphones are so
often called these days, is to look at the long list of devices,
services and institutions which they have more or less replaced.
Twenty years ago, the average Western household contained or consumed
at least one and possibly multiple instances of most of the items
on the list below:
- Address books
- Alarm clocks / wristwatches
- Answering machines
- Calculators
- Cameras
- Fax machines
- Letters and postcards from, or soon to be sent to, friends
and family
- Magazines and newspapers
- Printed dictionaries and encyclopedias
- Printed photo albums
- Printed recipe books
- Printed street directories or other maps
- Record/cassette/CD/MiniDisc player
- Radio receiver
- Television
- Typewriter
- Video/DVD/BlueRay player
(I construe "television" here to mean not just a large screen pointed
toward a couch, but a thing that actually receives images remotely
via radio or cable)
Nothing in this list is completely extinct, but they are all
strongly marginalised compared to twenty or even ten years ago.
It would not be difficult to find an affluent Western household which
contained literally zero of these things in their prototypical forms.
Many people still using the above things regularly are either
collectors or enthusiasts doing something the average person feels
no special affinity for, or else are using them more out of some
kind of necessity rather than desire. For every one such person,
there are many who own none or almost none of these things and will
happily and confidently assert that they are obsolete, and that they
have replaced *all* of these things with a *single* device, or a
small number of very similar devices. That device will be something
that almost anybody will happily admit is a personal computer.
So we've used what we'd all happily call computers to replace
all of those things above. But how many of replaced things are
themselves things we'd all happily call a kind of computer? How many
of the tasks those things achieve are tasks we would happily call
computational task? Aside from the calculator, probably none. Heck,
some of the things on that list are purely mechanical! And many of
the electronic devices once existed as purely analogue devices, with
no bits or bytes in sight, no processors, no memory. Of course,
mechanical computers exist, and analogue computers exist, too.
But most people wouldn't call a typewriter a mechanical computer
or a transistor radio an analogue computer. These things aren't
computers, and they're not doing computational tasks. Maybe I'm
late to the party here, but I have honestly only just recently come
to appreciate the sheer extent to which the following is true:
The dominant use of personal computers in the 21st century is the
functional simulation of non-computers.
In retrospect, this feels obvious. It explains, perfectly, why
personal computing went from being something that most people only
did at work to playing a central role in daily life only after and
almost immediately after computers starting being connected to the
internet 24/7 and having cameras, microphones and speakers built
right into them. Modern personal computing is first and foremost
about communication and about multimedia, which is what pre-PC
consumer appliances were also first and foremost about.
This is obvious in retrospect, but when the penny dropped,
I was shocked. *This* is what ubiquitous personal computing
has brought us? Okay, we can solve all those communication and
multimedia problems with one small, light, robust, portable device
instead of twenty big heavy fragile electromechanical contrivances,
we can do it faster, we can do it with higher fidelity, we can do it
with less power consumption (sure as heck not less embedded energy,
but we'll get to that), and it's cheaper and more accessible and
access is more democratised. Those things sure aren't nothing,
and they're not bad things in and of themselves, but...really?
This is it? Shouldn't computers be being used for all kinds of
amazing really qualitatively different things that we couldn't
imagine before they came along? We are talking, after all, about
thinking machines! Electronic brains! Computers are a fundamentally
new kind of machine, they do something that non-computers don't
do, something we had to invent brand new formalisms to properly
reason about. Surely it isn't asking too much to expect them to
deliver something genuinely, ground-breakingly new? What happened
to bicycles for the mind?!
Of course it's not true that computing has not impacted the average
person's daily life. It has, and it does. Every time you check the
weather forecast you are consuming the output of a supercomputer!
A real one, not the glorified thin-client in your pocket.
And computers model the climate and pandemics and economies and
demographics for us so we can have better situational awareness
about our world and make more informed decisions. And computers
design more aerodynamic shapes than any unaided human engineer
could conceive of. And so much more beside this. We really *are*
living in the computer age. But the computing that has impacted us
most happens far away, in universities and government institutions
and corporate R&D labs. It's done by specialists, professionals.
It's not personal computing. It's not in our homes. We are not
active participants.
If most people have nothing better to do with a personal computer
than simulate non-computers on it, how did we end up in a world where
some of the richest and most powerful companies in the world are in
the personal computer business? I think the answer is pretty simple:
we live in a world where there is a tremendous, overwhelming demand
for the ability to exchange text and sound and images and videos and
interactive multimedia experiences with people far away from us.
If you respond to that demand and you optimise your solution for
minimum size, minimum weight, minimum energy consumption, minimum
manufacturing costs, maximum speed, maximum fidelity, maximum device
convergence, and, if you're cynical enough, maximum ability of device
manufacturers to control or restrict the abilities of device users,
then personal computers and 24/7 high-speed, low-latency global
computer networks fall out naturally.
But the computing that goes on as a result of that optimisation isn't
"intrinsic computing". It's "incidental computing". A thought
experiment makes this clear: suppose that tomorrow, personal
ownership by private individuals of devices with more than 1,000
transistors in them is outlawed. Smartphones and tablets as we
know them become impossible. How do Apple and Google and friends
respond to this? Do they embark on a huge, long, expensive project
to design and mass produce hyper-miniaturised versions of Konrad
Zuse's electromechanical relay computers or Charles Baggage's purely
mechanical analytic engines, and then port iOS and Android to those
clicking, whirring new platforms? Or do they instead abandon the
computational substrate entirely and use modern material science
and highly automated, miniature manufacturing technology to build
a new generation of souped-up analog consumer appliances which,
while not as good as what we are used to in 2023, are likely still
considerably better than what we had in 1993? Which of these two
courses of action do you think is the most likely to produce a viable
commercial product in the least time at the lowest cost? These are
rhetorical questions; the first option is obviously absurd, a fool's
errand, and the second option wins: computing would be discarded
as soon as it stopped bringing practical benefits. Our everyday
digital devices really are only incidentally computers. We don't
build them as computers because we all need to automatically and
accurately process large quantities of data according to specified
sets of rules every day, we build them as computers because digital
microelectronics are really small and quiet and run cool and you can
smack 'em around a bit and let them get dusty without it mattering
too much (those are not properties of computers in general!).
Meanwhile, the scientists and the engineers, if the transistor limit
were extended to them, might actually take the Neo-Zuse-Babbage
gambit, because a lot of their work isn't practical without it.
Their work is intrinsically computational.
The starting point for any kind of really serious sustainable
computing movement has to be to look at what people are actually
doing with computers, and figure out which of those things can and
can't be done "well enough" in a more sustainable way if we do them
without computers at all. If we try to address the tremendous demand
for long-distance, high-speed communication and multimedia exchange
as best we can by optimising not for all the stuff listed earlier,
but instead for maximum device lifespan, maximum repairability,
maximum recyclability, minimum embodied energy, minimum supply chain
length, and maximum user autonomy, there might still be some use
cases where personal computing provides the best solution. But I
suspect that for a great many of them, general purpose computers will
start to look like square pegs in a world of round holes. In which
case, we should gleefully abandon them! And apply ourselves then
to the question of how best to design and build and use computers
specifically for that subset of tasks where they are indispensable
- which might be very differently to how we'd do it when just
attempting to replicate the status quo.
I could be quite wrong about the square peg thing. Maybe computers
really can still maximise that very different utility function
better than non-computers can. And one can also argue that as long
as there's just one single lonely use case where computers win, then
everybody's gonna have them anyway, and then in that context adding
additional non-computers into the mix for other use cases can only
increase the overall footprint. I'm not 100% sure that argument is
bulletproof, but it's not easily dismissed, either. Even if I am
wrong, I bet that the kind of personal computing that emerges from
maximising sustainability looks quite different from the kind that
emerges from maximising what essentially boils down to convenience.
Because I'm publishing this via Gopher and Gemini, it's going
to be read predominantly by "computer people", and a lot of
computer people are going to instinctively react negatively to me
characterising personal computing as being "just" about communication
and multimedia. Yeah, sure, I agree, Lisp and FORTH are divinely
beautiful intellectual constructs, code is poetry, generative
computer art (the kind where the artist writes the code, not these
trendy pre-packaged "AI" things everybody is talking about) lets
humans express their inner thoughts and feels in ways they couldn't
even dream of without computers, and all that stuff is deeply,
undeniably intrinsically computational. I really get it! I'm one
of you. But we have to realise and accept that when considering the
destructive ecological footprint of the modern computing landscape,
- that* kind of personal computing is a tiny fraction of a percent
of the whole. To a first order approximation, nobody on Earth
does that kind of computing. That's not to say it has no meaning
or value, I really believe it does. That kind of stuff is good
for the soul, if you have the right kind of soul. But making that
kind of computing sustainable is barely even a difficult problem.
If that kind of computing is your thing, you can have a lifetime
of fun playing with discarded junk from the 80s! Heck, here's a
flippant final remark: the idea that that kind of computing would
ever be everybody's thing, as opposed to the thing of a very small
group of very weird people, is itself a largely discarded idea
from the 80s - writing the "Orphans of Commodore" companion to
"Orphans of Netscape" is left as an exercise to the reader. :)
[UPDATE 2023-01-23] adiabatic reckons I've understimated just how
much mind-cycling people do[1]. Maybe I have exaggerated by writing
it off as a rounding error. I still don't think it's as much as 10%
of computing, especially not when weighted by power consumption rather
than time. I agree that it's a lot more common at work than at home,
even work that's not science/engineering as mentioned above. If I
remember rightly, Jobs' original "bicycle for the mind" talk did talk
pretty explicitly at times about "worker productivity", so personal
computing at work is fairly within scope. I have been thinking mostly
in terms of private use at home, but to be fair to adiabatic I barely
even tried to make that explicit in the above.
[1] gemini://gemini.circumlunar.space/users/adiabatic/scrawlspace/