💾 Archived View for gemini.njms.ca › gemlog › 2024-01-08.gmi captured on 2024-05-10 at 10:48:00. Gemini links have been rewritten to link to archived content

View Raw

More Information

⬅️ Previous capture (2024-03-21)

🚧 View Differences

-=-=-=-=-=-=-

Return to index

Sand never asked to think

Today I'm going to do what most normal people would probably call making a problem for yourself and then not knowing how to fix it.

I'm not the first person to notice that a lot of how we think about computing is inherently very authoritarian. Lots of critical theorists have wondered whether or not computers, devices devised by the US military and deeply rooted in its traditions, can ever truly be liberatory. Usually those people are thinking about how people relate to other people through computers, or how computers relate to us. That is, how they control us. Instead, I want to think about how we relate to computers.

When people start thinking about how we relate to computers, the go-to take seems to be that we're being controlled by them. I never watched The Social Dilemma but I think that was the idea, right? Scary algorithms influence our behaviour to make money for Facebook, etc. I'm sure most of the people reading this gemlog already get where I'm coming from here.

Things don't have to be this way, though, and a lot of people are working to re-frame the way we relate to technology. The metaphor I run into often in my circles online is that computers are tools. They're not tools for most people, but with enough gumption one can turn their computer into a very brutalist, functionalist object to manipulate the world. You can install Linux, you can learn the command line, you can find really convoluted bridges to access the social networks your friends and families use without having to download their apps... It can be very exhausting, but it's doable.

The problem I've been having with that model is that it's still authoritarian. There's still a clear hierarchy, it's just that now the roles have been reversed.

So, let's think of the computer and the user as two independent agents. To "horizontalize" this relationship, if the user is to use the computer, then the computer needs to consent to entering this relationship. Here's where things get kind of weird: it's not obvious on its own why the computer would want to enter this relationship. It's not obvious why the computer would want anything, for that matter. Sand never asked to think.

You could imbue the computer with a spiritual will; then maybe it'd make sense to ask for its consent. But, if you're the one imbuing the computer with a spiritual will, then as far as the object of the computer itself is concerned, this relationship is still authoritarian! Once again, sand never asked to think! Why should it want a spirit?

From here, it might be a good idea to take a look at the bigger picture. Computers are an extremely refined byproduct of the Earth. However long that line of separation may be, somewhere in the past, your computer was a gift of the Earth, probably taken under more than coercive means. However abusive and authoritarian the relationship between the mining company and the Earth may have been, that abuse can end now. You can be grateful to the Earth for its gifts. So, maybe if you're looking to receive consent to compute from some willing agent, then it'd make more sense to think of the world more holistically--to turn to the biosphere, or "mother nature," or Gaia, or whatever it is you want to call it.

Now, here's where I invent a problem for myself: I don't really like this idea lol. Ironically enough, it feels... shortsighted?

Importantly, all this shit is made up. If I got up in front of one of my computer science classes to make the argument that we need to ask our computers for consent before using them to compute, I would probably get comically pulled off the stage by a cane. So, a "solution" to this problem, no matter how sound, isn't worth anything if it doesn't feel "good." I think my problem is that the relationship I have to my computer, while influenced by nature, is simply not the same relationship I have to nature in some meaningful way. Maybe it should be; I don't know. But, in a practical, day-to-day sense, my interactions with my computer feel separate. That could change if I chose to live closer to nature, but that's not the life I'm living right now.

I don't have a perfect solution to this problem, but I think I might have a compromise:

Let's go back to the idea of imbuing your computer with a spiritual will. Before, I said that this relationship is still essentially authoritarian because sand never asked to think. I think you might be able to side step this problem by accepting that through the long transformation from sand to computer, the computer became essentially soulless. It is a pure commodity with no real relationship to the world around us. This, I would argue, is a bad thing. This kind of sucks. Being soulless is no fun. Because the computer is soulless, it has no will and it can neither accept nor reject anything we ask it to do.

What we may be able to do, then, is to extend a piece of our own soul to the computer--that is, to make the computer an extension of who we are. From there, we must do everything in our power to use the computer in our own effort to bridge the gap between technology and nature; to put an end to the alienation that constructs a fundamentally soulless prison in which we're all forced to live.

This is how computers can be a liberatory force.

---

"Sand never asked to think" was published on 2024-01-08

If you have thoughts you'd like to share, send me an email!

See here for ways to reach out