💾 Archived View for gemini.njms.ca › gemlog › transience-orientated-programming.gmi captured on 2024-05-10 at 10:50:35. Gemini links have been rewritten to link to archived content
-=-=-=-=-=-=-
Content warnings: existential doubt, allusions to climate nihilism and Overwatch.
When I was in high school, my friends and I really liked playing a game called Overwatch¹.
Now, before I scare my entire audience away with a tangent about a first-person shooter game, I should say that this is article isn't about Overwatch. But, if you need to know, Overwatch is a game where you form teams of six people and shoot at other teams of six people. Whoever dominates the territory more wins. What's important to know for our purposes is that Overwatch is, like many games produced today, entirely online, and entirely proprietary. There's no decentralized network of fan-operated servers like the FPS titles of the early 2000s. If you intend on playing Overwatch, then you ought to expect to be connecting to a server owned by Activision Blizzard, possibly rented from Amazon Web Services. But again, really trying not to get sidetracked here, I don't want to talk about Overwatch. Instead, I want to talk to you about something that happened in the Overwatch community a little while ago, and what it tells us about the way software is built today.
On the 3rd of October, 2022, Overwatch shut down.
There was technically an offline portion where you could play against the computer, but there's always been an imperative to connect to Activision Blizzard's servers first. So, when I say "shut down," I mean if you somehow managed to download Overwatch (not through Battle.net since it isn't available anymore), you *would not* be able to play it anymore.
Years of development, more lines of code than I could even fear writing, and a horrifingly formative part of my time in high school, all gone.
I haven't followed Overwatch much over the last few years. I understand that the sequel to the game is free to play, and that cosmetics from the first have been transferred into the second. One could even speculate that lots of the original code base has been incorporated into Overwatch 2. What does "gone" really mean? Gone is a pretty slippery word when it comes to most things. Most things in life don't just disappear. Matter is neither created nor destroyed, so they say. Is it really gone, or has it just taken on a new form? Nonetheless, this kind of "gone," the way a piece of software can be "gone," is a lot different. Gone, as in digitally forgotten. When a piece of software is gone, it's gone forever.
This is a pretty weird phenomenon. We're trained to believe that once something is put out on the Internet, it's there forever. This is a helpful assumption, but also very wrong. Internet infrastructure is fragile. Data is fragile. Links rot, and services disappear when they fail to provide monetary value to their shareholders. Data can be preserved, however. There's plenty of work being done to preserve information in digital archives. That's also not what I'm looking to talk about in this article. In this article, I want to talk about tools that only work so long as their creators feel it's meaningful to allow you to use them.
My father is really invested in tooling. It seems like we have this conversation every time we meet. Back in the day, things were built differently. Tools were made with love. There is a sort of hand-crafted quality to utilities created before the 21st century, when developers worked under stricter constraints, often much closer to the bare metal. There was more discipline involved in the process, which fuelled creativity that simply cannot be replicated with modern abstractions. Today, tools are cheap, mass-produced, unconfigurable, and designed to be replaced rather than repaired.
My father is a traditional woodworker, for the record.
Maybe you thought I was talking about software. Well, I kind of was. My father used to be a Linux user, but now he's a die-hard Apple fan. And I get it. There's a convenience to using Apple products that lasts as long as the lithium-ion battery doesn't wear out. As I've talked about in a previous article, if you don't have skin in the game, it's hard to appreciate the nuances. I'll probably never have the same appreciation for 19th century hammers and axes as he does, and until he starts writing code, or otherwise gets really into customizing his devices, he probably won't switch back to Linux.
A self reflection on the aestheticization of low-tech and doing it anyway
There's a good chance I've inherited some of my mentality around computing from him, however. Or at least, transcribed it into the realm of computing.
My laptop, which I've probably talked about a few times on this blog by now, is a Dell Latitude 3310. It was probably released in 2019², making it at least four years old. I got it for free from my school when they decided everyone needed a new computer going into distance education during the pandemic. It was the cheapest one they could find. The specs are very lacklustre. The screen is extremely flimsy and it already has serious pressure damage. The thing is a clunky, bevelled sheet of plastic, but I love it, because it does everything I need it to.
It runs Fedora with LXQt wonderfully.
I didn't like it all that much when I first got it. I wanted a nicer laptop--one that I could play video games on and maybe run Blender--but I was very adverse to spending money, and it's hard to compete with free. Over the years, however, I've grown to appreciate it for everything it lacks.
One thing it notably doesn't lack, however, is a replaceable battery.
Repairability has come to be something I care quite a bit about in my devices. It had never fully occurred to me until I recently took interest in low-tech circles that while microprocessors might be monetarily cheap, that cheapness does not extend into their impact on the environment and the people who produce them. I've always known extracting rare earth metals is pretty damaging, but like, it would be substantially less bad if we didn't throw them in the trash every two years. Microprocessors are precious. The planet has been irreparably damaged to produce them, and more workers than I could ever imagine have been abused to put them together
So, if your computer works, keep it. Use it until it physically rots under your finger tips as you type your incoherent tirade onto your monochromatic blog. The thing is, for many people, that literally isn't an option. By some twisted anti-miracle of supply-chain ludicrousness, for most, it's more convenient to throw computers away and replace them than to get them fixed. Sometimes, it costs too much. Often times, all the parts are proprietary and the manufacturers would rather go bankrupt than let you see past the aluminum case. People are systematically disallowed access to the requisite knowledge and even simply the confidence to break open their devices to swap out parts. We've spent a lifetime being told that computers are these scary black boxes that can only be safely touched by someone with a degree in electrical engineering, and so when it comes time to download the latest version of Photoshop or Microsoft Excel, we all line up at the electronics store and buy whatever just landed on the shelf.
You can still find easily repairable computers on the market. Often, you have to get them from hobbyist companies like System76. And, of course, all the big names like Dell and HP still produce computers you can fix yourself, but not to long ago, every computer was designed to be fixed. Nowadays, the imprint repairable computers have on the world is being quickly overtaken by proto-e-waste that gets churned out at a low cost to the corporation but a very high price to the labour and environmental degradation that goes into its manufacture.
I often facetiously describe non-repairable devices as e-waste. While they work now, they're all but guaranteed not to in the near future. If something can be fixed, then its life can be extended for as long as it matters, so long as reliable parts exist. If it can't, then it exists for exactly long as its producers want it to. Usually, this is just in time for a shiny new replacement to drop for CES.
Old devices aren't necessarily the solution either. First of all, the second law of thermodynamics dictates that they're growing rarer with each passing year. Secondly, most of these devices have proprietary components. Eventually, manufacturers stop making them, and then you can only get them on the second-hand market. Lithium-ion batteries, for one, start degrading as soon as they come off the assembly line. It's just a matter of time until you're out of options.
Modern computers are extremely complicated. I'd conjecture that there isn't a single human on Earth who fully understands how a given computer manufactured in 2023 works. Everything requires increasingly specialized knowledge to reproduce as a function of time. As such, in a civilizational collapse, if we're lucky, we might live out the rest of our days like we're playing No Man's Sky, strapping strange alien technologies to our exosuits and praying it doesn't blow up in our face. But even that will be short lived, again, because things just wear out. Modern computing is fundamentally incompatible with improvised hardware.
what makes collapse inevitable and imminent (collapseos.org)
It's bad now, and it's only going to get worse.
An easy way to talk about the way software has changed with the reification of the internet³ is to frame it in terms of ownership. I don't want to do that. I think this conversation can be equally if not more productive if we ignore ownership for a moment. After all, what it means to own something as intangible as information is a pretty abstract question that I strongly believe the legal institution has gotten wrong.
Instead, I think it might be more helpful to talk about it geographically. In the past, the main mode of delivery of software was on punch cards you fed into the computer. Later, it was floppy drives, and then CDs. I still have some very old copies of Ubuntu laying around my parents' basement from when I was growing up. In the earlier days of the internet, software was distributed online. You'd look it up in your search engine, download it, and then it was there, represented in your device's storage, as though it'd been installed via CD. This is a very unabstract way of distributing software, because it closely represents what you are physically doing when software is distributed: code is written by a developer, compiled, copied, and physically transferred from their device to yours by some physical medium.
That is still physically happens when we use software today, but increasingly, the relationship we have with developers is changing, becoming more abstract, and in effect, territorializing our relationship with the software itself
A clear example of this is web applications. Web apps are extremely transient. Yes, technically, the JavaScript code is being downloaded on your machine, and in some cases, you can save that code and run it offline, but that's generally not the relationship we have with web applications. Web applications are downloaded and run on the spot when you connect to a website, and when you're done, it's gone. Sometimes it's cached, but eventually, the software is disposed of, until the next time you connect to the website and the process starts over again.
But, this is only the first stage of the process. Most web apps cannot be downloaded and run offline. Usually, they rely on server infrastructure that's completely out of your hands, and they'll stop and complain if you disconnect from the internet. This is probably the purest form of transient software. You, the end-user, run a piece of software on your computer (in this case, a web browser) that serves as an interface to other software. The server runs another piece of software (a web server) that transmits other software over a communication channel to your web browser. Thus, we're using software that exists in no specific place at once. It might be better, then, to think of the software as a continuous exchange on top of the internet infrastructure⁴. We're not accessing the developer's computer to run the software; it's still being downloaded to our device, kind of, but not really. Today, we download access to software, not the software itself.
One of my favourite examples of this comes, again, from the triple-A gaming industry. The most charitable interpretation of video game DLC is that it's extra content developers have created for games so that they can keep making money. Video games suffer from a problem that has lead much of the tech industry to move towards the subscription model: once you release and sell a piece of software, you stop making money. This causes software companies to go bankrupt despite having a strong user base. If you're not continuously attracting new users, eventually, money will run out, and you'll stop being able to afford to maintain it. DLC is a way for video game companies to get around this problem: release a game, and then when you need more money, release extra content to keep people engaged.
When we're talking about the triple-A gaming industry, it's usually safe to not bother with charitability. Lots of games come with all the DLC pre-installed and locked with a key. The keys are then distributed either with purchases or through events that come and pass with time. This is quite a bit different from the web app example because all the content is in fact there, fully represented on your hard drive. And yet, the relationship is the same: you can only interact with it using the transient "access" provided to you by the service provider--access that lasts only as long as the service does. This is a problem I deal with whenever trying to play old Nintendo games for which their respective online components shut down over a decade ago, rendering large portions of the games completely inaccessible without cheating, if at all.
The content is still all there, sitting dormant, never to be accessed, because accessing it isn't meaningful when half of the puzzle pieces have been lost to time. This is a pertinent example of an issue that affects quite a bit of software, and which has motivated the creation and usage of open source software. So long as any part of the software remains proprietary, you will necessarily remain at the whim of the corporation that provides it to you. You retain access only so long as you provide value.
What does it feel like to write software that doesn't exist?
Speaking as someone who's released every line of code they've ever written of their own volition online under an open source license, it seems kind of lonely, casting your creations into a marketplace that will inevitably be consumed by the void. As someone who's worked on a proprietary code base on the job, my perspective has changed quite a bit. It's made things feel quite a bit more real to see the development process of an entirely transitive project unfold from the inside. At work, I don't develop the same kind of relationship with the code I write as I do when I'm working for the commons, because when I'm on the clock, code is written as a small part of a larger transaction I'm making with an organization. While code written as a part of a transactional relationship may be written with passion, it can never be written with love.
So, I'd say that in my personal experience, to write software for ghosts is to feel nothing at all.
In her book *Braiding Sweetgrass*, Robin Kimmerer talks about growing up surrounded by wild strawberries. She understood the wild strawberries to be a gift given to her and her family by the Earth. However, in the Anishinaabe tradition, to give someone a gift has a very different meaning than the settler-colonial understanding of gifts as "things given free of charge:"
From the viewpoint of a private property economy, the "gift" is deemed to be "free" because we obtain it free of charge, at no cost. But in the gift economy, gifts are not free. The essence of the gift is that it creates a set of relationships. The currency of a gift economy is, at its root, reciprocity. In Western thinking, private land is understood to be a "bundle of rights," whereas in a gift economy property has a "bundle of responsibilities" attached. (24)
With the gift of wild strawberries from the Earth comes the responsibility to nurture the plants from whence they came. It's a symbiotic relationship: the Earth is cared for and the people get really good tasting strawberries. When you buy strawberries from the store, that relationship is quite a bit different:
It's funny how the nature of an object--let's say a strawberry or a pair of socks--is so changed by the way it has come into your hands, as a gift or as a commodity. The pair of wool socks that I buy at the store, red and grey striped, are warm and cozy. I might feel grateful for the sheep that made the wool and the worker who ran the knitting machine. I hope so. But I have no *inherent* obligation to those socks as a commodity, as private property. There is no bond beyond the politely exchanged "thank yous" with the clerk. I have paid for them and our reciprocity ended the minute I handed her the money. (22)
Like strawberries and wool socks, I like to think of free software as a gift in this sense. When I publish my code online, usually under the GPL, I'm offering it as a gift, and I'm inviting people to form a relationship with me. When I use a piece of code distributed as open source software, I do what I can to reciprocate the gift to the developers. That can take many forms: monetary donations, bug reporting, documentation writing, contributing code, or even just incessantly begging my friends to try it out. So long as that relationship is reciprocated in some way, the software lives on.
Then, it becomes pretty clear how proprietary software is the metaphorical pair of wool socks bought at the department store. The difference is that my wool socks last as long as I'm willing to keep them around. Like my Dell Latitude 3310, they last for as long as I'm individually willing to maintain them. I can repair them when they tear, and until we run out of sheep to exploit, there will always be more wool. The life of my socks can be extended arbitrarily far into the future. It should be even more clear, then, how transient software hollows out this relationship. Traditional proprietary software is transactional, but transient proprietary software is *continuously* transactional. If I really really wanted to, I could keep running Windows 95 on my computer, and if I was a bit more masochistic, then I could probably even fix any software bugs I encounter by... decompiling it, I guess? The same can't be said for the Adobe Creative Cloud, for which you make a transaction every single time you log on⁵. That relationship starts when the connection is initiated and ends immediately when the connection is terminated, leaving nothing but a fading memory.
So it does feel a little lonely developing software for ghosts. But when you put it like this, it's hard not to face the reality that this problem extends far beyond the scope of software development.
How often do you think about the fact that we're currently living in a dark age?
I don't think about it too often. I try not to. It seems as though now more than ever can we rest assured that our memories will be preserved well into the future, what, with the sheer volume of information we're pumping onto the seemingly immutable Internet. But the Internet is not immutable, and IPFS won't save us from link rot either. Digital storage has got to be one of the worst storage innovations of all time, ranking higher in read and write speeds than any other technology to date, but with a lifespan measured in years to decades at best. A far cry from the centuries to millennia stone tablets and now antiquated paper managed to accomplish. Digital storage feels like it was specifically designed to enable a massive boom in knowledge accumulation, followed by an almost inevitable and far worse bust.
There's a certain existential dread you've got to encounter sooner or later when you start overthinking these things. Sure, if you're a forward thinking prepper, you might take this time to write your most valuable data onto simple, analog storage solutions. Printers are everywhere, and hey, if you're really ambitious and looking for a new hobby, you could probably find a nice set of chisels at your nearest second-hand store. But this isn't about you. This is about everything. We don't live in a world designed to operate on analog storage solutions. We live in a world built on waste and impermanence.
When I was in middle school, I got access to an institutional computer network for the first time. With that came a 10 gigabyte slice of my school district's networked storage system. Though today I store my most important files on a one terabyte drive, at the time, 10 gigabytes was more than I could ever imagine filling, and fill it I did. From the sixth grade to my high school graduation, I filled that drive with Scratch games, portable applications, assignments, personal projects, email backups, and whatever else I produced while online at school. When I graduated, I lost access to my school drive. It's been a few years now and there's a pretty good chance it's been erased (but knowing my old school district's tech department, that's a bit hard to say). I was younger and more naive than I am now, and I didn't have the foresight to back up my data. By that point, I hadn't logged into a school laptop in a year or two, and most of my new files were being stored in my personal OneDrive. A single, innocent mistake and years of data were lost forever.
Of course, this is a problem that I created. People have always made bad decisions when it comes to preserving information, and when they do, they pay the price. The thing is, now it's easier than ever to annihilate vast swaths of information with the click of a button (or lack thereof). The world we live in today is much more fragile. It's impermanent.
What does it mean to live and to produce in a world where everything being temporary means you can hardly make it a decade through life before everything you once cared about is lost to time? Where am I even going with this article? Well, dear reader, I'm not planning on ditching you with my existential angst without at least giving you a somewhat satisfying conclusion where I conjecture about the fundamental unity of all things or whatever. Have patience and you will be rewarded.
Something that I find comes up a lot when I commiserate with friends about how awful computers are is that anti-features exist because anti-patterns are fundamentally baked into the way we think about computing. My favourite example of this is cookies and consent. By law, websites are required to ask for consent to use cookies, and they (usually) do, so long as your definition of "consent" is as simple as literally asking if it's okay. That definition is legalistic and completely ignores what consent looks like in a healthy relationship. Consent should be clear, enthusiastic, and revocable, and if acquiring consent means insistently asking someone until they crack under the pressure, that's not real consent. Cookie banners hide their true intent, are difficult to opt-out of, and, of course, if you don't accept cookies, if that's even an option, then there's no way for the website to remember that you aren't interested. That last point is key to understanding something quite subtle that also shows up everywhere in the tech industry if you look closely enough: on a fundamental level, the cookie paradigm is not designed in such a way as to make healthy relationships possible. The companies that use them aren't all that interested in establishing healthy relationships with their customers. This isn't a gift economy we live in, after all. You don't have to look too far to figure out why this sort of thing is so essential to the way we structure web services.
There's nothing fundamental to Alan Turing's Turing Machine that necessitates that the relationship we have with it should be purely extractive. The fact that people created these patterns implies that people can recreate them. With enough humility, patience and desire to break free from the rigid computational traditions we've inherited, I choose to believe there's still hope that computers can be used for good. I choose to believe there is a future in which downloading a piece of software means developing a meaningful, reciprocal relationship with the community that creates it. One where things are real because we make them real together. One where things persist because they continue to be meaningful to us.
When I write code, I want to feel something. It's quite a beautiful feeling, and when I feel it, it's a gift I'd love to share with you. That is, if you choose to accept it.
¹ Now, in case you know what Overwatch is, maybe I need to preface this by saying that I don't like Overwatch anymore. Well, I can't say I didn't enjoy playing it at the time, but I feel like saying "I play Overwatch" carries with it the same kind of stigma as League of Legends, or any other video game you were instructed to hate. Don't get me wrong; Overwatch isn't a great game. [Activision Blizzard has a fraught history with labour rights][1]. In a hilariously anti-competitive move, the company is currently trying to merge with Microsoft. But labour abuse and greed are characteristic of the entire triple-A gaming industry. Overwatch itself is just a fancy wrapper around a virtual cosmetic marketplace. The gameplay is pretty much identical to every other class based shooter.
That wasn't even really the problem, though. The reason I don't like Overwatch is because the people who played it always seemed to be so full of hatred. When I played with my friends, I saw a different, more hate and spite-driven side of them I never wanted to see. It made me a more hateful person as well, and while it proved to be a good exercise in self-restraint (anger is the worst emotion to approach a team-based strategy game with), it was that kind of outrage porn that kept people coming back to it, buying more digital products.
² Trying to figure out when this thing was released was a nightmare. I've thrown out all the relevant documentation and the 2019 figure comes from a product compliance data sheet I found on the Internet.
³ If you've found your way to this article I wouldn't be surprised if you'd already agree with me when I say that the internet is Real, and that we're all living in an episode of *Serial Experiements: Lain* that's disguised to not look like a psychological thriller. If you don't, then this statement might sound a little weird, and maybe deserves its own article. In short, I'm talking about the sort of way that the internet has colonized reality in the Global North such that they're virtually inseparable.
⁴ You may not agree with me on this. Intuitively, it still seems like we're using software that's on our computer. We're definitely *using* a web browser. but when we access a web app, what are we *really* using? The web app, or the browser? Is the web app an extension to the browser? We are technically using a browser, but the objective of using a web app is not to use a browser; it's to use a web app. The distinction is important, because what we can do with transient web apps as end-users is very different from what we can do with the more "tangible" software that is a web browser.
⁵ The actual payments are made on a monthly basis, but what I understand to be the "transaction" made when logging on is the concession of personal information to confirm one's right to continue accessing the software. That, and of course, the monthly payments, when it's their time.
---
"Transience-orientated programming, or how to make software for ghosts" was published on 2023-02-19
If you have thoughts you'd like to share, send me an email!