💾 Archived View for perplexing.space › 2020 › re-low-power-computing.gmi captured on 2022-04-28 at 17:25:48. Gemini links have been rewritten to link to archived content
⬅️ Previous capture (2020-09-24)
-=-=-=-=-=-=-
2020-09-17
Something about InvisibleUp's thoughts on low-power computing set me to thinking about the future potential for computers and different technologies; here I blather on about those thoughts.
The first thing that came to mind was Jeff Fox's stories about iTV and systems developed entirely in Forth — he's got stories of from-scratch computing encompassing IP/TCP, GUIs, GIF/JPEG, file systems, email and browsers all running on custom made Forth chips in a few hundred K of code space. Part of me loves the story (and really, most stories about Chuck Moore) but I can't help but feel a little bitterness in retrospect. I can only speculate as to why iTV never made it off the ground, but it is curious to look at other cool projects in the same realm. GreenArrays is another venture of Moore's developing some pretty fringe technologies with impressive capabilities -- there's a video of a 144-core board doing gesture recognition using on-board accelerometers powered by an actual potato. There are a plethora of fascinating stories of the same sort, people using innovative technology to accomplish almost unimaginable feats; but nothing really changes. My own computers might become billions or trillions of times faster but the software becomes commensurately slower.
We get miniaturized servers in our pockets, stripped of the possibility of most of their intended use (can you imagine multi-user computing in the Multics sense of the word, running from a smartphone?). Of course, this isn't very efficient, it is all very much "worse is better" and an accident of history. So we end up strip mining continents for the material to pack more power to feed the devices until they pose a literal flight-risk for their explosive potential.
I sit here grappling with what is "enough" and while it might vary a little between people, whatever the average person has currently is a plausible estimate. The only answer I can draw up for "How much computer is enough?" is that it is never enough.
It is possible to ensconce ourselves in a bubble here in Gemini-space and also within technology communities on the broader internet that agree "JavaScript is to blame". I have made the argument myself — but the reason we need an Intel i9 to read the news is because people want it that way. There are plenty of available low-impact sources for media, but people don't use them because they don't want to. Above all else, people have chosen convenience, any additional friction imposed on them would leave alternative proposals dead on arrival, no matter what you put on the other end of the scale.
Without re-writing any software or making any real effort it is entirely possible to boot a computer from twenty years ago and accomplish a significant portion of day-to-day computing. So why don't we? Inevitably it is because of the non-necessary parts that don't really work without the latest hardware. People want to stream movies and run 8 browser instances and use video chat, they want to do all that simultaneously.
Without rambling too much further I'll just say, I like the idea of sustainable computing. I love reading Low-Tech Magazine and about how the website is powered using solar energy. I wish everyone could seriously consider the impacts of efficiency and the true cost of materials — but I don't think they will.