💾 Archived View for tilde.town › ~mio › log › 2022-07-21-occ-2022.gmi captured on 2024-05-26 at 15:08:26. Gemini links have been rewritten to link to archived content
⬅️ Previous capture (2023-07-22)
-=-=-=-=-=-=-
---
date: 2022-07-21T20:52:00Z
---
The first time I heard about solene's Old Computer Challenge was through a great conversation with wsinatra as well as his detailed blog post at lambdacreate recounting his experience with the first challenge held the previous year. The rules in this year's challenge included 1 hour maximum of internet connectivity, in a nod to the days of dialup modem internet.
The parameters:
* RAM: 512M of 2G, soft-limited via earlyoom
* CPU: 1 of 4 cores, soft-limited via ulimit CPU time (1m for 100% CPU)
* OS: Alpine Linux
* Messaging/calls over a mobile device
* IRC messaging over SSH (Day 3 onwards)
My participation was mostly symbolic. Life caught up a few days before the challenge was set to start and I didn't get around to dusting off any cool old hardware for the occasion, but eventually decided to try it with my humble Chromebook that was already in use for almost 5 years, though its factory age is close to 9 years since its first release in 2013.
One thing people might notice is none of the specifications were hard-limited. On day 1 I did a little searching for a way to limit the RAM and CPU usage on a per-user basis. The closest thing was cgroups, which was apparently managed by systemd, while Alpine used openrc. Settled on soft limits that would theoretically kick in if the resource bounds were exceeded for too long, though ideally usage would be reined in to not allow it to happen. It was also possible to enforce the connection time limit with a timer-controlled firewall, but for my purposes it seemed unnecessary. Manually noting how much time was used on certain online activities would offer some awareness to how and where I was spending the time.
Overall I liked being able to jump in without a lot of preparation besides adding a new user and pulling in some config files to make it usable for the week. For the occasion I briefly thought about running a different Linux from a list of distros known to work on the Chromebook that I haven't tried yet and would probably do it for another day, but took the challenge to further check and adjust an approach towards lighter resource usage and exploring the smol web that I had been considering for some time. A few questions I had were:
The exemptions to the internet time limit were made for social reasons. Some people would probably not count talk/messaging time on mobile devices, but the challenge mentioned the 1 hour was shared across all devices, so exceptions were noted in the parameters. My view of the limit was to show how the internet can be used resourcefully or in other positive ways, e.g. to connect people on different platforms and protocols, not to make people feel isolated as the week passed in part due to being unable to take calls or text their friends.
wsinatra's blog post at lambdacreate
list of distros known working on the Asus C201 Chromebook
* 20m - looked up resource limit settings, added system-wide fonts
* 10m - downloaded podcasts from a mobile device
* 10m - checked mail, prefetched RSS feeds, checked IRC
* 5m - checked local fediverse
Since it was my first time doing the challenge, one initial thing to do was to take stock of current resource usage. According to `htop`, the most memory-consuming applications were:
Baseline RAM usage was in the range of 146M - 181M using mainly CLI applications:
Including mail and IRC clients would increase the baseline usage by 40M (12M and 28M respectively) to around 221M, with all the applications mentioned running in the background. Given most of the clients require an internet connection to update content and the online time limit, a few like the RSS and Mastodon clients I might check once daily then close them when not in use to recover some RAM. Multiplexer sessions can group applications together and be re-launched with one command.
CPU-wise, the list was shorter, mostly because the system had no video acceleration (which should be fixable by custom packaging video drivers), so even when I had time to play video games, the majority of modern titles wouldn't run well on it. (Text adventures were fine though, as was retro games emulation.) Applications that tapped noticeably into CPU were:
Typically my internet time might begin with checking mail, RSS, IRC and sometimes the fediverse, the latter I could check a little more often after finding a TUI client that I could keep open for longer stretches. Due to the time limit, I switched to downloading podcasts in the background for later instead of streaming.
Spent significantly less time in IRC, from roughly 1-2h to 5m. The client was left open and connected on a server, acting like a bouncer to save conversations in scrollback. Only logged the time actually reading and replying. Also had less time on the fediverse relative to other networks such as IRC. I like the idea of a decentralised social network, unfortunately I'm also very selective about what kind of posts I'd like to read or follow. It's also easier to have casual conversations on IRC in real-time that switch topics or go on at some length, and not worry about potentially overloading other people's timelines.
Couldn't access the official Forth website from Lynx, got a `403 Forbidden` error. The web server probably misidentified it as a bot. Not a problem to load it in a GUI browser quickly to download the PDF book for offline reading, but did wonder how many other websites have a similar block.
* 5m - downloaded podcasts
* 5m - checked mail, prefetched RSS feeds and local fediverse timeline
* 50m - checked IRC
Easily spent most of the internet hour conversing with people on IRC. One of the servers I lurk in, affectionately known as "casa" among the regulars, is typically quieter on weekends and livens up with banter during weekdays. In leaving after time was up I had to cut a conversation short and felt badly about it, as it wasn't compulsory and more an arbitrary personal choice stemming from the challenge. Decided to add IRC (currently connected over SSH anyway) as part of the messaging exemptions to the internet time limit, to take effect on day 3. For the remainder of the day, I wanted to see what else I would miss without internet access. It's great being able to connect with people anytime despite distance and timezones, and have all sorts of interesting, funny and productive conversations. Being on IRC with a friendly, mutually supportive crowd has a positive effect on my day and was an aspect of internet connectivity I'd like to keep.
Would have to look for a way to download Gemini sites tomorrow. Ran out of internet time and wanted to download the Braxon stories by Joneworlds to continue reading offline after following the series over multiple episodes of the Tilde Whirl podcast.
Another side effect of using up the internet time for the day was turning to other activities I had wanted to try for a while. Played a little solo tabletop tea shop sim called Whisling Wolf Café with the instructions PDF open on the screen and an Android app for dice rolls. The description estimated gameplay to be 10-20 minutes, though my first full game was 1.5 hours. It's easy to play, with short rounds that make it similarly easy to pause and resume.
* 10m - downloaded podcasts, mail and RSS
* 30m - looked into saving Gemini content, downloaded PDF viewers
* 10m - looked up tabletop games licensed under CC-BY/CC-BY-SA
Went looking for a download manager for the Gemini protocol and didn't find a suitable utility from the Gemini software list. The closest thing was gemini-fetch, which was more a library than a downloader like wget. One possibility was to use wget to fetch files from a Gemini proxy, but it was not as straightforward as pointing wget to a subdirectory under the proxy url (it reported the error `disallowed by robots.txt`, the `-e robots=off` flag didn't work). It might work by taking the list of URLs indexed by wget from the log output and then pass them back to wget inside a shell script loop to get each page separately. Fortunately in the case of Braxon, the author included an ebook of all the journal entries to date in a gopherhole, which could be downloaded from Lynx and saved the trouble of parsing the wget log.
For viewing ebooks, I had been using a local build of Bookworm. The interface for managing its book collection is a bit buggy, but the viewer does work. A light option, if lack of formatting would be tolerable, was to convert to plain text, save or pipe it to a pager like `less` for reading:
epub2txt file.epub - | less epub2txt file.epub > file.txt
The same idea could be used for text-only PDF files using pdftotext: `pdftotext file.pdf - | less`. However, I regularly browse PDFs that are a mix or entirely composed of images, which text conversion doesn't handle. In search of a lighter PDF viewer, I tried a few different applications with a 2-page text-only PDF and a 9-page image-based PDF, just two examples of files I might typically open. These were:
Bookworm also supported PDFs and the RAM usage is close to Evince, but because it automatically added any opened files to its collection (regardless of whether it could actually render it), I preferred a separate PDF viewer.
* 10m - downloaded podcasts, mail, RSS and fediverse timeline
* 25m - looked up tabletop games under open game licensing
Used the GUI web browser a bit during a search and got up to 4 tabs open with static pages (about 329M) before exceeding resource limits and had to close additional tabs. The browser had a keybinding configured to save sessions, which can be restored if the browser abruptly closed or were terminated by earlyoom.
Another category of applications I should probably check are graphics programs such as Inkscape. Previously had the application's memory usage shot up to 511M with a file open for 1-2 days. After making a simple cover design with it for a few hours, it gradually inched up to 196M from 114M with a blank document, which fortunately was still usable.
Online searches felt slower to complete while checking the clock frequently to pace in the internet time available. For example, looking into the topic of tabletop games with open game licensing has taken two days so far with leads but somewhat scattered results. Following links took time and was more cumbersome to do with only 4 browser tabs open. Also reserved some internet minutes in case I needed access for something important, but was too tired by the end of the day to make use of the remaining time.
* 10m - downloaded podcasts, mail, RSS and fediverse timeline
* 10m - browsed links from the fediverse
* 35m - looked up a few Gemini clients, browsed Gemini capsules
On the quest for a smol web client. Among the GUI browsers were Lagrange and Castor, which used 73M and 19M respectively with 1 window open. Both were good options visually, but I preferred a client with some keyboard operability. Of the CLI options, I liked Amfora's interface, with colours and tabs. The only drawback with its tabs was only the right-most tab could be closed currently, which was a bit annoying. Bombadillo had a webmode for http/https (disabled by default) that could make navigating between protocols more seamless. RAM usage was moderate, 30M (amfora) and 38M (bombadillo). Also wanted to try Asuka, but it was unavailable in the Alpine repos and I didn't get to packaging it locally.
Read *Braxon* and a few other stories at Joneworlds. Just noticed I didn't know how to select and copy text in an EPUB file within mupdf, right-click dragging as in PDFs didn't work.
* 5m - downloaded podcasts, mail, RSS and fediverse timeline
* 35m - looked up licenses for tabletop game titles, checked a Gemini message board
* 20m - browsed Gemini capsules
Followed the trail recommended by geminiquicksta.rt and found a link-aggregating message board. My initial impression of the smol web is of an ecosystem where people can focus on telling stories, reading and communicating without a load of elements all vying for attention at once and persistent tracking. There are still pockets of the HTTPS web that fill a similar role, but increasingly they seem to be a smaller part of a web dominated by large silos. For browsing in general, 20-30m time segments worked better for me, which provided time to do longer searches before moving to other tasks (context switching might take a bit of time).
Also began reading *Starting Forth* with an interpreter open beside the book to try the examples in it.
* 5m - downloaded podcasts, mail, RSS and fediverse timeline
* 45m - looked up tabletop game licenses
Exceeded resource limits today. I forgot to launch the browser with Javascript disabled and usage spiked rapidly with 4 tabs open and the other CLI applications running in the background. Quickly got usage stats back within range again and was allowed 2 tabs with Javascript enabled. There are GUI browsers that use less memory, at the cost of pages not rendering fully and some basic interaction elements not working at all, or combined with crashes and instability. It's a bit like an internet kiosk except it only has one user. Trying to rein in my sarcasm here.
* 5m - downloaded podcasts, mail, RSS and fediverse timeline
* 30m - searched for a tabbed Gemini browser, tested another CLI browser
* 10m - fetched dependencies to compile a package
* 5m - looked up command flag options
A bit sad that I couldn't play in LeoCAD, a toy bricks CAD program — RAM use was a manageable 117M, but it would emit 348% CPU bursts when dragging parts from the parts selection window to the model view. It might work if I could hard-cap the CPU to 1.0. As it were, it would be too much like cheating.
While looking through the list of Gemini clients on the official Gemini website, I came across Fafi and tried to compile it again. In the previous attempt the version of racket in the repos was too old (7.x) and according to one of the issue reports, the application needed racket >= 8.2. When the resulting executable ran, it would shortly exit with an error like this:
class*: superclass does not provide an expected method for override override name: on-close-request class name: custom-tab-panel%
In the meantime, racket had since been updated to 8.5 in the repos, so I retried a simple APKBUILD I had prepared earlier. Initially got an error, which may have been due to process termination for running out of memory:
raco setup: making: <pkgs>/compiler-lib/compiler/commands raco setup: in <pkgs>/compiler-lib/compiler/commands raco setup: in <pkgs>/compiler-lib/compiler/private SIGSEGV MAPERR si_code 1 fault on addr 0x207 Aborted (core dumped)
Re-ran `abuild -r` and got a different error:
Linking current directory as a package Compiling bytecode... done. Building executable...find-exe: can't find GRacket executable for variant 3m
This issue seemed to be related to the racket compiler, not just Fafi. A workaround was to do:
raco pkg install raco exe --3m main.rkt
This was probably missing optimisations or other things, but the resulting binary worked. Wasn't keen about the 212M it used with only 1 window open, but it looked very nice and a healthy ecosystem could use more choices.
TIL w3m has buffers. This was very relevant to my search for a usable web browser that had multiple tabs/views. Would definitely take a closer look at w3m in the coming days.
With internet time almost up for the day, played another solo tabletop game, this time A Day at the Crystal Market. Recently I've been looking at exploratory tabletop games that don't require a lot of materials to play (instructions, maybe a deck of playing cards and 1-2 d6). Hadn't been interested in tabletop games before, but seeing some of the smaller indie games helped me appreciate the wide variety of things that can be done with the genre beyond Dungeons & Dragons and rogue-like dungeon-trawling games.
To revisit the questions I had on day 1:
At this point in time, I look for experiences more focused on people and expression.
From the start, I wanted to set aside more time to explore the smol web because it's creative and interesting in their own ways, and not only as a refuge from the deteriorating usability of the mainstream web. Didn't get in as much Gopher/Gemini browsing as I'd like, with what should have been simple web searches having taken up a portion of the allotted hours. IRC would have filled up much of the allotted time (pleasantly) had I not caved to an exemption. A time limit certainly made me consider where or how to put time towards things I enjoy.
Given the fairly small file sizes of many pages on the smol web, I think there was a missed opportunity for having a Gopher/Gemini application that could index and cache links up to 2-3 hops away, and be able to browse them offline later. It might already be possible with existing clients and I didn't know it then. Better preparation next time.
The main thing was viewing images, e.g. media attachments in fediverse timelines. Previously in tut, image attachments can be viewed with the default GUI image viewer via xdg-open, but I switched to toot for a while because tut stopped loading a significant chunk of thread replies, and lost poll voting and viewing media (faster than loading the toot URL in a GUI web browser) in the move. This being one of various inconveniences I'd like to rectify.
Related to this was an undercurrent of slight dissatisfaction with the configuration, whereby either I haven't found the most suitable application with the features I'd like (while still being fairly light!), or the settings and hacks to have things work as desired. That being said, it still looks like a good direction, and re-discovering w3m makes me more optimistic about sorting out the rough edges eventually. In the past few days since the challenge, I adjusted the w3m configuration and with some warming up on the key bindings, it has improved my web browsing and reading enough to use concurrently with GUI browsers.
Overall, it was a mildly unpleasant week for someone used to accessing the internet in short bursts anytime throughout the day to look up one thing or another and subsequently had to mentally plan ahead briefly to maximise the time blocks. However, it was also not a hard time, as there were plenty of other things I could do that didn't require an active internet connection. I wouldn't want to do this every day, but a week is roughly enough time to begin seeing patterns, including what worked and what didn't work so well, for future reference.