đŸ Archived View for gemini.circumlunar.space âș users âș adiabatic âș scrawlspace âș 2024 captured on 2024-02-05 at 09:35:21. Gemini links have been rewritten to link to archived content
âĄïž Next capture (2024-03-21)
-=-=-=-=-=-=-
This is scrawlspace. I scrawl in this space. Do not expect coherence or permanence here.
Prior reading:
JeanG3nie, âWhen a walled garden becomes a preserveâ
Money graf:
At this point, Apple's refusal to allow another browser engine on it's platforms might be the only thing keeping Chrome from being able to fully dictate the direction of the web.
I certainly say this, but I prefer Apple things to Google things. Iâm not a neutral third-party.
John Gruber has a look into what changes under the Digital Markets Act over in the EU:
John Gruber, âAppleâs Plans for the DMA in the European Unionâ
The relevant bit:
One point of confusion is that some aspects of Appleâs proposed DMA compliance apply to the App Store across all platforms (iPhone, iPad, Mac, TV, Watch, and soon, Vision), but other aspects are specific to the iOS platformâââwhich is to say, only the iPhone.
And then thereâs Appleâs relevant page:
Apple, âUsing alternative browser engines in the European Unionâ
iOS 17.4 introduces new capabilities that let iOS apps use alternative browser engines â browser engines other than WebKit â for dedicated browser apps and apps providing in-app browsing experiences in the EU.
Two things of note:
iOS
(as opposed to iOS and iPadOS)
in the EU
So unless youâre ignoring iPhone users outside the EU, you, as a website developer, canât just tell your iPhone-using visitors to download Chrome-with-Blink-in-it and come back. Even if youâll happily do the work, people up the chain of command who are more business-minded wonât have a net financial incentive to say âlet the Apple people download Chrome and then they can visit our siteâ. Youâll have to put in the time to make the site work right in Safari.
This state of affairs largely preserves Appleâs ability to defend its ecosystem and users from Googleâs snooping. After all, if you have to use Googleâs browser to do almost anything on the web other than browsing a handful of indie sites, thatâs a clear-cut monopoly and makes real consumer choice all but impossible. Anti-consumer-choice monopolies, of course, are the kind of thing governments say theyâre against, at least when theyâre in the private sector.
I first encountered the phrase âlet him cookâ on a Twitch stream where the streamer speedruns The Legend of Zelda: Breath of the Wild. Generally, when one cooks food in this game during a speedrun, itâs to make up large batches of food, and you canât un-make an omelet, so thereâll be a chorus of LETHIMCOOK in chat to get other people to, at least temporarily, not try to get the streamerâs attention for a bit.
My second encounter with the phrase, or something like it, was close, but a bit less literal.
Linkus7 on YouTube â Can you beat every dungeon without the paraglider in Tears of the Kingdom?
If you avoided getting the paraglider, then there more than a few places where your options to continue on are basically one of these two:
The second of these options is way less entertaining, so a guy whose day job is âentertainerâ who does so by playing games naturally tries for the first option.
Itâs in this context that he says âlet me cookâ â but here, heâs not asking Twitch chat to not try and get his attention. Heâs asking them to hold their horses while he tries to work out a solution to falling down a 2,000-meter hole without dying from the sudden stop at the end.
â
âŠand then I saw âlet X cookâ on X, coming from the HTMX account:
@htmx_org on stored procedures driving UI
i don't like the idea of stored procedures driving UI mainly due to the mechanics of updating them (version control, etc) but i'm willing to let them cook because eliminating the app server/db hop is one of the last big, obvious perf wins in most web apps...
(This is in the context of a hypothetical âReact Database Componentsâ. If you donât want to click through, imagine a stored procedure in your database that returns a snippet of JSX, and inside that is a list of todo items all wrapped in li elements, and the bundle is wrapped in a ul element.)
Still, thereâs the phrase
let them cook
If you find yourself looking for a way to reserve judgement on an idea until implementations of the idea get better fleshed out and/or better-spaded so the upsides and downsides are better understood, you could do worse than to haul out this turn of phrase.
Background information:
I played Diablo 3 for a bit.
One of the things that I noticed was that once I got to endgame content, I could mostly shut my brain off while I was killing demons. However, I had to pause podcasts and give my full attention to what I was doing when I was selling all the loot that I had accumulated, because âdo I keep this or do I sell thisâ was something that took all of my decisionmaking faculties and wasnât something I could just outsource to my brain stem.
I thought about this for a bit when pondering the process of cooking in The Legend of Zelda: Breath of the Wild and Tears of the Kingdom. If you want a particular effect, or a particular level of an effect, you canât just shut your brain off â oftentimes you have to look up specific ingredients and their potencies and maybe use an online calculator to find out if you can make something that will give you a level-3 buff for as long as you think youâll need it for.
Niklaus Wirth passed recently, and so his âA Plea for Lean Softwareâ has been making the rounds:
Niklaus Wirth â A Plea for Lean Software (February 1995)
I actually read it in full. Itâs not long. A bunch of people have posted excerpts they agree with. He ends with a list of lessons learned from Oberon. These are mostly sensible, although #5 is a bit suspect. My takes:
â
HoweverâŠ
Wirth is writing this at the beginning of 1995. Windows 95 was to come out that summer, and Windows 3.1 is already out there for normal people, and Windows NT 3.5 has been out for a few months already. Oberon, his pride and joy, was written between 1986 and 1989, back when Riker was clean-shaven and Windows hadnât hit 3.0 yet. Windows didnât get popular until Windows 3.0.
Back to Wirth. The speed of development of Oberon is impressive:
Designed and implementedâfrom scratchâby two people within three years, Oberon has been since been ported to several commercially available workstations and has found many enthusiastic users, particularly since it is freely available.
Oberon, to its (minor) credit, appears to have both color and graphics, although itâs not obvious from the screenshot that any kind of graphical paint program is possible in it. Presumably the giant squiggly can be generated with text, like SVG or POV-Ray. This will be relevant shortly.
Where Wirth seems to go off the rails is near the beginning of his article. There, he lays out his idea of what are â in 1995 â mere nice-to-haves:
Uncontrolled software growth has also been accepted because customers have trouble distinguishing between essential features and those that are just ânice to haveâ. Examples of the latter class: those arbitrarily overlapping windows suggested by the uncritically but widely adopted desktop metaphor; and fancy icons decorating the screen display, such as antique mailboxes and garbage cans that are further enhanced by the visible movement of selected items toward their ultimate destination. These details are cute but not essential, and they have hidden cost.
In modern terms:
Later, he continues:
increased complexity results in large part from our recent penchant for friendly user interaction. Iâve already mentioned windows and icons; color, gray-scales, shadows, pop-ups, pictures, and all kinds of gadgets can easily be added.
Modernizing:
â
Personally, Iâd like to have seen a debate between Niklaus Wirth and, say, Jakob Nielsen of the Nielsen Norman Group. Both men have an anti-frippery bent, but the usability proponent is going to have a much broader idea of what work needs to be done to make systems usable for normal people who arenât computer experts and also people who have one or more computing-relevant body parts that donât work right, like eyes or arms.
While text-to-speech systems seem to be mostly a solved problem on even wrist-worn consumer hardware, speech-to-text seems to be a problem that will happily consume whatever computing resources you can throw at it â up to and including machine-learning models that will take up like half your RAM on a 32-GB machine with an M3 Apple Silicon processor in it.
References:
If you want to read older entries, hereâs the page for the previous year:
If you want to stay abreast of updates, have a look at this capsuleâs colophon. It links to the capsuleâs JSON Feed and Atom feed.
Additionally, the following URL will always redirect to the current year, assuming I havenât forgotten to update the redirect after making the first post of the year:
â