💾 Archived View for rawtext.club › ~sloum › geminilist › 006774.gmi captured on 2021-11-30 at 19:37:34. Gemini links have been rewritten to link to archived content
-=-=-=-=-=-=-
indieterminacy at libre.brussels indieterminacy at libre.brussels
Sun Jun 20 23:02:09 BST 2021
- - - - - - - - - - - - - - - - - - -
Hi Christian,
Thanks for your swansong concerning this thread.
Apologies that my response deviates from the topic re compression but your points and citations (btw, do you have the links) do raise some interesting areas.
Given my experience of HTTP content, I have always meant to valorise my workflow across sites. I would like it, so that dependent on the site (or subsite), when I am clicking on the content is subsequently loaded in a different browser or with a different toolsets.
I have seem approaches for dealing with this at an eLisp end but I havent got around to implementing it myself. In some respects, my memory association with a specific site and a specific browser allows me to fudge it. While I do relish bindings and hotkeys, I have appreciated Emacs Hyperbole's ability to do something appropriate based around contents of the content in my text buffer (though similarly I havent yet tweaked it for all my needs and workflows) - based on simple clicking.
In general, I do feel that a 'workbench' approach has a utility, whereby different tools for different functions are available. One should have the ability to orientate towards a tool with minimal cost. A utility should be dependable and expectiations of deviation reasonable.
Going back to HTTP, Javascripts like uMatrix (and things like NoScript) have been informative re the contemptuous/irresponsible behaviour of content providers. The combinations for minimal client supplication in order to receive content are fascinating, as over time they break down (not only from shifts in infrastructure but additionally deterring JS blocker users). Im happy to have these impediments in the way, if only to be reminded of the harm. My less tech savvy partner tollerates this to an extent but edge cases really throw her, so all switches go on (!).
I recall a recent thread concerning clients and the use of favicons, with a lot of heat regarding whether an attempt at adding visual flair was detrimental. Similarly to the issue of compression, there seems to be an awkward dance which may ossilate between different producers and users of tools and content.
Is it possible to avoid an approach which is heavily dependent on (ambiguous?) norms within a site (ie, non protocol hardened expectations), which can be picked up/interpreted by a client? Instead, the client could call upon a repo(s) (pre downloaded, like a CA bundle) - which gives an indication of a URI's policies.
If such repo(s) existed, it might allow switching between different clients to satisfy different use cases.
For example:* Desire compression and visiting site foo.fr? Use this client for this usecase, with settings Y, R and I built in client side* Desire a favicon and visitng site bar.gr? Use this client for this usecase, with settings G, F, and T activated.
Kind regards,
Jonathan
June 19, 2021 12:49 PM, "Christian Seibold" <krixano at mailbox.org> wrote:
This is the final message that I will be sending on this thread.
The Gemini protocol restricts what is added to the core protocol. Compression is not used there for
a reason as outlined by the FAQ, which I have read numerous times.
What the spec does NOT do is outlaw which filetypes can be sent over the protocol. Zip files can be
downloaded over gemini. Same with any other filetype.
The spec says how to interpret both the protocol itself, and gemtext. It has nothing to say about
handling filetypes, aside from gemtext. Why? Because gemini is a file transfer protocol at its
core. You transfer files over it. The spec also purposefully allows for streaming of files, and
this has been talked about many times in both a mailing list thread and on Solderpunk's gemlog.
What the spec does NOT do is try to censor what files can be downloaded over the protocol. Which
means compressed files are allowed to be downloaded over the protocol, just like any other file.
This is NOT against the protocol, as shown in the spec:
Response handling by clients should be informed by the provided MIME type information. Gemini
defines one MIME type of its own (text/gemini) whose handling is discussed below in section 5. In
all other cases, clients should do "something sensible" based on the MIME type. Minimalistic
clients might adopt a strategy of printing all other text/* responses to the screen without
formatting and saving all non-text responses to the disk. Clients for unix systems may consult
/etc/mailcap to find installed programs for handling non-text types.
I will now give various quotes by Solderpunk that show that Solderpunk was a fan of a variety of
different types of clients that do different things - from his article on "A vision for Gemini
applications"
(gemini://gemini.circumlunar.space/users/solderpunk/gemlog/a-vision-for-gemini-applications.gmi):
What this provides is a nice little, dare I say it, "containerised" Gemini application experience.
You are easily and reliably identified to one service and one service only, and no external content
can control what you do with that identity, and your identity can't accidentally leak out to
anywhere external.
Meanwhile, your "everyday" Gemini client, which will let you go anywhere you like and follow
whichever links you like, does not know the paths to any of your client certificates. Maybe it
doesn't even support client certificates at all! If you roam the dangerous wild internet frontiers
with this client, and you accidentally follow a malicious link to one of your apps running on
localhost, as long as that app requires an approved client certificate to do anything of
consequence, no damage can be done.
Later on:
Some people may still be thinking that this looks like an ugly complication, even if you can wrap
these containers up in nice convenient shell scripts (and I'm sure some kind of GUI management
solution could be whipped up for people who want one): "What, now I need *two* kinds of client to
use Gemini? Gimme a break!". But I think it's a small surface complication which yields large
simplifications deeper down. The containerised identities approach using ultra-slim clients creates
two clearly separate ecological niches for clients: reading static textual content like gemlogs,
technical documentation, fiction, news reports, weather forecasts, etc. on the one hand and making
use of individual, certificate-secured dynamic applications on the other. This allows for client
authors to target one niche only and therefore write simpler clients.
In short, different tools for different jobs, but with a common underlying protocol and markup
language.
Or they can partake of both Gemini experiences. I really hope that this approach can ease some of
the tensions that are building between people with different visions for what Geminispace should
be, and make the lives of developers easier at the same time.
In summary, and this is the most important one:
Most Gemini clients will fall into one of two distinct categories
* "Reading-centric" clients will probably have very little or no support for client certificates at
all. Instead they will focus on beautiful and customisable typography, bookmarking, feed
subscription, TOC generation, etc. There will be many clients of this kind, taking different
approaches. Many will be graphical. People will love some and hate others. Plenty of room room for
all.
* "App-centric" clients will instead have good support for client certificates and will be designed
to very securely limit the use of those clients to a single app in order to totally avoid the risk
of CSRF. There will be much less variation amongst these clients because they are all just doing
the same basic thing. Most of them will be terminal-based, because that's just the better interface
for interacting with text-based applications, but they needn't all be.
Hybrid clients attempting to address both the document reading and secure app niches will surely
exist, but to avoid the risk of CSRF they will require either slightly clunky interfaces or good
awareness and understanding from users. These may be the preferred clients of "power users".