💾 Archived View for splint.rs › dist_ssl.gmi captured on 2023-09-28 at 16:00:36. Gemini links have been rewritten to link to archived content
⬅️ Previous capture (2023-07-22)
-=-=-=-=-=-=-
Distributed SSL seems a tough problem to crack. But if you have a central authority, they could go rogue, and then everyone's in trouble.
Gemini's solution - Trust on First Use - removes any need for centrality, or authorities, but also removes our ability to trust sites the first time we visit them.
But when do we first visit them? When we use a link! So this gave me an idea.
(I have zero no education in this field, so if the following proposal seems laughable, do e-mail me, but make it an easy e-mail for the uninitiated.)
Currently, my capsule has a gemini link to youshitsune, which looks like this:
=> youshitsune.tech Youshitsune
And the key looks like this:
"youshitsune/tech" = "93CF147C1EEEFD64A67514BC5F2BE2B27A009460FB38F6D1B5CB91A8D76EB0CB"
(I've just checked and the certificate changed early; is the new one valid? Or some trick? I have no way to check!)
Anyway, my proposal is to structure links like this:
=> gemini://youshitsune.tech {869A9DEDFA1318CD80B840F26CA9ACCE36F5BD5FE133DA638538E5721186FBDA} Youshitsune
That's a little ugly, but we'll put that aside for now. The idea is to place the key in the link. The browser would then check the key against the site, and give the user a warning if the site does not match.
This means any malicious actor would need to control two sites instead of one - a serious improvement over the current model.
If the key were missing, this would not mean any kind of failure. The capsule simply remains 'unknown', or 'verification level 0'.
That last proposal is a change to the Gemini protocol itself. But browsers could do more to help. They could take a copy of every key they see, for every domain. So if you browse a page with 10 links, the browser could take each key it sees. If a capsule suddenly changes keys, we now have a much higher chance of having something to verify it against, since normal browsing would naturally populate a list of keys for capsules.
Of course, these 'suggested' keys should not have equal status. If 5 places say the key is X, while 2 say it is Y, then X clearly wins. But then again, if someone disagrees with this idea, they could make their browser more careful by changing that setting, or they could make their browser less careful.
If this were implemented, I wouldn't want to handle those keys myself. I only have links to about 4 other capsules, and when one disappeared, I didn't notice for a month. Clearly, it should be automated.
Of course, that shouldn't present much of a challenge for sysadmins. Anyone putting out their own capsule could keep a shared git repository which only lists trusted keys, and each new commit could receive a GPG-signature from the capsule's administrator.
If we had a repo which had that line from youshitsune (above), Gemini pages could get updated with a simple `sed` script. This means someone would only need to grab a GPG public key once, and confidently link to someone's capsule by pulling their version of the git repository with their signature.
Alternatively, maybe the robots.txt file with a copy of the capsule's certificate, signed by the owner's GPG key, then I could have a script check that youshitsune's GPG signature has indeed signed this new key.
Gemini began with the ambitious goal of writing a client in 300 lines. The community clearly won't like any attempt to add yet-another-step to the protocol.
...but then again security trumps all.
Changing keys guarantees broken capsules, since everyone visiting would have the old key, while the new one has not yet populated. To fix this, links might provide a list of valid keys.
I don't know what the limits might be here, but two keys sound like a minimum.
At minimum, all browsers would have to ignore that second part, in the curly brackets. That's a lot of browsers!
Anyone setting up a capsule with a link could leave it running for years, without updating. This could leave a lot of false negatives. But of course, browsers should not require any certificate to load sites, so nobody should feel forced into hunting down a server's key.
SSL certificate sharing is a general problem, so why start with the smol internet?
Well, because of all the nerds! And energy! And gumption!
The web will not change. Nobody who makes decisions about how browsers run will look at any proposal and think about the benefits of decentralization - they'll think about their corporation.
Gemini users know what's what (technologically speaking, at least). And maybe this is a dumb idea, but if it's a good idea, gemini could realistically adopt it.
I don't think this could work across the web, because you can't connect to just one site. You connect to a plethora, which connect to Google.
If javascriptfonts.com came under the control of bad actors, they might change a boat-load of sites, which means a lot of reassurances that posteo.de has changed their certificate, when in fact it hasn't.
Well...that's not exactly plausible, since we only use javascriptfonts.com for downloading fonts, but the generally high level of activity seems to preclude trusting any site, as a site never just shows you the site. And even if it does, it can't guarantee to show you just that site.
If this works, then nobody would require specific DNS, just accurate DNS. As long as the browser/ user can check the cert, any number of IP addresses might match up to the domain. If we don't need to trust the DNS, then the browser can grab lists from any source and sort the veracity afterwards. And of course, any DNS provider can do the same with any number of sources.