💾 Archived View for rawtext.club › ~sloum › geminilist › 005406.gmi captured on 2023-11-14 at 10:07:21. Gemini links have been rewritten to link to archived content
⬅️ Previous capture (2021-11-30)
-=-=-=-=-=-=-
Mansfield mansfield at ondollo.com
Sun Feb 21 18:53:36 GMT 2021
- - - - - - - - - - - - - - - - - - -
I like the idea of supporting the creating-users freedom to choose to havetheir content only accessible through the gemini protocol. (Of course thissupport only goes so far - once content is available, it's available.Stopping someone from taking what you've made available and doing what theywill with it... short of a legal license and litigation... ugh.)
So... I have an HTTP Proxy and, while I provide a robots.txt, I'd like toexplore going a step further. I'd like to provide a URL path prefix like:/internal/blacklist
The process I'm thinking of is explained in more detail below, but theresult would be that the HTTP Proxy refuses to forward requests toself-blacklisted Gemini hosts.
Before the process, going to https://gem.ondollo.com/external/ondollo.comwould return the gemini/text content available at gemini://ondollo.com tothe requesting web browser.
After the process, going to https://gem.ondollo.com/external/ondollo.comwould *not* return the gemini/text content available at gemini://ondollo.com
Maybe the proxy could instead return a page that says... "The owner of thiscontent has requested that it only be accessible through the geminiprotocol and we respect that. Please use a Gemini client to access contentat gemini://ondollo.com. Here are some recommended clients: <list of one ormore clients>"
... or... the HTTP Proxy could just 404? 400? 204 Success and no content?301 Moved permanently with a gemini URI? 403 Forbidden (not a *user* authzissue... but a *proxy* authz issue...)? 410 Gone? 451 Legal? (As an aside:there is a part of me that loves discussions/debates around what the rightHTTP status code is for a given situation... there's another part of methat dies every time...)
I think I'll go for a 200 HTTP status and content that explains whathappened and encourages the user to access the content through gemini.
Here's a sequence I'm thinking of (given a creating-user and their geminiserver at gemhost, and a consuming-user and their web browser... and theproxy at proxyhost... and 'somepath' - a sufficiently unique and random URLpath):
1. The creating-user makes a new document on their server available at<gemhost>/<somepath> with the content <proxyhost> 2. The creating-user makes an HTTP GET request to<proxyhost>/internal/blacklist/<gemhost>/<somepath> 3. The proxy makes a Gemini request to <gemhost>/<somepath> and gets thecontent that matches themself - their own proxyhost value 4. The proxyhost adds the gemhost to their blacklist 5. The proxyhost refuses to proxy requests to the gemhost
Thoughts? How would you tweak to get the desired outcome?
An additional thought I had... the above feels like it might be too processheavy (but... it's also super simple...). What if proxy serverimplementations were encouraged to check gemini://<gemhost>/noproxy.txtbefore a request? (Every request? Feels too much like the favicon. Thefirst request? What if the creating-user changes their mind?) If my proxychecks that URL and gets a 20 status back, then it refuses to proxy theconnection. If it gets a 51 back, then it continues with proxying therequest.
So... who does the work? Proxy implementers would be encouraged to makemodifications with either approach... but the first approach has no work bygemini server implementers, just creating-users. The second approach couldhave some work by gemini server implementors... but little bycontent-users... maybe they toggle a setting in the server to respond witha 20 to /noproxy.txt requests?
Thoughts?-------------- next part --------------An HTML attachment was scrubbed...URL: <https://lists.orbitalfox.eu/archives/gemini/attachments/20210221/3685d0f6/attachment.htm>