Sean Conner sean at conman.org
Sun Feb 21 20:26:34 GMT 2021
- - - - - - - - - - - - - - - - - - -
It was thus said that the Great Mansfield once stated:
Maybe the proxy could instead return a page that says... "The owner of this
content has requested that it only be accessible through the gemini
protocol and we respect that. Please use a Gemini client to access content
at gemini://ondollo.com. Here are some recommended clients: <list of one or> more clients
"
After reading up on HTTP response codes, I think the most appropriate oneis 409 Conflict. The HTTP spec (RFC-2616, section 10.4.10, with addedcommentary from me):
The request could not be completed due to a conflict with the current state of the resource.
The owner of the Gemini server does not want data proxied to HTTP.
This code is only allowed in situations where it is expected that the user might be able to resolve the conflict and resubmit the request.
To resolve the situation, use an actual Gemini client.
The response body SHOULD include enough information for the user to recognize the source of the conflict. Ideally, the response entity would include enough information for the user or user agent to fix the problem; however, that might not be possible and is not required.
Which can be included in the body of the 409 response (most, if not all, webservers allow custom error pages to be sent).
Short of that, then 407 Proxy Authentication Required is the next bestone, kind of. The semantics are perfect, but it would seem to apply(RFC-2616, section 10.4.8):
This code is similar to 401 (Unauthorized), but indicates that the client must first authenticate itself with the proxy. The proxy MUST return a Proxy-Authenticate header field (section 14.33) containing a challenge applicable to the proxy for the requested resource. The client MAY repeat the request with a suitable Proxy-Authorization header field (section 14.34). HTTP access authentication is explained in "HTTP Authentication: Basic and Digest Access Authentication"
# An Alternative
An additional thought I had... the above feels like it might be too process
heavy (but... it's also super simple...). What if proxy server
implementations were encouraged to check gemini://<gemhost>/noproxy.txt
before a request? (Every request? Feels too much like the favicon. The
first request? What if the creating-user changes their mind?) If my proxy
checks that URL and gets a 20 status back, then it refuses to proxy the
connection. If it gets a 51 back, then it continues with proxying the
request.
If only there was a file that automated agents already use ... likerobots.txt, where one could do something like ...
User-agent: proxy Disallow: /
But alas, not-so-benevolent dictator wanna-be Drew DeVault said thou shaltnot do that:
https://lists.orbitalfox.eu/archives/gemini/2020/003506.html
-spc (I don't agree with Drew, and think robots.txt is find, and already in place in most cases ... )