Back in October, I removed a whole section of my Gemini site [1] because I got fed up with badly written bots [2]. In the best practices guide [3] there is a section about redirection limits (something that should be in the specification [4] but for some reason isn't). In the section of my site I removed, I had such a test—a link that would always redirect to a link that itself would redirect, ad nauseum. So I would see a new client being tested and perhaps a dozen or two attempts to resolve the link and then it would stop.
But then there were the clients written for automated crawling of Gemini space that never bothered with redirection limits. It got so bad that there was one client stuck for a month, endlessly making requests for a resource that was forever out of reach. I had even listed the redirection test as a place for bots **NOT** to go in the robots.txt [5] file for the site (also a companion specification [6]).
So because it can be needlessly complex to locate a bot contact address [7], I was like, XXXX it. I removed the redirection test (along with the rest of the section, which included a general Gemini client test because, well, XXXX it). Why waste my bandwidth for people who don't care?
I only bring this up now because I'm noticing two bots who are attempting to crawl a now non-existant redirection test, probably with a backlog of thousands of links because hey, who tests their bots anyway?
[1] gemini://gemini.conman.org/
[3] https://gemini.circumlunar.space/docs/best-practices.gmi
[4] https://gemini.circumlunar.space/docs/specification.gmi
[5] https://www.robotstxt.org/robotstxt.html