Getting slammed by a client

On Sat, 25 Jul 2020, Hannu Hartikainen wrote:

> Should we develop low-resource honeypots to exhaust crawler resources? 
> Or start maintaining a community blacklist?

I'd start with hardening Gemini servers ...

The server I've written has a bunch of hard limits (in particular, on the 
duration of requests and the number of concurrent CGI processes). It might 
be advisable to add some more back-pressure for the case where user-agents 
are generating obviously redundant requests to an excessive degree. I'd 
put that before trying honeypots and blacklists.

Mk

-- 
Martin Keegan, +44 7779 296469, @mk270, https://mk.ucant.org/

---

Previous in thread (8 of 18): 🗣️ Hannu Hartikainen (hannu.hartikainen+gemini (a) gmail.com)

Next in thread (10 of 18): 🗣️ Julien Blanchard (julien (a) typed-hole.org)

View entire thread.