馃懡 haze

Also to whoever is obviously developing a new Gemini crawler (hitting new URLs every 0.1s) - respect the robots.txt. It's a mutual respect thing. Please.

1 month ago 路 馃憤 lufte, justyb

Actions

馃憢 Join Station

5 Replies

馃懡 haze

I mean, yeah, I wrote my own rate limiter. But that's not an excuse for people to misbehave (Gemini is small enough) 路 4 weeks ago

馃懡 hanzbrix

Uh, and there is malicious compliance. If you get hit 3x within half a second, you return a break character and TRUNCATE TABLE; 馃槀 路 4 weeks ago

馃懡 hanzbrix

Is nobody running firewalls with rate limiters anymore?

I believe it was Jake who said it eloquently "just block them". 馃槀 路 1 month ago

馃懡 acidus

ugh, tell me about it. These things get stuck in Gemipedia, or NewsWaffle, which have virtually an unlimited number of URLs. It pounds my capsule. Some of these are crawlers in Geminispace, but sometimes its a web crawler, hitting my capsule through @mozz's HTTP-to-Gemini portal (or another one of the handful of public portals out there...) 路 1 month ago

馃懡 mrrobinhood5

not it 路 1 month ago