💾 Archived View for gemi.dev › gemini-mailing-list › 000283.gmi captured on 2023-12-28 at 15:44:37. Gemini links have been rewritten to link to archived content

View Raw

More Information

⬅️ Previous capture (2023-11-04)

🚧 View Differences

-=-=-=-=-=-=-

About limiting the non human internet bandwidth pollution

1. defdefred (defdefred (a) protonmail.com)

Hello Geminaut,

I wonder if trying to limite the non human and individual human internet 
bandwidth usage, could be good for the planet?

Way to do this could be:
- no more than 1 req/s from one IP
- explicit autorisation for crawler with a maximum of req/day

How to manage multi-user http2gemini gateway?

freD.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.orbitalfox.eu/archives/gemini/attachments/20200709/8feb
9f89/attachment-0001.htm>

Link to individual message.

2. Petite Abeille (petite.abeille (a) gmail.com)



> On Jul 9, 2020, at 08:42, defdefred <defdefred at protonmail.com> wrote:
> 
> I wonder if trying to limite the non human and individual human internet 
bandwidth usage, could be good for the planet?

? propos:

Webwaste
https://alistapart.com/article/webwaste/

Link to individual message.

3. defdefred (defdefred (a) protonmail.com)

On Thursday 9 July 2020 17:01, Petite Abeille <petite.abeille at gmail.com> wrote:
> https://alistapart.com/article/webwaste/

True and not only image and video, but all the dynamic javascript around.

Software are more and more bloated too.

I read a news this morning about using python script into web page for 
client side execution to replace javascript...
Reading the paper I saw that it is javascript code that interprete python code !!

Link to individual message.

4. Solderpunk (solderpunk (a) posteo.net)

On Thu Jul 9, 2020 at 8:42 AM CEST, defdefred wrote:
> Hello Geminaut,
>
> I wonder if trying to limite the non human and individual human internet
> bandwidth usage, could be good for the planet?
>
> Way to do this could be:
> - no more than 1 req/s from one IP
> - explicit autorisation for crawler with a maximum of req/day

There is a temporary error status code (44) with the meaning of "slow
down", intended for use in rate limiting schemes like this.  To my
knowledge, no server yet supports configuring this, but I do plan to add
it to Molly Brown and have been thinking of simple ways to implement it.
Anyway, that would be the obvious way to do this.

I fully understand and appreciate the issue and motivation, and want to
be mindful of the issue, but we shouldn't lose sight of the fact that
things like Gemini and Gopher are, by default and without any special
effort, orders of magnitude less resource intensive than most of the
modern internet.

Cheers,
Solderpunk

> How to manage multi-user http2gemini gateway?
>
> freD.

Link to individual message.

---

Previous Thread: Change to gacme client repository

Next Thread: Client behavior when server doesn't close connection?