💾 Archived View for station.martinrue.com › edanosborne › 323727c0249e48d38c6b16046ce27a17 captured on 2024-06-16 at 14:03:53. Gemini links have been rewritten to link to archived content

View Raw

More Information

⬅️ Previous capture (2024-05-26)

➡️ Next capture (2024-08-18)

🚧 View Differences

-=-=-=-=-=-=-

👽 edanosborne

It looks like not even the small web is safe from the tech giants: gopher://gopher.linkerror.com:70/0/phlog/2023/20230705

11 months ago · 👍 ruby_witch, akrabu, at_work, justyb

Links

gopher://gopher.linkerror.com:70/0/phlog/2023/20230705

Actions

👋 Join Station

7 Replies

👽 resetreboot

I gotta love how you pick something from the Internet and big tech and lawyers all go "copyright! It's mine, pay for it!" when you say "Hey, it was here for anyone to take...". But when it is time for them to pick up things off the Internet, it is all free game, even if they are making pretty penny from it. Because, "Hey, it was here for anyone to take...". · 11 months ago

👽 drh3xx

Big.Tech sucks. · 11 months ago

👽 mozz

@edanosborne for sure, i didn’t mean to direct that at you specifically. · 11 months ago

👽 edanosborne

@mozz That's not my gopher hole. · 11 months ago

👽 mozz

I love the art, but you’re fooling yourself if you think that LLMs won’t or somehow can’t learn ANSI escape codes. · 11 months ago

👽 ruby_witch

It's true, it's a bunch of unencrypted text files just waiting for an AI to ingest it. It's definitely rude to use it all without permission, or expressly against permission in this case where he had his robots.txt set.

On the other hand, I think that even if I had a choice I would personally let them eat my content. I don't mind being part of their language models. Chow down! · 11 months ago

👽 astroseneca

We are all in danger. Save yourself as much as you can. · 11 months ago