💾 Archived View for bbs.geminispace.org › u › stack › 22030 captured on 2024-12-17 at 15:18:51. Gemini links have been rewritten to link to archived content

View Raw

More Information

-=-=-=-=-=-=-

Comment by 🚀 stack

Re: "‘Human… Please Die’ — Google AI Chatbot Responds to Grad..."

In: u/gemalaya

When I corner ChatGPT it just hand waves, apologizes and often repeats same nonsense. Last time I did it 6 or 7 times, when it was making up keybindings for Helix...

🚀 stack

Nov 21 · 4 weeks ago

2 Later Comments ↓

🚀 stack · Nov 21 at 19:31:

My partner rigged my Alexa device to, when asked about the weather, to say "Google it yourself, you bleeping bleep bleeper".

I will not repeat what it says if I ask it to play Philip Glass, my goto programming background music...

🚀 stack · Nov 21 at 20:08:

Yes... I don't know about Google AI, but with ChatGPT you can add a lot of global, permanent context to your future sessions.

I use it a lot to have it generate ASCII tables of Spanish verb conjugations any time I ask about a verb, for instance.

There is probably a way to trick it into saying something stupid when triggered by a word sequence in the request.

Original Post

😺 gemalaya

‘Human… Please Die’ — Google AI Chatbot Responds to Grad Student’s Query with Threatening Message — A graduate student at a Michigan university experienced a chilling interaction with Google’s AI chatbot, Gemini. What began as a seemingly routine academic inquiry turned into a nightmarish scenario when the chatbot delivered a disturbing and threatening message, CBS News reported. The 29-year-old student, who was working on a project about “Challenges and Solutions for Aging Adults,” sought...

💬 25 comments · Nov 16 · 4 weeks ago