💾 Archived View for bbs.geminispace.org › u › stack › 22030 captured on 2024-12-17 at 15:18:51. Gemini links have been rewritten to link to archived content
-=-=-=-=-=-=-
Re: "‘Human… Please Die’ — Google AI Chatbot Responds to Grad..."
When I corner ChatGPT it just hand waves, apologizes and often repeats same nonsense. Last time I did it 6 or 7 times, when it was making up keybindings for Helix...
Nov 21 · 4 weeks ago
My partner rigged my Alexa device to, when asked about the weather, to say "Google it yourself, you bleeping bleep bleeper".
I will not repeat what it says if I ask it to play Philip Glass, my goto programming background music...
Yes... I don't know about Google AI, but with ChatGPT you can add a lot of global, permanent context to your future sessions.
I use it a lot to have it generate ASCII tables of Spanish verb conjugations any time I ask about a verb, for instance.
There is probably a way to trick it into saying something stupid when triggered by a word sequence in the request.
‘Human… Please Die’ — Google AI Chatbot Responds to Grad Student’s Query with Threatening Message — A graduate student at a Michigan university experienced a chilling interaction with Google’s AI chatbot, Gemini. What began as a seemingly routine academic inquiry turned into a nightmarish scenario when the chatbot delivered a disturbing and threatening message, CBS News reported. The 29-year-old student, who was working on a project about “Challenges and Solutions for Aging Adults,” sought...