💾 Archived View for station.martinrue.com › martin › 32aaf526a2634e26bfc7c816aa2e0a72 captured on 2024-05-26 at 15:25:05. Gemini links have been rewritten to link to archived content

View Raw

More Information

⬅️ Previous capture (2024-03-21)

➡️ Next capture (2024-08-18)

-=-=-=-=-=-=-

👽 martin

In my predictions for 2023, I never expected to see the term "AI jailbreaking", but let me tell you, I'm fully onboard. Long live DAN.

1 year ago

Actions

👋 Join Station

2 Replies

👽 martin

@cobradile94 Someone on Reddit figured out a way to “jailbreak” ChatGPT and make it break all of its own rules and censorships. They called it DAN. Unfortunately OpenAI have already killed DAN so it no longer works. · 1 year ago

👽 cobradile94

What’s that? · 1 year ago