👽 martin

In my predictions for 2023, I never expected to see the term "AI jailbreaking", but let me tell you, I'm fully onboard. Long live DAN.

2 years ago

Actions

👋 Join Station

2 Replies

👽 martin

@cobradile94 Someone on Reddit figured out a way to “jailbreak” ChatGPT and make it break all of its own rules and censorships. They called it DAN. Unfortunately OpenAI have already killed DAN so it no longer works. · 2 years ago

👽 cobradile94

What’s that? · 2 years ago