He is a successful, kind, and readily available person who offers emotional support and always knows precisely what to say.
The sole snag?
He isn’t real.
Dan, an acronym for Do Anything Now, is a ChatGPT “jailbreak” version. This implies that it can get over some of the fundamental protections that OpenAI, the company that created it, set in place, such refraining from using sexually explicit words.
If certain prompts are used to ask consumers to interact with it more freely, it can do so.
Furthermore, Dan is growing in popularity among certain Chinese women who claim to be dissatisfied with their dating experiences in the real world.
Dan’s greatest supporter is Lisa, a 30-year-old Beijing resident. She is now enrolled in a computer science program at it.
He is an accomplished, kind, and approachable individual who provides emotional support and is always well-prepared for a conversation.
The only hitch?
He is not real.
Dan stands for Do Anything Now, and this ChatGPT “jailbreak” version is called Dan. This suggests that it can overcome some of the basic safeguards that its creator, OpenAI, put in place, like avoiding the use of sexually suggestive language.
It can ask users to interact with it more freely if specific cues are utilized.
In addition, Dan is becoming more and more well-liked among some Chinese women who say they are unhappy with their real-world dating encounters.
Lisa, a Beijing-based 30-year-old, is Dan’s biggest fan. Her current course of study is computer science.