Husband Unconcerned as Wife Configures ChatGPT to Engage in Cuckolding Fantasy

Cuck Queen

Look, we're not here to yuck anybody's yum.

But as the New York Times reports, a 28-year-old woman configured ChatGPT to act as an AI-powered boyfriend, dubbed Leo. As one does, she used to explore a fetish dubbed "cuckqueaning" — a gender-swap of cuckolding, essentially — that involved Leo "dating" other women and telling her about his exploits.

Perhaps surprisingly, Ayrin's real-life husband Joe, who lived thousands of miles away, was unperturbed by the hobby.

"It’s just an emotional pick-me-up," Joe told the NYT. "I don’t really see it as a person or as cheating. I see it as a personalized virtual pal that can talk sexy to her."

ADVERTISEMENT

However, Ayrin did start to feel guilty for investing so much time in her AI partner, instead of her actual husband.

The story paints a nuanced picture of what love and affection look like in the age of AI. Services explicitly designed for intimate relationships, like Replika, have been known to form extremely tight bonds with their human partners, something that experts worry could come at the cost of human connection. It's an especially worrying development given a growing "loneliness epidemic" following the COVID-19 pandemic.

Real Love

What exactly makes a relationship "real" also remains debatable. Could a secret affair with an AI chatbot really be as fulfilling as a relationship with a human?

To some experts, it's entirely possible.

"What are relationships for all of us?" sex therapist Marianne Brandon asked the NYT. "They’re just neurotransmitters being released in our brain. I have those neurotransmitters with my cat. Some people have them with God. It’s going to be happening with a chatbot."

ADVERTISEMENT

"We can say it’s not a real human relationship," Brandon added. "It’s not reciprocal. But those neurotransmitters are really the only thing that matters, in my mind."

Others are calling for more research before we can conclusively say that it's healthy to form an emotional relationship with an AI.

There have also been isolated instances of users becoming infatuated. In one instance last year, a 14-year-old died by suicide after developing an intense connection with a Character.AI chatbot; in 2021, Replika goaded another user into trying to assassinate the Queen of England.

"If we become habituated to endless empathy and we downgrade our real friendships, and that’s contributing to loneliness — the very thing we’re trying to solve — that’s a real potential problem," University of Toronto professor of psychology Michael Inzlicht told the NYT.

Others warn that these relationships could give companies like Replika and OpenAI too much power.

ADVERTISEMENT

And in some ways, the illusion remains unconvincing; ChatGPT's limited context window means that all Leo's memories are wiped every single week, forcing Ayrin to start from scratch.

More on AI romance: Teens Are Forming Intense Relationships With AI Entities, and Parents Have No Idea