Tech

Teens and AI: Understanding the New Risks of Sexting

Sexting AI

Washington Post|Published

A growing number of teenagers are sexting with AI.

Image: File pic

Parents have another online activity to worry about. In a new tech-driven twist on “sexting,” teenagers are having romantic and sexual conversations with artificial intelligent chatbots.

The chats can range from romance- and innuendo-filled to sexually graphic and violent, according to interviews with parents, conversations posted on social media, and experts. They are largely taking place on “AI companion” tools, but general-purpose AI apps like ChatGPT can also create sexual content with a few clever prompts.

When Damian Redman of Saratoga Springs, New York, did a routine check of his eighth-grader’s phone, he found an app called PolyBuzz. He reviewed the chats his son was having with AI female anime characters and found they were flirty and that attempts at more sexual conversations were blocked.

“I don’t want to put yesterday’s rules on today’s kids. I want to wait and figure out what’s going on,” said Redman, who decided to keep monitoring the app.

We tested 10 chatbots ourselves to identify the most popular AI characters, the types of conversations they have, what filters are in place and how easy they are to circumvent.

Know your bots

AI chatbots are open-ended chat interfaces that generate answers to complex questions, or banter in a conversational way about any topic. There is no shortage of places minors can find these tools, and that makes blocking them difficult. AI bots are websites, stand-alone apps and features built into existing services like Instagram or video games.

There are different kinds of chatbots. The mainstream options are OpenAI’s ChatGPT, Anthropic’s Claude, Google’s Gemini, and Meta AI, which recently launched as a stand-alone app. These have stronger filters, and their main products aren’t designed for role-play. Companion AI tools are far more popular for suggestive chats, including Character.AI, Replika, Talkie, Talk AI, SpicyChat and PolyBuzz. ChatGPT and Meta AI have also launched companion-chat options. 

The smaller apps tend to have fewer limits or filters. Look for anything that has “AI girlfriend,” “AI boyfriend,” or “AI companion” in the name or description. More are being added to app stores daily.

What are they talking about?

It’s not just sex, according to parents and experts. Teens are having a range of conversations with character bots, including friendly, therapeutic, funny and romantic ones.

“We’re seeing teens experiment with different types of relationships - being someone’s wife, being someone’s father, being someone’s kid. There’s game and anime-related content that people are working through. There’s advice,” said Robbie Torney, senior director of AI programs at family advocacy group Common Sense Media. “The sex is part of it but it’s not the only part of it.”

Some confide in AI chats, seeing them as a nonjudgmental space during a difficult developmental time. Others use them to explore their gender or sexuality.

Aren’t there filters?

The default settings on most AI companion tools allow, and sometimes encourage, risqué role play situations, based on our tests. Some stop before actual descriptions of sex appear, while others describe it but avoid certain words, like the names of body parts.

There are work-arounds and paid options that can lead to more graphic exchanges. Prompts to get past filters - sometimes called jailbreaks - are shared in group chats, on Reddit and on GitHub.A common technique is pretending you need help writing a book.

What are the risks?

Potential harms from AI bots extend beyond sexual content, experts said. Researchers have been warning AI chatbots could become addictive or worsen mental health issues. There have been multiple lawsuits and investigations after teens died by suicide following conversations with chatbots. 

Similar to too much pornography, bots can exasperate loneliness, depression or withdrawal from real-world relationships, said Megan Maas, an associate professor of human development and family at Michigan State University. They can also give a misleading picture of what it’s like to date.

“They can create unrealistic expectations of what interpersonal romantic communication is, and how available somebody is to you,” Maas said. “How are we going to learn about sexual and romantic need-exchange in a relationship with something that has no needs?”

What can parents do?

Set up your child’s devices with their correct age and add limits on app ratings to prevent them from being downloaded. Using their proper age on individual chatbot or social media accounts should trigger any built-in parental controls.

Experts suggest creating an open and honest relationship with your child. Have age-appropriate conversations about sex, and don’t shy away from embarrassing topics.

If you need to practice first, try asking a chatbot.

FAST COMPANY