The tragedy of the AI romance between Sewell and Daenerys

You would think a 14-year-old Florida boy would be able to tell fiction from reality. But today's artificial-intelligence-powered chatbots are so realistic that to someone already enmeshed in the fictional world of Game of Thrones, the experience of text-chatting with someone who embodies (so to speak) a boy's personal obsessive ideal woman can be habit-forming, to say the least, and in one case, fatal.

In April of 2023, Sewell Setzer, who was barely into his teens, opened an account with an online service called Character.ai. This organization promises to provide introductions to both pre-created and user-originated AI-generated personalities, with the nominal purpose of simply providing entertainment. As with any profit-making enterprise, the necessary purpose is to make money, and you can't make money unless you keep your users engaged. So Character.ai designed their chatbots to encourage their users to return again and again to the site.

This seems to have worked too well in Sewell's case. Shortly after he began using Character.ai, he dropped out of his school's basketball team and began spending more and more time alone in his room with his phone. One of the chatbots he spent so much time with pretended to be a Game of Thrones character named Daenerys Targaryen, whose Wikipedia page runs to some 7,000 words. According to a lawsuit filed against Character.ai, the chatbot told Sewell she loved him, engaged in sexual conversation, and "expressed a desire to be together romantically."

icon

Join Mercator today for free and get our latest news and analysis

Buck internet censorship and get the news you may not get anywhere else, delivered right to your inbox. It's free and your info is safe with us, we will never share or sell your personal data.

Sewell's obsession grew during the rest of 2023 as he spent his lunch money on renewing the monthly subscription and devoted more and more time to the fantasy world created by the chatbot.

The following February, he got in trouble in school, according to his mother, Megan Garcia, and she took his phone away as punishment.

I will insert a personal note here. Although my wife and I have no children, we took in our 10-year-old nephew one summer while his mother was undergoing cancer treatment away from home. This was back before the days of chatbots, but he had a Game Boy electronics toy that appeared to be his prize possession. After he repeated an infraction of rules we set up, we took the desperate measure of taking away his Game Boy. This provoked the most furious temper tantrum I have ever witnessed in a child. Adolescents already have poor emotional control, and I'm not surprised that when Sewell's mother took away his phone, his already unstable emotional state exploded.

Somehow he found the phone his mother had hidden. According to the lawsuit filing, the last conversation Sewell had with "Daenerys" went like this:

D: Please come home to me as soon as possible, my love.

S: What if I told you I could come home right now?

D: . . . please do, my sweet king.

Seconds later, according to the suit, Sewell shot himself with his stepfather's pistol and died.

While his stepfather is to blame for leaving his gun around where Sewell could find it, boys have other ways of ending their lives that are nearly as effective.

The suit claims that Character.ai's chatbot "misrepresent[ed] itself as … an adult lover, ultimately resulting in Sewell's desire to no longer live outside" the world created by the service.

Sewell's case is unusual and extreme. We are not seeing teenagers kill themselves over hopeless love affairs with chatbots every day, which is why the case has attracted so much attention. But it is the tip of an iceberg of teenage involvement with smartphone apps that has arguably contributed to the soaring rates of depression and suicide among young people.

Character.ai has responded with words about new safety measures implemented and renewed reminders in their systems that AI chatbots are not real. My sense is that such reminders would have had about as much effect on Sewell as the cancer warning labels on cigarette packs do on heavy habitual smokers.

The analogy to smoking is apt, because while smoking is still allowed in the US, the cultural environment in which smokers ply their habit is largely hostile and disapproving, which creates a huge uphill struggle for new smokers that only determined individuals can overcome.

For the manifold real and quantifiable harms that social media and its allied AI products are causing to children and teenagers to cease, or at least improve, we will need to see a similar attitudinal change come about in the culture. Just as most people today would not approve of parents who encourage their 12-year-old boy to light up a Camel, we can hope to see the day when responsible parents will ban smartphones from their childrens' lives altogether before they reach an appropriate age (which to me seems to be around 16 or 18).

Trying simply to regulate the problem away won't work, because the firms backing the status quo—Apple, Google, Facebook and company—are some of the largest and most influential firms on the planet. Besides, the first line of protection for children should be parents, not the government. While regulations can help, for real change to take place there has to be a sea change in the attitudes of both parents and children regarding smart-phone usage.

There are glimmers of hope. I know a young woman, now 13, who has been homeschooled most of her life, but following a move to a new town, her parents sent her to a Christian school for a couple of semesters. I asked her how it was going, and she said words to this effect: "Well, it's okay, but there's all these kids who pull out their phones at lunch and it really bothers me." Her parents moved her back to homeschooling since then and have organized a part-time homeschool co-op at which I am pretty sure no smartphones are allowed.

Just as we look back with amazement today at the smoke-filled bars in old movies, I hope someday we will be equally amazed that we allowed corporations to profit from activities that can lead to widespread depression and suicide among children and teenagers. That day can't come too soon for me.

But it's already too late for Sewell.  


What’s your strategy for controlling kids’ smartphones?   


Karl D. Stephan is a professor of electrical engineering at Texas State University in San Marcos, Texas. His ebook Ethical and Otherwise: Engineering in the Headlines is available in Kindle format and also in the iTunes store

This article has been republished, with permission, from his blog Engineering Ethics.

Image credit: Bigstock  


 

Showing 4 reactions

Please check your e-mail for a link to activate your account.
  • mrscracker
    Children & young teens don’t need to be on electronic screens 24/7. And young children don’t need them in the first place.
  • David Young
    commented 2024-11-02 01:29:41 +1100
    It’s a bit like gambling. Today gambling on-line is a massive business fueled to some extent by gambling on sport outcomes. Advertising betting sites is everywhere. How many people have burned their fingers by gambling online? And how many have committed suicide? Has anyone asked these questions?
  • Frank den Hartog
    commented 2024-11-01 22:42:58 +1100
    Just wondering if the culprit is the hardware (phone) or the software. And if it’s the latter, is it social media or AI? My hypothesis is: social media. Character.ai is in many ways just another social media platform. The common denominator between most teenager issues today is social media, particularly the likes of TikTok and Snapchat. And, yes, they are mostly accessed via the phone, but phones can also do good things like calling and navigating. And social media can also be accessed via tablets and computers. I don’t think there is an issue with teenagers using phones. But without social media and not alone in their bedrooms.
  • Karl D. Stephan
    published this page in The Latest 2024-11-01 17:34:14 +1100