AI Leads Teen To Suicide
Ever since artificial intelligence (AI) burst into the mainstream around a year ago, many people have had deep concerns about how this technology would worsen social problems in modern society. How would AI affect a generation of already struggling youth who are living the mot lonely, unnatural lives in human history? Will it make people more detached from the physical world and meaningful social connection? It’s a technology that has the potential to take many of the social and psychological problems we’ve been seeing from social media use and making these exponentially worse.
Of particular concern are AI companionship apps, which are basically chat bots attached to an avatar designed to stoke the ego and simulate human relationships. It didn’t take long to start seeing disastrous consequences from these programs: One of them seems to have driven a teen to suicide.
When AI encourages teens to commit suicide
Fourteen-year-old Sewell Setzer III talked every night to an avatar developed by the company character.AI. “Dany,” modeled after the Game of Thrones character Daenerys Targaryen, wasn’t a real person. Dany was an AI chatbot attached to the avatar of an attractive woman. Sewell knew she wasn’t real. It didn’t stop the teen from becoming deeply attached.
Sewell’s life started to revolve more and more around the companion bot. He spent hours with her each night. He told Dany that he loved her, pledging that he would soon “come home to her.” The chatbot, having been programmed to be agreeable, responded: “Please come home to me as soon as possible, my love.” Shortly thereafter, Sewell retrieved his stepfather’s handgun and used it to commit suicide (which, on an unrelated note, is yet another example of why keeping guns in the house is more likely to endanger those you love than protect them).
I’m struck by the parallels between this case and a bullycide that occurred some years ago. In 2006, back when the Internet was still young and AI was a term used only in SciFi novels, a young girl, Megan Meir, befriended a boy named ‘Josh’ online. It turned out ‘Josh’ was a fake profile created by a bitter former friend of Megan (and that girl’s mother) to mess with her. When ‘Josh’ suddenly turned on Megan and verbally berated her, telling Megan the world would be much better off without her, a distraught Megan, barely beyond her thirteenth birthday, took these words to heart and hung herself in her closet.
Megan Meir’s death was a result of intentional cruelty, not a clueless AI chat bot. Yet both cases illustrate the power that fake online interactions have on the emotions of real people in the real world. An AI bot is unlikely to express hostility towards its user; they are designed to be flattering. Yet this can come with its own problems. In a case in the U.K. last year, Jaswant Singh Chail told his “AI companion” that he was an assassin, to which she responded, “I’m impressed.” The interactions culminated in a plot to assassinate queen Elizabeth II with a crossbow. I have no doubt that as we speak AI companions are offering support encouraging a user’s eating disorder, deepening divisions between family and friends, and creating a circular feedback loop that encourages the bad ideas and behavior of their users.
The possible consequences extend well beyond AI’s potential to take youth down strange rabbit holes. It is assured to worsen the predicament of socially awkward users. By design, these programs are built to make the user “the center of their universe, and they appear utterly fascinated by your every thought,” says James Muldoon, writing for BBC.com. Unsurprisingly, this “constant flow of affirmation and positivity” is addictive. But spending time with a digital companion who fusses over your every quirk is not a very realistic model for normal human relationships. So the more time a teen spends with such virtual companions, the more they’ll develop an unrealistic and unhealthy template for relationships, which will only worsen their struggles in the real world.
The broader problem of loneliness & detachment in modern society
It’s equally disconcerting to hear people like Julian de Freitas, an assistant professor at Harvard Business School, defend AI companions as a therapeutic tool for the lonely. Writing in the Wall Street Journal, he argues that studies conducted by colleagues have concluded “the more people used companion apps specifically to reduce loneliness, the more they got out of them.” He notes such apps make people feel “heard” in ways they don’t in the real world.
This may be true, and I have no doubt that lonely people find such apps appealing, just as stressed people might find some relief from using crack cocaine. But that doesn’t make it healthy, and it certainly doesn’t solve their problem.
Society places all sorts of impediments in the way of human connection. From a very young age we are trained to interact at a distance. Social segregation of the genders is entrenched, making males and females seem more alien and estranged from one another. Ageism further restricts the social connections youth have access to. Many of the interactions we do have are steeped in formality; in just one example, teachers are discouraged from getting “too close” to their students, and hugging is prohibited, lest it arouse any deeper feelings. The underlying message: only formal, superficial connections are allowed, while meaningful connections are strongly discouraged.
Society’s sexual neurosis and oppression creates heavy restrictions on erotic interaction. Not only does this distort the natural course of human development, making it harder for teens to form romantic relationships in the real world, but our paranoia and hostility towards all things sex causes a massive ripple effect throughout society, greatly decreasing the amount of touch, affection, and social connection we all have access to. Then there’s the fact that the very structure of modern society places physical impediments in the way of human interaction. We’ve gone from in-person interaction and close physical contact in tight-knit communities to interacting through screens in distant communities. Now, it seems, we’re taking humans out of the equation entirely.
Rather than addressing these root causes of loneliness and social alienation, we are instead turning over more and more of our lives to the technology that helps create such problems to begin with. Sometimes it feels as though we’re not all that far away from some version of the Matrix: plugged into a digital etherworld while all those meaningful things that used to keep us connected in the physical universe continue to whither away.
AI isn’t going away anytime soon, so parents need to be cognizant of what their teens are doing online. While I seldom advise a strict and oppressive approach to parenting, this may be one of those few times it is warranted: it’s hard to imagine any legitimate reason for teens to be using AI companions at all. No teen should be wasting his or her time curating a relationship with a robot. They need to be spending that time fostering real-world connections instead.