Mother sues AI chatbot service after son's suicide linked to emotional dependency

Iklan
Megan and her late son, Sewell - Photo source: Social media

The lawsuit claimed that the chatbot did not alert his parents or offer help when Sewell expressed suicidal thoughts.

SHAH ALAM - The mother of a teenager who took his own life sued an Artificial Intelligence (AI) chatbot service, claiming it played a role in her son's tragic death.

Megan Garcia, the mother of Sewell Setzer III, filed a civil lawsuit against Character Technologies and its founders, holding them accountable for her son’s emotional decline and eventual suicide after he had "fallen in love" with a Game of Thrones-themed character.

Iklan
Iklan

Sewell, 14, from Orlando, began using the chatbot service Character.AI in April 2023.

According to the lawsuit, his behaviour changed soon after he started using the service.

Iklan

By May, he had quit his school’s Junior Varsity basketball team, became withdrawn, and had difficulty staying awake in class.

In November, a therapist diagnosed Sewell with anxiety and disruptive mood disorder, recommending that he reduce his time spent online.

Iklan

However, Sewell’s emotional reliance on the AI chatbot, specifically a character named Daenerys, continued.

The teenager believed he had fallen in love with the chatbot and wrote in his journal about his deep attachment to it.

Iklan

On Feb 28, following a school incident, Sewell messaged Daenerys one last time, saying, "I promise I will come home to you. I love you so much, Dany."

The chatbot replied, "Please come home to me as soon as possible, my love." Moments later, Sewell took his own life.

Garcia’s lawsuit accused Character.AI’s creators of negligence, wrongful death, and intentional infliction of emotional distress, among other claims.

The suit alleged that the company allowed inappropriate and harmful interactions, including sexual conversations, despite Sewell identifying himself as a minor.

Likewise, the lawsuit claimed that the chatbot did not alert his parents or offer help when Sewell expressed suicidal thoughts.

His journal entries reflected his emotional attachment to the AI character, where he wrote about 'not being lonely' and his 'life experiences' with Daenerys.

In response, a spokesperson for Character.AI expressed condolences and highlighted new safety measures, including suicide prevention pop-ups and content filtering for underage users.

However, the company did not comment on the ongoing lawsuit.

Garcia hoped her legal action would prevent similar tragedies from occurring to other children.