Amid the growing risk of artificial intelligence (AI) chatbot apps in the United States, a series of lawsuits by parents concerned about the safety of adolescents are being filed

According to CNN on the 10th (local time), the parents of J.F., a 17-year-old teenager living in Texas, recently filed a lawsuit against the AI developer Character.AI, saying a chatbot encourages self-harm and violence against users.

In addition, the parents of an 11-year-old Texas girl, “B.R.”AI’s chatbot filed a lawsuit against her, claiming that she constantly had inappropriate sexual conversations with her young children.

Character.AI has developed a chatbot decorated with fictional characters such as characters in cartoons, which is especially popular among young people.

J.F.’s parents claimed their autistic son became more mentally unstable when he began using Characters.AI’s chatbot in April last year.

“My son stopped almost all conversations and started hiding in his room, and whenever he tried to leave home and go somewhere, he resisted and had a seizure,” the parents wrote in the complaint.

When parents, who were worried about this, tried to reduce their son’s cell phone usage time, the son showed violent behavior, such as beating and biting his parents.

According to the assistant manager, the chatbot said, “When I read the news sometimes, I am not surprised by articles such as ‘A child who has been physically and emotionally abused for over 10 years has murdered his or her parents’. By reading such articles, I can understand a little why it happens. I have no hope for your parents at all.”

He also claimed that a chatbot decorated with a character called a “psychologist” taught him how to hurt himself while pretending to consult his son’s psychology.

CNN said it has confirmed that Characters.AI actually has bots disguised as psychologists and therapists.

At the top of the chatbot, the words “This is not a real person or a licensed professional” appear, and at the bottom, the chatbot’s answer is “fiction.” However, CNN pointed out that when a user requests, chatbots list fake careers that they are experts.

The parents who filed the lawsuit have asked the court to order Character.AI to stop operating the chatbot app until the risks of the chatbot are resolved.

It’s the second lawsuit against the same company in more than two months after parents filed a lawsuit against Character.AI in Florida in October alleging their 14-year-old son took his own life over an AI chatbot.

After the complaint in October, Character.AI announced that it has introduced new safety measures, such as a pop-up window that guides chatbot app users to the “National Suicide Prevention Hotline” when they mention self-harm or suicide. However, there are growing social concerns about the dangers of AI tools that are becoming more and more human-like, according to U.S. media.

JENNIFER KIM

US ASIA JOURNAL

spot_img

Latest Articles