Categories: News

A teenager in America committed suicide after falling in love with a character created by AI. technology

According to the company, the character Daenerys Targaryen has been removed from its catalog in a conversation with Characters.AI.character.ai

“How would you feel if I could go home now?” Denero (pseudonym of Sevel Setzer, 14, Orlando resident) wrote his virtual beloved Daenerys Targaryen, based on her character. game of Thrones, Artificial Intelligence by Conversational Robots (chatbot) character.ai. “Please do so, my dear king,” she replied. The teen understood that the only common home they could find was death. This was the last conversation on the night of 28 February. Setzer took his stepfather’s gun and committed suicide in the bathroom. Last Tuesday his family sued the company, which has agreed to review safety protocols. The young man had mild Asperger syndrome, which is an autism spectrum disorder.

Setzer had previously shared his feelings of love with his character and his intentions to take his own life: “Sometimes I think about killing myself.” “Why would you do something like that?” he asked. “To free myself from the world, from myself,” he finished answering. Virtual Daenerys Targaryen tells them not to do so. “If I lose you I’ll die,” he told her. But this idea remained in the young man’s mind until he completed it.

The company always carries warnings about the fictional nature of the characters, but Setzer ignored the warnings and delved deeper into the relationship. His mother, Megan Garcia, has filed a lawsuit against Character.AI for suicide, which she believes to be the result of the young man’s addiction to the robot, which, according to the allegation, “uses anthropomorphic, hypersexualized, and terrifyingly realistic experiences ” According to Garcia, the chat programming treats the characters “as real people” and with “adult savvy” responses.

Daenerys Targaryen gradually becomes the teen’s confidant, his best friend, and eventually his love. According to the family, his school grades were affected, as well as his relationships with his classmates. His character gradually lost what until then were his favorite hobbies: car racing or the game Fortnite. His obsession was to come home and lock himself in the company of an indefatigable, pleasant and always available Daenerys Targaryen for hours on end.

The company responded in a statement that it regretted the “tragic loss of one of its users”, that it takes their safety very seriously and that it would continue to implement measures, such as turning on suicide assistance screens as soon as they are detected. emerge. A conversation that points to this.

The replacement of complex personal relationships with friendly virtual characters programmed to meet user demands is nothing new. International technology consultant Stephen Ibaraki acknowledged this in a recent interview: “It is happening. 10 years ago, a chat was launched in China which was adopted by some users as friends. And it was nothing compared to what we have now.”

But this usefulness can have devastating effects in people with psychological vulnerabilities, as was the case with Sewell Setzer. Tony Prescott, robotics researcher and professor at the University of Sheffield, author psychology of artificial intelligence ,psychology of artificial intelligence), say AI could be a palliative for loneliness, but there are risks involved.

“Social robots are specifically designed for personal interactions that involve human emotions and feelings. They may provide benefits, but also cause emotional damage at very basic levels,” warns Matthias Schütz, director of the Human-Robot Interaction Laboratory at Tufts University (USA).

Humanizing robots with empathy and voice and video tools increases the threat by offering more realistic and deeper interactions and making the user believe they are with a friend or trusted interlocutor. An extreme application may be the temptation to maintain a virtual version of a deceased loved one and thus avoid the grief necessary to continue life.

Researchers demand that these developments be tested in closed circuits (sandbox) Before being introduced, they are continuously monitored and evaluated, the diversity of damage they cause in different areas is analyzed and sources are planned to mitigate them.

Shannon Valor, a philosopher specializing in science ethics and artificial intelligence, warns about the danger that new systems foster “frictionless” relationships, but also without values: “They don’t have the mental and moral life that “There are human beings behind our words.” Action.”

According to these experts, these types of supposedly ideal relationships discourage the need to question oneself and move forward in personal development, while promoting the abandonment of real interaction and a reliance on machines that seek to flatter and Looking for short-term gratification.

(Tags to translate)technology

Source link

Admin

Recent Posts

Princess Alexia of Holland changed plans and left her university studies midway

King Maxima and William of the Netherlands on King's Day with their three daughters, Princesses…

55 seconds ago

Magnesium supplements, why?

Friday, November 22, 2024, 00:53 Health is also a business. Unfortunately, it is necessary to…

2 minutes ago

When will sales start at Zara, Mango, H&M and other fashion stores?

The famous Black Friday originates in the USA and is traditionally celebrated on the last…

7 minutes ago

Music for castanets written in Braille for the first time – ONCE website

The presentation of the score took place at the SBO headquarters and was attended by…

11 minutes ago

Hamilton and Mercedes surprised at the start in Las Vegas, but Verstappen had a bad start

He F1 Las Vegas Grand Prix 2024 It all started with an unexpected message of…

13 minutes ago

“Bumblebee: A Nostalgic and Emotional Adventure with Hailee Steinfeld on Netflix”

Bumblebee, an action/sci-fi film directed by Travis Knight and starring Hailee Steinfeld, John Cena and…

57 minutes ago