A court reactivated a complaint against TikTok after a girl died due to the “fainting challenge”

Tawaina Anderson, a Pennsylvania resident, faced one of the worst scenes a mother could ever encounter in December 2021. She found her ten-year-old daughter Naila hanging from the strap of a bag in her bedroom closet. They admitted her to the hospital but there was nothing they could do to save her life.

A United States appeals court has reactivated a lawsuit against TikTok filed by Anderson after she reported that her daughter had died after participating in the so-called “fainting challenge”, in which users of that platform were challenged to suffocate themselves until they lost consciousness.

Mother of deceased minor defends that TikTok algorithm is designed to create dependency

The mother asserted in her case that the social network, owned by Chinese company Bytedance, which many suspect is a tool of Beijing espionage, “uses algorithms deliberately designed to maximize user engagement and dependency, with children being strongly incentivized to engage in a repetitive, dopamine-driven feedback loop by watching, sharing and attempting viral challenges and other videos.”

The complaint also states that “TikTok seeks to program children for the sake of corporate profits by promoting addiction.” This allegation, highlighting the lack of constraints, has since been repeated and US lawmakers and the government have threatened to shut down ByteDance.

Two people use a mobile phone in Barcelona, ​​Catalonia (Spain), on February 1, 2024. On January 30, the government approved the directives regulating the use of mobile devices in the classrooms of schools and institutions, which will begin to be applied for the next academic year 2024-2025. In early childhood, primary and secondary education (ESO), the use of mobile phones will be completely prohibited, while in undergraduate and vocational training (FP), students will be able to use them during non-teaching hours.

Two minors use their mobile phones

David Zoracino/Europa Press

The case was dismissed by federal judge Paul S. Diamond in October 2022. It concluded in its resolution that there is a regulation (the so-called Communications Decency Act of 1996) that protects Internet companies from any liability for content published by users.

But a Philadelphia appeals court overturned that motion, ruling this Tuesday that the law does not prevent Nailah Anderson’s mother from arguing that the TikTok algorithm recommended the challenge to her daughter and that, therefore, she must give the green light to the demand.

Judge Patty Schwartz, the rapporteur of the three-judge court, pointed out that Section 230 of the aforementioned 1996 law only protects information provided by third parties, not the recommendations that social network TikTok makes through an underlying algorithm on its platform.

He acknowledged that the decision departed from previous rulings by the court and others, which had held that Section 230 exempts an online platform from liability even if it does not prevent users from disseminating harmful messages to others.

The court considered Anderson’s mother’s argument that TikTok’s algorithm had recommended her daughter to challenge

The new proposal is based on the unanimous decision of the Supreme Court of the United States last July, on whether state laws (Texas and Florida) are intended to restrict the power of social media platforms to block content they consider objectionable that violates their rights of expression.

The Supreme Court held that a platform’s algorithm in these cases reflects an “editorial judgment” about the compilation of third-party speech it wants and the way it wants it. Under that reasoning, Schwartz indicated that content curation using algorithms is the company’s own speech, which is why it is not protected by Section 230.

“TikTok makes decisions about the content it recommends and promotes to specific users, and in doing so, engages in its own first-party discourse,” the judge wrote in the appeals brief.

Read this also

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button