“Nylah, still in her first year of adolescence, likely had no idea what she was doing or that following along with the images on her screen would kill her,” U.S. Circuit Court Judge Paul Matey wrote. “But TikTok knew that Nylah would watch because the company’s customized algorithm placed the videos on her ‘For You Page.’”
An appeals court will give a Pennsylvania mother another chance to hold TikTok liable for the death of her 10-year-old daughter, who died while participating in a so-called “blackout challenge.”
According to NBC-Philadelphia, federal law and precedent typically hold that digital publishers are not liable for the consequences of third-party content hosted on their platforms. However, the U.S. Circuit Court of Appeals agreed to reinstate the family’s lawsuit, saying that TikTok could still be found liable for damages if its algorithm promotes dangerous content to children.
“TikTok makes choices about the content recommended and promoted to specific users, and by doing so, is engaged in its own first-party speech,” said U.S. Circuit Judge Patty Shwartz.
Attorneys for Tawainna Anderson, the mother of 10-year-old Nylah Anderson, had earlier argued that the “blackout challenges” repeatedly appeared in the girl’s “For You” feed on TikTok—even after the company received reports that similar challenges had killed children in other parts of the country.
Nylah, notes NBC-Philadelphia, was found unresponsive in her bedroom closet in December 2021. She was taken to a local hospital and placed in intensive care, but died five days later.
“I cannot stop replaying that day in my head,” her mother said in a 2022 press conference. “It is time that these dangerous challenges come to an end so that other families don’t experience the heartbreak we live every day.”
The family’s lawsuit, also filed in 2022, was quickly dismissed under Section 230 of the 1996 Communications Decency Act—a controversial federal law that protects publishers and social media platforms from third-party content posted to their websites and applications.
But on Tuesday, a three-judge appeals panel reversed part of the ruling—giving the Anderson family another chance to take their case to trial.
“Nylah, still in her first year of adolescence, likely had no idea what she was doing or that following along with the images on her screen would kill her,” U.S. Circuit Court Judge Paul Matey wrote in his concurring opinion. “But TikTok knew that Nylah would watch because the company’s customized algorithm placed the videos on her ‘For You Page.’”
Jeffrey Goodman, a lawyer representing Nylah’s mother, indicated that the court’s ruling could set a broad precedent.
“Big Tech just lost its ‘get-out-of-jail-free card,’” Goodman said in a statement.
Matey, in his partially concurring opinion, emphasized that TikTok, in its “pursuit of profits above all other values,” may well have a right to serve content catering to “the basest tastes” and the lowest virtues.”
“But it cannot claim immunity that Congress did not provide,” Matey said.
Sources
After Girl, 10, Dies From Online Challenge, Family Warns Others
TikTok must face lawsuit over 10-year-old girl’s death, US court rules
TikTok must face lawsuit over 10-year-old Pa. girl’s ‘blackout challenge’ death, appeals court rules
TikTok must face lawsuit over Pennsylvania girl’s ‘blackout challenge’ death
Join the conversation!