Details from a multi-state investigation into TikTok accidentally became public this week, revealing officials at the wildly popular video app knew they were harming American teens, according to a new report.
The incriminating internal documents became public when reporters for Kentucky Public Radio realized that redacted sections of court documents instantly became unredacted when copied and pasted into a new text file, NPR reported Friday.
The court documents were from Kentucky’s portion of Tuesday’s coordinated 14-state effort to sue TikTok for what officials say is an addictive algorithm that endangers the mental and physical health of the children who use the app.
According to NPR, TikTok’s internal documents show officials at parent company ByteDance discussing internal studies that show the app can harm children.
Young users can become addicted to the app after viewing 260 videos on the platform, TikTok employees determined, according to the documents. Given most TikTok videos are about 8 seconds, Kentucky’s authorities calculated that a child could be addicted after just 35 minutes of using the app.
NPR says that in one of the newly revealed internal documents, TikTok employees say that heavy use of the app “correlates with a slew of negative mental health effects like loss of analytical skills, memory formation, contextual thinking, conversational depth, empathy, and increased anxiety.”
Adam Wandt, an associate professor and deputy chair for technology at the John Jay College of Criminal Justice, told Business Insider that copy-and-paste redaction errors are very common in the US court system and that he has encountered dozens of similar instances.
“Redacting documents is fairly difficult,” Wandt said. “Very often what you have is people just putting black bars in PDFs, or black text, but the text still remains.”
In a statement to BI on Friday, a spokesperson said it is “highly irresponsible of NPR to publish information that is under a court seal.”
“Unfortunately, this complaint cherry-picks misleading quotes and takes outdated documents out of context to misrepresent our commitment to community safety,” a spokesperson for the company said. “We have robust safeguards, which include proactively removing suspected underage users, and we have voluntarily launched safety features such as default screentime limits, family pairing, and privacy by default for minors under 16. We stand by these efforts.”
TikTok has disputed the claims made in the state lawsuits, saying through a spokesperson on Tuesday that the app protects its youngest users through “robust safeguards.”
According to NPR, the unredacted internal documents show that TikTok touted its tools for limiting teens’ screen time despite knowing from its own research that these features had “little impact.”
Jayne Conroy, an attorney at Simmons Hanly Conroy, which represents some 50 plaintiffs in a class-action liability lawsuit accusing social media platforms of harming children, said the internal documents uncovered in the state investigations show that tech companies purposefully design products to “relentlessly engage and exploit the adolescent brain.”
“They have repeatedly worked to maximize both user engagement and profits, all at the expense of our young people’s mental health, Conroy told BI. “That these companies knew and ignored this harm is exactly consistent with the allegations in our complaint,” she said.
Wandt told BI that the content of the documents is “not surprising at all.”
TikTok’s policies forbid users under 13 from making an account, but the unredacted internal documents show that TikTok tells moderators to use caution when removing accounts, according to NPR. One internal document reveals that TikTok tells its moderators to not take action on reports of underage users unless the account identifies them as under 13, NPR reported.
Wandt said it is “definitely within TikTok’s ability to prevent minors under a certain age from using their app.”
“However, from a business point of view, it’s not necessarily in their business interest,” he said. He called TikTok’s algorithm “one of the most dangerous influences on the planet right now” for children.
“So they have half-baked measures put in place and policies that they know don’t work, and it doesn’t surprise me one bit that their own internal research shows that it doesn’t work because they really have no incentive to fix the issues,” he said.
Matthew Bergman, a founding attorney of the Social Media Victims Law Center, which represents over 3,000 plaintiffs in cases of adolescents harmed by social media, told BI that the unsealed information is “certainly consistent with what we’re seeing” across TikTok and other social media.
“They design these products to be addictive, including through their endless scrolls,” Bergman said. “They make their money by showing kids not what they want to see but what they can’t look away from.”
The post TikTok knew its algorithm harmed kids, accidentally revealed internal documents show appeared first on Business Insider.