The messaging platform Discord recklessly exposes children to graphic violent content, sexual abuse and exploitation, New Jersey’s attorney general said in a lawsuit filed Thursday.
New Jersey is the first state in the country to file suit against Discord, whose 200 million users can post in chat rooms and exchange direct messages with one another. Founded in 2015 as a chat tool for gamers, it has exploded in popularity in recent years among children, a trend that accelerated at the height of the pandemic.
The app’s popularity and limited safety controls have made its users easy targets for predators, prosecutors said in the suit, which was filed in Superior Court in Essex County.
“Discord markets itself as a safe space for children, despite being fully aware that the application’s misleading safety settings and lax oversight has made it a prime hunting ground for online predators seeking easy access to children,” the attorney general, Matthew J. Platkin, said in a statement announcing the suit.
Discord’s users must be 13 or older, according to the platform’s policies. But the suit says that because Discord accounts are so easy to create, and because users can use pseudonyms, younger children can evade the age restrictions with little difficulty and adults can readily pose as children.
The complaint cites several criminal cases against adults in New Jersey who were accused of using the app to engage in explicit communication with children, solicit and send nude pictures and take part in sexual acts on video chat.
Jillian Susi, a spokeswoman for Discord, disputed the lawsuit’s claims in a statement.
“Discord is proud of our continuous efforts and investments in features and tools that help make Discord safer,” Ms. Susi said. “Given our engagement with the attorney general’s office, we are surprised by the announcement that New Jersey has filed an action against Discord today.”
New Jersey is not the only state where people have been accused of using Discord to target children. On Sunday, a California man was arrested and charged with kidnapping and engaging in unlawful sexual conduct with a minor after a 10-year-old girl was reported missing and the police found that the two had been communicating on Discord and Roblox, a gaming site popular with children.
In February, Discord and Roblox were named in a lawsuit filed in California on behalf of a 13-year-old boy who, according to the suit, was sexually exploited by an adult stranger on the apps.
In one especially gruesome case, a 47 year-old man in Michigan used Discord to contact children and advertise “livestreams of children engaging in self-mutilation” and sexually explicit activity, prosecutors said. He was sentenced to 30 years in prison.
A 2023 investigation by NBC News found 35 cases over a six-year period of grooming, sexual assault or kidnapping that involved communication on Discord.
The day after the report was published, Discord’s chief executive, Jason Citron, said that he and the company “take this stuff very seriously.”
He added: “As a parent, it’s horrifying.”
During a Senate hearing last January, lawmakers grilled Mr. Citron and the executives of other social media companies, including Meta, TikTok and X, about what they were doing to protect children from harmful content on their sites.
Lawmakers told the executives that they had “blood on their hands” and had created “a crisis in America.”
At the hearing, Mr. Citron said Discord was working with a tech company founded by the actor Ashton Kutcher to detect predatory conversations.
Discord was aware that its young users were vulnerable, the New Jersey suit argues. But it marketed its platform to parents as safe anyway, highlighting a feature that it said would automatically identify and delete direct messages that contained explicit images or videos.
Between 2017 and 2023, the app’s default setting applied the feature only to messages between users who were not friends, prosecutors said.
Cari Fais, the director of the division of consumer affairs in Mr. Platkin’s office, said in a statement that Discord had deliberately misrepresented the application’s safety features.
“Discord claims that safety is at the core of everything it does, but the truth is the application is not safe for children,” Ms. Fais said in the statement.
Alyce McFadden is a reporter covering New York City and a member of the 2024-25 Times Fellowship class, a program for journalists early in their careers.
The post Discord App Exposes Children to Abuse and Graphic Content, Lawsuit Says appeared first on New York Times.