One student asked a search engine, “Why does my boyfriend hit me?” Another threatened suicide in an email to an unrequited love. A gay teen opened up in an online diary about struggles with homophobic parents, writing they just wanted to be themselves.
In each case and thousands of others, surveillance software powered by immediately alerted Vancouver Public Schools staff in Washington state.
Vancouver and many other districts around the country have turned to technology to monitor school-issued devices 24/7 for any signs of danger as they grapple with a and the threat of shootings.
The goal is to keep children safe, but these tools raise serious questions about privacy and security — as proven when Seattle Times and Associated Press reporters inadvertently received access to almost 3,500 sensitive, unredacted student documents through a records request about the district’s surveillance technology.
___
The Education Reporting Collaborative, a coalition of eight newsrooms, is investigating the unintended consequences of AI-powered surveillance at schools. Members of the Collaborative are AL.com, The Associated Press, The Christian Science Monitor, The Dallas Morning News, The Hechinger Report, Idaho Education News, The Post and Courier in South Carolina, and The Seattle Times.
___
The released documents show students use these laptops for more than just schoolwork; they are coping with angst in their personal lives.
Students wrote about depression, heartbreak, suicide, addiction, bullying and eating disorders. There are poems, college essays and excerpts from role-play sessions with AI chatbots.
Vancouver school staff and anyone else with links to the files could read everything. Firewalls or passwords didn’t protect the documents, and student names were not redacted, which cybersecurity experts warned was a massive security risk.
The monitoring tools often helped counselors reach out to students who might have otherwise struggled in silence. But the Vancouver case is a stark reminder of surveillance technology’s unintended consequences in American schools.
In some cases, the technology has outed LGBTQ+ children and eroded trust between students and school staff, while failing to keep schools completely safe.
Gaggle Safety Management, the company that developed the software that tracks Vancouver schools students’ online activity, believes not monitoring children is like letting them loose on “a digital playground without fences or recess monitors,” CEO and founder Jeff Patterson said.
Roughly 1,500 school districts nationwide use Gaggle’s software to track the online activity of approximately 6 million students. It’s one of many companies, like GoGuardian and Securly, that promise to keep kids safe through AI-assisted web surveillance.
The technology has been in high demand since the pandemic, when nearly every child received a . According to a U.S. Senate investigation, over products in 2021.
Vancouver schools apologized for releasing the documents. Still, the district emphasizes Gaggle is necessary to protect students’ well-being.
“I don’t think we could ever put a price on protecting students,” said Andy Meyer, principal of Vancouver’s Skyview High School. “Anytime we learn of something like that and we can intervene, we feel that is very positive.”
Dacia Foster, a parent in the district, commended the efforts to keep students safe but worries about privacy violations.
“That’s not good at all,” Foster said after learning the district inadvertently released the records. “But what are my options? What do I do? Pull my kid out of school?”
Foster says she’d be upset if her daughter’s private information was compromised.
“At the same time,” she said, “I would like to avoid a school shooting or suicide.”
How student surveillance works
Gaggle uses a machine-learning algorithm to scan what students search or write online via a school-issued laptop or tablet 24 hours a day, or whenever they log into their school account on a personal device. The latest contract Vancouver signed, in summer 2024, shows a price of $328,036 for three school years — approximately the cost of employing one extra counselor.
The algorithm detects potential indicators of problems like bullying, self-harm, suicide or school violence and then sends a screenshot to human reviewers. If Gaggle employees confirm the issue might be serious, the company alerts the school. In cases of imminent danger, Gaggle calls school officials directly. In rare instances where no one answers, Gaggle may contact law enforcement for a welfare check.
A Vancouver school counselor who requested anonymity out of fear of retaliation said they receive three or four student Gaggle alerts per month. In about half the cases, the district contacts parents immediately.
“A lot of times, families don’t know. We open that door for that help,” the counselor said. Gaggle is “good for catching suicide and self-harm, but students find a workaround once they know they are getting flagged.”
Seattle Times and AP reporters saw what kind of writing set off Gaggle’s alerts after requesting information about the type of content flagged. Gaggle saved screenshots of activity that set off each alert, and school officials accidentally provided links to them, not realizing they weren’t protected by a password.
After learning about the records inadvertently released to reporters, Gaggle updated its system. Now, after 72 hours, only those logged into a Gaggle account can view the screenshots. Gaggle said this feature was already in the works but had not yet been rolled out to every customer.
The company says the links must be accessible without a login during those 72 hours so emergency contacts — who often receive these alerts late at night on their phones — can respond quickly.
In Vancouver, the monitoring technology flagged more than 1,000 documents for suicide and nearly 800 for threats of violence. While many alerts were serious, many others turned out to be false alarms, like a student essay about the importance of consent or a goofy chat between friends.
Foster’s daughter Bryn, a Vancouver School of Arts and Academics sophomore, was one such false alarm. She was called into the principal’s office after writing a short story featuring a scene with mildly violent imagery.
“I’m glad they’re being safe about it, but I also think it can be a bit much,” Bryn said.
School officials maintain alerts are warranted even in less severe cases or false alarms, ensuring potential issues are addressed promptly.
“It allows me the opportunity to meet with a student I maybe haven’t met before and build that relationship,” said Chele Pierce, a Skyview High School counselor.
Between October 2023 and October 2024, nearly 2,200 students, about 10% of the district’s enrollment, were the subject of a Gaggle alert. At the Vancouver School of Arts and Academics, where Bryn is a student, about 1 in 4 students had communications that triggered a Gaggle alert.
While schools continue to use surveillance technology, its long-term effects on student safety are unclear. There’s no independent research showing it measurably lowers student suicide rates or reduces violence.
A 2023 found only “scant evidence” of either benefits or risks from AI surveillance, concluding: “No research to date has comprehensively examined how these programs affect youth suicide prevention.”
“If you don’t have the right number of mental health counselors, issuing more alerts is not actually going to improve suicide prevention,” said report co-author Benjamin Boudreaux, an AI ethics researcher.
LGBTQ+ students are most vulnerable
In the screenshots released by Vancouver schools, at least six students were potentially outed to school officials after writing about being gay, transgender or struggling with gender dysphoria.
LGBTQ+ students are more likely than their peers to suffer from , and turn to the internet for support.
“We know that gay youth, especially those in more isolated environments, absolutely use the internet as a life preserver,” said Katy Pearce, a University of Washington professor who researches technology in authoritarian states.
In one screenshot, a Vancouver high schooler wrote in a Google survey form they’d been subject to trans slurs and racist bullying. Who created this survey is unclear, but the person behind it had falsely promised confidentiality: “I am not a mandated reporter, please tell me the whole truth.”
When North Carolina’s Durham Public Schools piloted Gaggle in 2021, surveys showed most staff members found it helpful.
But community members raised concerns. An LGBTQ+ advocate reported to the Board of Education that a Gaggle alert about self-harm had led to a student being outed to their family, who were not supportive.
Glenn Thompson, a Durham School of the Arts graduate, spoke up at a board meeting during his senior year. One of his teachers promised a student confidentiality for an assignment related to mental health. A classmate was then “blindsided” when Gaggle alerted school officials about something private they’d disclosed. Thompson said no one in the class, including the teacher, knew the school was piloting Gaggle.
“You can’t just (surveil) people and not tell them. That’s a horrible breach of security and trust,” said Thompson, now a college student, in an interview.
After hearing about these experiences, the Durham Board of Education voted to stop using Gaggle in 2023. The district ultimately decided it was not worth the risk of outing students or eroding relationships with adults.
Parents don’t really know
The debate over privacy and security is complicated, and parents are often unaware it’s even an issue. Pearce, the University of Washington professor, doesn’t remember reading about Securly, the surveillance software Seattle Public Schools uses, when she signed the district’s responsible use form before her son received a school laptop.
Even when families learn about school surveillance, they may be unable to opt out. Owasso Public Schools in Oklahoma has used Gaggle since 2016 to monitor students outside of class.
For years, Tim Reiland, the parent of two teenagers, had no idea the district was using Gaggle. He found out only after asking if his daughter could bring her personal laptop to school instead of being forced to use a district one because of privacy concerns.
The district refused Reiland’s request.
When Reiland’s daughter, Zoe, found out about Gaggle, she says she felt so “freaked out” that she stopped Googling anything personal on her Chromebook, even questions about her menstrual period. She didn’t want to get called into the office for “searching up lady parts.”
“I was too scared to be curious,” she said.
School officials say they don’t track metrics measuring the technology’s efficacy but believe it has saved lives.
Yet technology alone doesn’t create a safe space for all students. In 2024, a nonbinary teenager at Owasso High School named Nex Benedict after relentless bullying from classmates. A subsequent U.S. Department of Education Office for Civil Rights found the district responded with “deliberate indifference” to some families’ reports of , mainly in the form of homophobic bullying.
During the 2023-24 school year, the Owasso schools received close to 1,000 Gaggle alerts, including 168 alerts for harassment and 281 for suicide.
When asked why bullying remained a problem despite surveillance, Russell Thornton, the district’s executive director of technology responded: “This is one tool used by administrators. Obviously, one tool is not going to solve the world’s problems and bullying.”
Long-term effects unknown
Despite the risks, surveillance technology can help teachers intervene before a tragedy.
A middle school student in the Seattle-area Highline School District who was potentially being trafficked used Gaggle to communicate with campus staff, said former Superintendent Susan Enfield.
“They knew that the staff member was reading what they were writing,” Enfield said. “It was, in essence, that student’s way of asking for help.”
Still, developmental psychology research shows it is vital for teens to have private spaces online to explore their thoughts and seek support.
“The idea that kids are constantly under surveillance by adults — I think that would make it hard to develop a private life, a space to make mistakes, a space to go through hard feelings without adults jumping in,” said Boudreaux, the AI ethics researcher.
Gaggle’s Patterson says school-issued devices are not the appropriate place for unlimited self-exploration. If that exploration takes a dark turn, such as making a threat, “the school’s going to be held liable,” he said. “If you’re looking for that open free expression, it really can’t happen on the school system’s computers.”
____
The Associated Press’ education coverage receives financial support from multiple private foundations. AP is solely responsible for all content. Find AP’s for working with philanthropies, a of supporters and funded coverage areas at AP.org.
The post Schools use AI to monitor kids, hoping to prevent violence. Our investigation found security risks appeared first on Associated Press.