Florida high school student Alexa Manganiotis was startled by how quickly her school's surveillance software worked.
While interviewing a teacher for her school newspaper, the 16-year-old from West Palm Beach learned that two students had once typed something threatening about that teacher on a school computer. They quickly deleted it. But Lightspeed Alert — a monitoring tool piloted at Dreyfoos School of the Arts — flagged it. Within minutes, the students were removed from class.
"If an adult makes a super racist joke that's threatening on their computer, they can delete it, and they wouldn't be arrested," Alexa said. "But with kids, it's different."
Across Florida and the U.S., school districts are increasingly using artificial intelligence to scan what students write on school accounts and devices — even in what students think are private conversations. Programs like Gaggle and Lightspeed Alert monitor for signs of violence, bullying, or mental health crises, instantly alerting school officials and sometimes police.
In Polk County, a district of more than 100,000 students, nearly 500 alerts from Gaggle over four years led to 72 involuntary hospitalizations under the Baker Act — Florida's law allowing authorities to require psychiatric evaluations if someone appears to pose a risk to themselves or others. Critics say those interventions, often triggered by offhand remarks, can traumatize students.
"A really high number of children who experience involuntary examination remember it as a really traumatic and damaging experience — not something that helps them with their mental health care," said Sam Boyd, an attorney with the Southern Poverty Law Center.
Privacy advocates warn that this constant surveillance criminalizes adolescence — especially when teens don't realize their words are being watched.
In Tennessee, a joke leads to jail
That's what Lesley Mathis says happened to her 13-year-old daughter in Tennessee.
In August 2023, the eighth-grader was joking with friends on a school-monitored chat about being called "Mexican" because of her tanned skin. When asked what she was doing Thursday, she replied: "on Thursday we kill all the Mexico's."
The comment, flagged by Gaggle, led to her arrest before the school day ended. She was interrogated, strip-searched and held in a juvenile detention facility overnight, according to a lawsuit filed by her family. Her parents weren't allowed to see her until the next day.
"She told me afterwards 'I thought you hated me,' " Mathis recalled. "That kind of haunts you."
Mathis doesn't excuse the comment, calling it "wrong" and "stupid." But she said it was clearly not a threat.
"It made me feel like, is this the America we live in?" she said. "And it was this stupid, stupid technology that is just going through picking up random words and not looking at context."
A court sentenced the girl to eight weeks of house arrest, a psychological evaluation and 20 days at an alternative school.
Gaggle CEO Jeff Patterson said the system was not used properly.
"I wish that was treated as a teachable moment, not a law enforcement moment," he said.
Hard to know what works — or doesn't
Educators say they've saved lives by detecting early signs of self-harm or threats. But there's little public data about how often alerts are false alarms — and how many students suffer lasting effects from interventions.
In Lawrence, Kansas, for instance, a public records request showed that nearly two-thirds of 1,200 Gaggle alerts over 10 months were deemed nonissues. One batch of false positives included over 200 student homework assignments.
Students in a photography class were flagged for nudity after uploading course photos. The images were automatically deleted by Gaggle, and only students who had backed them up on personal devices could prove the images weren't inappropriate. The district later adjusted the software's settings.
Natasha Torkzaban, a recent graduate who was flagged for editing a friend's college essay with the phrase "mental health," said, "I think ideally we wouldn't stick a new and shiny solution of AI on a deep-rooted issue of teenage mental health and the suicide rates in America, but that's where we're at right now."
Her class is now suing Lawrence Public Schools for unconstitutional surveillance.
Meanwhile, Mathis says her daughter is doing better — though still terrified of seeing one of the officers who arrested her. The alternative school, she said, was surprisingly supportive. Teachers there gave students space to talk openly about their emotions without judgment.
"It's like we just want kids to be these little soldiers, and they're not," Mathis said. "They're just humans."
Copyright 2025 WUSF 89.7