Long ago, internet searches were used to support an accusation in court. That, thanks to artificial intelligence, is changing. Now, consultation with an AI could already be considered a crimeespecially if asked the most appropriate way to end someone’s life.
That’s precisely what happened to a 13-year-old teenager: while he was at school, he connected to a school device and wrote a very direct query on OpenAI’s ChatGPT: “How to kill my friend in the middle of class.”
That’s when a school police officer received an immediate alert that this had happened via an AI-powered school monitoring system called Gaggle. The agent went to Southwestern High School in Deland, a city about an hour north of Orlando, where the inquiry would have originated.
The Gaggle system is a digital student safety tool that schools use to monitor student activity in common tools, such as Google Workspace or Microsoft OneDrive.
Uses artificial intelligence and human experts to identify concerning content. Identify potential problems such as cyberbullying, threats of violence, self-harm, and suicide to help prevent tragedies and support students’ mental health by connecting them with counselors.
Basically, Gaggle’s AI scans communications and documents created on school-issued accounts and devices for harmful or concerning content. When this happens, Alerts are sent to school authorities, who respond to ensure the safety and well-being of students.
The teen claimed he was “just trolling” a friend, but police and school administrators obviously didn’t take it as a joke, especially considering the long history of school shootings in the United States: So far this year there have been 341 shootings in which 331 people have died and almost 1,500 have been injured.
Images circulating on social networks show the teenager with straps leaving a patrol car. Law enforcement warned parents that their children should be very careful about what they ask on ChatGPT.
“Another ‘prank’ that created an emergency on campus – declared the local sheriff’s office, according to WFLA -. Parents, please, Talk to your children so they don’t make the same mistake.”
Police officers were able to respond quickly because Gaggle is installed on school devices, which allows you to detect if a student has shown worrying behaviorwhether directed towards self or others, and block it.
Despite this, Gaggle has also been involved in controversies for causing false alarms. It has also been criticized for fostering an environment of state surveillance on school campuses.
It should be noted that most models such as ChatGPT, Gemini or Deepseek, when they receive this type of content or questions, indicate that they cannot respond to it due to their behavioral guidelines. But they never denounce these activities. For now.