AI surveillance in American schools: protection or infringement?

Under the sweeping wave of digitalization, American schools are increasingly introducing artificial intelligence (AI) monitoring technology to create a safe environment for students. This technology can monitor school-issued devices and students’ online activities when logging into school accounts around the clock, with the aim of timely detecting dangerous signs such as students’ mental health crises and shooting threats, and preventing them from happening. However, in the actual implementation process, it has triggered a series of heated discussions about privacy and security.

Take the Vancouver Public Schools in Washington State as an example. They chose software developed by Gaggle Security Management to monitor students. This software uses machine learning algorithms to scan students’ online searches, writings, and activities when logging into school accounts on personal devices through school-issued laptops or tablets 24 hours a day. The latest contract signed in the summer of 2024 shows that the cost for three years is US$328,036, which is roughly equivalent to hiring an additional counselor.

Between October 2023 and October 2024, nearly 2,200 students in the district (about 10% of the enrollment) became the subject of Gaggle alerts. At the Vancouver School of Arts and Sciences, about a quarter of students trigger communications from Gaggle alerts. The algorithm detects indicators of potential problems such as bullying, self-harm, suicide or school violence, and once detected, it sends screenshots to human reviewers. If Gaggle employees confirm that the problem may be serious, they will alert the school. In an emergency, Gaggle will call school officials directly; in rare cases, if no one answers, it may contact law enforcement for risk verification.

But The Seattle Times and the Associated Press reporters obtained nearly 3,500 unredacted sensitive student files through requests. These files are neither firewall-protected nor password-protected, and the students’ names are clearly visible, which undoubtedly exposes huge security risks. The published documents show that students not only use these laptops to complete their studies, but also confide their personal anxieties on them, covering depression, heartbreak, suicide, addiction, bullying and eating disorders.

Monitoring technology does play a positive role in some aspects, helping counselors reach students who may be struggling alone in silence. However, its negative effects cannot be ignored. LGBTQ+ students are the most vulnerable. In screenshots released by Vancouver schools, at least six students may have been exposed by school officials after writing that they were gay, transgender, or struggling with gender dysphoria. LGBTQ+ students are already more prone to depression and suicidal thoughts than their peers, and they often seek support on the Internet. In one screenshot, for example, a Vancouver high school student wrote in a Google survey that he had been subjected to transphobic and racist slurs, and the person behind the survey falsely promised confidentiality.

Many parents are unaware of the use of surveillance technology in schools. When Pierce, a professor at the University of Washington, signed a responsible use form before his son received a school laptop, he did not notice the content related to Seattle Public Schools using the surveillance software Securly.

Even if some families know that the school is monitoring, they may not be able to choose to opt out. Owasso Public Schools in Oklahoma has been using Gaggle to monitor extracurricular students since 2016, and Tim Ryland, as a parent of two teenagers, knew nothing about it for years. He learned about this when he asked his daughter if she could bring her personal laptop to school, but was rejected by the school district. After learning about Gaggle, Leland’s daughter Zoe felt very “scared” and never used Google to search for any personal information on the Chromebook issued by the school, even about menstruation, because she was worried that she would be called to the office.

Although school officials firmly believe that monitoring technology saves lives, a middle school student who may have been trafficked in the Seattle High Line School District once used Gaggle to communicate with campus staff. But from the perspective of developmental psychology, adolescents need a private online space to explore their ideas and seek support as they grow up. Being under surveillance for a long time is not conducive to their development of their private lives, and it is difficult for them to have a space to make mistakes and face painful feelings alone.

At present, there is no independent research showing that AI monitoring technology significantly reduces student suicide rates or violence. In 2023, the RAND study found only a small amount of evidence about its benefits or risks. Benjamin Boudreau, a co-author of the study and an AI ethics researcher, pointed out: “If you don’t have enough mental health counselors, issuing more alerts will not actually improve suicide prevention.”

The school’s AI monitoring technology is a double-edged sword. How to find a balance between ensuring student safety and respecting student privacy has become an important issue that all sectors of society urgently need to explore in depth and work together to solve.

  • Related Posts

    Who competes with the electric vehicle runway?

    In recent years, the global electric vehicle market has undergone great changes. The market originally dominated by US electric vehicle giant Tesla has today become a multi-corner track. So far,…

    Why has Harvard become a thorn in Trump’s side?

    “I want a list of foreign students”-US President Trump’s order sounded the alarm for international students at Harvard University. On May 23, the U.S. government announced in a surprise manner…

    Leave a Reply

    Your email address will not be published. Required fields are marked *

    You Missed

    What did Macron say after being slapped in the face?

    What did Macron say after being slapped in the face?

    DeepSeek open-sources new version R1, comparable to OpenAI’s highest o3 model

    DeepSeek open-sources new version R1, comparable to OpenAI’s highest o3 model

    Musk announces he will leave Trump administration

    Musk announces he will leave Trump administration

    Nvidia’s first-quarter net profit was $18.78 billion, up 26% year-on-year

    Nvidia’s first-quarter net profit was $18.78 billion, up 26% year-on-year

    Trump takes credit: I won! EU is willing to speed up negotiations

    Trump takes credit: I won! EU is willing to speed up negotiations

    US court halts Trump tariff policy

    US court halts Trump tariff policy