The debate surrounding the effectiveness of safeguarding software

Round-the-clock surveillance of students’ use of technology raises questions of privacy, but it can allow for important safeguarding interventions to take place

This is an edited version of an article that originally appeared on The Guardian

In the midst of a pandemic, and a national uprising, Teeth Logsdon-Wallace was kept awake at night last summer by the constant sounds of helicopters and sirens.

For the 13-year-old from Minneapolis, who lives close to where George Floyd was murdered in May 2020, the pandemic-induced isolation and social unrest amplified the emotional distress he was experiencing as a result of gender dysphoria. His billowing depression landed him in the hospital after he tried to kill himself. During that dark stretch, he spent his days in an outpatient psychiatric facility, where he listened to a punk song on loop that promised things would soon ‘get better’; eventually, they did.

Logsdon-Wallace, a transgender eighth-grader, has since ‘graduated’ from weekly therapy sessions and is doing better, but that didn’t stop school officials from springing into action after he wrote about his mental health. In a school assignment last month, he reflected on his suicide attempt and how the anthem by the band Ramshackle Glory helped him cope – intimate details that wound up in the hands of district security.

The classroom assignment was one of thousands of Minneapolis student communications that got flagged by Gaggle, a digital surveillance company that saw rapid growth after the pandemic forced schools into remote learning. In an earlier investigation the non-profit website The 74 analyzed nearly 1,300 public records from Minneapolis Public Schools to expose how Gaggle subjects students to relentless, round-the-clock, digital surveillance, raising significant privacy concerns for more than five million young people across the country who are monitored by the company’s algorithm and human content moderators.

But technology experts and families with first-hand experience with Gaggle’s surveillance dragnet have raised another issue – the service is not only invasive, it may also be ineffective.

Betrayed

In mid-September, a school counsellor called Logsdon-Wallace’s mother to let her know the system had flagged him for using the word ‘suicide’. The meaning of the classroom assignment – that his mental health had improved – was seemingly lost in the transaction between Gaggle and the school district. He felt betrayed. “I was trying to be vulnerable with this teacher and be like, ‘Hey, here’s a thing that’s important to me because you asked,” Logsdon-Wallace said. “Now, when I’ve made it clear that I’m a lot better, the school is contacting my counsellor and is freaking out.”

Jeff Patterson, Gaggle’s founder and CEO, said in a statement his company does not “make a judgement on that level of the context”; ultimately, it’s up to school administrators to “decide the proper response, if any”.

Minneapolis Public Schools first contracted with Gaggle in the spring of 2020 as the pandemic forced students, nationwide, into remote learning. Through AI and the content moderator team, Gaggle tracks students’ online behaviour every day by analysing materials on their school-issued Google and Microsoft accounts. The tool scans students’ emails, chat messages and other documents, including class assignments and personal files, in search of keywords, images or videos that could indicate self-harm, violence or sexual behaviour. The remote moderators evaluate flagged materials and notify school officials about content they find troubling.

In Minneapolis, Gaggle flagged students for keywords related to pornography, suicide and violence, according to six months of incident reports obtained by The 74 through a public records request. The private company also captured their journal entries, fictional stories and classroom assignments.

Dearth of independent research

Gaggle executives maintain that the system saves lives, including those of more than 1,400 youths during the 2020-21 school year. Those figures have not been independently verified, though Minneapolis school officials make similar assertions. Though the pandemic’s effect on suicide rates remains fuzzy, suicide has been a leading cause of death among teenagers for years. Patterson, who has watched his business grow by more than 20% during COVID-19, said Gaggle could be part of the solution. Schools, nationwide, have increasingly relied on technological tools that purport to keep kids safe, yet there’s a dearth of independent research to back up their claims that these tools are effective.

Logsdon-Wallace’s mother, Alexis Logsdon, didn’t know Gaggle existed until she got the call from his school counsellor. “That was an example of somebody describing really good coping mechanisms, you know, ‘I have music that is one of my soothing activities that helps me through a really hard mental health time,’” she said. “But that doesn’t matter because, obviously, this software is not that smart – it’s just like ‘Woop, we saw the word!’”

Data obtained by The 74 offers a limited window into Gaggle’s potential effects on different student populations. Though the district withheld many details in the nearly 1,300 incident reports, just over 100 identified the campuses where the involved students attended school. An analysis of those reports showed Gaggle was about as likely to issue incident reports in schools where children of colour were the majority as it was at campuses where most children were white. It remains possible that students of colour in predominantly white schools may have been disproportionately flagged by Gaggle or faced disproportionate punishment once identified. Broadly speaking, Black students are far more likely to be suspended or arrested at school than their white classmates, according to federal education data.

Gaggle and Minneapolis district leaders acknowledge that students’ digital communications are forwarded to police in rare circumstances. Jason Matlock, the Minneapolis district’s director of emergency management, safety and security, said that the district had interacted with law enforcement about student materials flagged by Gaggle on several occasions, often involving students sharing explicit photographs of themselves. Such images could trigger police involvement if officials classify them as child pornography. During a six-month period from March to September 2020, Gaggle flagged Minneapolis students more than 120 times for incidents related to what officials deem child pornography, according to records obtained by The 74.

It is unclear whether any students faced legal consequences as a result.

Don’t forget to follow us on Twitter like us on Facebook or connect with us on LinkedIn!

Be the first to comment

Leave a Reply