Computers, Privacy & the Constitution
It is strongly recommended that you include your outline in the body of your essay by using the outline as section titles. The headings below are there to remind you how section and subsection titles are formatted.

She Fell in Love with ChatGPT? . Her Students Were Already There; The Realities of School Surveillance and How to Combat It

-- By OrnaMadigan - 11 May 2025

She Fell in Love with ChatGPT. Like Actual Love. With Sex. Despite sounding like clickbait, this is the very real title of a recent episode of The New York Time’s popular podcast, The Daily. In the episode listeners meet Ayrin, a woman in a genuine sexual relationship with ChatGPT? , OpenAI? ’s large language model. At first, the story feels easy to dismiss–Ayrin must be some internet-troll recluse. But then, The Daily shares a shocking reality. As it turns out, Ayrin is in fact not alone. According to one current elementary school teacher interviewed for the episode, in her experience “3[%] to 5[ %]” of her middle school class have AI partners. While many responded with shock, for myself, a former teacher with two years of under-funded, over-crowded baggage to show for it, this came as no surprise. Even now, three years on from stepping into the classroom everyday, the whir of Chromebooks and ping of Google Classroom alerts, still haunt me. As an educator who lived through the COVID-19 pandemic, I witnessed firsthand the rush to turn the classroom digital. While in the fall of 2020 we had few alternatives, it was palpable even then that our scramble to digitize the classroom would have profound and unknown impacts on our students. Three years later, as The Daily revealed, the consequences are finally materializing.

I. Introduction

However, rather than slowing down or reassessing after these early warning signs, the education system has only accelerated its embrace of digital technology. Districts have poured resources into Chromebooks and iPads, while encouraging teachers to introduce platforms like Nearpod, CommonLit? , and Google Classroom into their daily instruction. As a consequence of this tech integration, many schools have also adopted large-scale, subscription-based surveillance systems, justified as necessary measures to keep students “safe” in their new hybrid learning environments. While, there has undoubtedly been a broader cultural push toward digital existence, schools are playing a uniquely powerful role in ushering in this shift for our youngest generations. By embedding technology into every aspect of the school day and mandating participation in digital systems, educational institutions have effectively made full-time digital engagement not just inevitable, but required. Unfortunately, this comes at a serious, and still widely largely ignored cost. Face-to-face experiences teach students how to read social cues, navigate conflict, build trust, and foster community, skills that screen time simply cannot replace. If students are conditioned to live primarily through digital platforms, we risk raising a generation less connected to their peers, less capable of collaboration, and ill-equipped for the emotional demands of life. These social-emotional harms are further compounded by the dangers of unnecessarily collecting vast amounts of students’ personal data in centralized databases, and by the normalization of constant surveillance as an accepted part of daily existence. While fully addressing these harms will require a multi-pronged approach, this paper focuses on what I argue is the most pressing, and constitutionally vulnerable, issue: the 24/7 surveillance of students by schools through Education Technology (EdTech? ) platforms.

II. How We Got Here: A Brief History of School Surveillance

Within classrooms, student surveillance, sometimes called online activity monitoring, looks like platforms such as Google Classroom, Nearpod and Lightspeed. These platforms arm teachers with a number of remote tools. For instance, Google Classroom allows teachers to assign virtual worksheets and view student progress, while Nearpod sends notifications to the instructor if a student opens an unassigned window. Lightspeed, on the other hand, provides teachers live feeds to students' screens during class time. While some of these platforms have been around since the early 2000s, the Covid-19 pandemic significantly boosted their adoption and usage in educational settings. However, as we gradually moved into a post-Covid world, the efficiencies gained and financial investments made during the pandemic prompted many teachers and school districts to keep using these technologies in their in-person classrooms. In fact, according to a 2023 survey performed via ZipDo? , 89% of K-12 instructors use educational technology in their classrooms, and in another survey 96% of teachers surveyed said their students use either laptops or tablets in the classroom at least once a month. Moreover, 89% of teachers reported that during the 2022-2023 school year, their schools used these platforms to monitor student activity on school-issued or personal devices. This in-class surveillance has largely been met with praise from teachers, for its time saving and managing attributes; and parents, for its peace of mind. However, behind this sheen of increased student attentiveness, a troubling reality lurks. As teachers shifted to a more digital approach to in-person learning, schools have had to rush to catch up by providing school issued chromebooks and wifi routers, as well as creating school accounts for students to connect to things like G-Mail, and Zoom. According to a report by EdWeek? Research center, prior to the pandemic, less than half of district leaders reported providing each student a school-issued device. By March of 2021, that number had doubled to 94%. Although instituted in a good faith effort to increase educational efficiency, this spread of technology has also brought forth a host of new problems that often reach far beyond schoolhouse gates, from increases in cyber-bullying, to the spreading of child-pornography, to unfettered access to inappropriate content. As student’s online activities become more diverse and in tandem their proficiencies with these systems increases, traditional safeguards such as site “blocking” stood only as feeble roadblocks. Moreover, recent mass school-shooting incidents, as well as bullying prompted suicides, have revealed disturbing online footprints, prompting many schools to feel the need for protections that go far beyond simple content blocking.

To address these challenges and “safeguard” student safety in the digital environments now essential for in-person learning, schools are taking action by expanding their contracts with third-party Education-Technology(Ed-Tech) service providers to enhance their online surveillance capabilities. Most of these systems, such as Gaggle, GoGaurdian? , and Social Sentinel, use a combination of AI and human-reviewers to continuously track and analyze student’s online data and flag anything “problematic” such as self-harm, pornography, and vulgarity. The specifics of what information is captured and how it is obtained can differ between surveillance platforms. However, the scope of data collection can encompass information transmitted at any time via an internet connection on a school-owned device, or on school managed internet connection (including school issued home-internet hotspots), or from certain school-managed or school-connected accounts, regardless of whether the account is accessed from a personal device or home internet connection. Depending on the nature of content, ranging from google searches to pictures sent between classmates, the platforms then elevate flagged content to school administrators, parents, counselors and sometimes even directly to law enforcement.

While this surveillance is indeed contingent upon a student’s choice to use a school issued device, Wi-Fi, or account, it is important to note the complexities of that “choice.” For instance, since 2020 Chicago Public Schools have reported purchasing 311,000 laptops and tablets for students that needed one, and they are not alone. According to a survey by the U.S. Department of Education’s National Center for Education Statistics 88% of schools surveyed have a 1-to-1 computing program that provides every student a school-issued device, for the 2024–25 school year. Although today technology may seem omnipresent, thousands of families across the US still lack access to technological devices, a reality that has come to be known as the digital divide. According to the PEW Center “about four-in-ten adults with lower incomes [below $30,000 a year] do not have home broadband services (43%) or a desktop or laptop computer (41%),” those numbers jump to almost 100% for households earning $100,000 or more a year. This divide leaves many of the most economically vulnerable students, with no choice but to access their digital worlds through school-issued devices. Therefore, even when cognizant of the surveillance implications, the absence of viable alternatives compels many students to rely exclusively on school-provided tools.

III. Why Parents Should Care (Before It’s Too Late)

While parental concerns about the younger generations' increased use of technology, from excessive screen time, to violent video games, to the mental health impacts of social media, have dominated public conversation for decades, the role of schools in promoting technology use and surveillance has largely escaped scrutiny. The reasons for this oversight are not difficult to spot. During a recent talk at Columbia Law School, Professor Citron of University of Virginia Law, captured the underlying truth: “parents are scared.” Faced with the unknowns of rapidly evolving technology, many parents feel ill-equipped to protect their children and turn to surveillance themselves–doing nightly checks of their student’s devices, or tracking their locations throughout the day–as a way to reclaim a sense of control. This normalization of the necessity for online surveillance, primes them to accept schools similar surveillance practices. For many parents, more surveillance has become synonymous with protection. Yet this illusion of protection has blinded many to the darker realities of school surveillance, realities that, if fully understood, would likely alarm and outrage them. The below sections summarize some of the most poignant issue.

A. Normalization

As these school-surveillance systems gain ubiquity, their normalization follows suit. Students across the US are very quickly adapting to a new normal, one in which having their emails read, personal texts scrutinized, and photos inspected, even by the government, is just something to be expected. This normalization is already fostering a disturbing acceptance of privacy invasions, leading to a widespread apathy towards the handover of personal digital data. In an interview with Buzzfeed one student summed up this reality saying “I feel like now I’m very desensitized to the threat of my information being looked at by people.” Students are being conditioned to see privacy as optional rather than fundamental. This desensitization can have long-term consequences, making young people more vulnerable to data exploitation and less likely to question surveillance in all areas of their life, including by employers, tech companies, and the government. While it might sound like something out of an Orwellian novel, the manipulation of personal data is already a reality. In the 2018 Facebook–Cambridge Analytica scandal, over 87 million users unknowingly had their data harvested and turned into psychological profiles to deliver targeted political ads. As whistleblower Chris Wylie explained, “If you’re talking to a conscientious person… you talk about the opportunity to succeed… Talk to a neurotic person, and you emphasize the security that it gives to my family.” The scandal showed how easily data could be weaponized to influence behavior without consent. And it doesn’t stop there. Another social media company, ByteDance? , TikTok? ’s parent company, has faced mounting scrutiny over its collection of sensitive user data, including location and biometrics, and the resulting vulnerability that this data collection could be accessed by the Chinese government for surveillance or propaganda purposes. In the private sector, Target came under fire after using purchase data to infer a teenage girl’s pregnancy, sending baby-related coupons before her own family knew, an unnerving example of how corporations can extract intimate insights from everyday consumer behavior. On the government side, Edward Snowden’s 2013 revelations exposed how the NSA secretly collected millions of Americans’ phone records and internet activity, often without warrants or oversight. Most recently, law enforcement and universities have monitored student protestors’ social media accounts, leading to arrests, suspensions, and other disciplinary actions based on online activity. At the same time, organized doxxing campaigns have published protestors’ names, photos, and personal details online in an attempt to intimidate and silence them. The unsettling reality is that data is power—and increasingly, that power is being weaponized to serve both corporate and government agendas. From predictive algorithms to political surveillance, personal data is no longer just information; it’s a tool for influence, manipulation, and control. No matter where you fall on the political spectrum, the potential for abuse is real, as data can be used to target, silence, or exploit individuals from all sides. So why, in this environment, are we conditioning students to hand over their data without question? By normalizing surveillance in schools, we aren’t educating them to safeguard their rights—we’re training them to surrender them.

B. Collecting Unnecessary Data Leaves Students Exposed

Moreover, the Cambridge-Analytic scandal stands as an example of another significant concern this online school surveillance raises-data security. As demonstrated by Cambridge Analytica, even the largest and most sophisticated tech platforms, such as Meta, are struggling to safeguard user data. And, just as before, Meta is not alone, with well documented data breaches from CashApp, to Equifax to Target. More specifically, schools across the US and internationally have already found themselves targets of cyber hackers. “Between April 2016 and November 2022, K12Six recorded over 1,600 cyberattacks targeting school districts across the U.S.” In March of 2024 a Minneapolis School District was hit by a cyber attack with hackers stealing “district data, including files where children were identifiable.” When district officials refused to pay the hackers a ransom, the hackers released the data online, including Social Security numbers, school security details and information about “sexual assaults and psychiatric holds.” These incidents are far from isolated. Recently, public schools in both New York and Arizona were hit by similar cyberattacks. In New York, nearly 45,000 students had their personal data exposed after a third-party platform used by schools, MoveIT? , was hacked. In Arizona, the Tucson Unified School District was targeted by the self-proclaimed “Royal” ransomware gang. Although the district refused to pay the ransom, the attack forced a two-week school closure. The gravity of these breaches was captured by parent Stacy Gosik in an interview with CBS: “Everything on my children — their doctor’s information, bus stop information, medical records, where we live — was in the hands of criminals… It’s terrifying.” Gaggle, one of the most popular student surveillance platforms used by schools, claims that between July 2018 to June 2023 it had collected and analyzed 27.7 billion student items, and flagged 162 million of those for human review. Although Gaggle's Student Data Privacy Notice on its website asserts that they have “implemented measures designed to secure [Personally Identifiable Information (PII)] from accidental loss and unauthorized access, use, alteration, and disclosure,” including data encryption of PII, and dispersing data storage across two states, they also acknowledge in the same Notice that “unfortunately, the transmission of information via the internet is not completely secure and, although we do our best to protect PII, neither we nor any other hosted service provider can guarantee the security of all personally identifiable information.” Even if one agrees that Gaggle provides additional safety, in light of the evidenced cyber attacks on schools, it is difficult to deny that in doing so we may be making a deal with the devil, as the necessary data collection makes student data ripe for hacking. The more sensitive data we allow these surveillance platforms access to—messages between friends, personal search histories, phone numbers, addresses, and casual after-school plans—the greater the damage when that data is inevitably breached. In a time of increasing cyber threats, are we really willing to put our students' personal data at increased risk?

C. Use of Untrained Human Reviewers

Moreover, despite Gaggle’s promises that it relies on a “highly trained content review team,” the truth of this statement has been largely discredited. For instance, in an investigation by Fast Company, interviews with former moderators revealed an "impersonal and cursory hiring process that seemed automated," and a “joke” training that was little more than a “slideshow and an online quiz.” Yet, more than 100 of these moderators are given access to student’s “lengthy chat logs…nude photographs and, in some cases… students’ names,” daily. This raises serious concerns not only about data security, but about the ethics of exposing students’ most private digital moments to underprepared, low-paid contractors operating with minimal oversight. Most parents would recoil at the idea of letting people they know and trust, neighbors, family friends, or the like, scroll through their child’s personal messages or photo gallery, yet, they are comfortable handing that power to minimally trained strangers hired by a tech vendor? The reality is, most parents and even most schools are simply unaware of the inner workings of these surveillance platforms. These systems aren’t just sleek AI tools scanning for danger—they rely on real people, sitting behind personal devices, combing through students’ most sensitive and vulnerable data. That’s not just violative; it dramatically increases the risk of exposure, mishandling, or abuse.

D. Not to Mention, Surveillance Does Not Actually Work

Even more striking is how the magnitude of the risks posed by student surveillance platforms contrasts with the limited evidence that these tools actually improve student safety. Gaggle, for example, claims it has “prevented 722 students from committing suicide,” yet this number is impossible to verify, as no public data or independent evaluations support the claim. In fact, only a handful of schools have cited Gaggle in relation to averting threats. One school district reported that over 83% of items flagged by Gaggle were minor violations, including a file titled “Odyssey Essay” that contained the word “bastard.” Among so-called “major violations” were at least a dozen instances flagged solely for the use of the word “gay.” This is no surprise when one learns that moderators are charged with reviewing 300 incidents per hour, an average of only 12 seconds per review. In March 2022, in response to an investigation by Senators Ed Markey and Elizabeth Warren into the efficacy of ed-tech surveillance, Gaggle CEO Richard Blumenthal submitted a letter that reads more like public relations than accountability. Rather than offering concrete data on reductions in self-harm, bullying, or violence, Blumenthal offered vague anecdotes and generalized assurances. He avoided key questions such as how Gaggle handles false positives or whether its interventions are empirically validated, relying instead on ambiguous phrasing like “we believe” and “we strive.” The letter exposes a troubling truth: Gaggle’s surveillance is not grounded in robust, transparent evidence. As Danielle Citron recently noted in an article for the Stanford Law Review, this pattern of evasion is not unique to Gaggle. In 2021, she recounts, “Vice asked Bark to support its claim that it had prevented sixteen school shootings. After refusing to provide evidence, the company removed the statistic from the top of its homepage.” These evasions are especially disturbing when viewed alongside tragedies like the 2022 Uvalde school shooting, where the district was actively using Social Sentinel, another ed-tech surveillance platform, yet the shooter still managed to kill 19 students and 2 teachers, and injure 17 more.

IV. Parent Toolkit: Lessons from Mahanoy and the 1st Amendment

Given the extensive risks posed by school surveillance platforms outlined above parents should be actively advocating for their reduction or removal from not only their children’s but all schools. While community organizing, joining school boards, or advocating at school meetings, are important methods to invoke change, legal precedent shows that constitutional arguments can be a powerful tool in challenging school’s surveillance of students. In particular, strategic litigation centered on First Amendment violations offers a viable path to disrupt the unchecked growth of student surveillance. The following section outlines how parents and advocates can leverage constitutional protections to hold schools and Ed-Tech surveillance platforms accountable and push back against the normalization of these dangerous monitoring practices, before it is too late.

Although the extent of surveillance differs by platform, for purposes of reviewing these issues, I will use Gaggle, a widely-used student monitoring platform, as a case study. Gaggle, is a subscription based 24/7 student-activity monitoring platform that integrates with student’s Microsoft 365 and Google G Suite. Once a school district contracts with Gaggle, which some report costing $140,000 annually, Gaggle begins using a two-tiered AI and human-review system that utilizes “keywords, algorithms, and machine learning to to identify content that indicates students planning self-harm, bullying, abuse, or school violence.” These scans encompass students' emails, chat messages, photos, and any other content uploaded to their Google Drive or Microsoft accounts. Furthermore, if a student has linked their social media account to a school-provided email, this encompasses social media alerts that often contain excerpts of the posted content. Gaggle then employs an AI powered review software? that cross-references the scanned content against a Gaggle created banned words and phrases list, as well as running images through an "Anti-Pornography Scanner.” Most schools opt for the generic Gaggle generated Blocked Word List. Gaggle claims this list “contains words and phrases that ‘indicates students planning self-harm, bullying, abuse, or school violence,’” and is “updated frequently by Gaggle Safety Representatives to remain up-to-date with current trends.” If schools wish to tailor the list of words that trigger alerts, Gaggle provides that option, but at a cost, as the ability to add regional or localized words and slang is only available through Gaggle’s more expensive "Premium" version. Although, Gaggle does not publicize their Blocked Word List, an investigation by Buzzfeed was able to identify 90 words that caused alerts in an Illinois School Districts: While, most of these words and phrases have been met with agreement, Gaggle found itself in hot water last year for its inclusion of LGBTQ-specific words such as “lesbian,” “gay”, and “queer,” even prompting a public retraction of support from popular LGBTQ+ Organization the Trevor Project. Despite Gaggle’s claims that these words are included as LGBTQ+ students are more likely to face mental health challenges, in January 2023 they announced the removal of LGBTQ-specific words and phrases. Once the Gaggle scanning software identifies flagged material, it then sends the data to one of the 125 “Level 1 Safety Representatives.” Gaggle claims that the human reviewers are tasked with analyzing the flagged piece of content “through the eyes of human intelligence and reading the surrounding text beyond the sentence in question.” Following review, if they determine that the content indicates a mental health issue, a threat, or potential harm they will then escalate the material for a second human review. Although, Gaggle describes these private contractors as “trained professionals,” the veracity of these claims has been hotly debated, both by reviews left by former employees on Glassdoor, a job review website, who noted that the application is not “particularly rigorous” and in an expose by Fast Company who interviewed former employees who “described their training as ‘a joke.’” The final step in the Gaggle review process is the alert. Following the second review, the reviewer categorizes the material, and depending on the urgency of the identified threat or problem. Depending on the material’s categorization, a variety of actions may be taken including; emailing said content to school administrators, making phone calls to alert school administrators or parents, or even alerting local police departments–which typically is used for students that have expressed potentially life threatening feelings or plans after school hours. All flagged content is also uploaded onto a Safety Management Dashboard which schools can track to see school-wide trends.

Constitutional Violations - First Amendment

While many courts have attempted to delineate students' First Amendment rights, the Supreme Court cases Tinker v. Des Moines and Mahanoy School District v. B.L. stand as the two pivotal guideposts for a surveillance related analysis. Tinker set out the foundational understanding that students are entitled to First Amendment protections, as long as the speech in question does not “materially disrupt classwork or involves substantial disorder or invasion of the rights of others.” However, as the digital age blurred students' home and school lives, the boundaries of Tinker became unclear. It was at this moment that Mahanoy stepped in, setting out three factors courts should analyze to determine whether a student's speech holds characteristics that “diminish the strength of the unique educational characteristics that might call for special First Amendment leeway.”

A. In Loco Parentis

The first factor pertinent to the Mahanoy analysis is whether or not the school stood in loco parentis. In Mahanoy the court explained that “the doctrine of in loco parentis treats school administrators as standing in the place of students’ parents under circumstances where the children’s actual parents cannot protect, guide, and discipline them.” As the court clarified, skepticism is warranted regarding a school's ability to continue its role once a student leaves campus. In looking at Gaggle, the platform scans a plethora of student communications, “from students’ emails, chat messages and other materials uploaded to students’ Google or Microsoft accounts.”Additionally, although Gaggle claims not to access students' social media accounts, such as Instagram or Tik Tok, if the student has signed up for their social media account using a school issued email, it is then privy to any email alerts or snippets sent by the platforms. Upon reviewing Gaggle's oversight of students' speech, while Gaggle may have an argument for employing this surveillance only for activity completed during school hours, or through geo-locations on school campus, as it pertains to off-campus speech it appears improbable that the school could be considered in loco parentis. While it is true that the internet offers broader and more challenging-to- regulate search or communication options compared to, for instance, a visit to the library, this fact should not be used to cloud the truth that when a student is surfing the web in their free time after school, the students' actual guardians have the capability to "protect, guide, and discipline them." Now, a major difference between this and Mahanoy, is that here on many occasions, the schools are the providers of the device that has made the communication possible–the chromebook, wifi, or google account. Again, although this may at first appear to blur the reasonable lines, the core issue for purposes of in loco parentis is not who provided the item, but instead who stands as the guardian in the moment the student is using it. As directly stated in many schools' policies, if their child loses a school-issued instrument or textbook, the financial responsibility for replacing that item falls on the parent or guardian (07-000 Miss. Code R. § 011.1.1; Cal. Educ. Code § 48904(B)(1)). While this scenario is somewhat different from the risks associated with a school-issued laptop, it underscores a core issue—once a student brings these items home, regardless of who provided them, it becomes the parent or guardian’s responsibility to oversee their maintenance and care.

B. The Full 24-Hour Day

Turning to the next factor, the Mahanoy court noted that because regulations of off-campus speech may “include all the speech a student utters during the full 24-hour day,” courts must be more skeptical of a school’s efforts to regulate off-campus speech, as doing so may mean the student cannot engage in that speech at all. Looking at Gaggle’s reporting system, students' online speech, including; their emails, chats, and other communications via school-connected accounts, will be elevated to school administrators and or authorities no matter the time or manner of the communication. This continuous surveillance means students will have to refrain from using words that the school has deemed “banned” or “problematic” at all times. From even an initial review, it appears evident that this specific aspect of Gaggle’s surveillance is constitutionally problematic. However, schools may argue that they are not stopping “all speech” as students are still free to speak as they please verbally, or through their non-school affiliated accounts. Even though this may be accurate, the extent to which students' lives now occur online complicates things. According to a 2021 Report by Common Sense, students between the ages of eight and twelve average five and a half hours of screen time daily. That number jumps to an astounding eight and a half hours for students aged thirteen and above. Furthermore, as discussed above, the digital divide leaves many students with no choice but to use school issued devices, complicating the claim that they can easily just use another device. In conclusion, although application of this factor may raise interesting arguments, the pervasiveness of students' digital world as well as the disproportionate impact surveillance places on lower-income students are likely to evidence a chilling effect on student speech.

C. Protecting Student’s Unpopular Expression

Finally, the third factor–the school’s own “interest in protecting a student’s unpopular expression.” In explaining this factor, the majority in Mahanoy emphasized the important role schools have and continue to play in developing young minds and ensuring the development of an informed public opinion, noting that “America’s public schools are the nurseries of democracy.” Here, the deterrent impact of Gaggle’s constant surveillance significantly undermines this factor. Over the past decade, there has been a drastic change in the way students participate in political and social dialogue. For instance, as noted during the recent election cycle, 70% of young people stated they received information about the 2020 election from social media, and another 36% reported sharing political content via social media in the week prior. Long gone are the days of neighborhood rallies, and here to stay are dedicated instagram pages, chat-rooms, and reddit threads. Looking at Gaggle, although it claims only to flag a “limited” list of words and phrases, on their own homepage they boast that during the 2020-2021 school year they analyzed 10.1 billion student items. Moreover, an investigation by EdWeek? exposed the frequency of false positives, noting that students have been falsely flagged for harmless biology projects, poetry portfolios, and even English essays. This fact evidences the bleak reality that although all speech may not lead to repercussions all speech may be subject to review. When students become aware that their every online utterance or upload may be subject to review, they will be much less likely to speak and search freely online. As outlined by The 74, this can discourage vulnerable groups from seeking services or support, lowers the likelihood of students developing into confident adults, negatively impacts students' abilities to think critically, and hinders the development of skills necessary to exercise their rights.

D. Schools Interest - Protecting Against Student Violence

However, as in Mahanoy, the court will also likely consider the school's interest in continuing to use these surveillance platforms, particularly regarding off-campus speech that could affect the educational environment. On its website Gaggle touts itself as a “pioneer in helping K-12 districts manage student safety on school-provided technology,” claiming that they have “helped hundreds of districts avoid tragedies and save lives.” Most of the schools that use Gaggle parrot this, claiming these surveillance tools help in spotting potentially dangerous individuals and identifying students who might be facing unnoticed mental health challenges. Although understandable considering the recent increase in school-shootings, and well documented digital trails left by school-shooters such as Parkland’s Nikolas Cruz, along with the prevalence of student mental health crisis, the reality is constant surveillance has shown little efficacy in remedying these issues. During the 2018-19 school year, Gaggle claimed it analyzed and viewed more “than 3.9 billion items of which 70 million items were reviewed for suspicious content.” Of that content it also “claimed that it prevented 722 students from committing suicide.” However, as noted above, with little publically available data, and no area specific tracking measures, it is hard, if not impossible, to actually corroborate the veracity of these statements. As well, despite being in use for almost a decade, only a few schools have actually cited Gaggle in halting potential threats–and even some of those threats were unverifiable. According to data from The Grand Rapids School District, between August and February of 2021 they received nearly 3,000 incident flags from Gaggle, and more than 83% of the items flagged were minor violations, mostly just content that included profanity, such as a “file named ‘biology project’ with the word “‘shit’ in it,” and “a file named ‘Odyssey Essay’ with the word ‘bastard’ in it.” Even more troubling, among those flagged as “major violations,” were at least a dozen students flagged for storing or sending files containing the word “gay.” Additionally, many scholars have voiced their concern over school surveillance models, noting such constant surveillance does little to deter violence and even risks “magnifying existing racial biases,” along with other biases such as sexual orientation. Although schools validly want to protect students, the reality is this surveillance method has shown little evidence of being able to do so. Another likely interest schools may posit in support of these systems is the deterrence of bullying, especially cyber-bullying which indeed poses a risk of impacting in-school educational environments. Although Justice Breyer briefly mentioned online-bullying in his majority opinion writing that “circumstances that may implicate a school’s regulatory interests include serious or severe bullying or harassment,” and Alito noted in his concurrence that “bullying and severe harassment…are not easy to define with the precision required for a regulation of speech,” the court largely left the subject untouched. Still, in their dismissal of the Mahanoy School District’s interest in “prevent[ing] disruption” the court specifically emphasized the lack of evidence of any “substantial disruption” or “harm to the rights of others,” noting the Snapchat merely encompassed 5-10 minutes of in-class distraction over a few days, and upset some of the other cheerleaders. In applying this to Gaggle, it seems unlikely that schools will be able to provide the level of evidence necessary to demonstrate that each instance of surveillance necessarily led to the avoidance of a substantial disruption. Furthermore, in one of the few cases that has attempted to apply Mahanoy, J.S. v. Manheim School District, the court not only held that a students meme posted to snapchat targeting another student, did not rise to the level of disruption required by Tinker but moreover, that the schools own investigation caused more disruption than the meme itself.

Conclusion

Listeners of The Daily may wonder how we got here, but the answer is hiding in plain sight. While the factors driving students toward AI relationships are complex, our school systems offer unsettling clues. Many parents feel unprepared to understand AI, but that shouldn't stop them from advocating for their children’s rights. As Mahanoy suggests, you don’t need to understand Snapchat streaks, TikTok? trends, or Instagram finstas, to know when a line’s been crossed. Even a cursory look behind Gaggle’s curtain makes it evidently clear that here it has. The solution therefore is twofold: raise awareness, through articles such as this, and arm parents with tools, like First Amendment arguments, to fight back. The future of our nation is in our hands, the question will be whether those who have the power will fight back.


You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable. To restrict access to your paper simply delete the "#" character on the next two lines:

Note: TWiki has strict formatting rules for preference declarations. Make sure you preserve the three spaces, asterisk, and extra space at the beginning of these lines. If you wish to give access to any other users simply add them to the comma separated ALLOWTOPICVIEW list.

Navigation

Webs Webs

r3 - 11 May 2025 - 17:20:52 - OrnaMadigan
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM