student surveillance – The 74 America's Education News Source Thu, 18 Apr 2024 22:46:22 +0000 en-US hourly 1 https://wordpress.org/?v=6.1.1 /wp-content/uploads/2022/05/cropped-74_favicon-32x32.png student surveillance – The 74 32 32 Room Scans & Eye Detectors: Robocops are Watching Your Kids Take Online Exams /article/room-scans-eye-detectors-robocops-are-watching-your-kids-take-online-exams/ Thu, 18 Apr 2024 10:15:00 +0000 /?post_type=article&p=725432

Remote proctoring tools like Proctorio have faced widespread pushback at colleges. Less scrutiny and awareness exists on their use in K-12 schools.

Updated, correction appended April 18

In the middle of night, students at Utah’s Kings Peak High School are wide awake — taking mandatory exams. 

At this online-only school, which opened during the pandemic and has ever since, students take tests from their homes at times that work best with their schedules. Principal Ammon Wiemers says it’s this flexibility that attracts students — including athletes and teens with part-time jobs — from across the state. 

“Students have 24/7 access but that doesn’t mean the teachers are going to be there 24/7,” Wiemers told The 74 with a chuckle. “Sometimes [students] expect that but no, our teachers work a traditional 8 to 4 schedule.” 

Any student who feels compelled to cheat while their teacher is sound asleep, however, should know they’re still being watched. 

For students, the cost of round-the-clock convenience is their privacy. During exams, their every movement is captured on their computer’s webcam and scrutinized by Proctorio, . Proctorio software conducts “desk scans” in a bid to catch test-takers who turn to “unauthorized resources,” “face detection” technology to ensure there isn’t anybody else in the room to help and “gaze detection” to spot anybody “looking away from the screen for an extended period of time.” 

Proctorio then provides visual and audio records to Kings Peak teachers with the algorithm calling particular attention to pupils whose behaviors during the test flagged them as possibly engaging in academic dishonesty. 

Such remote proctoring tools grew exponentially during the pandemic, particularly at U.S. colleges and universities where administrators seeking to ensure exam integrity during remote learning met with sharp resistance from students. Online end the surveillance regime; the tools of and that set off a red flag when the tool failed to detect Black students’ faces.  

A video uploaded to TikTok offers advice on how to cheat during exams that are monitored by Proctorio. (Screenshot)

At the same time, social media platforms like TikTok were flooded with videos purportedly highlighting service vulnerabilities that taught others

K-12 schools’ use of remote proctoring tools, however, has largely gone under the radar. Nearly a year since the federal public health emergency expired and several since the vast majority of students returned to in-person learning, an analysis by The 74 has revealed that K-12 schools nationwide — and online-only programs in particular — continue to use tools from digital proctoring companies on students, including those as young as kindergarten. 

Previously unreleased survey results from the nonprofit Center for Democracy and Technology found that remote proctoring in K-12 schools has become widespread. In its August 2023 36% of teachers reported that their school uses the surveillance software.

Civil rights activists, who contend AI proctoring tools fail to work as intended, harbor biases and run afoul of students’ constitutional protections, said the privacy and security concerns are particularly salient for young children and teens, who may not be fully aware of the monitoring or its implications. 

“It’s the same theme we always come back to with student surveillance: It’s not an effective tool for what it’s being claimed to be effective for,” said Chad Marlow, senior policy counsel at the American Civil Liberties Union. “But it actually produces real harms for students.” 

It’s always strange in a virtual setting — it’s like you’re watching yourself take the test in the mirror.

Ammon Wiemers, Principal Kings Peak High School

Wiemers is aware that the school, where about 280 students are enrolled full time and another 1,500 take courses part time, must make a delicate “compromise between a valid testing environment and students’ privacy.” When students are first subjected to the software he said “it’s kind of weird to see that a camera is watching,” but unlike the uproar at colleges, he said the monitoring has become “normalized” among his students and that anybody with privacy concerns is allowed to take their tests in person.

“It’s always strange in a virtual setting — it’s like you’re watching yourself take the test in the mirror,” he said. “But when students use it more, they get used to it.”  

Children ‘don’t take tests’

Late last year, Proctorio founder and CEO Mike Olsen published   in response to research critical of the company’s efficacy. A tech-savvy Ohio college student had conducted an analysis and concluded Proctorio’s relied on an open-source software library with a — including a failure to recognize Black faces more than half of the time. 
The student tested the company’s face-detection capabilities against a dataset of nearly 11,000 images, , which depicted people of multiple races and ethnicities, with results showing a failure to distinguish Black faces 57% of the time, Middle Eastern faces 41% of the time and white faces 40% of the time. Such a high failure rate was problematic for Proctorio, which relies on its ability to flag cheaters by zeroing in on people’s facial features and movements. 

Olsen’s post sought to discredit the research, arguing that while the FairFace dataset had been used to identify biases in other facial-detection algorithms, the images weren’t representative of “a live test-taker’s remote exam experience.” 

“For example,” he wrote, “children and cartoons don’t take tests so including those images as part of the data set is unrealistic and unrepresentative.” 

Proctorio founder and CEO Mike Olsen published a blog post that countered research claiming the remote proctoring tool had a high fail rate — especially for Black students. (Screenshot)

To Ian Linkletter, a librarian from Canada embroiled in a long-running battle with Proctorio over whether its products were harmful, Olsen’s response was baffling. Sure, cartoon characters don’t take tests. But children, he said, certainly do. What he wasn’t sure about, however, was whether those younger test-takers were being monitored by Proctorio — so he set out to find out. 

He found two instances, both in Texas, where Proctorio was being used in the K-12 setting, including at a remote school tied to the University of Texas at Austin. Linkletter shared his findings with The 74, which used the government procurement tool GovSpend to identify other districts that have contracts with Proctorio and its competitors. 

More than 100 K-12 school districts have relied on Proctorio and its competitors, according to the GovSpend data, with a majority of expenditures made during the height of the pandemic. And while remote learning has become a more integral part of K-12 schooling nationwide, seven districts have paid for remote proctoring services in the last year. While extensive, the GovSpend database doesn’t provide a complete snapshot of U.S. school districts or their expenditures. 

“It was just obvious that Proctorio had K-12 clients and were being misleading about children under 18 using their product,” Linkletter said, adding that young people could be more susceptible to the potential harms of persistent surveillance. “It’s almost like a human rights issue when you’re imposing it on students, especially on K-12 students.” Young children, he argued, are unable to truly consent to being monitored by the software and may not fully understand its potential ramifications. 

Proctorio did not respond to multiple requests for comment by The 74. Founded in 2013, claims it provided remote proctoring services during the height of the pandemic to education institutions globally. 

In 2020,  over a series of tweets in which the then-University of British Columbia learning technology specialist linked to Proctorio-produced YouTube videos, which the company had made available to instructors. Using the video on the tool’s “Abnormal Eye Movement function,” Linkletter that it showed “the emotional harm you are doing to students by using this technology.”

Proctorio’s lawsuit alleged that Linkletter’s use of the company’s videos, which were unlisted and could only be viewed by those with the link, amounted to copyright infringement and distributing of confidential material. In January, Canada’s Supreme Court Linkletter’s claim that the litigation was specifically designed to silence him.

While there is little independent research on the efficacy of any remote proctoring tools in preventing cheating, one 2021 study found that who had been instructed to cheat. Researchers concluded the software is “best compared to taking a placebo: It has some positive influence, not because it works but because people believe that it works, or that it might work.” 

Remote proctoring costs K-12 schools millions

A , the online K-12 school operated by the University of Texas, indicates that Proctorio is used for Credit by Exam tests, which award course credit to students who can demonstrate mastery in a particular subject. For students in kindergarten, first and second grade, the district pairs district proctoring with a “Proctorio Secure Browser,” which prohibits test takers from leaving the online exam to use other websites or programs. Beginning in third grade, according to the rubric uploaded to the school’s website, test takers are required to use Proctorio’s remote online proctoring.

A UT High School rubric explains how it uses Proctorio software. (Screenshot)

Proctorio isn’t the only remote proctoring tool in use in K-12 schools. GovSpend data indicate the school district in Las Vegas, Nevada, has spent more than $1.4 million since 2018 on contracts with Proctorio competitor Spending on Honorlock by the Clark County School District surged during the pandemic but as recently as October, it had a $286,000 company purchase. GovSpend records indicate the tool is used at , the district’s online-only program which claims more than 4,500 elementary, middle and high school students. Clark County school officials didn’t respond to questions about how Honorlock is being utilized. 

Meanwhile, dozens of K-12 school districts relied on the remote proctoring service ProctorU, now known as , during the pandemic, records indicate, with several maintaining contracts after school closures subsided. Among them is the rural Watertown School District in South Dakota, which spent $18,000 on the service last fall. 

Aside from Wiemers, representatives for schools mentioned in this story didn’t respond to interview requests or declined to comment. Meazure Learning and Honorlock didn’t respond to media inquiries. 

At TTU K-12, an online education program offered by Texas Tech University, the institution relies on Proctorio for “all online courses and Credit by Examinations,” flagging suspicious activity to teachers for review. In an apparent nod to Proctorio privacy concerns, TTU instructs students to select private spaces for exams and that if they are testing in a private home, they have to get the permission of anyone also residing there for the test to be recorded. 

Documents indicate that K-12 institutions continue to subject remote learners to room scans even after a federal judge ruled a university’s . In 2022, a federal judge sided with a Cleveland State University student, who alleged that a room scan taken before an online exam at the Ohio institution violated his Fourth Amendment rights against unreasonable searches and seizures. The judge ruled that the scan was “unreasonable,” adding that “room scans go where people otherwise would not, at least not without a warrant or an invitation.” 

Sign up for the School (in)Security newsletter.

Get the most critical news and information about students' rights, safety and well-being delivered straight to your inbox.

Marlow of the ACLU says he finds room scans particularly troubling — especially in the K-12 context. From an equity perspective, he said such scans could have disproportionately negative effects on undocumented students, those living with undocumented family members and students living in poverty. He expressed concerns that information collected during room scans could be used as evidence for immigration enforcement 

“There are two fairly important groups of vulnerable students, undocumented families and poor students, who may not feel that they can participate in these classes because they either think it’s legally dangerous or they’re embarrassed to use the software,” he said. 

The TTU web page notes that students “may be randomly asked to perform a room scan,” where they’re instructed to offer their webcam a 360-degree view of the exam environment with a warning: Failure to perform proper scans could result in a violation of exam procedures.

“If you’re using a desktop computer with a built-in webcam, it might be difficult to lift and rotate the entire computer,” the web page notes while offering a solution. “You can either rotate a mirror in front of the webcam or ask your instructor for further instruction.”

‘A legitimate concern’ 

Wiemers, the principal in Utah, said that Proctorio serves as a deterrent against cheating — but is far from foolproof. 

“There’s ways to cheat any software,” he said, adding that educators should avoid the urge to respond to Proctorio alerts with swift discipline. In the instances where Proctorio has caught students cheating, he said that instead of being given a failing grade, they’re simply asked to retake the test. 

“There are limitations to the software, we have to admit that, it’s not perfect, not even close,” he said. “But if we expect it to be, and the stakes are high and we’re overly punitive, I would say [students] have a legitimate concern.”

During a TTU K-12 advisory board meeting in July 2021, administrators outlined the extent that Proctorio is used during exams. Justin Louder, who at the time served as the TTU K-12 interim superintendent, noted that teachers and a “handful of administrators within my office” had access to view the recordings. Ensuring that third parties didn’t have access to the video feeds was “a big deal for us,” he said, because they’re “dealing with minors.” 

While college students “really kind of pushed back” on remote proctoring, he noted that they only received a few complaints from K-12 parents, who recognized the service offered scheduling benefits. Like Wiemers, he framed the issue as one of 24-hour convenience. 

“It lets students go at their own pace,” he said. “If they’re ready at 2 o’clock in the morning, they can test at 2 o’clock in the morning.”

Correction: A copyright infringement case brought by Proctorio against longtime company critic Ian Linkletter is still being argued in court. An earlier version of this story mischaracterized the litigation as being ruled in Proctorio’s favor.

]]>
Exclusive: Dems Urge Federal Action on Student Surveillance Citing Bias Fears /article/exclusive-dems-urge-federal-action-on-student-surveillance-citing-discrimination-fears/ Thu, 19 Oct 2023 18:01:00 +0000 /?post_type=article&p=716619 A coalition of Democratic lawmakers on Thursday called on the U.S. Education Department to investigate school districts that use digital surveillance and other artificial intelligence tools in ways that trample students’ civil rights. 

, the coalition expressed concerns that AI-enabled student monitoring tools could foster discrimination against marginalized groups, including LGBTQ+ youth and students with disabilities. The Education Department’s Office for Civil Rights should issue guidance on the appropriate uses of emerging classroom technologies, the lawmakers wrote, and crack down on practices that run afoul of existing federal anti-discrimination laws. 

“While the expansion of educational technology helped facilitate remote learning that was critical to students, parents and teachers during the pandemic,” the lawmakers wrote, “these technologies have also amplified student harms.” 


Get stories like these delivered straight to your inbox. Sign up for The 74 Newsletter


Lawmakers asked the Education Department’s civil rights office whether it has received complaints alleging discrimination facilitated by education technology software and whether it has taken any enforcement action related to potential civil rights violations. 

The letter comes in response to a recent national survey of educators, parents and students, the findings of which suggest that schools’ use of digital tools to monitor children online have based on their race, disability, sexual orientation and gender identity. The survey, conducted by the nonprofit Center for Democracy and Technology, found that while activity monitoring has become ubiquitous in schools and is intended to keep students safe, it’s used regularly as a discipline tool and routinely brings youth into contact with the police.

Findings from the CDT survey, lawmakers wrote, “raise serious concerns about the application of civil rights laws to schools’ use of these technologies.” Letter signatories include Democratic Reps. Lori Trahan of Massachusetts, Sara Jacobs of California, Hank Johnson of Georgia, Bonnie Watson Coleman of New Jersey and Adam Schiff of California. Trahan, who serves on the House Energy and Commerce Committee’s Innovation, Data and Commerce Subcommittee, has previously called for tighter student data privacy protections in the ed tech sector. 

The monitoring tools, such as those offered by for-profit companies GoGuardian and Gaggle, rely on artificial intelligence to sift through students’ online activities and flag school administrators — and sometimes the police — when they discover materials related to sex, drugs, violence or self-harm. 

Two-thirds of teachers reported that a student at their school was disciplined as a result of activity monitoring and a third said they know a student who was contacted by the police because of an alert generated by the software. 

Children with disabilities were more likely than their peers to report being watched, and special education teachers reported heightened rates of discipline as a result of activity monitoring. The findings, researchers argue, that entitle children with disabilities equal access to an education. Even beyond the technologies, students with disabilities are subjected to disproportionate levels of school discipline, including restraint and seclusion, when compared to their general education peers. 

Half of all students said their schools responded fairly to alerts generated by monitoring software, a sentiment shared by just 36% of LGBTQ+ youth. In fact, LGBTQ+ youth were more likely than their straight and cisgender peers to report that they or someone they know was disciplined as a result of monitoring. And nearly a third of LGBTQ+ youth reported that they or someone they know was outed because of the technology. 

More than a third of teachers said their school monitors students’ online behaviors outside of school hours — and sometimes on their personal devices. 

In a similar student survey, released this month by the American Civil Liberties Union, a majority of respondents expressed worries that the monitoring tools — despite being designed to keep them safe — could actually cause harm and a third said they “always feel” like they’re being watched. 

The 74 has reported extensively on schools’ use of digital surveillance tools to monitor students’ online behaviors, and the tools’ implications for youth civil rights. The company Gaggle previously flagged to administrators student communications that referenced LGBTQ+ keywords like “gay” and “lesbian.” The company says it halted the practice last year in the wake of pushback from civil rights activists. 

Given the survey findings, the lawmakers urged the Education Department to clarify “how educators can fulfill their civil rights obligations” as they develop policies related to artificial intelligence, whose rapidly evolving role in education more broadly — including students’ use of tools like ChatGPT — has become a topic of debate. 

“This research is particularly concerning due to linkages between school disciplinary policies and incarceration rates of our nation’s youth,” the coalition wrote, adding concerns that the tools can create hostile learning environments. 

]]>
White House Cautions Schools Against ‘Continuous Surveillance’ of Students /article/white-house-cautions-schools-against-continuous-surveillance-of-students/ Tue, 04 Oct 2022 21:38:35 +0000 /?post_type=article&p=697623 Updated, Oct. 5

The Biden administration on Tuesday urged school districts nationwide to refrain from subjecting students to “continuous surveillance” if the use of digital monitoring tools — already accused of targeting at-risk youth — are likely to trample students’ rights. 

The White House recommendation was included in an in-depth but non-binding white paper, dubbed the that seeks to rein in the potential harms of rapidly advancing artificial intelligence technologies, from smart speakers featuring voice assistants to campus surveillance cameras with facial recognition capabilities. 

The blueprint, which was released by the White House Office of Science and Technology Policy and extends far beyond the education sector, lays out five principles: Tools that rely on artificial intelligence should be safe and effective, avoid discrimination, ensure reasonable privacy protections, be transparent about their practices and offer the ability to opt out “in favor of a human alternative.”


Get stories like these delivered straight to your inbox. Sign up for The 74 Newsletter


Though the blueprint lacks enforcement, schools and education technology companies should expect greater federal scrutiny soon. In , the White House announced that the Education Department would release by early 2023 recommendations on schools’ use of artificial intelligence that “define specifications for the safety, fairness and efficacy of AI models used within education” and introduce “guardrails that build on existing education data privacy regulations.” 

During , Education Secretary Miguel Cardona said officials at the department “embrace utilizing Ed Tech to enhance learning” but recognize “the need for us to change how we do business.” The future guidance, he said, will focus on student data protections, ensuring that digital tools are free of biases and incorporate transparency so parents know how their children’s information is being used.

“This has to be baked into how we do business in education, starting with the systems that we have in our districts but also teacher preparation and teacher training as well,” he said.

Amelia Vance, president and founder of Public Interest Privacy Consulting, said the document amounts to a “massive step forward for the advocacy community, the scholars who have been working on AI and have been pressuring the government and companies to do better.” 

The blueprint, which offers a harsh critique of and systems that predict student success based on factors like poverty, follows in-depth reporting by The 74 on schools’ growing use of digital surveillance and the tech’s impact on student privacy and civil rights.

But local school leaders should ultimately decide whether to use digital student monitoring tools, said Noelle Ellerson Ng, associate executive director of advocacy and governance at AASA, The School Superintendents Association. Ellerson Ng opposes “unilateral federal action to prohibit” the software.

“That’s not the appropriate role of the federal government to come and say this cannot happen,” she said. “But smart guardrails that allow for good practices, that protect students’ safety and privacy, that’s a more appropriate role.”

The nonprofit Center for Democracy and Technology praised the report. The group recently released a survey highlighting the potential harms of student activity monitoring on at-risk youth, who are already disproportionately disciplined and referred to the police as a result. In a statement Tuesday, it said the blueprint makes clear “the ways in which algorithmic systems can deepen inequality.” 

“We commend the White House for considering the diverse ways in which discrimination can occur, for challenging inappropriate and irrelevant data uses and for lifting up examples of practical steps that companies and agencies can take to reduce harm,” CEO Alexandra Reeve Givens said in a media release. 

The document also highlights several areas where artificial intelligence has been beneficial, including improved agricultural efficiency and algorithms that have been used to identify diseases. But the technologies, which have grown rapidly with few regulations, have introduced significant harm, it notes, including that screen job applicants and facial recognition technology that . 

After the pandemic shuttered schools nationwide in early 2020 and pushed students into makeshift remote learning, companies that sell digital activity monitoring software to schools saw an increase in business. But the tools have faced significant backlash for subjecting students to relentless digital surveillance. 

In April, Massachusetts Sens. Elizabeth Warren and Ed Markey warned in a report the technology could carry significant risks — particularly for students of color and LGBTQ youth — and promoted a “need for federal action to protect students’ civil rights, safety and privacy.” Such concerns have become particularly acute as states implement new anti-LGBTQ laws and abortion bans and advocates warn that digital surveillance tools could expose expose youth to legal peril. 

Vance said that she and others focused on education and privacy “had no idea this was coming,” and that it would focus so heavily on schools. Over the last year, the department sought input from civil rights groups and technology companies, but Vance said that education groups had lacked a meaningful seat at the table. 

The lack of engagement was apparent, she said, by the document’s failure to highlight areas where artificial intelligence has been beneficial to students and schools. For example, the document discusses a tool used by universities to predict which students were likely to drop out. It considered students’ race as a predictive factor, leading to discrimination fears. But she noted that if implemented equitably, such tools can be used to improve student outcomes. 

“Of course there are a lot of privacy and equity and ethical landmines in this area,” Vance said. “But we also have schools who have done this right, who have done a great job in using some of these systems to assist humans in counseling students and helping more students graduate.” 

Ellerson Ng, of the superintendents association, said her group is still analyzing the blueprint’s on-the-ground implications, but that student data privacy efforts present schools with “a balancing act.”

“You want to absolutely secure the privacy rights of the child while understanding that the data that can be generated, or is generated, has a role to play, too, in helping us understand where kids are, what kids are doing, how a program is or isn’t working,” she said. “Sometimes that’s broader than just a pure academic indicator.”

Others have and just of recommendations from civil rights groups and tech companies. Some of the most outspoken privacy proponents and digital surveillance critics, such as Albert Fox Cahn, founder and executive director of the Surveillance Technology Oversight Project, argued it falls short of a critical policy move: outright bans.

As Cahn and other activists mount campaigns against student surveillance tools, they’ve highlighted how student data can wind up in the hands of the police.

“When police and companies are rolling out new and destructive forms of AI every day, we need to push pause across the board on the most invasive technologies,” he said in a media release. “While the White House does take aim at some of the worst offenders, they do far too little to address the everyday threats of AI, particularly in police hands.”

]]>
Trevor Project Severs Ties with Surveillance Company Accused of LGBTQ Youth Bias /article/trevor-project-teams-upith-student-surveillance-company-accused-of-lgbtq-bias/ Fri, 30 Sep 2022 11:00:00 +0000 /?post_type=article&p=697341 Updated 3:15 p.m. ET

Hours after the publication of this article Friday, The Trevor Project announced in a tweet it would return a $25,000 donation from the student surveillance company Gaggle, acknowledging widespread concerns about the monitoring tool’s “role in negatively impacting LGBTQ students.”

“Our philosophy is that having a seat at the table enables us to positively influence how companies engage with LGBTQ young people, and we initially agreed to work with Gaggle because we saw an opportunity to have a meaningful impact to better protect LGBTQ students,” the nonprofit said in the statement. “We hear and understand the concerns, and we hope to work alongside schools and institutions to ensure they are appropriately supporting LGBTQ youth and their mental health.” 

The move came after widespread condemnation on social media, with multiple supporters threatening to pull their donations to The Trevor Project moving forward. 

In a Friday statement, Gaggle spokesperson Paget Hetherington said the company wanted The Trevor Project’s “guidance on how to do what we do better.” The company also where it previously touted the partnership. 

“We’re disappointed that The Trevor Project has decided to pause our collaboration,” she said. “However, we are grateful for the opportunity we have had to learn and work with them and will continue with our mission of protecting all students regardless of how they identify.” 

Original report below:

Amid warnings from lawmakers and civil rights groups that digital surveillance tools could discriminate against at-risk students, a leading nonprofit devoted to the mental well-being of LGBTQ youth has formed a financial partnership with a tech company that subjects them to persistent online monitoring. 

, The Trevor Project, a high-profile nonprofit focused on suicide prevention among LGBTQ youth, began to list Gaggle as on its website, disclosing that the controversial surveillance company had given them between $25,000 and $50,000 in support. Meanwhile Gaggle, which uses artificial intelligence and human content moderators to sift through billions of student chat messages and homework assignments each year in search of students who may harm themselves or others, noting the two were collaborating to “improve mental health outcomes for LGBTQ young people.” 

Though the precise contours of the partnership remain unclear, a Trevor Project spokesperson said it aims to have a positive influence on the way Gaggle navigates privacy concerns involving LGBTQ youth while a Gaggle representative said the company sees the relationship as a learning opportunity.

Both groups maintain that the partnership was forged in the interests of LGBTQ students, but student privacy advocates argue the relationship could undermine The Trevor Project’s work while allowing Gaggle to use the donation to counter criticism about its potential harms to LGBTQ students. The collaboration comes at a particularly perilous time for many students as a rash of states implement new anti-LGBTQ laws that could erode their privacy and expose them to legal jeopardy. 

Teeth Logsdon-Wallace, a 14-year-old student from Minneapolis with first-hand experience of Gaggle’s surveillance dragnet, said the deal could eliminate any motivation for Gaggle to change its business practices. 

“It really does feel like a ‘We paid you, now say we’re fine,’ kind of thing,” said Logsdon-Wallace, who is transgender. Without any real incentives to implement reforms, he said that Gaggle’s “seal of approval” from The Trevor Project could offer the privately held company reputational cover amid growing concerns that such surveillance tech is disproportionately harmful to LGBTQ youth. 

“People who want to defend Gaggle can just point to their little Trevor Project thing and say, ‘See, they have the support of “The Gays” so it’s fine actually,’ and all it does is make it easier to deflect and defend actual issues with Gaggle.” 

A screenshot showing that Gaggle is a corporate partner of The Trevor Project
Student surveillance company Gaggle is listed among “Corporate Partners” on The Trevor Project’s website (screenshot)

Following an investigation by The 74 into Gaggle’s monitoring practices, the company . Gaggle’s algorithm relies on keyword matching to compare students’ online communications against a dictionary of thousands of words the company believes could indicate potential trouble, including references to violence, drugs and sex. Among the keywords are “gay” and “lesbian,” verbiage the company maintains is necessary because LGBTQ youth are more likely than their straight and cisgender peers to consider suicide. 

But privacy and civil rights advocates have accused the company of discrimination by subjecting LGBTQ youth to heightened surveillance — a concern that has taken on new meaning this year as states like Florida adopt laws that ban classroom discussions about sexuality and LGBTQ youth to their parents.  

A by the nonprofit Center for Democracy and Technology found that while Gaggle and similar student monitoring tools are designed to keep students safe, teachers reported that they were more often used to discipline them. LGBTQ youth were disproportionately affected. 

In a statement, a Trevor Project spokesperson said it’s important that digital monitoring tools keep students safe without invading their privacy and that the collaboration was built on Gaggle’s “desire to identify and address privacy and safety concerns that their product could cause for LGBTQ students.” 

“It’s true that LGBTQ youth are among the most vulnerable to the misuse of this kind of safety monitoring — many worry that these tools could out them to teachers or parents against their will,” the statement continued. “It is because of that very real concern that we have worked in a limited capacity with digital safety companies — to play an educational role and have a seat at the table so they can consider these potential risks while they design their products and develop policies.” 

But it remains unclear what policy changes have occurred at Gaggle as a result of the deal. Without offering any specifics, Gaggle spokesperson Paget Hetherington said in a statement the company is “honored to be able to align with The Trevor Project to better serve LGBTQ youth,” and that the company is “always looking for ways to learn and to improve upon what we do to better support students and keep them safe.” 

‘Faceless bureaucracy’ 

At its core, the partnership between Gaggle and The Trevor Project makes sense because both work to prevent youth suicides, said Amelia Vance, the founder and president of . But their approaches to solving the problem, she said, are fundamentally different. 

By combing through digital materials on students’ school-issued Microsoft and Google accounts, Gaggle seeks to alert educators — and in some cases the police — of students’ online behaviors that suggest they might harm themselves or others.

“It really is about collecting details that kids may not be voluntarily sharing — information that they may be looking up to learn, to explore their identities, to otherwise help them in their day-to-day lives,” Vance said. At The Trevor Project, “you have proactive outreach from youth who know that they need help or they need a community.” 

Katy Perry smiles in front of a Trevor Project background, holding a poster that says "Be proud of who you are."
Katy Perry poses for a photograph during a fundraising event for The Trevor Project in 2012. (Mark Davis/Getty Images for Trevor Project)

The West Hollywood-based Trevor Project, which and funding from including Macy’s and AT&T, was founded in 1998 and in contributions in 2020. Gaggle, founded in 1999, does not publicly report its finances. The Dallas-based company says it monitors the digital communications of more than 5 million students across more than 1,500 school districts nationally. 

The Trevor Project to train volunteer crisis counselors and assess the risk levels of people who reach out to for help. If counselors with The Trevor Project believe a student is at imminent suicide risk, to call the police. But it’s ultimately up to youth to decide which information they share with adults. 

It’s important for LGBTQ students to have trusting adults with whom they can confide their experiences, Vance said, rather than a system where “some faceless bureaucracy is finding out and informing your parents” about information they intended to keep private. 

A by The Trevor Project offers troubling data about the realities of the youth suicide crisis. Nearly half of LGBTQ youth said they seriously considered attempting suicide in the past year and 14% said they made a suicide attempt. 

This isn’t the first time The Trevor Project has faced scrutiny in recent months for its ties to companies that could have detrimental effects on LGBTQ youth. In July, a HuffPost investigation revealed that CEO and Executive Director Amit Paley previously and helped create a strategic plan to boost opioid sales amid an addiction epidemic — one that’s in suicide attempts among LGBTQ youth. 

The group knows firsthand how data can be weaponized. Just last month, that target the transgender community launched a campaign to clog up The Trevor Project’s suicide prevention hotline. 

Persistent student surveillance could exacerbate the challenges that LGBTQ youth face by subjecting them to disproportionate discipline and erroneously flagging their online communications as threats, Democratic Sens. Elizabeth Warren and Ed Markey warned in an April report

Nearly a third of LGBTQ students say they or someone they know has experienced the nonconsensual disclosure of their sexual orientation or gender identity — typically called “outing” — due to student activity monitoring, by the nonprofit Center for Democracy and Technology. They were also more likely than their straight and cisgender peers to report getting into trouble at school and being contacted by the police about having committed a crime. 

A bar chart showing LGBTQ+ students are more likely to get in trouble for visiting a website or saying something inappropriate online; were more likely to be contacted by counselors or other adults at school about their mental health; and were more likely to be contacted by a police officer or other adult due to concerns about them committing a crime.
A recent survey by the nonprofit Center for Democracy and Technology found that student monitoring tools have disproportionate negative effects on LGBTQ youth. (Center for Democracy and Technology) 

In response to the survey results, a coalition of civil rights groups called on the U.S. Education Department to condemn the use of activity monitoring tools that violate students’ civil liberties and to state its intent “to take enforcement action against violations that result in discrimination.” The letter argues that using the tools to out LGBTQ students or to subject them to disproportionate discipline and criminal investigations could violate Title IX, the federal law prohibiting sex-based discrimination in schools. 

Among the letter signatories is the nonprofit LGBT Tech, which about the harms of digital surveillance on LGBTQ people. Christopher Wood, the group’s co-founder and executive director, said The Trevor Project’s partnership with Gaggle could be positive if it’s used to ensure that LGBTQ youth who are struggling have access to help. But once Gaggle gives student information to school administrators, the company can no longer control how those records are used, he said. 

A screenshot from Gaggle's website. Gray box with text that says Gaggle is a Proud Sponsor of The Trevor Project.
Gaggle says on its website that the student surveillance company “is proud to collaborate with The Trevor Project and improve mental health outcomes for LGBTQ young people.” (Screenshot)

“If that information is provided to someone who is not accepting, who has very different views and who willfully brings their political, personal or religious views into the school system, and they are not supportive of LGBTQ youth, then what they’ve done is harm the student,” Wood said. 

Yet as schools increasingly turned to student activity monitoring software during the pandemic, The Trevor Project portrayed their growth as an inevitable result of districts seeking “to avoid liability issues.”  

“It is our stance that since these tools are not going anywhere, we think it’s important to do our part to offer our expertise around LGBTQ experiences,” the spokesperson said. 

A student holds up a peace sign with one hand and has the other wrapped around his dog
Minneapolis student Teeth Logsdon-Wallace poses with his dog Gilly. (Photo courtesy Alexis Logsdon)

The power of trust

In interviews, students flagged by Gaggle said their trust in adults suffered as a result. Among them is Logsdon-Wallace, the 14-year-old transgender student. Before the Minneapolis school district stopped using Gaggle this summer and state lawmakers put strict limits on digital surveillance in schools, the tool alerted district security when he used a classroom assignment to reflect on a previous suicide attempt and how music therapy helped him cope. That same assignment, which included references to his gender identity, was flagged to his parents. 

And while his parents are affirming, he has friends who live in less supportive environments.                                                                                                       

“I have friends who are queer and/or trans who are out at school but not to their parents,” he said. “If they want to be open with teachers, Gaggle can create a bad or even dangerous situation for these kids if their parents were contacted about what they were saying.” 

In The Trevor Project’s recent survey, nearly three-quarters of LGBTQ youth reported that they have endured discrimination based on their sexual orientation or gender identity, just 37% said their homes are affirming and 55% said the same about their schools. 

Given that reality, reported sharing information about their sexual orientation with teachers or guidance counselors. 

While Gaggle has maintained that keywords like “gay” and “lesbian” can also prevent bullying, Logsdon-Wallace said their approach is out of touch with how students generally interact. At school, he said he’s been called just about every “slur for a queer or a trans person that isn’t from like 80 years ago.” While slurs are common, terms like “lesbian” are not.

“As an actual teenager going to an actual public school, those words are not being used to bully people,” he said. “They’re just not.”

Sign up for the School (in)Security newsletter.

Get the most critical news and information about students' rights, safety and well-being delivered straight to your inbox.

]]>
Survey Reveals Extent that Cops Surveil Students Online — in School and at Home /article/survey-reveals-extent-that-cops-surveil-students-online-in-school-and-at-home/ Wed, 03 Aug 2022 04:01:00 +0000 /?post_type=article&p=694119 When Baltimore students sign into their school-issued laptops, the police log on, too. 

Since the pandemic began, Baltimore City Public Schools officials have with GoGuardian, a digital surveillance tool that promises to identify youth at risk of harming themselves or others. When GoGuardian flags students, their online activities are shared automatically with school police, giving cops a conduit into kids’ private lives — including on nights and weekends.


Get stories like these delivered straight to your inbox. Sign up for The 74 Newsletter


Such partnerships between schools and police appear startlingly widespread across the country with significant implications for youth, according to . Nearly all teachers — 89% — reported that digital student monitoring tools like GoGuardian are used in their schools. And nearly half — 44% — said students have been contacted by the police as a result of student monitoring. 

The pandemic has led to major growth in the number of schools that rely on activity monitoring software to uncover student references to depression and violent impulses. The tools, offered by a handful of tech companies, can sift through students’ social media posts, follow their digital movements in real-time and scan files on school-issued laptops — from classroom assignments to journal entries — in search of warning signs. 

Educators say the tools help them identify youth who are struggling and get them the mental health care they need at a time when youth depression and anxiety are spiraling. But the survey suggests an alternate reality: Instead of getting help, many students are being punished for breaking school rules. And in some cases, survey results suggest, students are being subjected to discrimination. 

The report raises serious questions about whether digital surveillance tools are the best way to identify youth in need of mental health care and whether police officers should be on the front lines in responding to such emergencies. 

“If we’re saying this is to keep students safe, but instead we’re using it punitively and we’re using it to invite law enforcement literally into kids’ homes, is this actually achieving its intended goal?” asked Elizabeth Laird, a survey author and the center’s director of equity in civic technology. “Or are we, in the name of keeping students safe, actually endangering them?”

Among teachers who use monitoring tools at their schools, 78% said the software has been used to flag students for discipline and 59% said kids wound up getting punished as a result. Yet just 45% of teachers said the software is used to identify violent threats and 47% said it is used to identify students at risk of harming themselves. 

Center for Democracy and Technology

The findings are a direct contradiction of the stated goal of student activity monitoring, Laird said. School leaders and company executives have long maintained that the tools are not a disciplinary measure but are designed to identify at-risk students before someone gets hurt.

The Supreme Court’s recent repeal of Roe v. Wade, she said, further muddles police officers’ role in student activity monitoring. As states implement anti-abortion laws, that data from student activity monitoring tools could help the police identify youth seeking reproductive health care. 

“We know that law enforcement gets these alerts,” she said. “If you are in a state where they are looking to investigate these kinds of incidents, you’ve invited them into a student’s house to be able to do that.”

A tale of discrimination

In Baltimore, counselors, principals and school-based police officers receive all alerts generated by GoGuardian during school hours, according to by The Real News Network, a nonprofit media outlet. Outside of school hours, including on weekends and holidays, the responsibility to monitor alerts falls on the police, the outlet reported, and on numerous occasions officers have shown up at students’ homes to conduct wellness checks. On , students have been transported to the hospital for emergency mental health care. 

In a statement to The 74, district spokesperson Andre Riley said that GoGuardian helps officials “identify potential risks to the safety of individual students, groups or schools,” and that “proper accountability measures are taken” if students violate the code of conduct or break laws.

“The use of GoGuardian is not simply a prompt for a law enforcement response,” Riley added.

Leading student surveillance companies, including GoGuardian, have maintained that their interactions with police are limited. In April, Democratic Sens. Elizabeth Warren and Ed Markey warned in a report that schools’ reliance on the tools could violate students’ civil rights and exacerbate “the school-to-prison pipeline by increasing law enforcement interactions with students.” Warren and Markey focused their report on four companies: GoGuardian, Gaggle, Securly and Bark. 

In , Gaggle executives said the company contacts law enforcement for wellness checks if they are unable to reach school-based emergency contacts and a child appears to be “in immediate danger.” In on the company’s website, school officials in Wichita Falls, Texas, Cincinnati, Ohio, and Miami, Florida, acknowledged contacting police in response to Gaggle alerts.

In some cases, school leaders ask Securly to contact the police directly and request they conduct welfare checks on students, the to lawmakers. Executives at Bark said “there are limited options” beyond police intervention if they identify a student in crisis but they cannot reach a school administrator. 

“While we have witnessed many lives saved by police in these situations, unfortunately many officers have not received training in how to handle such crises,” in its letter. “Irrespective of training there is always a risk that a visit from law enforcement can create other negative outcomes for a student and their family.” 

In its , GoGuardian states the company may disclose student information “if we believe in good faith that doing so is necessary or appropriate to comply with any law enforcement, legal or regulatory process.” 

Center for Democracy and Technology

Meanwhile, survey results suggest that student surveillance tools have a negative disparate impact on Black and Hispanic students, LGBTQ youth and those from low-income households. In a letter on Wednesday to coincide with the survey’s release, a coalition of education and civil rights groups called on the U.S. Department of Education to issue guidance warning schools that their digital surveillance practices could violate federal civil rights laws. Signatories include the American Library Association, the Data Quality Campaign and the American Civil Liberties Union.

“This is becoming a conversation not just about privacy, but about discrimination,” Laird said. “Without a doubt, we see certain groups of students having outsized experiences in being directly targeted.”

In a youth survey, researchers found that student discipline as a result of activity monitoring fell disproportionately along racial lines, with 48% of Black students and 55% of Hispanic students reporting that they or someone they knew got into trouble for something that was flagged by an activity monitoring tool. Just 41% of white students reported having similar experiences. 

Nearly a third of LGBTQ students said they or someone they know experienced nonconsensual disclosure of their sexual orientation or gender identity — often called outing — as a result of activity monitoring. LGBTQ youth were also more likely than straight and cisgender students to report getting into trouble at school and being contacted by the police about having committed a crime. 

Some student surveillance companies, like Gaggle, monitor references to words including “gay” and “lesbian,” a reality company founder and CEO Jeff Patterson has said was created to protect LGBTQ youth, who face a greater risk of dying by suicide. But survey results suggest the heightened surveillance comes with significant harm to youth, and Laird said if monitoring tools are designed with certain students in mind, such as LGBTQ youth, that in itself is a form of discrimination. 

Center for Democracy and Technology

In its letter to the Education Department’s Office for Civil Rights Wednesday, advocates said the disparities outlined in the survey run counter to federal laws prohibiting race-, sex- and disability-based discrimination. 

“Student activity monitoring is subjecting protected classes of students to increased discipline and interactions with law enforcement, invading their privacy, and creating hostile environments for students to express their true thoughts and authentic identities,” the letter states. 

The Education Department’s civil rights division, they said, should condemn surveillance practices that violate students’ civil rights and launch “enforcement action against violations that result in discrimination.”

Lawmakers consider youth privacy

The report comes at a moment of increasing alarm about student privacy online. In May, the Federal Trade Commission announced plans to crack down on tech companies that sell student data for targeted advertising and that “illegally surveil children when they go online to learn.” 

It also comes at a time of intense concern over students’ emotional and physical well-being. While the pandemic has led to a greater focus on youth mental health, the May mass school shooting in Uvalde, Texas, has sparked renewed school safety efforts. In June, President Joe Biden signed a law with modest new gun-control provisions and an influx of federal funding for student mental health care and campus security. The funds could lead to more digital student surveillance.

The results of the online survey, which was conducted in May and June, were likely colored by the Uvalde tragedy, researchers acknowledged. A majority of parents and students have a favorable view of student activity monitoring during school hours to protect kids from harming themselves or others, researchers found. But just 48% of parents and 30% of students support around-the-clock surveillance. 

“Schools are under a lot of pressure to find ways to keep students safe and, like in many aspects of our lives, they are considering the role of technology,” Laird said. 

Last week, the Senate designed to improve children’s safety online, including new restrictions on youth-focused targeted advertising. The effort comes a year after a showing that the social media app Instagram had a harmful effect on youth mental well-being, especially teenage girls. One bill, the Kids Online Safety Act, would require tech companies to identify and mitigate any potential harms their products may pose to children, including exposure to content that promotes self-harm, eating disorders and substance abuse.

Yet the legislation has faced criticism from privacy advocates, who argue it would mandate digital monitoring similar to that offered by student surveillance companies. Among critics is the Electronic Frontier Foundation, a nonprofit focused on digital privacy and free speech. 

“The answer to our lack of privacy isn’t more tracking,” the . The legislation “is a heavy-handed plan to force technology companies to spy on young people and stop them from accessing content that is ‘not in their best interest,’ as defined by the government, and interpreted by tech platforms.” 

Attorney Amelia Vance, the founder and president of Public Interest Privacy Consulting, said she worries the provisions will have a negative impact on at-risk kids, including LGBTQ students. Students from marginalized groups, she said, “will now be more heavily surveilled by basically every site on the internet, and that information will be available to parents” who could discipline teens for researching LGBTQ content. She said the legislation could force tech companies to censor content to avoid potential liability, essentially making them arbiters of community standards. 

“When you have conflicting values in the different jurisdictions that the companies operate in, oftentimes you end up with the most conservative interpretations, which right now is anti-LGBT,” she said.

]]>
Minneapolis Schools to Halt Controversial Student Surveillance Initiative /article/minneapolis-schools-to-halt-controversial-student-surveillance-initiative/ Mon, 27 Jun 2022 19:56:23 +0000 /?post_type=article&p=692269 The Minneapolis school district has announced plans to end its relationship with Gaggle, a controversial digital surveillance tool that monitored students’ online behaviors during pandemic-induced remote learning. 

The announcement, which follows extensive reporting by The 74 about how the tool subjected the city’s youth to pervasive round-the-clock digital surveillance, was outlined last week at the bottom of a newsletter alerting families to changes at the district. Gaggle, which uses artificial intelligence and human content moderators to track students’ online activities and notify district officials of “inappropriate behaviors or potential threats to self or others,” will no longer be used beginning on July 1, the district announced. 

A week after schools went remote in Minneapolis and nationally in March 2020, the district sidestepped typical procurement rules and used federal pandemic relief money to contract with Gaggle, a for-profit company that reported significant business growth when classes went online. The district has spent more than $355,000 on the tool, which monitors student behaviors on school-issued Google and Microsoft accounts, and has a contract with the company through September 2023. 

District officials said the tool saved lives but civil rights advocates and students targeted by the program have questioned its efficacy and accused the company of violating students’ privacy rights. 

In an email, district spokesperson Julie Schultz Brown attributed the change to “made in order to honor the terms of our new contract” with educators. Gaggle founder and CEO Jeff Patterson said the Minneapolis district will stop using the tool at a moment when “students across the United States are suffering.” In June, the company alerted Minneapolis officials to 15 “critical incidents” related to suicide, death threats, violence and drug use, Patterson wrote in a statement. Nationally, the pandemic has led to a surge in youth mental health issues and . 

A recent report by Democratic Sens. Elizabeth Warren and Ed Markey warned that Gaggle and similar services could surveil students inappropriately, compound racial disparities in school discipline and waste tax dollars. Gaggle claims it during the 2020-21 school year, yet independent research on the tool’s effectiveness doesn’t exist. 

Minneapolis student Teeth Logsdon-Wallace poses with his dog Gilly. (Photo courtesy Alexis Logsdon)

Teeth Logsdon-Wallace, a rising freshman in Minneapolis, saw the district’s decision to cut ties with Gaggle as a major victory. He became an outspoken Gaggle critic after a homework assignment, which discussed a previous suicide attempt and how he learned important coping skills, got flagged by the tool’s surveillance dragnet. Officials at Gaggle and the district said the tool helps identify students who are struggling emotionally and need adult intervention. But 14-year-old Logsdon-Wallace and other critics argue that digital surveillance is an inappropriate way to pinpoint students who need mental health care. Rather than helping, he said the experience “felt violating and gross.” 

“When you’re spying on kids and their stuff, especially about mental health stuff, they’re just going to be more secretive about it,” he said. “That can just cause more danger.”

While Gaggle relies on technology to ferret out students with issues like depression, Logsdon-Wallace said that he and other students are more likely to share their mental health struggles with adults at school if there’s a culture of trust. Monitoring communications through an algorithm and a team of low-paid remote workers who the students don’t even know, he said, had the opposite effect and left students more apprehensive about district computers, “which could be positive and negative.”

While his peers learned how to better protect their own privacy online “even when it’s inherently being violated,” he said, he worried that some may have been “bottling up mental health issues because of it.”

The district will no longer use Gaggle’s student activity monitoring tool or the company’s anonymous tip line, SpeakUp for Safety, which allows students to report potential safety threats confidentially. Instead of turning to SpeakUp, concerned parents and students should report issues to police officials with the state Bureau of Criminal Apprehension, the district wrote in its newsletter. 

District officials have said the anonymous tip line was central to its decision to contract with Gaggle, yet previous reporting by The 74 found that the service was rarely used. Meanwhile, the digital surveillance tool routinely flagged students who made references to sex, drugs and violence on district technology. An analysis of nearly 1,300 alerts found the service flagged Minneapolis students for discussing violent impulses, eating disorders, abuse at home and suicidal plans. 

But Gaggle regularly flagged benign student chatter and personal files, including classroom assignments, casual conversations between teens and sensitive journal entries. Gaggle flags students who use keywords related to sexual orientation including “gay” and “lesbian,” and on at least one occasion school officials in Minneapolis outed an LGBT student to their parents. The sheer volume of student communications that got flagged by Gaggle was at times overwhelming, the Minneapolis school district’s head of security acknowledged, but he also felt like he was able to save students from dying by suicide. 

In interviews with The 74, former content moderators at Gaggle — hundreds of whom are paid just $10 an hour on month-to-month contracts — raised serious questions about the company’s efficacy, its employment practices and its effect on students’ civil rights. 

Moderators said they received little training before they were given access to students’ sensitive materials and were pressured to prioritize speed over quality. They also reported insufficient safeguards to protect students’ sensitive files, including nude selfies. Patterson acknowledged that moderators, who work remotely with little supervision or oversight, could easily save copies of students’ nude photographs and share them on the dark web. 

As a transgender teenager who believes the school district has done too little to address bullying, Logsdon-Wallace said he already had little trust in district leaders. While Gaggle didn’t address the abuse from peers, having his sensitive experiences caught in the company’s algorithm made the situation worse.

“The very little trust I had in the administration is just destroyed,” he said. “You can’t expect students to trust you if you’ve done nothing to earn that trust.”

]]>
Gaggle Surveils Millions of Kids in the Name of Safety. Targeted Families Argue it’s ‘Not That Smart’ /article/gaggle-surveillance-minnesapolis-families-not-smart-ai-monitoring/ Tue, 12 Oct 2021 11:15:00 +0000 /?post_type=article&p=578988 In the midst of a pandemic and a national uprising, Teeth Logsdon-Wallace was kept awake at night last summer by the constant sounds of helicopters and sirens. 

For the 13-year-old from Minneapolis who lives close to where George Floyd was murdered in May 2020, the pandemic-induced isolation and social unrest amplifed his transgender dysphoria, emotional distress that occurs when someone’s gender identity differs from their sex assigned at birth. His billowing depression landed him in the hospital after an attempt to die by suicide. During that dark stretch, he spent his days in an outpatient psychiatric facility, where therapists embraced music therapy. There, he listened to a punk song on loop that promised how  

Eventually they did. 


Get stories like these delivered straight to your inbox. Sign up for The 74 Newsletter


Logsdon-Wallace, a transgender eighth-grader who chose the name Teeth, has since “graduated” from weekly therapy sessions and has found a better headspace, but that didn’t stop school officials from springing into action after he wrote about his mental health. In a school assignment last month, he reflected on his suicide attempt and how the punk rock anthem by the band Ramshackle Glory helped him cope — intimate details that wound up in the hands of district security. 

In a classroom assignment last month, Minneapolis student Teeth Logsdon-Wallace explained how the Ramshackle Glory song “Your Heart is a Muscle the Size of Your Fist” helped him cope after an attempt to die by suicide. In the assignment, which was flagged by the student surveillance company Gaggle, Logsdon-Wallace wrote that the song was “a reminder to keep on loving, keep on fighting and hold on for your life.” (Photo courtesy Teeth Logsdon-Wallace)

The classroom assignment was one of thousands of Minneapolis student communications that got flagged by Gaggle, a digital surveillance company that saw rapid growth after the pandemic forced schools into remote learning. In an earlier investigation, The 74 analyzed nearly 1,300 public records from Minneapolis Public Schools to expose how Gaggle subjects students to relentless digital surveillance 24 hours a day, seven days a week, raising significant privacy concerns for more than 5 million young people across the country who are monitored by the company’s digital algorithm and human content moderators. 

But technology experts and families with first-hand experience with Gaggle’s surveillance dragnet have raised a separate issue: The service is not only invasive, it may also be ineffective. 

While the system flagged Logsdon-Wallace for referencing the word “suicide,” context was never part of the equation, he said. Two days later, in mid-September, a school counselor called his mom to let her know what officials had learned. The meaning of the classroom assignment — that his mental health had improved — was seemingly lost in the transaction between Gaggle and the school district. He felt betrayed. 

 “I was trying to be vulnerable with this teacher and be like, ‘Hey, here’s a thing that’s important to me because you asked,” Logsdon-Wallace said. “Now, when I’ve made it clear that I’m a lot better, the school is contacting my counselor and is freaking out.”

Jeff Patterson, Gaggle’s founder and CEO, said in a statement his company does not “make a judgement on that level of the context,” and while some districts have requested to be notified about references to previous suicide attempts, it’s ultimately up to administrators to “decide the proper response, if any.”  

‘A crisis on our hands’

Minneapolis Public Schools first contracted with Gaggle in the spring of 2020 as the pandemic forced students nationwide into remote learning. Through AI and the content moderator team, Gaggle tracks students’ online behavior everyday by analyzing materials on their school-issued Google and Microsoft accounts. The tool scans students’ emails, chat messages and other documents, including class assignments and personal files, in search of keywords, images or videos that could indicate self-harm, violence or sexual behavior. The remote moderators evaluate flagged materials and notify school officials about content they find troubling. 

In Minneapolis, Gaggle flagged students for keywords related to pornography, suicide and violence, according to six months of incident reports obtained by The 74 through a public records request. The private company also captured their journal entries, fictional stories and classroom assignments. 

Gaggle executives maintain that the system saves lives, including those of during the 2020-21 school year. Those figures have not been independently verified. Minneapolis school officials make similar assertions. Though the pandemic’s effects on suicide rates remains fuzzy, suicide has been a leading cause of death among teenagers for years. Patterson, who has watched his business during COVID-19, said Gaggle could be part of the solution. Though not part of its contract with Minneapolis schools, the company recently launched a service that connects students flagged by the monitoring tool with teletherapists. 

“Before the pandemic, we had a crisis on our hands,” he said. “I believe there’s a tsunami of youth suicide headed our way that we are not prepared for.” 

Schools nationwide have increasingly relied on technological tools that purport to keep kids safe, yet there’s to back up their claims.

Minneapolis student Teeth Logsdon-Wallace poses with his dog Gilly. (Photo courtesy Alexis Logsdon)

Like many parents, Logsdon-Wallace’s mother Alexis Logsdon didn’t know Gaggle existed until she got the call from his school counselor. Luckily, the counselor recognized that Logsdon-Wallace was discussing events from the past and offered a measured response. His mother was still left baffled. 

“That was an example of somebody describing really good coping mechanisms, you know, ‘I have music that is one of my soothing activities that helps me through a really hard mental health time,’” she said. “But that doesn’t matter because, obviously, this software is not that smart — it’s just like ‘Woop, we saw the word.’” 

‘Random and capricious’

Many students have accepted digital surveillance as an inevitable reality at school, according to a new survey by the Center for Democracy and Technology  in Washington, D.C. But some youth are fighting back, including Lucy Dockter, a 16-year-old junior from Westport, Connecticut. On multiple occasions over the last several years, Gaggle has flagged her communications — an experience she described as “really scary.”

“If it works, it could be extremely beneficial. But if it’s random, it’s completely useless.”
Lucy Dockter, 16, Westport, Connecticut student mistakenly flagged by Gaggle

On one occasion, Gaggle sent her an email notification of “Inappropriate Use” while she was walking to her first high school biology midterm and her heart began to race as she worried what she had done wrong. Dockter is an editor of her high school’s literary journal and, according to her, Gaggle had ultimately flagged profanity in students’ fictional article submissions. 

“The link at the bottom of this email is for something that was identified as inappropriate,” Gaggle warned in its email while pointing to one of the fictional articles. “Please refrain from storing or sharing inappropriate content in your files.” 

Gaggle emailed a warning to Connecticut student Lucy Dockter for profanity in a literary journal article. (Photo courtesy Lucy Dockter)

But Gaggle doesn’t catch everything. Even as she got flagged when students shared documents with her, the articles’ authors weren’t receiving similar alerts, she said. And neither did Gaggle’s AI pick up when she wrote about the discrepancy in where she included a four-letter swear word to make a point. In the article, which Dockter wrote with Google Docs, she argued that Gaggle’s monitoring system is “random and capricious,” and could be dangerous if school officials rely on its findings to protect students. 

Her experiences left the Connecticut teen questioning whether such tracking is even helpful. 

“With such a seemingly random service, that doesn’t seem to — in the end — have an impact on improving student health or actually taking action to prevent suicide and threats” she said in an interview. “If it works, it could be extremely beneficial. But if it’s random, it’s completely useless.”

Lucy Dockter

Some schools have asked Gaggle to email students about the use of profanity, but Patterson said the system has an error that he blamed on the tech giant Google, which at times “does not properly indicate the author of a document and assigns a random collaborator.”

“We are hoping Google will improve this functionality so we can better protect students,” Patterson said. 

Back in Minneapolis, attorney Cate Long said she became upset when she learned that Gaggle was monitoring her daughter on her personal laptop, which 10-year-old Emmeleia used for remote learning. She grew angrier when she learned the district didn’t notify her that Gaggle had identified a threat. 

This spring, a classmate used Google Hangouts, the chat feature, to send Emmeleia a death threat, warning she’d shoot her “puny little brain with my grandpa’s rifle.”

Minneapolis mother Cate Long said a student used Google Hangouts to send a death threat to her 10-year-old daughter Emmeleia. Officials never informed her about whether Gaggle had flagged the threat. (Photo courtesy Cate Long)

When Long learned about the chat, she notified her daughter’s teacher but was never informed about whether Gaggle had picked up on the disturbing message as well. Missing warning signs could be detrimental to both students and school leaders; districts if they fail to act on credible threats.

“I didn’t hear a word from Gaggle about it,” she said. “If I hadn’t brought it to the teacher’s attention, I don’t think that anything would have been done.” 

The incident, which occurred in April, fell outside the six-month period for which The 74 obtained records. A Gaggle spokesperson said the company picked up on the threat and notified district officials an hour and a half later but it “does not have any insight into the steps the district took to address this particular matter.” 

Julie Schultz Brown, the Minneapolis district spokeswoman, said that officials “would never discuss with a community member any communication flagged by Gaggle.” 

“That unrelated but concerned parent would not have been provided that information nor should she have been,” she wrote in an email. “That is private.” 

Cate Long poses with her 10-year-old daughter Emmeleia. (Photo courtesy Cate Long)

‘The big scary algorithm’

When identifying potential trouble, Gaggle’s algorithm relies on keyword matching that compares student communications against a dictionary of thousands of words the company believes could indicate potential issues. The company scans student emails before they’re delivered to their intended recipients, said Patterson, the CEO. Files within Google Drive, including Docs and Sheets, are scanned as students write in them, he said. In one instance, the technology led to the arrest of a 35-year-old Michigan man who tried to send pornography to an 11-year-old girl in New York, . Gaggle prevented the file from ever reaching its intended recipient.  

Though the company allows school districts to alter the keyword dictionary to reflect local contexts, less than 5 percent of districts customize the filter, Patterson said. 

That’s where potential problems could begin, said Sara Jordan, an expert on artificial intelligence and senior researcher at the in Washington. For example, language that students use to express suicidal ideation could vary between Manhattan and rural Appalachia, she said.

“We’re using the big scary algorithm term here when I don’t think it applies,” This is not Netflix’s recommendation engine. This is not Spotify.”
Sara Jordan, AI expert and senior researcher, Future of Privacy Forum

Sara Jordan

On the other hand, she noted that false-positives are highly likely, especially when the system flags common swear words and fails to understand context. 

“You’re going to get 25,000 emails saying that a student dropped an F-bomb in a chat,” she said. “What’s the utility of that? That seems pretty low.” 

She said that Gaggle’s utility could be impaired because it doesn’t adjust to students’ behaviors over time, comparing it to Netflix, which recommends television shows based on users’ ever-evolving viewing patterns. “Something that doesn’t learn isn’t going to be accurate,” she said. For example, she said the program could be more useful if it learned to ignore the profane but harmless literary journal entries submitted to Dockter, the Connecticut student. Gaggle’s marketing materials appear to overhype the tool’s sophistication to schools, she said. 

“We’re using the big scary algorithm term here when I don’t think it applies,” she said. “This is not Netflix’s recommendation engine. This is not Spotify. This is not American Airlines serving you specific forms of flights based on your previous searches and your location.” 

“Artificial intelligence without human intelligence ain’t that smart.”
Jeff Patterson, Gaggle founder and CEO

Patterson said Gaggle’s proprietary algorithm is updated regularly “to adjust to student behaviors over time and improve accuracy and speed.” The tool monitors “thousands of keywords, including misspellings, slang words, evolving trends and terminologies, all informed by insights gleaned over two decades of doing this work.” 

Ultimately, the algorithm to identify keywords is used to “narrow down the haystack as much as possible,” Patterson said, and Gaggle content moderators review materials to gauge their risk levels. 

“Artificial intelligence without human intelligence ain’t that smart,” he said. 

In Minneapolis, officials denied that Gaggle infringes on students’ privacy and noted that the tool only operates within school-issued accounts. The district’s internet use policy states that students should “expect only limited privacy,” and that the misuse of school equipment could result in discipline and “civil or criminal liability.” District leaders have also cited compliance with the Clinton-era which became law in 2000 and requires schools to monitor “the online activities of minors.” 

Patterson suggested that teachers aren’t paying close enough attention to keep students safe on their own and “sometimes they forget that they’re mandated reporters.” On the , Patterson says he launched the company in 1999 to provide teachers with “an easy way to watch over their gaggle of students.” Legally, teachers are mandated to report suspected abuse and neglect, but Patterson broadens their sphere of responsibility and his company’s role in meeting it. As technology becomes a key facet of American education, Patterson said that schools “have a moral obligation to protect the kids on their digital playground.” 

But Elizabeth Laird, the director of equity in civic technology at the Center for Democracy and Technology, argued the federal law was never intended to mandate student “tracking” through artificial intelligence. In fact, the statute includes a disclaimer stating it shouldn’t be “construed to require the tracking of internet use by any identifiable minor or adult user.” In , her group urged the government to clarify the Children’s Internet Protection Act’s requirements and distinguish monitoring from tracking individual student behaviors. 

Sen. Elizabeth Warren, a Democrat from Massachusetts, agrees. In recent letters to Gaggle and other education technology companies, Warren and other Democratic lawmakers said they’re concerned the tools “may extend beyond” the law’s intent “to surveil student activity or reinforce biases.” Around-the-clock surveillance, they wrote, demonstrates “a clear invasion of student privacy, particularly when students and families are unable to opt out.” 

“Escalations and mischaracterizations of crises may have long-lasting and harmful effects on students’ mental health due to stigmatization and differential treatment following even a false report,” the senators wrote. “Flagging students as ‘high-risk’ may put them at risk of biased treatment from physicians and educators in the future. In other extreme cases, these tools can become analogous to predictive policing, which are notoriously biased against communities of color.”

A new kind of policing

Shortly after the school district piloted Gaggle for distance learning, education leaders were met with an awkward dilemma. Floyd’s murder at the hands of a Minneapolis police officer prompted Minneapolis Public Schools to sever its ties with the police department for school-based officers and replace them with district security officers who lack the authority to make arrests. Gaggle flags district security when it identifies student communications the company believes could be harmful. 

Some critics have compared the surveillance tool to a new form of policing that, beyond broad efficacy concerns, could have a disparate impact on students of color, similar to traditional policing. to suffer biases. 

Matt Shaver, who taught at a Minneapolis elementary school during the pandemic but no longer works for the district, said he was concerned that could be baked into Gaggle’s algorithm. Absent adequate context or nuance,  he worried the tool could lead to misunderstandings. 

Data obtained by The 74 offer a limited window into Gaggle’s potential effects on different student populations. Though the district withheld many details in the nearly 1,300 incident reports, just over 100 identified the campuses where the involved students attended school. An analysis of those reports failed to identify racial discrepancies. Specifically, Gaggle was about as likely to issue incident reports in schools where children of color were the majority as it was at campuses where most children were white. It remains possible that students of color in predominantly white schools may have been disproportionately flagged by Gaggle or faced disproportionate punishment once identified. Broadly speaking, Black students are far more likely to be suspended or arrested at school than their white classmates, according to federal education data. 

Gaggle and Minneapolis district leaders acknowledged that students’ digital communications are forwarded to police in rare circumstances. The Minneapolis district’s internet use policy explains that educators could contact the police if students use technology to break the law and a document given to teachers about the district’s Gaggle contract further highlights the possibility of law enforcement involvement. 

Jason Matlock, the Minneapolis district’s director of emergency management, safety and security, said that law enforcement is not a “regular partner,” when responding to incidents flagged by Gaggle. It doesn’t deploy Gaggle to get kids into trouble, he said, but to get them help. He said the district has interacted with law enforcement about student materials flagged by Gaggle on several occasions, but only in cases related to child pornography. Such cases, he said, often involve students sharing explicit photographs of themselves. During a six-month period from March to September 2020, Gaggle flagged Minneapolis students more than 120 times for incidents related to child pornography, according to records obtained by The 74.

Jason Matlock, the director of emergency management, safety and security at the Minneapolis school district, discusses the decision to partner with Gaggle as students moved to remote learning during the pandemic. (Screenshot)

“Even if a kid has put out an image of themselves, no one is trying to track them down to charge them or to do anything negative to them,” Matlock said, though it’s unclear if any students have faced legal consequences. “It’s the question as to why they’re doing it,” and to raise the issue with their parents.

Gaggle’s keywords could also have a disproportionate impact on LGBTQ children. In three-dozen incident reports, Gaggle flagged keywords related to sexual orientation including “gay, and “lesbian.” On at least one occasion, school officials outed an LGBTQ student to their parents, according to

Logsdon-Wallace, the 13-year-old student, called the incident “disgusting and horribly messed up.” 

“They have gay flagged to stop people from looking at porn, but one, that is going to be mostly targeting people who are looking for gay porn and two, it’s going to be false-positive because they are acting as if the word gay is inherently sexual,” he said. “When people are just talking about being gay, anything they’re writing would be flagged.” 

The service could also have a heavier presence in the lives of low-income families, he added, who may end up being more surveilled than their affluent peers. Logsdon-Wallace said he knows students who rely on school devices for personal uses because they lack technology of their own. Among the 1,300 Minneapolis incidents contained in The 74’s data, only about a quarter were reported to district officials on school days between 8 a.m. and 4 p.m.

“That’s definitely really messed up, especially when the school is like ‘Oh no, no, no, please keep these Chromebooks over the summer,’” an invitation that gave students “the go-ahead to use them” for personal reasons, he said.

“Especially when it’s during a pandemic when you can’t really go anywhere and the only way to talk to your friends is through the internet.”

]]>
Dems Warn School Surveillance Tools Could Compound ‘Risk of Harm for Students’ /article/democratic-lawmakers-demand-student-surveillance-companies-outline-business-practices-warn-the-security-tools-may-compound-risk-of-harm-for-students/ Mon, 04 Oct 2021 20:41:00 +0000 /?post_type=article&p=578691 Updated, Oct. 5

A group of Democratic lawmakers has demanded that several education technology companies that monitor children online explain their business practices, arguing that around-the-clock digital surveillance demonstrates “a clear invasion of student privacy, particularly when students and families are unable to opt out.”

In to last week, Democratic Sens. Elizabeth Warren, Ed Markey and Richard Blumenthal asked them to explain steps they’re taking to ensure the tools aren’t “unfairly targeting students and perpetuating discriminatory biases,” and comply with federal laws. The letters went to executives at Gaggle, Securly, GoGuardian and Bark Technologies, each of which use artificial intelligence to analyze students’ online activities and identify behaviors they believe could be harmful.


Get stories like these delivered straight to your inbox. Sign up for The 74 Newsletter


“Education technology companies have developed software that are advertised to protect student safety, but may instead be surveilling students inappropriately, compounding racial disparities in school discipline and draining resources from more effective student supports,” the lawmakers wrote in the letters. Though the tools are marketed as student safety solutions — and grew rapidly as schools shifted to remote learning during the pandemic — there’s . Some critics, including the lawmakers, argue they may do more harm than good. “The use of these tools may break down trust within schools, prevent students from accessing critical health information and discourage students from reaching out to adults for help, potentially increasing the risk of harm for students,” the senators wrote.

The letters cited a recent investigation by The 74, which outlined how Gaggle’s AI-driven surveillance tool and human content moderators subject children to relentless digital surveillance long after classes end for the day, including on weekends, holidays, late at night and over the summer. In Minneapolis, the company notified school security when it identified students who made references to suicide, self-harm and violence. But it also analyzed students’ classroom assignments, journal entries, chats with friends and fictional stories.

Each of the companies offer differing levels of remote student surveillance. Gaggle, for example, analyzes emails, chat messages and digital files on students’ school-issued Google and Microsoft accounts. Other services include students’ social media accounts and web browsing history, among other activities.

The letters were particularly critical of the tools’ capacity to track student behaviors 24/7 — including when students are at home — and their ability to monitor students on their personal devices in some cases.

Schools’ use of digital monitoring tools has become commonplace in recent years. More than 80 percent of teachers reported using the tools, according to a recent survey by the Center for Democracy and Technology. Among those who participated in the survey, nearly a third reported that they monitor student activity at all hours of the day and just a quarter said it was limited to school hours.

“Because of the lack of transparency, many students and families are unaware that nearly all of their children’s online behavior is being tracked,” according to the letters. “When students and families are aware, they are often unable to opt out because school-issued devices are given to students with the software already installed, and many students rely on these devices for remote or at-home learning.”

A Securly spokesperson said in an email the company is “reviewing the correspondence received” by the lawmakers and is in the process of responding to their requests for information. He said the company is “deeply committed to continuously evolving our technology” to help schools protect students online. A Gaggle spokesperson said the company appreciates the lawmakers’ interest in learning how the tool “serves as an early warning system to help school districts prevent tragedies such as suicide, acts of violence, child pornography and other dangerous situations.” A GoGuardian spokesman said the company cares “deeply about keeping students safe and protecting their privacy.”

Bark officials didn’t respond to requests for comment.

The Clinton-era , passed in 2000, requires schools to filter and monitor students’ internet use to ensure they aren’t accessing material that is “harmful to minors,” such as pornography. Student privacy advocates have long argued that a newer generation of AI-driven tools go beyond the law’s scope and have urged federal officials to clarify its requirements. The law includes a disclaimer noting that it does not “require the tracking of internet use by any identifiable minor or adult user.” It “remains an open question” as to whether schools’ use of digital tools to monitor students at home violates Fourth Amendment protections against unreasonable searches and seizures, according to a by the Future of Privacy Forum.

In their letters, senators highlighted how digital surveillance tools could perpetuate several educational inequities. For example, the tools could have a disproportionate impact on students of color and further uphold longstanding racial disparities in student discipline.

“School disciplinary measures have a long history of disproportionately targeting students of color, who face substantially more punitive discipline than their white peers for equivalent offenses,” according to the letters. “These disciplinary records, even when students are cleared, may have life-long harmful consequences for students.”

Meanwhile, the tools may have a larger impact on low-income students who rely on school technology to access the internet than those who can afford personal computers. Elizabeth Laird, the director of equity in civic technology at the Center for Democracy and Technology, said their research “revealed a worrisome lack of transparency” around how these educational technology companies track students online and how schools rely on their tools.

“Responses to this letter will help shine a light on these tools and strategies to mitigate the risks to students, especially those who are most reliant on school-issued devices,” she said in an email.

]]>
Report: Most Parents, Teachers Support Student Surveillance Tech /article/new-research-most-parents-and-teachers-have-accepted-student-surveillance-as-a-safety-tool-but-see-the-potential-for-serious-harm/ Tue, 21 Sep 2021 16:30:00 +0000 /?post_type=article&p=577984 Tools that monitor students’ online behavior have become ubiquitous in U.S. schools — and grew rapidly as the pandemic closed campuses nationwide — but a majority of parents and teachers believe the benefits of such digital surveillance outweigh the risks, .

Similarly, half of students said they are comfortable with schools’ use of monitoring software while a quarter reported feeling queasy about the idea, according to the new research by the Center for Democracy and Technology, a nonprofit group based in Washington, D.C. Despite their overall comfort with digital software, teachers, parents and students each worried about how the tools could have detrimental side effects. Specifically, many parents and teachers were concerned that digital surveillance could be used to discipline students and young people reported becoming more reserved when they knew they were being watched.


Get stories like these delivered straight to your inbox. Sign up for The 74 Newsletter


“In response to the pandemic, the focus on technology and its use has never been greater,” said report co-author Elizabeth Laird, the center’s director of equity in civic technology. As tech gains a greater grasp on education, she said it’s important for school leaders and policymakers to remain focused on protecting students’ individual rights. She worried that student surveillance technology could have a damaging impact on students, especially youth of color and those from low-income households.

“I don’t think it’s a slam dunk,” Laird said.

Though the report didn’t highlight specific tools used, schools deploy a range of digital monitoring software to track student activity, including programs that block online material deemed inappropriate, track when students log into school applications, and allow teachers to view students’ screens in real-time and even take control of their computers.

Last week, an investigative report by The 74 exposed how the Minneapolis school district’s use of the digital surveillance tool Gaggle had subjected children to relentless online surveillance long after classes ended for the day — including inside students’ homes. Through artificial intelligence and a team of content moderators, Gaggle tracks the online behaviors of millions of students across the U.S. every day by sifting through data stored on their school-issued Google and Microsoft accounts. In Minneapolis, the company flagged school security when moderators believed students could harm themselves or others, but it also picked up students’ classroom assignments, journal entries, chats with friends and fictional stories.

Among teachers surveyed by the Center for Democracy and Technology, 81 percent said their schools use software that tracks students’ computer activity, including to block obscene material, monitor students’ screens in real time and prohibit students from using websites unrelated to school like YouTube. A majority of both parents and students reported such tools were used in their schools, but they were also more likely than teachers to be unsure about whether youth were being actively monitored by educators. In interviews with administrators, researchers found that many school leaders weren’t sure how best to be transparent with families about their monitoring practices.

“Certainly there is an imbalance in information and transparency around what is happening,” Laird said. School districts have been clear [that] students shouldn’t have an expectation of privacy but they haven’t been as clear about what they are tracking, how they are tracking it, how long they keep that information. They really should be doing that.”

Four-fifths of surveyed teachers said their schools used digital tools to track students online. Both parents and students were more unlikely than teachers to be unsure whether such tools were in use in their schools. (Photo courtesy Center for Democracy and Technology)

Among teachers, 66 percent said the benefits of activity monitoring outweigh student privacy concerns and 62 percent of parents reached a similar conclusion. Meanwhile, 78 percent of teachers reported that digital surveillance helps keep students safe by identifying problematic online behaviors and 72 percent said it helps keep students on task. But their answers also revealed equity concerns: 71 percent of teachers reported that monitoring software is applied to all students equally, 51 percent worried that it could come with unintended consequences like “outing” LGBTQ students and 49 percent said it violates students’ privacy.

Many teachers reported that such monitoring tools are used on students long after classes end for the day. In total, 30 percent of educators said the tools are active “all of the time,” and 16 percent said the software tracks kids on their personal devices.

Nearly a third of teachers who reported their schools use digital services like Gaggle to track students online said the tools monitor youth behaviors 24 hours a day. (Photo by Center for Democracy and Technology)

Among parents, 75 percent said digital surveillance helps keep students safe and 73 percent said it ensures children remain focused on schoolwork. Yet many parents also reported potential downsides: 61 percent worried of long-term harm if the tools were used to discipline students, 51 percent were concerned about unintended consequences and 49 percent said it violates students’ privacy rights.

Perhaps unsurprisingly, students were less at ease with educators watching their online behaviors. Half said they were comfortable with monitoring tools, a quarter said they were uncomfortable with them and another quarter were unsure.

The data also suggest that students alter their behaviors as a result of being watched: 58 percent said they don’t share their true thoughts or ideas online as a result of being monitored at school and 80 percent said they were more careful about what they search online. While just 39 percent of students said it was unfair that educators monitored their school-issued services, 74 percent opposed the surveillance of their own devices like their cell phones. are among those that could track students’ behaviors on their own technology.

The data raise significant equity concerns. For many students, school-issued devices are their only method of connectivity.

“The privacy and security of personal devices is a luxury not all can afford,” Alexandra Givens, the center’s president and CEO, said in a press release. “Constant online monitoring — especially of students who cannot afford or don’t have access to personal devices — risks creating disparities in the ways student privacy is protected nationwide.”

To reach its findings, researchers conducted online surveys in June that were completed by 1,001 teachers, 1,663 parents and 420 high school students. Researchers also conducted interviews with school administrators to understand their motives in deploying digital surveillance. Among the justifications is a federal law that requires schools to monitor students online. But the law also includes a disclaimer noting that the statute does not “require the tracking of internet use by any identifiable minor or adult user.”

Understanding context is critical, Laird said, adding that the law’s authors hadn’t fully envisioned a world where students could be surveilled by artificial intelligence long after classes end for the day.

“What was happening at the time was students were in a school computer lab for part of the day and monitoring meant having an adult walking around a computer lab and physically looking at what was on students’ computer monitors,” she said. But today, she said the statute is being interpreted very differently.

In response, the center, along with the American Civil Liberties Union and the Center for Learner Equity Tuesday to clarify the law’s stipulations and inform educators it “does not require broad, invasive and constant surveillance of students’ lives online.”

“Systemic monitoring of online activity can reveal sensitive information about students’ personal lives, such as their sexual orientation, or cause a chilling effect on their free expression, political organizing, or discussion of sensitive issues such as mental health,” the letter continued. “These harms likely fall disproportionately on already vulnerable, over-policed and over-disciplined communities.”

]]>