Student Privacy – The 74 America's Education News Source Tue, 11 Jun 2024 00:52:43 +0000 en-US hourly 1 https://wordpress.org/?v=6.1.1 /wp-content/uploads/2022/05/cropped-74_favicon-32x32.png Student Privacy – The 74 32 32 Room Scans & Eye Detectors: Robocops are Watching Your Kids Take Online Exams /article/room-scans-eye-detectors-robocops-are-watching-your-kids-take-online-exams/ Thu, 18 Apr 2024 10:15:00 +0000 /?post_type=article&p=725432

Remote proctoring tools like Proctorio have faced widespread pushback at colleges. Less scrutiny and awareness exists on their use in K-12 schools.

Updated, correction appended April 18

In the middle of night, students at Utah’s Kings Peak High School are wide awake — taking mandatory exams. 

At this online-only school, which opened during the pandemic and has ever since, students take tests from their homes at times that work best with their schedules. Principal Ammon Wiemers says it’s this flexibility that attracts students — including athletes and teens with part-time jobs — from across the state. 

“Students have 24/7 access but that doesn’t mean the teachers are going to be there 24/7,” Wiemers told The 74 with a chuckle. “Sometimes [students] expect that but no, our teachers work a traditional 8 to 4 schedule.” 

Any student who feels compelled to cheat while their teacher is sound asleep, however, should know they’re still being watched. 

For students, the cost of round-the-clock convenience is their privacy. During exams, their every movement is captured on their computer’s webcam and scrutinized by Proctorio, . Proctorio software conducts “desk scans” in a bid to catch test-takers who turn to “unauthorized resources,” “face detection” technology to ensure there isn’t anybody else in the room to help and “gaze detection” to spot anybody “looking away from the screen for an extended period of time.” 

Proctorio then provides visual and audio records to Kings Peak teachers with the algorithm calling particular attention to pupils whose behaviors during the test flagged them as possibly engaging in academic dishonesty. 

Such remote proctoring tools grew exponentially during the pandemic, particularly at U.S. colleges and universities where administrators seeking to ensure exam integrity during remote learning met with sharp resistance from students. Online end the surveillance regime; the tools of and that set off a red flag when the tool failed to detect Black students’ faces.  

A video uploaded to TikTok offers advice on how to cheat during exams that are monitored by Proctorio. (Screenshot)

At the same time, social media platforms like TikTok were flooded with videos purportedly highlighting service vulnerabilities that taught others

K-12 schools’ use of remote proctoring tools, however, has largely gone under the radar. Nearly a year since the federal public health emergency expired and several since the vast majority of students returned to in-person learning, an analysis by The 74 has revealed that K-12 schools nationwide — and online-only programs in particular — continue to use tools from digital proctoring companies on students, including those as young as kindergarten. 

Previously unreleased survey results from the nonprofit Center for Democracy and Technology found that remote proctoring in K-12 schools has become widespread. In its August 2023 36% of teachers reported that their school uses the surveillance software.

Civil rights activists, who contend AI proctoring tools fail to work as intended, harbor biases and run afoul of students’ constitutional protections, said the privacy and security concerns are particularly salient for young children and teens, who may not be fully aware of the monitoring or its implications. 

“It’s the same theme we always come back to with student surveillance: It’s not an effective tool for what it’s being claimed to be effective for,” said Chad Marlow, senior policy counsel at the American Civil Liberties Union. “But it actually produces real harms for students.” 

It’s always strange in a virtual setting — it’s like you’re watching yourself take the test in the mirror.

Ammon Wiemers, Principal Kings Peak High School

Wiemers is aware that the school, where about 280 students are enrolled full time and another 1,500 take courses part time, must make a delicate “compromise between a valid testing environment and students’ privacy.” When students are first subjected to the software he said “it’s kind of weird to see that a camera is watching,” but unlike the uproar at colleges, he said the monitoring has become “normalized” among his students and that anybody with privacy concerns is allowed to take their tests in person.

“It’s always strange in a virtual setting — it’s like you’re watching yourself take the test in the mirror,” he said. “But when students use it more, they get used to it.”  

Children ‘don’t take tests’

Late last year, Proctorio founder and CEO Mike Olsen published   in response to research critical of the company’s efficacy. A tech-savvy Ohio college student had conducted an analysis and concluded Proctorio’s relied on an open-source software library with a — including a failure to recognize Black faces more than half of the time. 
The student tested the company’s face-detection capabilities against a dataset of nearly 11,000 images, , which depicted people of multiple races and ethnicities, with results showing a failure to distinguish Black faces 57% of the time, Middle Eastern faces 41% of the time and white faces 40% of the time. Such a high failure rate was problematic for Proctorio, which relies on its ability to flag cheaters by zeroing in on people’s facial features and movements. 

Olsen’s post sought to discredit the research, arguing that while the FairFace dataset had been used to identify biases in other facial-detection algorithms, the images weren’t representative of “a live test-taker’s remote exam experience.” 

“For example,” he wrote, “children and cartoons don’t take tests so including those images as part of the data set is unrealistic and unrepresentative.” 

Proctorio founder and CEO Mike Olsen published a blog post that countered research claiming the remote proctoring tool had a high fail rate — especially for Black students. (Screenshot)

To Ian Linkletter, a librarian from Canada embroiled in a long-running battle with Proctorio over whether its products were harmful, Olsen’s response was baffling. Sure, cartoon characters don’t take tests. But children, he said, certainly do. What he wasn’t sure about, however, was whether those younger test-takers were being monitored by Proctorio — so he set out to find out. 

He found two instances, both in Texas, where Proctorio was being used in the K-12 setting, including at a remote school tied to the University of Texas at Austin. Linkletter shared his findings with The 74, which used the government procurement tool GovSpend to identify other districts that have contracts with Proctorio and its competitors. 

More than 100 K-12 school districts have relied on Proctorio and its competitors, according to the GovSpend data, with a majority of expenditures made during the height of the pandemic. And while remote learning has become a more integral part of K-12 schooling nationwide, seven districts have paid for remote proctoring services in the last year. While extensive, the GovSpend database doesn’t provide a complete snapshot of U.S. school districts or their expenditures. 

“It was just obvious that Proctorio had K-12 clients and were being misleading about children under 18 using their product,” Linkletter said, adding that young people could be more susceptible to the potential harms of persistent surveillance. “It’s almost like a human rights issue when you’re imposing it on students, especially on K-12 students.” Young children, he argued, are unable to truly consent to being monitored by the software and may not fully understand its potential ramifications. 

Proctorio did not respond to multiple requests for comment by The 74. Founded in 2013, claims it provided remote proctoring services during the height of the pandemic to education institutions globally. 

In 2020,  over a series of tweets in which the then-University of British Columbia learning technology specialist linked to Proctorio-produced YouTube videos, which the company had made available to instructors. Using the video on the tool’s “Abnormal Eye Movement function,” Linkletter that it showed “the emotional harm you are doing to students by using this technology.”

Proctorio’s lawsuit alleged that Linkletter’s use of the company’s videos, which were unlisted and could only be viewed by those with the link, amounted to copyright infringement and distributing of confidential material. In January, Canada’s Supreme Court Linkletter’s claim that the litigation was specifically designed to silence him.

While there is little independent research on the efficacy of any remote proctoring tools in preventing cheating, one 2021 study found that who had been instructed to cheat. Researchers concluded the software is “best compared to taking a placebo: It has some positive influence, not because it works but because people believe that it works, or that it might work.” 

Remote proctoring costs K-12 schools millions

A , the online K-12 school operated by the University of Texas, indicates that Proctorio is used for Credit by Exam tests, which award course credit to students who can demonstrate mastery in a particular subject. For students in kindergarten, first and second grade, the district pairs district proctoring with a “Proctorio Secure Browser,” which prohibits test takers from leaving the online exam to use other websites or programs. Beginning in third grade, according to the rubric uploaded to the school’s website, test takers are required to use Proctorio’s remote online proctoring.

A UT High School rubric explains how it uses Proctorio software. (Screenshot)

Proctorio isn’t the only remote proctoring tool in use in K-12 schools. GovSpend data indicate the school district in Las Vegas, Nevada, has spent more than $1.4 million since 2018 on contracts with Proctorio competitor Spending on Honorlock by the Clark County School District surged during the pandemic but as recently as October, it had a $286,000 company purchase. GovSpend records indicate the tool is used at , the district’s online-only program which claims more than 4,500 elementary, middle and high school students. Clark County school officials didn’t respond to questions about how Honorlock is being utilized. 

Meanwhile, dozens of K-12 school districts relied on the remote proctoring service ProctorU, now known as , during the pandemic, records indicate, with several maintaining contracts after school closures subsided. Among them is the rural Watertown School District in South Dakota, which spent $18,000 on the service last fall. 

Aside from Wiemers, representatives for schools mentioned in this story didn’t respond to interview requests or declined to comment. Meazure Learning and Honorlock didn’t respond to media inquiries. 

At TTU K-12, an online education program offered by Texas Tech University, the institution relies on Proctorio for “all online courses and Credit by Examinations,” flagging suspicious activity to teachers for review. In an apparent nod to Proctorio privacy concerns, TTU instructs students to select private spaces for exams and that if they are testing in a private home, they have to get the permission of anyone also residing there for the test to be recorded. 

Documents indicate that K-12 institutions continue to subject remote learners to room scans even after a federal judge ruled a university’s . In 2022, a federal judge sided with a Cleveland State University student, who alleged that a room scan taken before an online exam at the Ohio institution violated his Fourth Amendment rights against unreasonable searches and seizures. The judge ruled that the scan was “unreasonable,” adding that “room scans go where people otherwise would not, at least not without a warrant or an invitation.” 

Sign up for the School (in)Security newsletter.

Get the most critical news and information about students' rights, safety and well-being delivered straight to your inbox.

Marlow of the ACLU says he finds room scans particularly troubling — especially in the K-12 context. From an equity perspective, he said such scans could have disproportionately negative effects on undocumented students, those living with undocumented family members and students living in poverty. He expressed concerns that information collected during room scans could be used as evidence for immigration enforcement 

“There are two fairly important groups of vulnerable students, undocumented families and poor students, who may not feel that they can participate in these classes because they either think it’s legally dangerous or they’re embarrassed to use the software,” he said. 

The TTU web page notes that students “may be randomly asked to perform a room scan,” where they’re instructed to offer their webcam a 360-degree view of the exam environment with a warning: Failure to perform proper scans could result in a violation of exam procedures.

“If you’re using a desktop computer with a built-in webcam, it might be difficult to lift and rotate the entire computer,” the web page notes while offering a solution. “You can either rotate a mirror in front of the webcam or ask your instructor for further instruction.”

‘A legitimate concern’ 

Wiemers, the principal in Utah, said that Proctorio serves as a deterrent against cheating — but is far from foolproof. 

“There’s ways to cheat any software,” he said, adding that educators should avoid the urge to respond to Proctorio alerts with swift discipline. In the instances where Proctorio has caught students cheating, he said that instead of being given a failing grade, they’re simply asked to retake the test. 

“There are limitations to the software, we have to admit that, it’s not perfect, not even close,” he said. “But if we expect it to be, and the stakes are high and we’re overly punitive, I would say [students] have a legitimate concern.”

During a TTU K-12 advisory board meeting in July 2021, administrators outlined the extent that Proctorio is used during exams. Justin Louder, who at the time served as the TTU K-12 interim superintendent, noted that teachers and a “handful of administrators within my office” had access to view the recordings. Ensuring that third parties didn’t have access to the video feeds was “a big deal for us,” he said, because they’re “dealing with minors.” 

While college students “really kind of pushed back” on remote proctoring, he noted that they only received a few complaints from K-12 parents, who recognized the service offered scheduling benefits. Like Wiemers, he framed the issue as one of 24-hour convenience. 

“It lets students go at their own pace,” he said. “If they’re ready at 2 o’clock in the morning, they can test at 2 o’clock in the morning.”

Correction: A copyright infringement case brought by Proctorio against longtime company critic Ian Linkletter is still being argued in court. An earlier version of this story mischaracterized the litigation as being ruled in Proctorio’s favor.

]]>
Virginia Probe Finds Systemic Privacy Violations after Fairfax Data Release /article/virginia-probe-finds-systemic-privacy-violations-after-fairfax-data-release/ Mon, 26 Feb 2024 20:32:51 +0000 /?post_type=article&p=722962 The Fairfax County Public Schools, Virginia’s largest district, has a systemwide problem protecting students’ privacy, the state education agency said Friday, calling for additional training of staff it said were either “not aware of the precautions that should be taken” or weren’t “sensitized” to the issues.

The finding stems from a complaint brought by a Fairfax parent and special education advocate  in December after she inadvertently received data on roughly 35,000 students, including special education records, confidential legal memos and mental health conditions. The 74 first reported the disclosure Nov. 1. The records included full names of students involved in lawsuits against the district over alleged sexual assault complaints and those seeing counselors for issues such as suicidal thoughts and depression.

The 180,000-student district has until March 25 to appeal the state’s finding or complete a “corrective action plan” that includes some steps the district has already agreed to, such as additional staff training.


Get stories like these delivered straight to your inbox. Sign up for The 74 Newsletter


That training, however, was supposed to begin Oct. 31, according to the district’s response to an earlier complaint from the same parent. But during a with a parents group, a district official acknowledged the training had yet to start . 

“That is going to be launched fairly shortly,” said Dawn Schaefer, who oversees special education complaints for the district. “I don’t have an exact launch date, but I can certainly check.” 

In its decision, the state noted the district’s failure to address the repeated violations.

“A perfect policy is of no use if people ignore it,” wrote Patricia Haymes, the director of dispute resolution at the Virginia Department of Education. “Perfect procedures are meaningless if no one follows them.”

Haymes ordered the district to provide a list of all students affected by the disclosure and to verify that their parents have been notified. The district must also submit monthly progress on its implementation of recommendations of the Superintendent Michelle Reid launched following The 74’s reporting. The state noted the article in its response to the district.

The state’s finding backs up what some Fairfax parents have been saying for years — that district staff members have a pattern of sharing confidential emails and student records with the wrong parents and educators. Experts praised the state for pushing for additional training, but one questioned whether the requirements go far enough, calling them “fairly lackluster.” 

“I don’t know that the families harmed will feel like this is sufficient oversight of the issue,” said Amelia Vance, president of the Public Interest Privacy Center. “Trust has been breached between the community and the district, and more is necessary to fix this.”

Nonetheless, she gave Fairfax’s superintendent credit for being transparent about the district’s mistake and promptly issuing an apology. The district declined to comment on the outcome of the state complaint.

‘A bigger Band-Aid’

Virginia officials previously accepted the district’s assurances that the disclosures were isolated incidents. In mid-December, a state hearing officer said “a series of mistakes” doesn’t necessarily add up to a “systemic violation.” 

The state has “always said it’s a one-off. They operate as if each incident is a silo,” said Callie Oettinger, the parent who gained access to the unredacted records in mid-October when she went to a high school to examine files on her own two children. She made the request under the federal , or FERPA, which gives parents the right to examine their children’s education records.

Pointing to larger concerns in the district, her complaint noted “overlapping” privacy violations that officials were already investigating between March and mid-November last year, including the large October records release and a November incident in which Robinson Secondary School, a seventh through 12th grade school, mailed students’ report cards to the wrong parents. 

Oettinger called the remedy “a bigger Band-Aid” compared with steps the district already agreed to take, including lawyers signing off on record requests before they are released to parents. 

But Todd Reid, a spokesman for the state education department, called the corrective action plan an “intensive requirement of both federal and state special education law” to ensure districts make improvements within a specific time frame. 

‘Not letting it slide’

Another privacy expert blamed these types of mistakes on the “convergence” of more student data, new technologies and parents who want access to records electronically. Steve Smith, founder of the , a national network, said the district should be using systems that “reduce the likelihood of inadvertent sharing.”

But, he added, the backlash from parents can force a district to take better precautions. 

“These things becoming public and the school community losing confidence probably has more impact than a warning from the FERPA office or the state,” he said. “I applaud parents for not letting it slide.”

]]>
Leaked Active School Shooter Plans Revive Scrutiny of Ed Tech Privacy Pledge /article/leaked-active-school-shooter-plans-revive-scrutiny-of-ed-tech-privacy-pledge-2/ Fri, 02 Feb 2024 11:01:00 +0000 /?post_type=article&p=721486 A security lapse at a leading school safety company that exposed millions of sensitive records online — including districts’ active-shooter response plans, students’ medical records and court documents about child abuse — has revived criticism that an industry student privacy pledge fails to police bad actors.

In response to an inquiry by The 74, the nonprofit Future of Privacy Forum said last week it would review Raptor Technologies’ status as a Student Privacy Pledge signatory after a maintained by the company were readily available without any encryption protection despite Raptor’s claims that it scrambles its data. 

“We are reviewing the details of Raptor Technologies’ leak to determine if the company has violated its Pledge commitments,” David Sallay, the Washington-based group’s director of youth and education privacy, said in a Jan. 24 statement. “A final decision about the company’s status as Pledge signatory, including, if applicable, potential referrals to the [Federal Trade Commission] and relevant State Attorneys General, is expected within 30 days.” 

Should the privacy forum choose to take action, Raptor would become just the second-ever education technology company to be removed from the pledge. 

Texas-based , which counts roughly 40% of U.S. school districts as its customers, offers an extensive suite of software designed to improve campus safety, including a tool that screens visitors’ government-issued identification cards against sex offender registries, a management system that helps school leaders prepare for and respond to emergencies, and a threat assessment tool that allows educators to report if they notice “something a bit odd about a student’s behavior” that they believe could become a safety risk. This means, according to a Raptor guide, that the company collects data on kids who appear ‘unkempt or hungry,” withdrawn from friends, to engage in self-harm, have poor concentration or struggle academically. 
Rather than keeping students safe, however, cybersecurity researcher Jeremiah Fowler said the widespread data breach threatened to put them in harm’s way. And as cybersecurity experts express concerns about , they’ve criticized the Student Privacy Pledge for lackluster enforcement in lieu of regulations and minimum security standards. 

Fowler, a cybersecurity researcher at and a self-described “data breach hunter,” has been tracking down online vulnerabilities for a decade. The Raptor leak is “probably the most diverse set of documents I’ve ever seen in one database,” he said, including information about campus surveillance cameras that didn’t work, teen drug use and the gathering points where students were instructed to meet in the event of a school shooting. 

vpnMentor in December and Fowler said the company was responsive and worked quickly to fix the problem. The breach wasn’t the result of a hack and there’s no evidence that the information has fallen into the hands of threat actors, though Fowler in the last several months. 

The situation could have grown far more dire without Fowler’s audit. 

“The real danger would be having the game plan of what to do when there is a situation,” like an active shooting, Fowler said in an interview with The 74. “It’s like playing in the Super Bowl and giving the other team all of your playbooks and then you’re like, ‘Hey, how did we lose?’”

David Rogers, Raptor’s chief marketing officer, said last week the company is conducting an investigation to determine the scope of the breached data to ensure “that any individuals whose personal information could have been affected are appropriately notified.” 

“Our security protocols are rigorously tested, and in light of recent events, we are committed to further enhancing our systems,” Rogers said in a statement. “We take this matter incredibly seriously and will remain vigilant, including by monitoring the web for any evidence that any data that has been in our possession is being misused.” 

‘Maybe this is a pattern’

Raptor is currently among more than 400 companies that , a self-regulatory effort designed to ensure education technology vendors are ethical stewards of the sensitive information they collect about children. 

Raptor and the other companies have vowed against selling students’ personally identifiable information or using it for targeted advertising, among other commitments. They also agreed to “maintain a comprehensive security program that is reasonably designed to protect the security, confidentiality and integrity” of student’s personal information against unauthorized or unintended disclosure. Cybersafeguards, the pledge notes, should be “appropriate to the sensitivity of the information.” 

Raptor touts its pledge commitment on its website, where it notes the company takes “great care and responsibility to both support the effective use of student information and safeguard student privacy and information security.” The company that it ensures “the highest levels of security and privacy of customer data,” including encryption “both at rest and in-transit,” meaning that data is scrambled into an unusable format without a password while it is being stored on servers and while it’s being moved between devices or networks. 

Sign up for the School (in)Security newsletter.

Get the most critical news and information about students' rights, safety and well-being delivered straight to your inbox.

Its , however, offers a more proscribed assurance, saying the company takes “reasonable” measures to protect sensitive data, but that it cannot guarantee that such information “will be protected against unauthorized access, loss, misuse or alterations.” 

Districts nationwide have spent tens of millions of dollars on Raptor’s software, according to GovSpend, a government procurement database. Recent customers include the school districts in Dallas, Texas, Broward County, Florida, and Rochester, New York. Under , education technology companies that collect student data are required to maintain a cybersecurity program that includes data encryption and controls to ensure that personally identifiable information doesn’t fall into the hands of unauthorized actors. 

Countering Raptor’s claims that data were encrypted, Fowler told The 74 the documents he accessed “were just straight-up PDFs, they didn’t have any password protections on them,” adding that the files could be found by simply entering their URLs into a web browser. 

Officials at the Rochester school district didn’t respond to requests for comment about whether they had been notified about the breach and its effects on their students or if they were aware that Raptor may not have been in compliance with state encryption requirements. 

Doug Levin, the national director of the nonprofit K12 Security Information eXchange, said the Raptor blunder is reminiscent of a 2022 data breach at the technology vendor Illuminate Education, which exposed the information of at least 3 million students nationwide, including 820,000 current and former New York City students. Levin noted that both companies claimed their data was encrypted at rest and in transit — “except maybe it wasn’t.” 

A decade after the privacy pledge was introduced, he said “it falls far short of offering the regulatory and legal protections students, families and educators deserve.”

“How can educators know if a company is taking security seriously?” Levin asked. Raptor “said all of the right things on their website about what they were doing and, yet again, it looks like a company wasn’t forthright. And so, maybe this is a pattern.” 

State data breach rules have long focused on personal information, like Social Security numbers, that could be used for identity theft and other financial crimes. But the consequences of data breaches like the one at Raptor, Fowler said, could be far more devastating — and could harm children for the rest of their lives. He noted the exposure of health records, which could violate federal privacy law, could be exploited for various forms of fraud. Discipline reports and other sensitive information, including about student sexual abuse victims, could be highly embarrassing or stigmatizing. 

Meanwhile, he said the exposure of confidential records about physical security infrastructure in schools, and district emergency response plans, could put kids in physical danger. 

Details about campus security infrastructure have been exploited by bad actors in the past. After Minneapolis Public Schools fell victim to a ransomware attack last February that led to a large-scale data breach, an investigation by The 74 uncovered reams of campus security records, including campus blueprints that revealed the locations of surveillance cameras, instructions on how to disarm a campus alarm system and maps that documented the routes that children are instructed to take during an emergency evacuation. The data can be tracked down with little more than a Google search. 

“I’ve got a 14-year-old daughter and when I’m seeing these school maps I’m like, ‘Oh my God, I can see where the safe room is, I can see where the keys are, I can see the direction they are going to travel from each classroom, where the meetup points are, where the police are going to be,” Fowler said of the Raptor breach. “That’s the part where I was like, ‘Oh my God, this literally is the blueprint for what happens in the event of a shooting.” 

‘Sweep it under the rug’

The Future of Privacy Forum’s initial response to the Raptor breach mirrors the nonprofit’s actions after the 2022 data breach at Illuminate Education, which was previously listed among the privacy pledge signatories and became the first-ever company to get stripped of the designation. 

The forum’s decision to remove Illuminate followed an article in The 74, where student privacy advocates criticized it for years of failures to enforce its pledge commitments — and accused it of being a tech company-funded effort to thwart government regulations. 

The pledge, which was created by the privacy forum in partnership with the Software and Information Industry Association, a technology trade group, was created in 2014, placing restrictions on the ways ed tech companies could use the data they collect about K-12 students. 

Along with stripping Illuminate of its pledge signatory designation, the forum referred it to the Federal Trade Commission, which the nonprofit maintains can hold companies accountable to their commitments via consumer protection rules that prohibit unfair and deceptive business practices. The company was also referred to the state attorneys general in New York and California to “consider further appropriate action.” It’s unclear if regulators took any actions against Illuminate. The FTC and the California attorney general’s office didn’t respond to requests for comment. The New York attorney general’s office is reviewing the Illuminate breach, a spokesperson said. 

“Publicly available information appears to confirm that Illuminate Education did not encrypt all student information” in violation of several Pledge provisions, Forum CEO Jules Polonetsky told The 74 at the time. Among them is a commitment to “maintain a comprehensive security program” that protects students’ sensitive information” and to “comply with applicable laws,” including New York’s  “explicit data encryption requirement.” 

After the breach and before it was removed from the pledge, the Software and Information Industry Association recognized Illuminate with the sector’s equivalent of an Oscar. 

Raptor isn’t the only pledge signatory to fall victim to a recent data breach. In December, a cybersecurity researcher disclosed a security vulnerability at Education Logistics, commonly known as EduLog, which offers a GPS tracking system to give parents real-time information about the location of their children’s school buses. A statement the forum provided The 74 didn’t mention whether it had opened an inquiry into whether EduLog had failed to comply with the pledge commitments. 

Despite the forum’s actions against Illuminate Education, and its new inquiry into Raptor, the pledge continues to face criticism for having little utility, including from Fowler, who likened it to “virtue signaling” that can be quickly brushed aside. 

“Pledges are just that, they’re like, ‘Hey, that sounds good, we’ll agree to it until it no longer fits our business model,” he said. “A pledge is just like, “whoops, our bad,” a little bit of bad press and you just sweep it under the rug and move on.” 

Chad Marlow, a senior policy counsel at the American Civil Liberties Union focused on privacy and surveillance issues, offered a similar perspective. Given the persistent threat of data breaches and a growing number of cyberattacks on the K-12 sector, Marlow said that schools should take a hard look at the amount of data that they and their vendors collect about students in the first place. He said Raptor’s early intervention system, which seeks to identify children who pose a potential threat to themselves or others, is an unproven surveillance system that could become a vector for student discrimination in the name of keeping them safe. 

Although he said he has “a great deal of admiration” for the privacy forum and the privacy pledge goals, it falls short on accountability when compared to regulations that mandate compliance.

“Sometimes pledges like this, which are designed to make a little bit of progress, actually do the opposite because it allows companies to point to these pledges and say, ‘Look, we are committed to doing better,’ when in fact, they’re using the pledge to avoid being told to do better,” he said. “That’s what we need, not people saying, ‘On scout’s honor I’ll do X.’”  

Disclosure: The Bill & Melinda Gates Foundation and the Chan Zuckerberg Initiative provide financial support to the Future of Privacy Forum and The 74.

]]>
Alleged Rape Victim Presses Va.’s Fairfax Schools for Answers on Records Leak /article/alleged-rape-victim-presses-virginias-fairfax-schools-for-answers-on-records-disclosure/ Mon, 27 Nov 2023 16:01:00 +0000 /?post_type=article&p=718089 A former Fairfax County Public Schools student who accuses the Virginia district of ignoring allegations that she was repeatedly raped, tortured and threatened when she was in middle school is demanding to know how officials accidentally revealed her identity last month. 

In a federal court motion filed Nov. 14 that cited The 74’s exclusive reporting, attorney Andrew Brenner described the disclosure as “at best, careless,” particularly after the former student won a legal battle against the district for her right to remain anonymous. Brenner asked the U.S. District Court for the Eastern District of Virginia to compel Fairfax to explain how her name ended up in documents released as part of a records request that had nothing to do with her case.

A hearing on the motion is set for Dec. 15.

Known as B.R., the woman is as well as the former students she alleges sexually assaulted her in 2011, with a trial set to begin in March. The motion asks for the names of all district employees involved in producing the materials that identified her as well as the district’s steps “to collect, review, compile and transmit the documents” prior to their release.


Get stories like these delivered straight to your inbox. Sign up for The 74 Newsletter


The district’s response to the motion could provide insight into how unredacted records on tens of thousands of students were released to a parent and special education advocate. The documents included sensitive, confidential information such as grades, disability status and mental health conditions.

Following The 74’s report, the district apologized and launched an investigation. A firm with expertise in cybersecurity — — is handling the probe, but some parents with children named in the disclosure said so far, no one has contacted them. Superintendent Michelle Reid said in she will share a summary of the investigation once it’s complete.

Callie Oettinger, the parent who received the records, went to her local high school in mid-October to examine what she thought were records pertaining to her own two children. Her son, who received special education services in the district, has since graduated, and her daughter is still in high school. She copied computer files onto thumb drives as a paralegal observed and helped her identify some of the records. 

While most of the documents set aside for her review included her children’s names, they also revealed information on what she estimates were at least 35,000 other students. B.R.’s full name was listed in a document labeled “attorney work product” and marked “privileged and confidential,” as well as in an email to board members about litigation to discuss in a 2020 closed meeting.

The records also identified another former student with a separate Title IX case against the district. In reached last year, the district agreed to always redact the student’s real name from any copy of the document and only use a pseudonym when referring to the case. Her attorneys did not respond to a request for comment.

One document the Fairfax County Public Schools turned over to parent Callie Oettinger identifies two students who were involved in Title IX lawsuits as Jane Doe, but then includes their names in parentheses. The 74 has redacted their real names.

The day after issuing its apology, the district sent Oettinger a strongly worded email demanding that she “return all files removed, including any and all physical media used for unauthorized extraction of information from FCPS.” The letter referred to the documents as “wrongfully retained information.”

To her attorney, the language suggested Oettinger was at fault. 

“She’s done nothing illegal, and they have no legal right to compel her to do anything,” said Timothy Sandefur, vice president for legal affairs at the Goldwater Institute, a Phoenix-based libertarian think tank. Oettinger posted redacted documents from the recent trove on she runs on special education issues. “If they want assurance that she is not going to publish any kind of confidential information about kids, she absolutely will not publish confidential information about children. She has assured everybody of that already.”

Oettinger sent the thumb drives to Sandefur, who has since communicated with attorneys conducting the district’s investigation. But he declined to provide an update on the district’s progress. The attorneys conducting the investigation also didn’t respond to requests for comment.

A need for ‘robust action’

Oettinger didn’t initially alert the district to the disclosure because, she said, it has failed to make improvements after previous privacy violations. In fact, on Oct. 19 — the third and final day that Oettinger reviewed files in person — the Virginia Department of Education responded to one of her earlier complaints, finding the Fairfax district out of compliance with the federal Family Educational Rights and Privacy Act, or FERPA.

The decision only pertained to her son and was not a statement about the district’s overall privacy record.

Patricia Haymes, who directs the state agency’s Office of Dispute Resolution and Administrative Services, noted that officials have had “ongoing concerns” regarding student confidentiality in Fairfax and “believed that there was a need for the school division to take more robust action to ensure sustainable compliance.” But she also said the district assured her in September that it was taking steps “regarding the confidentiality of and access to student records.”

In that Sept. 27 letter, the district said it was training staff on their obligations under FERPA and the Freedom of Information Act, and was planning a “mandatory training” for principals and other administrators in charge of student records and special education. Training was scheduled to begin Oct. 31 and employees have two months to complete it. 

On. Nov. 8, Oettinger appealed the state’s decision, citing The 74’s reporting on the accidental records release. Both the district and the state have “failed to ensure compliance — and now here we are,” she wrote. “You have enough for [the district] to be found at fault for systemic noncompliance.” 

The district disputes that it has violated the law. In a Nov. 21 response to Oettinger’s appeal, it described the disclosure as a “single instance of what appears to be human error” and said that Oettinger’s in-person review of the documents, which FERPA allows, was “outside the typical electronic document production that FCPS employs.“

Oettinger said she has faith in Reid, who became superintendent last year, to push for tighter security.  The two have exchanged emails and met in person multiple times. Oettinger said she’s “choosing to believe Reid’s trying to change the district’s culture and that she knows me enough to know I’d never do anything nefarious.”

Some special education experts in the state are baffled by the district’s mistake. 

“It’s just the norm that when you do a document production, you are careful about what you shouldn’t be disclosing — whether it’s other students’ names or legal advice,” said Jim Wheaton, a William and Mary Law School professor who runs a legal clinic for future attorneys that plan to work on special education issues. “It just blows my mind that they would be so reckless.”

But he said that there’s not much parents can do about such violations. They can file complaints, but there’s no right to sue under FERPA.

“In religious terms,” he said, “it’s, ‘Go forth and sin no more.’”

]]>
Exposed Fairfax School Documents Include Names of Alleged Assault Victims /article/exposed-documents-from-virginias-fairfax-schools-include-names-of-alleged-assault-victims/ Fri, 03 Nov 2023 11:01:00 +0000 /?post_type=article&p=717268 Among the tens of thousands of confidential documents accidentally released by the Fairfax County Public Schools last month were the names of two former students whose sexual assault allegations the district bitterly contested, including an appeal to the U.S. Supreme Court.

The students, 12- and 16-years-old at the time of the alleged incidents, said district officials failed to respond adequately to their reports — accusations they deny. In court, the students’ lawyers fought successfully for their right to stay anonymous.


Get stories like these delivered straight to your inbox. Sign up for The 74 Newsletter


“It’s completely irresponsible,” said Shiwali Patel, an attorney with the National Women’s Law Center, which supporting one of the former Fairfax student’s requests to keep her identity private. She said a lot of victims of sexual violence don’t come forward because they “don’t want to have their name out there in the public.”

The 74 reported Wednesday on the district’s release of records on an estimated 35,000 students to a parent who has been an outspoken critic of Fairfax’s data privacy record. District officials declined to comment on the specifics of the disclosures, but late Wednesday issued an apology and launched an “external legal investigation” to determine how staff released the documents.

Two weeks ago, Callie Oettinger, a special education advocate, went to her local high school to review what she thought were records she had requested on her children. But she ended up with a trove of digital files that included personal information such as addresses and disability diagnoses, and that named students who had engaged in self-harm or been hospitalized. “We are deeply sorry that this happened,” the district said, predicting the probe “could take some time” due to the large number of affected students.

In addition, Superintendent Michelle Reid responded to an email from Oettinger, saying that she had “spoken with staff and requested an immediate and thorough review into this deeply concerning matter.” 

The documents also named students with disabilities involved in a over the use of seclusion and restraint. Following a local news investigation, almost 1,700 instances involving over 200 students during the 2017-18 school year. Some students as young as six were isolated in a room dozens of times during the year. The case ended in 2021 with in which the district promised to phase out such practices by the end of last school year. Court documents only used students’ initials, but the documents released used their full names. 

“Absolutely, student names should have been protected,” said Denise Marshall, executive director of the Council of Parent Attorneys and Advocates, a nonprofit that joined the parents who sued the district. She called the leak “an egregious breach of privacy.”

One document the Fairfax County Public Schools turned over to parent Callie Oettinger identifies two students who were involved in Title IX lawsuits as Jane Doe, but then includes their names in parentheses. The 74 has redacted their real names.

One of the documents on those students, labeled “attorney work product” and “privileged and confidential,” also contained the names of two former students involved in Title IX cases against the district. It identified them as “Jane Doe,” but then listed their real names in parentheses. Their last names were also included in an email from John Foster, the district’s general counsel, to board members about cases they’d discuss in a 2020 closed meeting.

In the , a plaintiff identified as Jane Doe was a 16-year-old Oakton High School student when she alleged that she was sexually assaulted during a three-day band trip in 2017. She sued in 2018, saying that officials violated Title IX because they knew about the allegations, but waited until the trip was over to address it. She alleged that the district discouraged her from contacting police and when they told her parents, suggested their daughter would face discipline for having sex while on the trip.

Doe won her case in the U.S. Court of Appeals for the Fourth Circuit, but it ended in a settlement last year after the U.S. Supreme Court declined to hear the district’s appeal. She received almost $588,000 in , but the district made no admission of responsibility. The agreement includes a stipulation that the district will always redact Doe’s real name from any copy of the document and only use a pseudonym when referring to the case.

Lawyers for both students declined to comment on the recent disclosures.

The second case, , is set for trial in March in a federal district court. B.R., as she’s named in the suit, was a 12-year-old student at Rachel Carson Middle School in 2011 when she said an older group of students repeatedly raped, tortured and threatened her with death over a four-month period. She alleged that they were part of a gang tied to sex trafficking in Northern Virginia.

While she later reported the alleged attacks to the police, she said the detective who investigated was a former school resource officer in the district who quickly closed the case. The district argued that staff responded appropriately, but a by the U.S. Department of Education’s Office for Civil Rights concluded the district could have acted more quickly. As a result, the district updated its policies.

At 19, she sued the district and her alleged attackers, saying educators ignored her requests for help. The school district argued the case should be dismissed because she missed a deadline for requesting to use a pseudonym. The in B.R.’s favor, but the district appealed to the Fourth Circuit.  

The National Women’s Law Center was one of 52 organizations that argued the case should continue, despite what it called a “procedural technicality.” In November 2021, the ruled in favor of the plaintiff. 

“In many of these cases, plaintiffs are proceeding with a pseudonym. That is not uncommon,” Patel said. “For the district to push back against that is a bullying tactic. It doesn’t impact their ability to defend the lawsuit.”

]]>
Virginia’s Fairfax Schools Expose Thousands of Sensitive Student Records /article/exclusive-virginias-fairfax-schools-expose-thousands-of-sensitive-student-records/ Wed, 01 Nov 2023 10:01:00 +0000 /?post_type=article&p=716852 Virginia’s Fairfax County Public Schools disclosed tens of thousands of sensitive, confidential student records, apparently by accident, to a parent advocate who has been an outspoken critic of its data privacy record.  

The documents identify current and former special education students by name and include letter grades, disability status and mental health data. In one particularly sensitive disclosure, a counselor identified over 60 students who’ve struggled with issues like depression, including those who have engaged in self-harm or been hospitalized. 

A letter from the district to the state provides copious details about the condition and care of a medically fragile fourth grader. And a document containing “attorney work product” marked “privileged and confidential” references a pair of Title IX cases. It identifies two students as “Jane Doe” — a common practice with alleged victims of sexual assault or harassment — but then names the students in parentheses.

One document the Fairfax County Public Schools turned over to parent Callie Oettinger identifies two students who were involved in Title IX lawsuits as Jane Doe, but then includes their names in parentheses. The 74 has redacted their real names.

The disclosure of private student data is likely the largest since 2020, when the hacker group MAZE , including Social Security numbers and birthdates, on over 170,000 students and employees in the nation’s 13th-largest district. But this time, it looks like human error, rather than ransomware, was to blame. 

“Why worry about people from the outside?” asked Callie Oettinger, who received the recent document collection. “They’ve got the door wide open from the inside.”  

Oettinger, a parent and special education advocate with a long and contentious relationship with Fairfax administrators, went to a school on three consecutive days last month to examine her children’s files — data such as test scores, attendance records and audio recordings of meetings she’s been requesting for years. In addition to boxes of paper files, the district provided her with thumb drives and computer discs that Oettinger estimates include personal data on roughly 35,000 students.

Fairfax parent and special education watchdog Callie Oettinger runs Special Education Action, a website focusing on services for students with disabilities in Fairfax and across the state. (Courtesy of Callie Oettinger)

Parents who have challenged the district over special education services said the leak opens their children to further harm. Among the records released to Oettinger was a 2019 email exchange in which officials questioned the cost of an independent educational evaluation for Julie Melear’s son, who has dyslexia. 

“Is my kid, for the rest of his life, going to have to look over his shoulder to see what Fairfax is putting out there?” asked Melear, who had three children in the district and now lives in Denver.

The latest disclosure is not an isolated incident. Oettinger, who also runs a special education , said the district has repeatedly released information on her now 19-year-old son to other parents and unauthorized staff and, on at least six occasions between 2016 and 2021, provided her with documents on children who are not her own. One was a 2020 internal on special education that included students’ names, their attorneys and costs for services.

But those instances seem small compared to the volume of records she received in October, which span the years 2019 to 2021. It also comes four years after the district’s former superintendent apologized to Oettinger for a similar disclosure and two years after a county judge ruled against Fairfax in a case related to leaked student records. 

Contacted last week, Fairfax officials — who pledged to improve security after the 2020 breach — appeared unaware they had given Oettinger access to students’ personal data. The district’s communications office forwarded an inquiry from The 74 to Molly Shannon, who manages the district’s public records office. In an email, Shannon asked a reporter to identify who accessed the records and where it occurred ”so we can investigate and remediate the issue at the school, notify any affected families, and work with the parent to ensure other students’ information is properly secured.” 

Under , the district is required to alert parents “as soon as practicable” if there’s a violation under the Family Educational Rights and Privacy Act, or FERPA.

Included in the files the Fairfax County Public Schools released to parent Callie Oettinger is a tracker from a counselor used to note student mental health issues.

The records release is the latest dilemma for Virginia’s largest school system, which has come under intense scrutiny for its handling of special education. Following a federal civil rights probe last year, to make up for services it failed to provide to students with disabilities during the pandemic. For years, federal officials the state to improve its monitoring of districts to ensure they’re complying with all special education laws. As recently as February, they told former state Superintendent Jillian Balow that remained a sticking point.

Data leaks linked to are not unique to Fairfax. In 2017, for example, the Chicago Public Schools posted , including health conditions and birthdates, to unsecured websites. Time-consuming records requests to school districts have also skyrocketed in recent years, fueled in part by controversies over COVID protocols, library books and curriculum. Many districts have struggled to keep up, but one expert said Fairfax shouldn’t be one of them.

“I have a lot more sympathy for the many, many small districts,” said Amelia Vance, founder and president of the Public Interest Privacy Center. But with an annual $3.5 billion budget, Fairfax, she said, “certainly seems to have the resources and they’ve had these requests for years. If they don’t have a system to respond in a protective manner, in an efficient manner, that’s on them.”

With nearly 180,000 students, Fairfax County Public Schools is Virginia’s largest district.

Phyllis Wolfram, executive director of the Council of Administrators of Special Education, a national organization, said she doesn’t think it’s common for districts to release students’ files to the wrong parent. But if record requests are increasing, she said, security should be tighter. 

“Given the shortage of school staff all around, we must be extra vigilant and ensure high-quality training for all staff,” she said. 

‘Process and protocols’ 

FERPA is that gives parents the right to examine their children’s educational records. Oettinger said she asked to see original documents in person — after the state overruled the district’s initial refusal — because past responses have been incomplete or contained electronic files that didn’t open. 

She said she is unsure who in the district ultimately signed off on the recent release. On Oct. 16th, she received an email from Shannon saying the records were ready. From Oct. 17 to 19, she sat in a small room next to the main office of her local high school and viewed the files. A paralegal from the central office supervised as she copied records to thumb drives and scanned paper documents on her phone, Oettinger said. He offered assistance and even called in an IT expert when a media file didn’t open. She recorded everything and shared audio files of her visit with The 74. Ironically, she said, some of her own children’s records are still missing.

At one point, she spotted an unredacted document with a teacher’s notes and suspected there were more. But she said she didn’t realize the full scope of the disclosure until she began reviewing the files at home. 

She filed a complaint with the U.S. Department of Education’s Office for Civil Rights on Oct. 20 and contacted a handful of parents she knows with children named in the documents.

Oettinger said she didn’t report the leak to district officials because she doesn’t trust them — a skepticism that has only intensified over time. When her son had reading difficulties in elementary school, educators responded three times that an evaluation “is not warranted,” according to district records and, she said, told her that boys learn to read slower than girls. 

“You get one chance with your kid, and there’s no handbook,” she said. “In special education especially, nobody knows what to do. All you know is that you’re fighting.”

It took an independent evaluation for her son to be diagnosed with dyslexia, and by seventh grade, he had an Individualized Education Program, a plan that outlines the services a district is obligated to provide students with disabilities. Like thousands of Fairfax parents, she also complained that the district failed to follow that plan during the pandemic. He graduated in 2022, but her daughter remains a Fairfax student.

As she navigated the system for her son, she became a sounding board for other families. She launched her website, Special Education Action, in 2020. She’s filed at least 100 complaints with the state education department over special education services in the district and another dozen with the federal civil rights office, of which at least two have resulted in investigations. Her persistence — sending detailed, sometimes biting, emails and pressing for answers to all her questions — has earned her a reputation for “berating” staff, according to one 2019 email from Dawn Schaefer, director of the district office that handles special education complaints.

“It’s obvious you don’t know what you’re talking about, so let me break it down for you,” Oettinger wrote in a 2020 email to a staff person regarding a diagnosis for her son.

Fairfax district staff gave Callie Oettinger several boxes of documents as well as envelopes full of CDs and flash drives. (Courtesy of Callie Oettinger)

In addition to requests for documents on her own children, she submits Freedom of Information Act requests with the district each year for more general data that she uses in her advocacy role. In one internal 2020 email she obtained, John Cafferky, an attorney who handles special education cases for the district, said she files them because she’s “waiting for someone to slip up.” 

District officials have promised her they would do a better job of safeguarding student privacy. In a 2019 email exchange with former Superintendent Scott Brabrand, Oettinger reported multiple cases of school staff forwarding information about her son to the wrong people. 

“I am sorry to report that the school did make a mistake and unintentionally provided information about your son to another parent,” he responded. “We take student privacy very seriously. Following our process and protocols is paramount to ensuring we protect student information.”

Following the 2020 ransomware incident, the district and released a statement saying it was “committed to protecting the information of our students, our staff, and their families.” The state also stepped in to help the district clean up its “internal practices, and ensure it should not happen again,” state Superintendent Lisa Coons told The 74.

But it did. 

In 2021, another Fairfax parent, Debra Tisler, filed a public records request seeking invoices for legal services in an attempt to learn how much Fairfax was spending on attorneys’ fees related to students with disabilities. The district released records that included personal information on about a dozen students. 

Tisler shared the files with Oettinger, who posted , with names blacked out, on her website. The district to get the records back, but lost the case. 

Judge Richard Gardiner, who heard the lawsuit in a Fairfax County district court, said the records were “obtained quite lawfully.” 

“The [district], for whatever reason — maybe it was ineptness, I don’t know; I have no evidence on that — made the decision to turn over the information, and they’re stuck with that,” he said, according to of the hearing. 

Following the lawsuit, an from December 2022 showed the district’s in-house attorneys didn’t finish redacting students’ personal information before its records office released the documents. Fairfax instituted new procedures to ensure records go through multiple reviews, including checks by a paralegal and a staff attorney. The district also to keep up with demand.

Another document marked “confidential” that was inadvertently released to a Fairfax County, Virginia, parent includes the names of students who receive special education at one of the district’s high schools. The 74 redacted their names.

‘Basic data protection’

But it appears the system broke down. Some parents whose records ended up in the recently released files said they weren’t surprised because they, too, have previously received documents pertaining to other students.

“Some of the information I found out about other people’s children I don’t want to know,” said Melear, the parent who relocated to Denver. 

In the files released to Oettinger, Torey Vanek’s daughter was included on a spreadsheet of students who receive special education services or accommodations for a disability. A ninth grader at Woodson High School, her daughter has dyslexia. 

 “There is a joint frustration among many parents in Fairfax,” Vanek said. “Part of me is not surprised, but part of me is like this is just basic data protection.” 

]]>
How Ed Tech Tools Track Kids Online — And Why Parents Should Care /article/how-ed-tech-tools-track-kids-online-and-why-parents-should-care/ Fri, 22 Sep 2023 11:15:00 +0000 /?post_type=article&p=715160 As technology becomes more and more ingrained in education — and as students become increasingly concerned about how their personal information is being collected and used — startling new research shows how schools have given for-profit tech companies a massive data portal into young people’s everyday lives. 

, led by researchers at the University of Chicago and New York University, highlights how the scramble to adopt new technologies in schools has served to create an $85 billion industry with significant data security risks for teachers, parents and students. The issue has become particularly pervasive since the pandemic forced students nationwide into remote, online learning. 

Students’ sensitive information is increasingly leaked online following high-profile ransomware attacks and user data monetization is a key business strategy for tech companies, including those that serve the education market, like Google. Yet student privacy is rarely a top consideration when teachers adopt new digital tools, researchers learned in interviews with district technology officials. In fact, schools routinely lack the resources and know-how to assess potential vulnerabilities.


Get stories like these delivered straight to your inbox. Sign up for The 74 Newsletter


Such a reality could spell trouble: In an analysis of education technologies widely used or endorsed by districts nationwide, researchers discovered privacy risks abound. The analysis relied on , a privacy inspector tool created by the nonprofit news website The Markup which scours websites to uncover data-sharing practices. Those include the use of cookies that track user behaviors to deliver personalized advertisements. Analyzed education tools, they found, make “extensive use of tracking technologies” with potential privacy implications. 

Most alarming to the researchers were the 7.4% that used “session recorders,” a type of tracker that documents a user’s every move. 

“Anyone visiting those sites would have their entire session captured which includes information such as which links they clicked on, what images they hovered over and even data entered into fields but not submitted,” the report notes. “This could include data that users might otherwise consider private such as the autofilling of saved user credentials or social network data.” 

The 74 caught up with report co-author Jake Chanenson, a University of Chicago Ph.D. student, to gain insight into the report’s findings and to understand why he believes that parents and students should be concerned about how ed tech companies collect, store and use their personal data. 

The conversation has been edited for length and clarity. 

Why did remote learning pique your interest in digital privacy and what are the primary implications that worry you? 

Remote learning can be done well but we all had to get to it very quickly without a plan because we all suddenly got thrown at home because of the global pandemic. Suddenly schools had to scramble and find new solutions to reach their students, to educate their students, without being able to test the field, to think critically about it. They really were, with shoestring and gum, trying to keep their classes together. 

Whether you were in school, whether you were at work, whether you were at neither and still just trying to keep in touch with your friends, you were using anything that came your way because that’s what you had to do. I found that really interesting — and a bit concerning. It’s no one’s fault because we don’t understand the ramifications of these technologies and now that we’ve used them a lot of them are here to stay. 

I don’t want to sound like some sort of demonizing figure saying that all tech is bad — that is certainly not the case. It’s merely the fact that sometimes these promises are oversold, and now we have this added element of data privacy. 

When you interact with any of these platforms, tons and tons of student data — from how you interact with it, how well you do on their assignments, when you do it, if you’re a chronic procrastinator, if you’re always getting your work done, if you seem more interested in your art class than your math class. These are all data points collected by these companies and I wanted to know, ‘What is it they’re collecting? What are they doing with it,’ and, specifically for this study, ‘What are schools thinking about in this space if anything at all?’

This study took a two-pronged approach. You conducted surveys with experts in this space and then used technology to identify information that folks might not be aware of. Let’s discuss the surveys first. How did the school administrators and district technology officials you interviewed view privacy issues? 

Lots of them knew that something wasn’t quite up to snuff in their security and privacy practices. 

The best security and privacy practices that I saw in these school districts were entirely because someone, usually in the IT department, had an independent interest in student privacy. They were going above and beyond what their job descriptions required because they cared about the students. 

That’s not to imply that school officials don’t care about the kids —they care about them very much — but they’re so busy making sure the lights are on and making sure there are teachers for the classrooms, dealing with discipline issues, dealing with staffing concerns. They’re not necessarily focused on data privacy and security. 

Your research takes a unique approach to show the real-world impacts of education technology on student privacy. You identify that some of these tools raise significant privacy implications. How did you go about that?

We looked at the online websites of educational sites and tried to understand, what are the privacy risks here? What we found is that 7.4% of all these websites had a session recorder, which records everything you do when you’re interacting with a web page. How long you hovered over a certain element, how often you scrolled, what you clicked on and what you didn’t click on. 

That’s a scary amount of data collection for something that’s normally an education site. On top of that we found a high prevalence of cookies and other types of trackers that were being sent to third-parties, basically advertising networks, that were taking that data to track these students across the web. As a student, even while I’m doing my work, they’re creating an ad profile of me that not only encompasses who I am as a consumer in my spare time, but who I am as a student inside of school for this more comprehensive picture of who I am to sell me ads. 

That could be upsetting to somebody who thinks that what I’m doing in school is only the business of me and the teacher, my parents and the principal. 

Why would an education technology company use a session recorder? 

We were able to identify that these trackers, like session recorders, were running on these websites, but we don’t have any idea what they’re recording, which is a project that we’re currently working on and trying to understand. 

I can’t make any well-grounded assumptions to what this is being used for, whether it be nefarious or benign. It’s not uncommon for a session recorder to be used for diagnostic information for a technology company if they want to understand how their users use a site so they can improve it. That’s a legitimate use of one of these session recorders, but without knowing what data they collect, it could be that they’re collecting data that isn’t strictly relevant to improving the service or are over-collecting data in the guise of improving the service and retaining it for future use. 

There are, of course, but I won’t speculate on that because I don’t have definitive proof that’s what’s happening. 

Why should people care about districts’ technology procurements? School districts are using a huge swath of digital tools, some from Google and some from tiny tech companies. If school leaders aren’t putting privacy at the forefront of deciding which tools to use, what concerning outcomes can come from that? 

There are several concerning outcomes, the first being that the data these companies collect don’t necessarily sit on their servers. They sometimes are sold to third parties. Some companies state third parties ambiguously and others list out who they are selling it to and why. 

Just on a normative basis, I think that what you do in the classroom shouldn’t be harvested and sold, especially when many of these companies are raking in somewhere between five- and seven-figure contracts to license this technology. It’s not like they don’t have other sources of income, but the things they can take from students can be incredibly alarming: Information about socioemotional behavior, so if I act out in school, if I am in trouble for something that’s happening at home or I’m bullying another student, that data is collected by a specific service and that data is held somewhere. And of course, when you hold data, it’s a security risk. 

There was a big breach in New York City where hundreds of thousands of students had their personal information leaked because a company was holding onto all of this data. It was leaked to hackers who got that data and can do who knows what with it. That’s a huge privacy violation. Some of the things they stole in that particular breach were names, birthdays and standard things you can use to commit identity fraud, which is a problem. But it can also be more sensitive stuff, such as [special education] accommodation lists or if you qualify for free lunch. There’s stuff about disability or your economic status, stuff that is all collected by these ed tech companies and held somewhere. 

Learning management systems have incredible amounts of metadata. ‘Are you someone who procrastinates and only finishes an assignment one minute before it’s due? Did you do it early? Are you someone who didn’t do the reading but showed up to class anyway? Are you someone who took 10 times to get this quiz right or did it only take you one time’ 

These data are recorded and are available for teachers to see, but because teachers can see it, it’s sitting on a server somewhere. 

Because they’re being stored somewhere and they are not being deleted regularly and these companies are not following data minimization principles, it’s a potential privacy risk for these students should another breach happen, which we’ve seen happen again and again and again. 

Breaches have affected sensitive student information. In her book Danielle Citron argues for federal rules that would protect intimate privacy as a civil right. Why are such rules needed and how would they work in an educational context? 

There are certain types of information, like nonconsensual disclosures of intimate images, so-called revenge porn. I think you can make a straight analogy for student data. Just as there should be a zone of intimate privacy around your personal intimate life, your sexuality, whatever else, we should have a similar zone around your educational life. 

Education is a space where students should be able to learn and make mistakes, and if you cannot make those mistakes without being recorded, then that can have repercussions for you later. If you’re not perfect on your first try and someone gets a hold of that, I could see that affecting your college admissions or that could affect an employment record. If I am someone who wants to hire you and I have a list of every student in a school that turns in their assignments early and all of these people were either habitually late or always procrastinating then obviously I’m going to be more interested in hiring the worker that turned stuff in early. But what that list might not tell you is that it was one data point in eighth grade and that one of those students when they were in high school finally got on top of their executive dysfunction and started turning things in on time. 

It’s ultimately nobody’s business how you do in the classroom. You have final grades, but those fine-grained data are nobody else’s business but yours and the teacher’s. You have a safe space to learn and grow and make mistakes in the educational environment and to not be penalized for them outside of that classroom.

]]>
ChatGPT Is Landing Kids in the Principal’s Office, Survey Finds /article/chatgpt-is-landing-kids-in-the-principals-office-survey-finds/ Wed, 20 Sep 2023 04:01:00 +0000 /?post_type=article&p=715056 Ever since ChatGPT burst onto the scene last year, a heated debate has centered on its potential benefits and pitfalls for students. As educators worry students could use artificial intelligence tools to cheat, a new survey makes clear its impact on young people: They’re getting into trouble. 

Half of teachers say they know a student at their school who was disciplined or faced negative consequences for using — or being accused of using — generative artificial intelligence like ChatGPT to complete a classroom assignment, , a nonprofit think tank focused on digital rights and expression. The proportion was even higher, at 58%, for those who teach special education. 

Cheating concerns were clear, with survey results showing that teachers have grown suspicious of their students. Nearly two-thirds of teachers said that generative AI has made them “more distrustful” of students and 90% said they suspect kids are using the tools to complete assignments. Yet students themselves who completed the anonymous survey said they rarely use ChatGPT to cheat, but are turning to it for help with personal problems.


Get stories like these delivered straight to your inbox. Sign up for The 74 Newsletter


“The difference between the hype cycle of what people are talking about with generative AI and what students are actually doing, there seems to be a pretty big difference,” said Elizabeth Laird, the group’s director of equity in civic technology. “And one that, I think, can create an unnecessarily adversarial relationship between teachers and students.”   

Indeed, 58% of students, and 72% of those in special education, said they’ve used generative AI during the 2022-23 academic year, just not primarily for the reasons that teachers fear most. Among youth who completed the nationally representative survey, just 23% said they used it for academic purposes and 19% said they’ve used the tools to help them write and submit a paper. Instead, 29% reported having used it to deal with anxiety or mental health issues, 22% for issues with friends and 16% for family conflicts.

Part of the disconnect dividing teachers and students, researchers found, may come down to gray areas. Just 40% of parents said they or their child were given guidance on ways they can use generative AI without running afoul of school rules. Only 24% of teachers say they’ve been trained on how to respond if they suspect a student used generative AI to cheat. 

Center for Democracy and Technology

The results on ChatGPT’s educational impacts were included in the Center for Democracy and Technology’s broader annual survey analyzing the privacy and civil rights concerns of teachers, students and parents as tech, including artificial intelligence, becomes increasingly engrained in classroom instruction. Beyond generative AI, researchers observed a sharp uptick in digital privacy concerns among students and parents over last year. 

Among parents, 73% said they’re concerned about the privacy and security of student data collected and stored by schools, a considerable increase from the 61% who expressed those reservations last year. A similar if less dramatic trend was apparent among students: 62% had data privacy concerns tied to their schools, compared with 57% just a year earlier. 

Center for Democracy and Technology

Those rising levels of anxiety, researchers theorized, are likely the result of the growing frequency of cyberattacks on schools, which have become a primary target for ransomware gangs. High-profile breaches, including in Los Angeles and Minneapolis, have compromised a massive trove of highly sensitive student records. Exposed records, investigative reporting by The 74 has found, include student psychological evaluations, reports detailing campus rape cases, student disciplinary records, closely guarded files on campus security, employees’ financial records and copies of government-issued identification cards. 

Survey results found that students in special education, whose records are among the most sensitive that districts maintain, and their parents were significantly more likely than the general education population to report school data privacy and security concerns. As attacks ratchet up, 1 in 5 parents say they’ve been notified that their child’s school experienced a data breach. Such breach notices, Laird said, led to heightened apprehension. 

“There’s not a lot of transparency” about school cybersecurity incidents “because there’s not an affirmative reporting requirement for schools,” Laird said. But in instances where parents are notified of breaches, “they are more concerned than other parents about student privacy.” 

Parents and students have also grown increasingly wary of another set of education tools that rely on artificial intelligence: digital surveillance technology. Among them are student activity monitoring tools, such as those offered by the for-profit companies Gaggle and GoGuardian, which rely on algorithms in an effort to keep students safe. The surveillance software employs artificial intelligence to sift through students’ online activities and flag school administrators — and sometimes the police — when they discover materials related to sex, drugs, violence or self-harm. 

Among parents surveyed this year, 55% said they believe the benefits of activity monitoring outweigh the potential harms, down from 63% last year. Among students, 52% said they’re comfortable with academic activity monitoring, a decline from 63% last year. 

Such digital surveillance, researchers found, frequently has disparate impacts on students based on their race, disability, sexual orientation and gender identity, potentially violating longstanding federal civil rights laws. 

The tools also extend far beyond the school realm, with 40% of teachers reporting their schools monitor students’ personal devices. More than a third of teachers say they know a student who was contacted by the police because of online monitoring, the survey found, and Black parents were significantly more likely than their white counterparts to fear that information gleaned from online monitoring tools and AI-equipped campus surveillance cameras could fall into the hands of law enforcement. 

Center for Democracy and Technology

Meanwhile, as states nationwide pull literature from school library shelves amid a conservative crusade against LGBTQ+ rights, the nonprofit argues that digital tools that filter and block certain online content “can amount to a digital book ban.” Nearly three-quarters of students — and disproportionately LGBTQ+ youth — said that web filtering tools have prevented them from completing school assignments. 

The nonprofit highlights how disproportionalities identified in the survey could run counter to federal laws that prohibit discrimination based on race and sex, and those designed to ensure equal access to education for children with disabilities. In a letter sent Wednesday to the White House and Education Secretary Miguel Cardona, the Center for Democracy and Technology was joined by a coalition of civil rights groups urging federal officials to take a harder tack on ed tech practices that could threaten students’ civil rights. 

“Existing civil rights laws already make schools legally responsible for their own conduct, and that of the companies acting at their direction in preventing discriminatory outcomes on the basis of race, sex and disability,” the coalition wrote. “The department has long been responsible for holding schools accountable to these standards.”

Sign up for the School (in)Security newsletter.

Get the most critical news and information about students' rights, safety and well-being delivered straight to your inbox.

]]>
Opinion: Virtual Reality & Other New Technologies Pose Risks for Kids. It’s Time to Act /article/virtual-reality-other-new-technologies-pose-risks-for-kids-its-time-to-act/ Mon, 27 Mar 2023 13:30:00 +0000 /?post_type=article&p=706497 Almost immediately after ChatGPT, a captivating artificial intelligence-powered chatbot, was released late last year, school districts across the country moved to limit or access to it. As rationale, they cited a combination of potential negative impacts on student learning and concerns about plagiarism, privacy and content accuracy. 

These districts’ reactions to ChatGPT have led to a debate among policymakers and parents, teachers and technologists about the of this new chatbot. This deliberation magnifies a troubling truth: Superintendents, principals and teachers are making decisions about the adoption of emerging technology without the answers to fundamental questions about the benefits and risks. 


Get stories like these delivered straight to your inbox. Sign up for The 74 Newsletter


Technology has the potential to modernize education and help prepare students for an increasingly complex future. But the risks to children are just beginning to be uncovered. Creating a policy and regulatory framework focused on building a deeper understanding of the benefits and risks of emerging technologies, and protecting children where the evidence is incomplete, is not alarmist, but a responsible course of action. 

Why act now? 

First, recent history has demonstrated that emerging technology can pose real risks to children. a correlation between time spent on social media and adolescent anxiety, depression, self-harm and suicide. These impacts seem particularly significant for . While there is debate among researchers about the size of these effects, the state of adolescent mental health has deteriorated to the extent that it was declared a in 2021 by the American Academy of Pediatrics, the American Academy of Child and Adolescent Psychiatry, and the Children’s Hospital Association. Social media seems to be a contributing factor. 

Second, immersive technologies, including virtual reality, augmented reality, mixed reality and brain-computer interfaces, may intensify the benefits and risks to children. Immersive technologies have the potential to . But the impact on childhood development of exposure to multisensory experiences replicating the physical world in digital spaces is just beginning to be understood — and there is cause for concern based on limited research. For example, a concluded that immersive virtual reality can interfere with the development of coordination that allows children to maintain balance. And a 2021 on the impact of virtual reality on children revealed evidence of cognition issues, difficulty navigating real and virtual worlds, and addiction. The most significant risk may be how frequent and prolonged exposure to virtual environments impact mental health. 

Third, the digital divide has considerably. Government and the private sector have driven improvements in , expanded cellular networks and made mobile and computing devices significantly more affordable. Since 2014-15, the percentage of teens who have a smartphone has . Paired with money from COVID-19 legislation that allowed schools to invest in hardware, more children will have opportunities to use emerging technologies than ever had access to older innovations — including apps and the internet — at home and in school. 

Based on emerging evidence on these impacts on children, and in the face of significant unknowns, a policy and regulatory framework focused on mitigating risks — while still allowing children to access the benefits of these technologies — is warranted. At the federal level, Congress should consider:

  • Compelling all emerging technology companies, including those producing immersive reality products that are utilized by children, to provide academic researchers access to their data.
  • Compelling all immersive reality companies to assess the privacy and protection of children in the design of any product or service that they offer.
  • Compelling all immersive reality companies to provide child development training to staff working on products intended for use by children.
  • Requiring hardware manufacturers of virtual reality, augmented reality, mixed reality and brain-computer interface devices targeted to children to prominently display on their packaging warning labels about unknown physical and mental health risks.
  • Establishing guidance, via the Department of Education, for district and school leaders to prepare their communities for the adoption of immersive technologies.
  • Requiring all immersive technology companies to inform users of product placement within the platform.
  • Compelling relevant federal regulatory agencies to provide clarification on the ways existing laws, such as the Health Information Portability and Accountability Act and the Children’s Online Privacy Protection Act, Individuals with Disabilities Act and Americans with Disabilities Act, apply to immersive technologies.
  • Compelling all immersive technology companies to acquire parental consent for data sharing, particularly biometric information, including eye scans, fingerprints, handprints, face geometry and voiceprints.
  • Providing guidelines around minimum age for the use of immersive technology platforms and products.

At the state level, every governor should carefully assess the action last week to regulate children’s use of social media and consider the following actions: 

  • Creating child well-being requirements for state procurement of any immersive technology.
  • Offering research and development grants to in-state immersive technology companies to focus on safety and well-being impacts on children.
  • Establishing protocols for reviewing districts’ use of emerging technologies to determine compliance with federal and state law.

Finally, at the local level, school boards, superintendents and school leaders should consider regulations and guidance for the selection, adoption and use of immersive technologies:

  • Assessing opportunities for integration with current teaching and learning methods and curriculum.
  • Investing in and planning for professional development around these technologies.
  • Ensuring accessibility for students with disabilities and English learners when planning around use of emerging technologies.
  • Ensuring that any planned use of emerging technologies in the classroom is compliant with state and federal special education laws.
  • Evaluating the costs of immersive technology procurement and necessary infrastructure upgrades and making the results transparent to the community.
  • Creating opportunities for educator, parent and student involvement in the purchasing process for technology.

If emerging technology can have detrimental impacts on children — and evidence points to that being the case — responsibly mitigating the risks associated with these technologies is prudent. Why chance it? This is the best opportunity to allow children to reap the benefits.

]]>
Hackers Use Stolen Student Data Against Minneapolis Schools in Brazen New Threat /article/hackers-use-stolen-student-data-against-minneapolis-schools-in-brazen-new-threat/ Thu, 09 Mar 2023 14:01:00 +0000 /?post_type=article&p=705596 Minneapolis Public Schools appears to be the latest ransomware target in a $1 million extortion scheme that came to light Tuesday after a shady cyber gang posted to the internet a ream of classified documents it claims it stole from the district. 

While districts nationwide have become victims in in the last several years, cybersecurity experts said the extortion tactics leveraged against the Minneapolis district are particularly aggressive and an escalation of those typically used against school systems to coerce payments.

In a dark web blog post and an online video uploaded Tuesday, the ransomware gang Medusa claimed responsibility for conducting a February cyberattack — or what Minneapolis school leaders euphemistically called an “encryption event” — that led to . The blog post gives the district until March 17 to hand over $1 million. If the district fails to pay up, criminal actors appear ready to post a trove of sensitive records about students and educators to their dark web leak site. The gang’s leak site gives the district the option to pay $50,000 to add a day to the ransom deadline and allows anyone to purchase the data for $1 million right now.

On the video-sharing platform Vimeo, the group, calling itself the Medusa Media Team, posted a 51-minute video that appeared to show a limited collection of the stolen records, making clear to district leaders the sensitive nature of the files within the gang’s possession. 

“The video is more unusual and I don’t recall that having been done before,” said Brett Callow, a threat analyst with the cybersecurity company Emsisoft. 

A preliminary review of the gang’s dark web leak site by The 74 suggest the compromised files include a significant volume of sensitive documents, including records related to student sexual violence allegations, district finances, student discipline, special education, civil rights investigations, student maltreatment and sex offender notifications. 

A file purportedly stolen from Minneapolis Public Schools and uploaded to the Medusa ransomware gang’s dark web leak site references a sexual assault incident involving several students. (Screenshot)

The video is no longer available on Vimeo and a company spokesperson confirmed to The 74 that it was , which prohibits users from uploading content that “infringes any third party’s” privacy rights. 

As targeted organizations decline to pay ransom demands in efforts to recover stolen files, Callow said the threat actors are employing new tactics “to improve conversion rates.”

“This is likely just an experiment, and if they find this works they will do it more frequently,” Callow said. “These groups operate like regular businesses, in that they A/B test and adopt the strategies that work and ditch the ones that don’t.” 

Here’s a snippet of the video’s introduction (with all sensitive records omitted):

The Minneapolis school district hasn’t acknowledged being a ransomware victim, while Callow and other cybersecurity experts have been harshly critical of how it has disclosed the attack to the public. In , the district attributed “technical difficulties” with its computer systems to the referenced “encryption event,” a characterization that experts blasted as creative public relations that left potential victims in the dark about the incident’s severity. 

The district “has not paid a ransom” and an investigation into the incident “has not found any evidence that any data accessed has been used to commit fraud,” school officials said in the March 1 statement.  

In a statement to The 74 Tuesday, the district said it “is aware that the threat actor who has claimed responsibility for our recent encryption event has posted online some of the data they accessed.” 

“This action has been reported to law enforcement, and we are working with IT specialists to review the data in order to contact impacted individuals,” the statement continued.

A file uploaded to the Medusa ransomware gang’s dark web leak site lists personal information of Minneapolis Public Schools administrators who serve as campus emergency contacts. (Screenshot)

Minnesota-based student privacy advocate Marika Pfefferkorn called on the district to be more forthcoming as it confronts the attack. 

“First and foremost, they owe an apology to the community by not being explicit right away about what was happening,” said Pfefferkorn, executive director of the Midwest Center for School Transformation. “Because they haven’t communicated about it, they haven’t shared a plan about, ‘How will you address this? How will you respond?’ Not knowing how they are going to respond makes me really nervous.”

School cybersecurity expert Doug Levin, the national director of the K12 Security Information eXchange, said that district officials appear to have coined the term “encryption event,” but available information suggests the school system was the victim of “classic double extortion,” an exploitation technique that’s become popular among ransomware gangs in the last several years. 

With its video and dark web blog, Medusa may have spent “a little more time and energy” than other ransomware groups in presenting the stolen data in a compelling package, “but the tactics seem to be the same,” Levin said. “Now that we have a group coming forward with compelling evidence that they have exfiltrated data from the system and it’s actively extorting them, that’s all I would need to know to classify this as ransomware.”

In double extortion ransomware attacks, threat actors gain access to a victim’s computer network, download compromising records and lock the files with an encryption key. Criminals then demand their victim pay a ransom to regain control of their files. Then, if a ransom is not paid, criminals sell the data or publish the records to a leak site. 

Such a situation recently played out in the Los Angeles Unified School district, the nation’s second-largest school system. Last year, the ransomware gang Vice Society broke into the district’s computer network and made off with some 500 gigabytes of district files. When the district refused to pay an undisclosed ransom, Vice Society uploaded the records to its dark web leak site. 

District officials have sought to downplay the attack’s effects on students. But an investigation by The 74 found thousands of students’ comprehensive and highly sensitive mental health records had been exposed. The district then acknowledged Feb. 22 that some 2,000 student psychological assessments — including those of 60 current students — had been leaked.

Districts that become ransomware targets could face significant liability issues. Earlier this month, the education technology company Aeries Software a negligence lawsuit after a data breach exposed records from two California school districts. District families accused the software company of failing to implement reasonable cybersecurity safeguards. 

Federal authorities have made progress in curtailing cybercriminals. In January, authorities seized control of a prolific ransomware gang’s leak site and earlier this month officials with ties to a Russian-based ransomware group that’s known to target schools. 

At least 11 U.S. school districts have been the victims of ransomware attacks so far in 2023, according to Emsisoft research. Last year, 45 school districts and 44 colleges. 

The Medusa ransomware gang’s leak site suggests the Minneapolis school district has until March 17 to pay a $1 million ransom or have their sensitive files published online. The district can pay $50,000 to add a day to the ransom deadline. (Screenshot)

In Minneapolis, a lack of transparency from the district could put affected students and staff at heightened risk of exploitation, Emsisoft’s Callow said. 

“There absolutely are times when districts have to be cautious about the information they release because it is the source of an ongoing investigation,” he said. “But calling something a ransomware incident as opposed to an encryption event really isn’t problematic. Nor is telling people their personal information may have been compromised.”

Pfefferkorn, the Minneapolis student privacy advocate, said she’s concerned about the amount of data the school district collects about students and worries it lacks sufficient cybersecurity safeguards to keep the information secure. She pointed to Minneapolis schools’ since-terminated contract with the digital student surveillance company Gaggle, which monitors students online and alerts district officials to references about mental health challenges, sexuality, drug use, violence and bullying. 

The district said it adopted the monitoring tool in a pandemic-era effort to keep kids safe online, but the unauthorized disclosure of Gaggle records maintained by the district could make them more vulnerable, she said. 

There’s little recourse, she said, for students and educators whose sensitive records were already leaked by Medusa. 

“It’s already out there and that cannot be repaired,” she said. “There’s information out there that’s going to impact them for the rest of their lives.”

]]>
Gaggle Drops LGBTQ Keywords from Student Surveillance Tool Following Bias Concerns /article/gaggle-drops-lgbtq-keywords-from-student-surveillance-tool-following-bias-concerns/ Fri, 27 Jan 2023 12:15:00 +0000 /?post_type=article&p=703034 Digital monitoring company Gaggle says it will no longer flag students who use words like “gay” and “lesbian” in school assignments and chat messages, a significant policy shift that follows accusations its software facilitated discrimination of LGBTQ teens in a quest to keep them safe.

A spokesperson for the company, which describes itself , cited a societal shift toward greater acceptance of LGBTQ youth — rather than criticism of its product — as the impetus for the change as part of a “continuous evaluation and updating process.”

The company, which uses artificial intelligence and human content moderators to sift through billions of student communications each year, has long defended its use of LGBTQ-specific keywords to identify students who might hurt themselves or others. In arguing the targeted monitoring is necessary to save lives, executives have pointed to the prevalence of bullying against LGBTQ youth and data indicating they’re than their straight and cisgender classmates. 


Get stories like these delivered straight to your inbox. Sign up for The 74 Newsletter


But in practice, Gaggle’s critics argued, the keywords put LGBTQ students at a heightened risk of scrutiny by school officials and, on some occasions, the police. Nearly a third of LGBTQ students said they or someone they know experienced nonconsensual disclosure of their sexual orientation or gender identity — often called outing — as a result of digital activity monitoring, according to released in August by the nonprofit Center for Democracy and Technology. The survey encompassed the impacts of multiple monitoring companies who contract with school districts, such as GoGuardian, Gaggle, Securly and Bark. 

Gaggle’s decision to remove several LGBTQ-specific keywords, including “queer” and “bisexual,” from its dictionary of words that trigger alerts was first reported in . It follows extensive reporting by The 74 into the company’s business practices and sometimes negative effects on students who are caught in its surveillance dragnet. 

Though Gaggle’s software is generally limited to monitoring school-issued accounts, including those by Google and Microsoft, the it can scan through photos on students’ personal cell phones if they plug them into district laptops.

The keyword shift comes at a particularly perilous moment, as Republican lawmakers in multiple states . Legislation has looked to curtail classroom instruction about sexual orientation and gender identity, ban books and classroom curricula featuring LGBTQ themes and prohibit transgender students from receiving gender-affirming health care, participating in school athletics and using restroom facilities that match their gender identities. Such a hostile political climate and pandemic-era disruptions, a recent youth survey by The Trevor Project revealed, has contributed to an uptick in LGBTQ youth who have seriously considered suicide. 

The U.S. Education Department received 453 discrimination complaints involving students’ sexual orientation or gender identity last year, according to data provided to The 74 by its civil rights office. That’s a significant increase from previous years, including in 2021 when federal officials received 249 such complaints. The Trump administration took and complaints dwindled. In 2018, the Education Department received just 57 complaints related to sexual orientation or gender identity discrimination.

The increase in discrimination allegations involving sexual orientation or gender identity are part of , according to data obtained by The New York Times. The total number of complaints for 2021-22 grew to 19,000, a historic high and more than double the previous year. 

In September, The 74 revealed that Gaggle had donated $25,000 to The Trevor Project, the nonprofit that released the recent youth survey and whose advocacy is focused on suicide prevention among LGBTQ youth. The arrangement was framed on Gaggle’s website as a collaboration to “improve mental health outcomes for LGBTQ young people.” 

The revelation was met with swift backlash on social media, with multiple Trevor Project supporters threatening to halt future donations. Within hours, the group announced it had returned the donation, acknowledging concerns about Gaggle “having a role in negatively impacting LGBTQ students.” 

The Trevor Project didn’t respond to requests for comment on Gaggle’s decision to pull certain LGBTQ-specific keywords from its systems. 

In a statement to The 74, Gaggle spokesperson Paget Hetherington said the company regularly modifies the keywords its software uses to trigger a human review of students’ digital communications. Certain LGBTQ-specific words, she said, are no longer relevant to the 24-year-old company’s efforts to protect students from abuse and were purged late last year.

“At points in time in the not-too-distant past, those words were weaponized by bullies to harass and target members of the LGBTQ+ community, so as part of an effective methodology to combat that discriminatory harassment and violence, those words were once effective tools to help identify dangerous situations,” Hetherington said. “Thankfully, over the past two decades, our society evolved and began a period of widespread acceptance, especially among the K-12 student population that Gaggle serves. With that evolution and acceptance, it has become increasingly rare to see those words used in the negative, harassing context they once were; hence, our decision to take these off our word/phrases list.”

Hetherington said Gaggle will continue to monitor students’ use of the words “faggot,” “lesbo,” and others that are “commonly used as slurs.” A previous review by The 74 found that Gaggle regularly flagged students for harmless speech, like profanity in fictional articles submitted to a school’s literary magazine, and students’ private journals. 

Anti-LGBTQ activists have , and privacy advocates warn that in the era of “Don’t Say Gay” laws and abortion bans, information gleaned from Gaggle and similar services could be weaponized against students.

Gaggle executives have minimized privacy concerns and claim the tool saved more than 1,400 lives last school year. That statistic hasn’t been independently verified and there’s a dearth of research to suggest digital monitoring is an effective school-safety tool. A recent survey found a majority of parents and teachers believe the benefits of student monitoring outweigh privacy concerns. The Vice News documentary included the perspective of a high school student who was flagged by Gaggle for writing a paper titled “Essay on the Reasons Why I Want to Kill Myself but Can’t/Didn’t.” Adults wouldn’t have known she was struggling without Gaggle, she said. 

“I do think that it’s helpful in some ways,” the student said, “but I also kind of think that it’s — I wouldn’t say an invasion of privacy — but if obviously something gets flagged and a person who it wasn’t intended for reads through that, I think that’s kind of uncomfortable.” 

Student surveillance critic Evan Greer, director of the nonprofit digital rights group said the tweaks to Gaggle’s keyword dictionary are unlikely to have a significant effect on LGBTQ teens and blasted the company’s stated justification for the move as being “out of touch” with the state of anti-LGBTQ harassment in schools. Meanwhile, Greer said that LGBTQ youth frequently refer to each other using “reclaimed slurs,” reappropriating words that are generally considered derogatory and remain in Gaggle’s dictionary. 

“This is just like lipstick on a pig — no offense to pigs — but I don’t see how this actually in any meaningful way mitigates the potential for this software to nonconsensually out LGBTQ students to administrators,” Greer said. “I don’t see how it prevents the software from being used to invade the privacy of students in a wide range of other circumstances.”

Gaggle and its competitors — including , and — have faced similar scrutiny in Washington. In April, Democratic Sens. Elizabeth Warren and Ed Markey argued in a report that the tools could be misused to discipline students and warned they could be used disproportionately against students of color and LGBTQ youth. 

Jeff Patterson

In , Gaggle founder and CEO Jeff Patterson said the company cannot test the potential for bias in its system because the software flags student communications anonymously and the company has “no context or background on students,” including their race or sexual orientation. They also said their monitoring services are not meant to be used as a disciplinary tool. 

In the survey released last summer by the Center for Democracy and Technology, however, 78% of teachers reported that digital monitoring tools were used to discipline students. Black and Hispanic students reported being far more likely than white students to get into trouble because of online monitoring. 

In October, the White House cautioned school districts against the “continuous surveillance” of students if monitoring tools are likely to trample students’ rights. It also directed the Education Department to issue guidance to districts on the safe use of artificial intelligence. The guidance is expected to be released early this year.

Evan Greer (Twitter/@evan_greer)

As an increasing number of districts implement Gaggle for bullying prevention efforts, surveillance critic Greer said the company has failed to consider how adults can cause harm.

“There is now a very visible far-right movement attacking LGBTQ kids, and particularly trans kids and teenagers,” Greer said. “If anything, queer kids are more in the crosshairs today than they were a year ago or two years ago — and that’s why this surveillance is so dangerous.”

If you are in crisis, please call the National Suicide Prevention Lifeline at 1-800-273-TALK (8255), or contact the Crisis Text Line by texting TALK to 741741. For LGBTQ mental health support, contact The Trevor Project’s toll-free support line at 866-488-7386.

]]>
Startling 96% of School Tech Exposes Student Data, Research Finds /article/startling-96-of-school-tech-exposes-student-data-research-finds/ Tue, 03 Jan 2023 20:01:00 +0000 /?post_type=article&p=701948 Each school day, students nationwide are required to log into thousands of digital platforms to complete homework, chat with their teachers and check their grades. Then, without their knowledge, an overwhelming majority of those tools turn around and share their data with third parties — often for profit.

A resounding 96% of apps used regularly in schools have data-sharing practices that “are not adequately safe for children,” according to a new report by the nonprofit Internet Safety Labs, which conducts software safety tests. In an analysis of apps commonly required or recommended by schools, the group found that many shared students’ personal data to marketing firms that build extensive profiles of children to sell products through targeted advertising.

“At a minimum, it fuels marketers’ and data brokers’ personal data profiles ultimately used to bombard young minds with highly targeted and persuasive advertising or opinions,” according to the report. “At worst, in the wrong hands it can lead to emotional trauma, aberrant seduction or even physical danger with location information.”


Get stories like these delivered straight to your inbox. Sign up for The 74 Newsletter


Once their data reaches the “Wild West” digital advertising ecosystem, students lose control of how that information is used and retained, said Irene Knapp, the group’s technology director and a former Google software engineer. Knapp said the widespread exchange of data between the apps and advertisers, including sensitive information such as mental health records, could come back to harm students later on. Such data could make it more difficult for students to get health insurance when they’re older, Knapp said, and could be used to serve them products based on their conditions. For example, the data could be used to “discover kids who have a propensity to gambling addiction and sell them Candy Crush,” a reference to the mobile game known for its obsessive users.

“It gets difficult holding this advertising ecosystem to account because the harms and the original cause are far apart,” Knapp said. “But that’s why we have to identify where these leaks happen and start to close them.”

To reach its conclusions, researchers tested more than 1,300 digital tools that were required or recommended by a random sampling of 13 schools in each state, totaling 663 campuses serving nearly half a million students. Almost a quarter of the services included advertisements and 13% had targeted ads, researchers found. Schools routinely recommended tools that weren’t specifically designed with schools in mind, including Spotify and YouTube. Yet even among the digital tools specifically geared toward students, 18% contained ads and 9% used targeted ads — a finding the group argues is “still too high to be safe for students.”

Meanwhile, the tools routinely shared information with Big Tech data aggregators: 68% of apps shared data with Google, 36% with Apple and 33% with Facebook. More than three-quarters of the apps accessed users’ location information, researchers found, and more than half tapped into students’ calendars and contacts. The analysis found that school utility apps, which allow schools to share with students and parents information like lunch schedules and other announcements, were the most likely to pass student data to the Big Tech giants. Districts often contract with Blackboard and Apptegy for the apps. Neither company responded to requests for comment.

“Those community engagement platform apps, they came out to be pretty dangerous it turned out,” said Lisa LeVasseur, the group’s executive director. “Were we surprised? I’d say no. It’s disappointing.”

Schools in the study sample recommended an average of 125 different apps, highlighting the ubiquity of technology in modern education, especially as the pandemic forced school closures and remote learning. Yet even as school districts’ reliance on tech companies grows, they generally lack the staffing to ensure that recommended technologies sufficiently safeguard student privacy, according to a recent report by the Center for Democracy and Technology, a nonprofit think tank.

The Internet Safety Labs report comes at a moment of heightened scrutiny around how technology companies use the data they collect about children. Earlier this year, the Federal Trade Commission announced that it would toughen its enforcement of education technology companies that sell student data for targeted advertising and that “illegally surveil children when they go online to learn.”

Just this week, the commission reached a record-breaking $520 million settlement with Epic Games, creator of the hugely popular video game Fortnite, and accused it of illegally collecting information about young children in violation of federal privacy rules and tricking players into making unintentional purchases.

Ultimately, Internet Safety Labs found that schools need more resources to thoroughly vet the technologies they use — especially as districts face greater cybersecurity threats. With limited accountability to ensure that software providers behave ethically, it’s up to schools to keep kids safe.

“With technology right now, there’s no norms for product safety, none whatsoever,” LeVasseur said. “It’s as if you bought your car and you had to install seat belts, airbags, windshield wipers, lights, like all of the safety features. That’s what we have with technology right now. It’s all, ‘Do it yourself, try to be safe. Godspeed you’re on your own.’ ”

]]>
Best of 2022: The Year’s Top Stories About Education & America’s Schools /article/best-education-articles-of-2022-our-22-most-shared-stories-about-students-schools/ Wed, 21 Dec 2022 12:15:00 +0000 /?post_type=article&p=701606 Every December at The 74, we take a moment to recap and spotlight our most read, shared and debated education articles of the year. Looking back now at our time capsules from December 2020 and December 2021, one can chart the rolling impact of the pandemic on America’s students, families and school communities. Two years ago, we were just beginning to process the true cost of emergency classroom closures across the country and the depth of students’ unfinished learning. Last year, as we looked back in the shadow of Omicron, a growing sense of urgency to get kids caught up was colliding with bureaucratic and logistical challenges in figuring out how to rapidly convert federal relief funds into meaningful, scalable student assistance. 

This year’s list, publishing amid new calls for mask mandates and yet another spike in hospitalizations, powerfully frames our surreal new normal: mounting concerns about historic test score declines; intensifying political divides that would challenge school systems even if there weren’t simultaneous health, staffing and learning crises to manage; broader economic stresses that are making it harder to manage school systems; and a sustained push by many educators and families to embrace innovations and out-of-the-box thinking to help kids accelerate their learning by any means necessary.

Now, 2½ years into one of the most turbulent periods in the history of American education, these were our 22 most discussed articles of 2022: 

The COVID School Years: 700 Days Since Lockdown 

Learning Loss: 700 days. As we reported Feb. 14, that’s how long it had been since more than half the nation’s schools crossed into the pandemic era. On March 16, 2020, districts in 27 states, encompassing almost 80,000 schools, closed their doors for the first long educational lockdown. Since then, schools have reopened, closed and reopened again. The effects have been immediate — students lost parents, teachers mourned fallen colleagues — and hopelessly abstract as educators weighed “pandemic learning loss,” the sometimes crude measure of COVID’s impact on students’ academic performance. 

With spring approaching, there were reasons to be hopeful. More children had been vaccinated. Mask mandates were ending. But even if the pandemic recedes and a “new normal” emerges, there are clear signs that the issues surfaced during this period will linger. COVID heightened inequities that have long been baked into the American educational system. The social contract between parents and schools has frayed. And teachers are burning out. To mark a third spring of educational disruption, Linda Jacobson interviewed educators, parents, students and researchers who spoke movingly, often unsparingly, about what Marguerite Roza, director of Georgetown University’s Edunomics Lab, called “a seismic interruption to education unlike anything we’ve ever seen.” Read her full report

Related:


Threatened & Trolled, School Board Members Quit in Record Numbers

School Leadership: By the time we published this report in May, the chaos and violence at big city school board meetings had dominated headlines for months, as protesters, spurred by ideological interest groups and social media campaigns, railed about race, gender and a host of other hot-button issues. But what does it look like when the boardroom is located in a small community, where the elected officials under fire often have lifelong ties to the people doing the shouting? Over the last 18 months, Minnesota K-12 districts have seen a record number of board members resign before the end of their term. As one said in a tearful explanation to her constituents, “The hate is just too much.” Beth Hawkins takes a look at the possible ramifications.  

Related:

  • Million-Dollar Records Request: From COVID and critical race theory to teachers’ names & schools, districts flooded with freedom of information document demands

Nation’s Report Card Shows Largest Drops Ever Recorded in 4th and 8th Grade Math

Student Achievement: In a moment the education world had anxiously awaited, the latest round of scores from the National Assessment of Educational Progress were released in October — and the news was harsh. Math scores saw the largest drops in the history of the exam, while reading performance also fell in a majority of states. National Center for Education Statistics Commissioner Peggy Carr said the “decline that we’re seeing in the math data is stark. It is troubling. It is significant.” Even as some state-level data has shown evidence of a rebound this year, federal officials warned COVID-19’s lost learning won’t be easily restored. The 74’s Kevin Mahnken breaks down the results.

Related:

  • Lost Decades: ‘Nation’s Report Card’ shows 20 years of growth wiped out by two years of pandemic
  • Economic Toll: Damage from NAEP math losses could total nearly $1 trillion
  • COVID Recovery: Can districts rise to the challenge of new NAEP results? Outlook’s not so good 

Virtual Nightmare: One Student’s Journey Through the Pandemic

Mental Health: As the debate over the lingering effects of school closures continues, the term “pandemic recovery” can often lose its meaning. For Jason Finuliar, a California teen whose Bay Area school district was among those shuttered the longest, the journey has been painful and slow. Once a happy, high-achieving student, he descended into academic failure and a depression so severe that he spent 10 days in a residential mental health facility. “I felt so worthless,” he said. It’s taking compassionate counselors, professional help and parents determined to save their son for Jason to regain hope for the future. Linda Jacobson reports. 


16 Under 16: Meet The 74’s 2022 Class of STEM Achievers

This spring, we asked for the country’s help identifying some of the most impressive students, age 16 or younger, who have shown extraordinary achievement in the fields of science, technology, engineering and mathematics. After an extensive and comprehensive selection process, we’re thrilled to introduce this year’s class of 16 Under 16 in STEM. The honorees range in age from 12 to 16, specialize in fields from medicine to agriculture to invention and represent the country from coast to coast. We hope these incredible youngsters can inspire others — and offer reassurance that our future can be in pretty good hands. Emmeline Zhao offers a closeup of the 2022 class of 16 Under 16 in STEM — click here to read and watch more about them.


A ‘National Teacher Shortage’? New Research Reveals Vastly Different Realities Between States & Regions

School Staffing: Adding to efforts to understand America’s teacher shortages, a new report and website maps the K-12 teaching vacancy data. Nationally, an estimated 36,504 full-time teacher positions are unfilled, with shortages currently localized in nine states. “There are substantial vacant teacher positions in the United States. And for some states, this is much higher than for other states. … It’s just a question of how severe it is,” said author Tuan Nguyen. Marianna McMurdock reports on America’s uneven crisis


Meet the Gatekeepers of Students’ Private Lives

School Surveillance: Megan Waskiewicz used to sit at the top of the bleachers and hide her face behind the glow of a laptop monitor. While watching one of her five children play basketball on the court below, the Pittsburgh mother didn’t want other parents in the crowd to know she was also looking at child porn. Waskiewicz worked on contract as a content moderator for Gaggle, a surveillance company that monitors the online behaviors of some 5 million students across the U.S. on their school-issued Google and Microsoft accounts in an effort to prevent youth violence and self-harm. As a result, kids’ deepest secrets — like nude selfies and suicide notes — regularly flashed onto Waskiewicz’s screen. Waskiewicz and other former moderators at Gaggle believe the company helped protect kids, but they also surfaced significant questions about its efficacy, employment practices and effect on students’ civil rights. Eight former moderators shared their experiences at Gaggle with The 74, describing insufficient safeguards to protect students’ sensitive data, a work culture that prioritized speed over quality, scheduling issues that sent them scrambling to get hours and frequent exposure to explicit content that left some traumatized. Read the latest investigation by The 74’s Mark Keierleber


Students Continue to Flee Urban Districts as Boom Towns, Virtual Schools Thrive

Exclusive Data: A year after the nation’s schools experienced a historic decline in enrollment, data shows many urban districts are still losing students, and those that rebounded this year typically haven’t returned to pre-pandemic levels. Of 40 states and the District of Columbia, few have seen more than a 1% increase compared with 2020-21, when some states experienced declines as high as 5%, according to data from Burbio, a company that tracks COVID-related education trends. Flat enrollment this year “means those kids did not come back,” said Thomas Dee, an education professor at Stanford University. While many urban districts were already losing students before the pandemic, COVID “accelerated” movement into outlying areas and to states with stronger job markets. Experts say that means many districts will have to make some tough decisions in the coming years. Linda Jacobson reports


‘Hybrid’ Homeschooling Making Inroads as Families Seek New Models

School Choice: As public school enrollments dip to historic lows, researchers are beginning to track families to hybrid homeschooling arrangements that meet in person a few days per week and send students home for the rest of the time. More formal than learning pods or microschools, many still rely on parents for varying levels of instruction and grading. About 60% to 70% are private, according to a new research center on hybrid schools based at Kennesaw State University, northwest of Atlanta. Greg Toppo reports.


Student Safety: Thousands of times every year, New York City school staff report what they fear may be child abuse or neglect to a state hotline. But the vast majority of the resulting investigations yield no evidence of maltreatment while plunging the families, most of them Black, Hispanic and low income, into fear and lasting trauma. Teachers are at the heart of the problem: From August 2019 to January 2022, two-thirds of their allegations were false alarms, data obtained by The 74 show. “Teachers, out of fear that they’re going to get in trouble, will report even if they’re just like, ‘Well, it could be abuse.’ … It also could be 10 million other things,” one Bronx teacher said.


Law enforcement work the scene after a mass shooting at Robb Elementary School May 24, 2022 in Uvalde, Texas. The massacre was one of 16 mass shootings in the U.S. in 10 days. (Jordan Vonderhaar/Getty Images)

The Contagion Effect: From Buffalo to Uvalde, 16 Mass Shootings in Just 10 Days

Gun Violence: May’s mass school shooting in Texas — the deadliest campus attack in about a decade — has refocused attention on the frequency of such devastating carnage on American victims. The tragedy unfolded just 10 days after a mass shooting at a supermarket in Buffalo, New York. It could be more than a coincidence: A growing body of research suggests these assaults have a tendency to spread like a viral disease. In fact, The U.S. has experienced 16 mass shootings with at least four victims in just 10 days. Read Mark Keierleber’s report


Teachers Leaving Jobs During Pandemic Find ‘Fertile’ Ground in New School Models

Microschools: Feeling that she could no longer effectively meet children’s needs in a traditional school, former counselor Heather Long is among those who left district jobs this year to teach in an alternative model — a microschool based in her New Hampshire home. “For the first time in their lives, they have options,” Jennifer Carolan of Reach Capital, an investment firm supporting online programs and ed tech ventures, told reporter Linda Jacobson. Some experts wonder if microschools are sustainable, but others say the ground is “fertile.” Read our full report


Eamonn Fitzmaurice/The 74/iStock

Facing Pandemic Learning Crisis, Districts Spend Relief Funds at a Snail’s Pace

School Funding: Schools that were closed the longest due to COVID have spent just a fraction of the billions in federal relief funds targeted to students who suffered the most academically, according to an analysis by The 74. The delay is significant, experts say, because research points to a direct correlation between the closures and lost learning. Of the 25 largest districts, the 12 that were in remote learning for at least half the 2020-21 school year have spent on average roughly 15% of their American Rescue Plan funds — and districts are increasing pressure on the Education Department for more time. Linda Jacobson reports.


Slave Money Paved the Streets. Now, This Posh Rhode Island City Strives to Teach Its Past 

Teaching History: Every year, millions of tourists marvel at Newport, Rhode Island’s colonial architecture, savor lobster rolls on the wharf and gaze at waters that — many don’t realize — launched more slave trading voyages than anywhere else in North America. But after years of invisibility, that obscured chapter is becoming better known, partly because the Ocean State passed a law in 2021 requiring schools to teach Rhode Island’s “African Heritage History.” Amid recent headlines that the state’s capital city is now moving forward with a $10 million reparations program, read Asher Lehrer-Small’s examination of how Newport is looking to empower schools to confront the city’s difficult past. 


Harvard Economist Thomas Kane on Learning Loss, and Why Many Schools Aren’t Prepared to Combat It 

74 Interview: This spring, Harvard economist Thomas Kane co-authored one of the biggest — and most pessimistic — studies yet of COVID learning loss, revealing that school closures massively set back achievement for low-income students. The effects appear so large that, by his estimates, many schools will need to spend 100% of their COVID relief to counteract them. Perversely, though, many in the education world don’t realize that yet. “Once that sinks in,” he said, “I think people will realize that more aggressive action is necessary.” Read Kevin Mahnken’s full interview


In White, Wealthy Douglas County, Colorado, a Conservative School Board Majority Fires the Superintendent, and Fierce Backlash Ensues

Politics: The 2021 election of four conservative members to Colorado’s Douglas County school board led to the firing in February of schools Superintendent Corey Wise, who had served the district in various capacities for 26 years. The decision, which came at a meeting where public comment was barred, swiftly mobilized teachers, students and community members in opposition. Wise’s ouster came one day after a 1,500-employee sickout forced the shutdown of the state’s third-largest school district . A few days later, students walked out of school en masse, followed by litigation and talk of a school board recall effort. The battle mirrors those being fought in numerous districts throughout the country, with conservative parents, newly organized during the pandemic, championing one agenda and more moderate and liberal parent groups beginning to rise up to counter those views. Jo Napolitano reports.


Weaving Stronger School Communities: Nebraska’s Teacher of the Year Challenges Her Rural Community to Wrestle With the World 

Inspiring: Residents of tiny Taylor, Nebraska, call Megan Helberg a “returner” — one of the few kids to grow up in the town of 190 residents, leave to attend college in the big city and then return as an adult to rejoin this rural community in the Sandhills. Honored as the state’s 2020 Teacher of the Year, Helberg says she sees her role as going well beyond classroom lessons and academics. She teaches her students to value their deep roots in this close-knit circle. She advocates on behalf of her school — the same school she attended as a child — which is always threatened with closure due to small class sizes. She has also launched travel clubs through her schools, which Helberg says has strengthened her community by breaking students, parents and other community members out of their comfort zone and helping them gain a better view of the world outside Nebraska while also seeing their friends and neighbors in a whole new light. This past winter, as part of a broader two-month series on educators weaving community, a team from The 74 made multiple visits to Taylor to meet Helberg and see her in action with her students. Watch the full documentary by Jim Fields, and read our full story about Helberg’s background and inspiration by Laura Fay

Other profiles from this year’s Weaver series: 

  • Texas’s Alejandro Salazar: The band teacher who kept his school community connected through COVID’s chaos
  • Hawaii’s Heidi Maxie: How an island teacher builds community bridges through her Hawaii school
  • Georgia’s Allie Reeser: Living and learning among refugees in the ‘Ellis Island of the South’
  • : Meet 12 educators strengthening school communities amid the pandemic

Research: Babies Born During COVID Talk Less with Caregivers, Slower to Develop Critical Language Skills

Big Picture: Independent studies by Brown University and a national nonprofit focused on early language development found infants born during the pandemic produced significantly fewer vocalizations and had less verbal back-and-forth with their caretakers compared with those born before COVID. Both used the nonprofit LENA’s “talk pedometer” technology, which delivers detailed information on what children hear throughout the day, including the number of words spoken near the child and the child’s own language-related vocalizations. It also counts child-adult interactions, called “conversational turns,” which are critical to language acquisition. The joint finding is the latest troubling evidence of developmental delays discovered when comparing babies born before and after COVID. “I’m worried about how we set things up going forward such that our early childhood teachers and early childhood interventionalists are prepared for what is potentially a set of children who maybe aren’t performing as we expect them to,” Brown’s Sean Deoni tells The 74’s Jo Napolitano. Read our full report


Minneapolis Teacher Strike Lasted 3 Weeks. The Fallout Will Be Felt for Years

Two days after Minneapolis teachers ended their first strike in 50 years this past May, Superintendent Ed Graff walked out of a school board meeting, ostensibly because a student protester had used profanity. The next morning, he resigned. The swearing might have been the last straw, but the kit-bag of problems left unresolved by the district’s agreement with the striking unions is backbreaking indeed. Four-fifths of the district’s federal pandemic aid is now committed to staving off layoffs and giving classroom assistants and teachers bonuses and raises, leaving little for academic recovery at a moment when the percentage of disadvantaged students performing at grade level has dipped into the single digits. From potential school closures and misinformation about how much money the district actually has to layoffs of Black teachers, a lack of diversity in the workforce and how to make up for lost instructional time, Beth Hawkins reports on the aftermath


Mississippi Superintendent of Schools Carey Wright will retire this month after nearly nine years in office. (Mississippi Department of Education)

After Steering Mississippi’s Unlikely Learning Miracle, Carey Wright Steps Down

Profile: Mississippi, one of America’s poorest and least educated states, emerged in 2019 as a fast-rising exemplar in math and reading growth. The transformation of the state’s long-derided school system came about through intense work — in the classroom and the statehouse — to raise learning standards, overhaul reading instruction and reinvent professional development. And with longtime State Superintendent Carey Wright retiring at the end of June, The 74’s Kevin Mahnken looked at what comes next.


As Schools Push for More Tutoring, New Research Points to Its Effectiveness — and the Challenge of Scaling it to Combat Learning Loss

Learning Acceleration: In the two years that COVID-19 has upended schooling for millions of families, experts and education leaders have increasingly touted one tool as a means for coping with learning loss: personalized tutors. In February, just days after the secretary of education declared that every struggling student should receive 90 minutes of tutoring each week, a newly released study offers more evidence of the strategy’s potential — and perhaps its limitations. An online tutoring pilot launched last spring did yield modest, if positive, learning benefits for the hundreds of middle schoolers who participated. But those gains were considerably smaller than the impressive results from some previous studies, perhaps because of the project’s design: It relied on lightly trained volunteers, rather than professional educators, and held its sessions online instead of in person. “There is a tradeoff in navigating the current climate where what is possible might not be scalable,” the study’s co-author, Matthew Kraft, told The 74’s Kevin Mahnken. “So instead of just saying, ‘Come hell or high water, I’m going to build a huge tutoring program,’ we might be better off starting off with a small program and building it over time.” Read our full report


STEM: Robert Sansone was born to invent. His STEM creations range from springy leg extensions for sprinting to a go-kart that can reach speeds of 70 mph. But his latest project aims to solve a global problem: the unsustainability of electric car motors that use rare earth materials that are nonrenewable, expensive and pollute the environment during the mining and refining process. In Video Director James Field’s video profile, the Florida high schooler talks about his creation, inspiration and what he plans to do with his $75,000 prize from the 2022 Regeneron International Science and Engineering Fair. , and watch our full portrait below: 

]]>
LA Parents Sound Off After Cyberattack Leaves Students Vulnerable /article/la-parents-sound-off-after-cyberattack-leaves-students-vulnerable/ Thu, 06 Oct 2022 19:07:40 +0000 /?post_type=article&p=697787 For Christie Pesicka, the Los Angeles Unified School District cyberattack hits home.

During in 2014, Pesicka was one of thousands of Sony Pictures employees that had their private information exposed in the midst of aggressive attacks by a North Korean hacker group.

Now, as a mom, Pesicka worries about protecting her son Jackson, a 1st grade Playa Vista Elementary School student, so history doesn’t repeat itself.

“When you’re a kid, you won’t ever see a credit report and find out that there’s something on there until you go off to college,” Pesicka said in an interview. “By that time, somebody has had 15 years to rack up a bunch of different credit cards or properties or whatever else on your kid’s account…so that’s very concerning.”


Get stories like these delivered straight to your inbox. Sign up for The 74 Newsletter


Like Pesicka, LAUSD parents have raised concerns about the district’s response to the cyberattack, ranging from long term data protection to how well a hotline — created to answer parents and staff questions — is working. 

The public release of about 500 gigabytes of stolen district data was posted on the dark web Saturday by Vice Society, a Russian-speaking ransomware gang known to target school districts.

After the district and law enforcement analysts reviewed about two-thirds of the data, LAUSD Superintendent Alberto Carvalho assured students, parents and employees that there is no reason for widespread concern.

“The release was actually more limited than what we had originally anticipated,” Carvalho said in a Monday downplaying the damage done.

Carvalho said any exposed student data – including names, academic information and personal addresses – was between 2013 and 2016, insisting most middle and high school students during that period already graduated.

For now, Carvalho confirmed students who did have their data breached will be contacted and offered credit monitoring services.

But many parents were not convinced the superintendent’s response was enough to ease their concerns about the cyberattack.

When Pesicka’s private information was exposed, Sony offered her one year of credit monitoring. But she found out years later she had a stolen identity and social security number.

“I had three people working under my social security number and I had my identity compromised,” Pesicka said in an interview. “Anybody who’s been through identity theft knows how difficult it is and how there’s not really a streamlined process or way to scrub your information.”

Teresa Gaines, the mom of 2nd and 3rd grade students at Grand View Boulevard Elementary School, was troubled by Carvalho’s response because it didn’t provide the urgency she was hoping for.

“Some people don’t realize how serious this can be because what if five or ten years from now our kids go to college and all of a sudden they get denied entrance because of something that is not their fault…or somebody uses that data to cause issues that prevent them from getting into certain programs or denied work,” Gaines said in an interview.

Gaines also said LAUSD should provide more targeted outreach to families through “town halls” and “informational webinars” so parents could ask questions about the cyberattack.

She is particularly concerned by the release of psychological assessments, which Carvalho insisted did not happen during his press conference. However, the Los Angeles Times did find .

For Jenna Schwartz, the mom of a 7th grade student in North Hollywood, Carvalho’s response left her cautiously optimistic.

“If I find out I was impacted…but it was just my child’s school photograph from 2013 and his attendance record, I don’t care as much,” Schwartz said in an interview. “If it was my social security number and bank information, those are two very different scenarios.”

Carvalho pointed parents to the district’s hotline, available Monday through Friday and this weekend for additional questions or support on the cyberattack.

But parents reported long wait times, and limited hours and information when the hotline began earlier this week.  

“Unless you ask a question that fits into their script, they don’t really have a response,” Pesicka said in an interview. “And even if you do, you’re getting a very robotic response.”

In addition, Schwartz noted that she’s “not sure what good the hotline is at this point other than sort of just to make people feel better.”

After a request for comment, a spokesperson from LAUSD referred back to Carvalho’s statement on the cyberattack: 

The hotline hours have been updated to weekdays from 8 a.m. to 8 p.m. and this weekend from 6 a.m. to 3:30 p.m.

]]>
LA District Downplays Student Harm After Cyber Gang Posts Sensitive Data Online /article/lausd-data-breach-los-angeles-hack-student-data/ Mon, 03 Oct 2022 21:57:31 +0000 /?post_type=article&p=697514 Updated, Oct. 4

The Vice Society ransomware gang reportedly published over the weekend a trove of sensitive student records from the Los Angeles school district. The data was posted to the gang’s dark-web “leak site,” after education leaders refused to pay — and at first even acknowledge — a ransom. 

Yet in a press conference Monday, Superintendent Alberto Carvalho sought to downplay the damage done, particularly as it relates to records about children. An said that student psychiatric evaluation records had been published online, citing a confidential law enforcement source. That reporting, Carvalho said, is “absolutely incorrect.”

“We have seen no evidence that psychiatric evaluation information or health records, based on what we’ve seen thus far, has been made available publicly,” said Carvalho, who acknowledged the hackers had “touched” the district’s massive student information system. The “vast majority” of exposed student data, including names, academic information and personal addresses, was from a period between 2013 and 2016. “That is the extent of the student information data that we have seen.”

Roughly 500 gigabytes of district data was made public on Sunday by the Russian-speaking ransomware gang, which took credit for stealing the district records in a massive data breach last month. The full scope of the information released is unclear, yet after reviewing about two-thirds of the data, Carvalho said that “so far, based on what we’ve seen, critical health information or Social Security numbers for students,” is not included.

Carvalho confirmed on Sunday that LAUSD’s data had been published on the dark web, but did not verify the type of data that was leaked. On Monday, he said that information from private-sector contractors, particularly those in construction, appeared most impacted. Breached records include contracts, financial information and personally identifiable data, Carvalho said.

Cybersecurity experts have warned that the release of district data could come with significant risks for current and former students. Children’s Social Security numbers are particularly valuable to identity thieves because they can be used for years without raising alarm.

James Turgal, a former executive assistant director for the FBI Information and Technology Branch, said it’s particularly important for officials to protect the sensitive data of children, who may “find out they own a condo in Bora Bora under their name 15 years from now” because their information was exploited. 

Turgal, now the vice president of cyber risk and strategy at Optiv Security, praised the district’s decision to withhold payment.

“There’s no upside to ever paying a ransom,” said Turgal, “More likely than not, even if LAUSD would have paid the ransom, [Vice Society] still would have disclosed the information” on their leak site. 

Carvalho made it clear in several statements the district had no intentions of paying up, possibly prompting the criminals to publish the stolen data earlier than planned. Vice Society, which took credit for a massive data breach that caused widespread disruptions at America’s second-largest school district, had initially . 

“What I can tell you is that the demand — any demand — would be absurd,” Carvalho told the Los Angeles Times. “But this level of demand was, quite frankly, insulting. And we’re not about to enter into negotiations with that type of entity.” 

In a statement, the district acknowledged that paying a ransom wouldn’t ensure the recovery of data and asserted that “public dollars are better spent on our students rather than capitulating to a nefarious and illicit crime syndicate. We continue to make progress toward full operational stability for several core information technology services.” 

The district announced on Sunday a new hotline available to concerned parents and students seeking information about the breach. A district spokesperson declined to comment further. The district has also not revealed details of Vice Society’s demand.

In an email to The 74, Vice Society said they published the district data because “they didn’t pay,” and acknowledged the “ransom demand was big” without providing a specific figure. Asked what makes school districts attractive victims for such attacks, the group offered a brief explanation: “Maybe news? Don’t know … We just attack it =).”

Over the weekend, they that they demanded a ransom weeks earlier than district officials have publicly acknowledged. Asked about the size of the ransom, the group replied, “let’s say that it was big =).”

Since the breach was disclosed, district officials have been working with federal authorities at the FBI and Cybersecurity and Infrastructure Security Agency, which the ransomware group says has “wasted our time,” in an email that federal authorities were “wrong” to advise the district against paying. 

“We always delete documents and help to restore network [sic], we don’t talk about companies that paid us,” the group told the news outlet. “Now LAUSD has lost 500GB of files.”

The 74 has not reviewed the data published to the Vice Society leak site. Doug Levin, the national director of The K12 Security Information eXchange, said Monday he was unable to independently verify information posted to the leak site, suggesting that it may have been the victim of a hack. But once the data was published online, he said, it’s impossible to rein it back in.

“You have to assume that it has been compromised by nefarious actors who have copied it down and the damage, therefore, is done,” Levin said. 

For example, while Vice Society likely posted most of the data it exfiltrated onto its leak site, they may have held onto the most sensitive data like Social Security numbers to sell on a dark web marketplace, often for identity theft.

Now that sensitive data has been disclosed, the district must formally notify victims that their information was compromised and provide advice on how to best protect themselves, Levin said. The district may find themselves on the hook for as much as $100 million in medium-term recovery costs, Levin noted, to improve their cybersecurity infrastructure and work to prevent another attack in the future.

He said it’s important that affected educators, parents and students . The district announced plans to provide credit monitoring services to victims, but Levin said that victims should consider freezing their credit. 

“The school district itself is likely going to be facing a crisis of confidence in its school community about its ability to keep data and their IT systems safe and secure,” Levin said. “Ultimately, they’re going to have to be able to answer the question of why they can be trusted to safeguard that personal information going forward.” 

Sign up for the School (in)Security newsletter.

Get the most critical news and information about students' rights, safety and well-being delivered straight to your inbox.

]]>
As Advocates and Parents Rally, Youth Online Privacy Bills on Life Support /article/as-advocates-and-parents-rally-youth-online-privacy-bills-on-life-support/ Wed, 14 Sep 2022 21:07:24 +0000 /?post_type=article&p=696557 Sen. Ed Markey was getting quizzed on the viability of new online privacy laws for children when he took a brief but awkward pause. 

The Democrat from Massachusetts, who has long championed consumer privacy and become a key adversary of tech companies like Meta for monetizing user data, joined a Zoom call Tuesday evening to rally support for two bills he said would protect kids from being manipulated by social media algorithms. But he also brought some bad news: The legislation had “stalled” in Washington despite bipartisan support. 

Advocates this week are making a push to get the bipartisan bills — the Kids Online Safety Act and the Children’s Online Privacy Protection Act 2.0 — across the finish line. In a letter on Monday, 145 groups including Fairplay and Common Sense Media urged lawmakers to pass the legislation in the interests of protecting youth mental health, now considered at an all-time low in this country. 


Get stories like these delivered straight to your inbox. Sign up for The 74 Newsletter


But Markey seemed to lay out a path requiring Herculean effort. 

“Only the paranoid survive,” Markey said, adding that the legislation would pass if its supporters — and youth activists in particular — called their lawmakers and demanded they “pull this out of the pile of issues” and give it priority. “We’re going to try to get it over the finish line, but we need you to just have your energy level go higher and higher for these final couple of months and we will get it done.”

The legislative push comes a year after a Facebook whistleblower disclosed research showing that the social media app Instagram had a harmful effect on youth mental well-being, especially for teenage girls. The whistleblower, Frances Haugen, to regulate social media companies — Meta owns Facebook and Instagram — that she accused of pursuing “astronomical profits” while knowingly putting its users at risk. revealed the company knew Instagram made “body image issues worse for one in three teen girls” who blamed the social media platform for driving “increases in the rate of anxiety and depression” and, for some, suicidal thoughts. 

The would make tech companies liable if they expose young people to content deemed harmful, including materials that promote self-harm, eating disorders and substance abuse. It would also require parental controls that could be used to block adult content and to study systems to verify users’ age “at the device or operating system level.”

The , which expands a law that Markey championed in 1998 to cover older teens, would ban targeted advertisements directed at children and require companies to offer an “eraser button” that allows children and teens to remove their personal data. 

Former Facebook employee Frances Haugen (Getty Images)

But deep-pocketed tech companies, Sen. Richard Blumenthal said Tuesday, are standing in the way. 

“Our obstacles here are the big tech lobbyists,” he said. “They have armies of lobbyists. They pay them, they pay them very well. They hire them to block this legislation.”

While the legislation is designed to protect kids, some digital privacy experts say the rules could come with significant unintended consequences — and could lead to an age-verification system where all web users are made to submit documentation like a driver’s license, requiring them to hand over personal information to tech companies. 

On the Zoom call to bolster support for the bills was Vinaya Sivakumar, a high school senior from Ohio, who created her first social media profile when she was 12. What started out as being harmless, she said, quickly took a toll on her health. 

“It just snowballed into something that constantly perpetuated actions and thoughts like self-harm and eating disorders and it was really never let out of my sight,” said Sivakumar, referring to a stream of content she found harmful being fed to her by algorithms. “It almost encouraged me to make decisions that I didn’t necessarily feel were mine and my mental health was in the worst state ever.”

Kristin Bride, a mother and digital safety advocate from Oregon, implored lawmakers to pass the legislation for kids like her 16-year-old son Carson, who died by suicide in 2020 after he was “visciously bullied” by other kids on Snapchat who used third-party apps to conceal their identities. Last year, Bride , the company that owns the social media app Snapchat, and accused it of lacking safeguards to protect children from harassment. In response, Snap suspended two of the apps, Yolo and LMK. But , NGL, has since cropped up. 

“Until social media companies are held accountable for their harmful products, they will always put profit over people,” Bride said, “and kids like Carson and so many others are just collateral damage.” 

Despite the heightened focus in Washington around digital rights and tech companies’ use of user data for targeted advertising, broader digital privacy legislation has also struggled this year. which would create a national digital privacy standard and limit the personal data that tech companies can collect about users, has hit roadblocks, from House Speaker Nancy Pelosi. 

Earlier this month, Ireland’s Data Protection Commission for violating European Union data privacy laws. The commission has been investigating the company for an Instagram setting that automatically sets the profiles of teenagers as public by default. 

Meanwhile, Meta has begun to roll out , including that automatically routes new users younger than 16 to a version with limits on content deemed inappropriate.

The childrens’ safety legislation, which would strengthen rules that haven’t been updated for decades, has received support from a broad range of groups focused on youth well-being, including and the American Psychological Association and The Jed Foundation. from digital rights advocates including the Electronic Frontier Foundation. In that while lawmakers deserve credit “for attempting to improve online data privacy for young people,” the plan would ultimately “require surveillance and censorship” of children and teens “and would greatly endanger the rights, and safety, of young people online.” 

“Data collection is a scourge for every internet user, regardless of age,” the report notes, but the legislation could ultimately force tech companies to further track their users. “Surveillance of young people is , even in the healthiest household, and is not a solution to helping young people navigate the internet.”

Disclosure: Campbell Brown oversees global media partnerships at Meta. Brown co-founded The 74  and sits on its board of directors.

]]>
With ‘Don’t Say Gay’ Laws & Abortion Bans, Student Surveillance Raises New Risks /article/with-dont-say-gay-laws-abortion-bans-student-surveillance-raises-new-risks/ Thu, 08 Sep 2022 10:30:00 +0000 /?post_type=article&p=696150 While growing up along the Gulf Coast in Mississippi, Kenyatta Thomas relied on the internet and other teenagers to learn about sex.

Thomas and their peers watched videos during high school gym class that stressed the importance of abstinence — and the horrors that can come from sex before marriage. But for Thomas, who is bisexual and nonbinary, the lessons didn’t explain who they were as a person. 

“It was very confusing trying to navigate understanding who I am and my identity,” said Thomas, now a student at Arizona State University. It was on the internet that Thomas learned about a whole community of young people with similar experiences. Blog posts on Tumblr helped them make sense of their place in the world and what it meant to be bisexual. “I was able to find the words to understand who I am — words that I wouldn’t be able to piece together in a sentence if the internet wasn’t there.” 


Get stories like these delivered straight to your inbox. Sign up for The 74 Newsletter


But now, as states adopt anti-LGBTQ laws and abortion bans, the digital footprint that Thomas and other students leave may come back to harm them, privacy and civil rights advocates warn, and it could be their school-issued devices that end up exposing them to that legal peril.

For years, schools across the U.S. have used digital surveillance tools that collect a trove of information about youth sexuality — intimate details that are gleaned from students’ conversations with friends, diary entries and search histories. Meanwhile, student information collected by student surveillance companies are regularly shared with police, according to a recent survey conducted by the nonprofit Center for Democracy and Technology. These two realities are concerning to Elizabeth Laird, the center’s director of equity in civic technology. Following the Supreme Court’s repeal of Roe v. Wade in June, she said information about youth sexuality could be weaponized. 

 “Right now — without doing anything — schools may be getting alerts about students” who are searching the internet for resources related to reproductive health,” Laird said. “If you are in a state that has a law that criminalizes abortion, right now this tool could be used to enforce those laws.”

Teens across the country are already to fill the void for themselves and their peers in the current climate. Thomas, the ASU student and an outspoken reproductive justice activist, said that while students are generally aware that school devices and accounts are monitored, the repeal of Roe has led some to take extra privacy precautions. 

Kenyatta Thomas, an Arizona State University student and activist, participates in an abortion-rights protest. (Photo courtesy Kenyatta Thomas)

“I have switched to using Signal to talk to friends and colleagues in this space,” they said, referring to the . “The fear, even though it’s been common knowledge for basically my generation’s entire life that everything you do is being surveilled, it definitely has been amplified tenfold.”

Police have long used social media and other online platforms to investigate people for breaking abortion rules, including where police obtained a teen’s private Facebook messages through a search warrant before charging the then-17-year-old and her mother with violating the state’s ban on abortions after 20 weeks of pregnancy. 

LGBTQ students face similar risks as lawmakers in Florida and elsewhere impose rules that prohibit classroom discussions about sexuality and gender. This year alone, lawmakers have proposed 300 anti-LGBTQ bills and about a dozen have . They so-called “Don’t Say Gay” laws in Florida and Alabama that ban classroom discussions about gender and sexuality and require school officials to tell the parents of children who share that they may be gay or transgender. 

In a survey, a fifth of LGBTQ students told the Center for Democracy and Technology that they or another student they knew had their sexual orientation or gender identity disclosed without their consent due to online student monitoring. They were more likely than straight and cisgender students to report getting into trouble for their web browsing activity and to be contacted by the police about having committed a crime. 

LGBTQ youth are nearly twice as likely as their straight and cisgender classmates to search for health information online, according to . But as anti-LGBTQ laws proliferate, student surveillance tools should reconsider collecting data about youth sexuality, Christopher Wood, the group’s co-founder and executive director, told The 74. 

“Right now, we are not in a landscape or an environment where that is safe for a company to be doing,” Wood said. “If there is a remote possibility that the information that they are trying to provide to help a student could potentially lead them into more harm, then they need to be looking at that very carefully and considering whether that is the appropriate direction for a company to be taking.”

Digital student monitoring tools have a negative disparate impact on LGBTQ youth, according to a recent student survey by the nonprofit Center for Democracy and Technology. (Photo courtesy Center for Democracy and Technology)

‘Extraordinarily concerned’

For decades, has required school technology to block access to images that are obscene, child pornography or deemed “harmful to minors,” and schools have used web-filtering software to prevent students from accessing sexually explicit content. But in some cases, the filtering to block pro-LGBTQ websites that aren’t explicit, including those that offer crisis counseling.  

Many student monitoring tools, which saw significant growth during the pandemic, go far beyond web filtering and employ artificial intelligence to track students across the web to identify issues like depression and violent impulses. The tools can sift through students’ social media posts, follow their digital movements in real time and scan files on school-issued laptops — from classroom assignments to journal entries — in search of warning signs. 

They’ve also come under heightened scrutiny. In a report this year, Democratic Sens. Elizabeth Warren and Ed Markey warned that schools’ widespread adoption of the tools could trample students’ civil rights. By flagging words related to sexual orientation, the report notes, LGBTQ youth could be subjected to disproportionate disciplinary rates and be unintentionally outed to their parents. 

In in July, Warren and Markey cautioned that the tools could pose new risks following the repeal of Roe and asked four leading student surveillance companies — GoGuardian, Gaggle, Securly and Bark — whether they flag students for using keywords related to reproductive health, such as “pregnant” and “abortion.”

“We are extraordinarily concerned that your software could result in punishment or criminalization of students seeking contraception, abortion or other reproductive health care,” Markey and Warren wrote. “With reproductive rights under attack nationwide, it would represent a betrayal of your company’s mission to support students if you fail to provide appropriate protections for students’ privacy related to reproductive health information.”

Student activity monitoring tools are more often used to discipline students than protect them from violence and mental health crises, according to a recent teacher survey by the nonprofit Center for Democracy and Technology. (Photo courtesy Center for Democracy and Technology)

The scrutiny is part of a larger concern over digital privacy in the post-Roe world. In August, the Federal Trade Commission and accused the company of selling the location data from hundreds of millions of cell phones that could be used to track peoples’ movements. Such precise location data, the , “may be used to track consumers to sensitive locations, including places of religious worship, places that may be used to infer an LGBTQ+ identification, domestic abuse shelters, medical facilities and welfare and homeless shelters.” 

School surveillance companies have acknowledged their tools track student references to sex but sought to downplay the risks they pose to students. Bark spokesperson Adina Kalish said the company began to immediately purge all data related to reproductive health after a leaked Supreme Court draft opinion suggested Roe’s repeal was imminent – despite maintaining a 30-day retention period for most other data. 

“By immediately and permanently deleting data which contains a student’s reproductive health data or searches for reproductive health information, such data is not in our possession and therefore not produce-able under a court order, subpoena, etc.,” Bark CEO Brian Bason , which the company shared with The 74. 

GoGuardian spokesperson Jeff Gordon said its tools “cannot be used by educators or schools to flag reproductive health-related search terms” and its web filter cannot “flag reproductive health-related searches.” Securly didn’t respond to requests for comment. Last year its web-filtering tool categorized health resources for LGBTQ teens as pornography. 

Gaggle founder and CEO Jeff Patterson to the senators that his company does not “collect health data of any kind including reproductive health information,” specifying that the monitoring tool does not flag students who use the terms “pregnant, abortion, birth control, contraception or Planned Parenthood. ” 

Yet tracking conversations about sex is a primary part of Gaggle’s business — more than references to suicide, violence or drug use, according to nearly 1,300 incident reports generated by the company for Minneapolis Public Schools during a six-month period in 2020. The reports, obtained by The 74, showed that 38% were prompted by content that was pornographic or sexual in nature, including references to “sexual activity involving a student.” Students were regularly flagged for using keywords like “virginity,” “rape,” and, simply, “sex.” 

Patterson, the Gaggle CEO, has acknowledged that a student’s private diary entry about being raped wasn’t off limits. In touting the tool’s capabilities, he told The 74 his company uncovered the girl’s diary entry, where she discussed how the assault led to self-esteem issues and guilt. Nobody knew she was struggling until Gaggle notified school officials about what they’d learned from her diary, Patterson said. 

“They were able to intervene and get this girl help for things that she couldn’t have dealt with on her own,” Patterson said.

Any information that surveillance companies collect about students’ sexual behaviors could be used against them by police during investigations, privacy experts warned. And it’s unclear, Laird said, how long the police can retain any data gleaned from the tools. 

‘Don’t Say Gay’

Internet search engines are “particularly potent” tools to track the behaviors of pregnant people, by the nonprofit Surveillance Technology Oversight Project. In 2017, for example, a with second-degree murder of her stillborn fetus after police scoured her browser history and identified a search for an abortion pill. 

While GoGuardian and other companies offer web filtering to schools, Gaggle has sought to differentiate itself. In his letter to the senators, Patterson said the company — which sifts through files and chat messages on students’ school-issued Microsoft and Google accounts — is not a web filter and therefore “does not track students’ online searches.” Yet Patterson’s assurance to lawmakers appears misleading. The company acknowledges on its website that it partners with several web-filtering companies, including Linewize, to analyze students’ online searches. By working in tandem, flags triggered by Linewize’s web filtering “can be sent straight to the Gaggle Safety Team,” if the material “should be forwarded to the school or district.” 

In an email, Gaggle spokesperson Paget Hetherington said that in “a very small number of school systems,” the company reviews alerts from web filters before they’re sent to school officials to “alleviate the large number of false positives” and ensure that “only the most critical and imminent issues are being seen by the district.” 

Gaggle has also faced scrutiny for including LGBTQ-specific keywords in its algorithm, including “gay” and “lesbian.” Patterson said the heightened surveillance of LGBTQ youth is necessary because they face a disproportionately high suicide rate, and Hetherington shared examples where the keywords were used to spot cyberbullying incidents. 

But critics have accused the company of discrimination. Wood of the nonprofit LGBT Tech said that anti-LGBT activists have used surveillance to target their opponents for generations. Prior to the seminal 1969 riots after New York City police raided the Stonewall Inn gay bar, LGBTQ spaces and made arrests for “inferring sexual perversion” and “serving gay people.” From the colonial era and into the 19th century, anti-sodomy laws carried the death penalty and police used the rules to investigate and incarcerate people suspected of same-sex intimate behaviors. 

Now, in the era of “Don’t Say Gay” laws, digital surveillance tools could be used to out LGBTQ students and put them in danger, Wood said. Student surveillance companies can claim their decision to include LGBTQ terminology is designed to help students, but historically such data have “been used against us in very detrimental ways.” 

Companies, he said, are unable to control how officials use that information in an era “where teachers and administrators and other students are encouraged to out other students or blame them or somehow get them in trouble for their identity.” In Texas, Republican Gov. Greg Abbott calling on child protective services to investigate as child abuse any parents who provide gender-affirming health care to their transgender children. 

“They can’t control what’s going to happen in Florida or Texas and they can’t control what’s going to happen in an individual home,” where students could be subjected to abuse, Wood said. “Any person in their right mind would be horrified to learn that it was their technology that ended up harming a youth or driving a youth to the point of feeling so isolated that they felt the only way out was suicide.” 

When private thoughts become public

Susan, a 14-year-old from Cincinnati, knows firsthand how surveillance companies can target students for discussing their sexuality. In middle school, she was assigned to write a “time capsule” letter to her future self. 

Until Susan retrieved the letter after high school graduation, her teacher said that no one — not even him — would read it. So Susan, who is now a freshman and asked to remain anonymous, used the private space to question her gender identity. 

But her teacher’s assurance wasn’t quite true, she learned. Someone had been reading the letter — and would soon hold it against her. 

In an automated May 2021 email, Gaggle notified her that the letter to her future self was “identified as inappropriate” and urged her to “refrain from storing or sharing inappropriate content.” In a “second warning,” sent to her inbox, she was told a school administrator was given “access to this violation.” After a third alert, she said, access to her school email account was restricted. She said the experience left her with “a sense of betrayal from my school.” She said she had no idea words like “gay” or “sex” could get flagged by Gaggle’s algorithm.

Susan, a student from Cincinnati, received an email alert from Gaggle notifying her that her classroom assignment, a “time capsule” letter to her future self, had been “identified as inappropriate.” (Courtesy Susan)

“It’s frustrating to know that this program finds the need to have these as keywords, and quite depressing,” she said. “There’s always going to be oppression against the community somewhere, it seems, and it’s quite disheartening.” 

School administrators reviewed the time capsule letter and determined it didn’t contain anything inappropriate, her mother Margaret said. While Susan lives in an LGBTQ-affirming household, Thomas, who grew up in Mississippi, warned that’s not the case for everyone.

“That’s not just the surveillance of your activities, that’s the surveillance of your thoughts,” Thomas said of Susan’s experience. “I know that wouldn’t have gone very well for me and I know for a lot of young people that would place them in a lot of danger.”

Such harms could be exacerbated, Margaret said, if authorities use student data to enforce Ohio’s strict abortion ban, which has already become the subject of national debate after a 10-year-old girl traveled to Indiana for an abortion. A 27-year-old man and accused of raping the child. 

Cincinnati Public Schools spokesman Mark Sherwood said in an email that “law enforcement is immediately contacted” if the district receives an alert from Gaggle suggesting that a student poses “an imminent threat of harm to self or others.” 

Given the state of abortion rules in Ohio, Susan said she’s concerned that student conversations and classroom assignments that discuss gender and sexuality could wind up in the hands of the police. She lost faith in school-issued technology after her assignment got flagged by Gaggle. 

“I just flat out don’t trust adults in positions of power or authority,” Susan said. “You don’t really know for sure what their true motives are or what they could be doing with the tools they have at their disposal.”

]]>
Illuminate Ed Pulled from ‘Student Privacy Pledge’ After Massive Data Breach /article/illuminate-ed-pulled-from-student-privacy-pledge-after-massive-data-breach/ Mon, 08 Aug 2022 18:01:00 +0000 /?post_type=article&p=694391 Updated

Embattled education technology vendor Illuminate Education has become the first-ever company to get booted from the Student Privacy Pledge, an unprecedented move that follows a massive data breach affecting millions of students and allegations the company misrepresented its security safeguards. 

The Future of Privacy Forum, which created the self-regulatory effort nearly a decade ago to promote ethical student data practices by education technology companies, announced on Monday it had stripped Illuminate of its pledge signatory designation and referred the company to the Federal Trade Commission and state attorneys general in New York and California, where the biggest breaches occurred, to “consider further appropriate action,” including sanctions. 

“Publicly available information appears to confirm that Illuminate Education did not encrypt all student information while” it was being stored or transferred from one system to another, forum CEO Jules Polonetsky said in a statement. He said the decision to de-list Illuminate came after a review including “direct outreach” to the company, which “would not state” that such privacy practices had been in place.


Get stories like these delivered straight to your inbox. Sign up for The 74 Newsletter


 “Such a failure to encrypt would violate several pledge provisions,” Polonetsky said, including a commitment to “maintain a comprehensive security program” to protect students’ sensitive information and to “comply with applicable laws,” including an “explicit data encryption requirement” in New York.

Encryption is the cybersecurity practice of scrambling readable data into an unusable format to prevent bad actors from understanding it without a key. Amazon Web Services to store student data on accounts that were easy to identify. 

Through the voluntary pledge, have agreed to to protect students’ online privacy. Though the privacy forum maintains that the pledge is legally binding and can be enforced by federal and state regulators, the move against Illuminate marks a dramatic shift in enforcement. The extent of the Illuminate breach remains unclear, encompasses districts in six states affecting an . 

Illuminate Education CEO Christine Willig (Illuminate Education)

Illuminate Education spokesperson Jane Snyder said the company is disappointed in the privacy forum’s decision, but it “will not detract from our commitment to safeguard the privacy of all student data in our care.” The privately held company founded in 2009 claims some 5,000 schools serving 17 million students use its tools.

“We will continue to monitor and enhance the security of our systems, and we will continue to work with students and school districts to resolve any concerns related to this matter while prioritizing the privacy and protection of the data we maintain,” Snyder said in a statement.

In a recent article in The 74, student privacy experts criticized the Big Tech-funded privacy forum for failing to sanction companies that break the agreement terms. 

The action taken against Illuminate comes just three months after the Federal Trade Commission announced efforts to ramp up enforcement of federal student privacy protections, including against companies that sell student data for targeted advertising and that lack reasonable systems “to maintain the confidentiality, security and integrity of children’s personal information.” 

The privacy forum maintains that the Federal Trade Commission and state attorneys general can hold companies accountable to their pledge commitments via consumer protection rules that prohibit unfair and deceptive business practices, but such action has never been taken. Education companies have long used the pledge as a marketing tool and the privacy forum has touted it as an assurance to schools as they shop for new technology. 

Signs of a data breach at California-based Illuminate first emerged in January when several of its popular digital tools, including programs used in New York City to track students’ grades and attendance, went dark. City officials announced in March that the personal data of some 820,000 current and former students had been compromised. Outside New York City, home to America’s largest school district, state officials said the breach affected an additional 174,000 students across the state. Student information in Los Angeles, the country’s second-largest school district, was also breached. 

Compromised data includes information about students’ eligibility for special education services and free or reduced-price lunch, their names, demographic information, immigration status and disciplinary records. 

New York City officials have accused Illuminate of misrepresenting its security safeguards and instructed educators to stop using its tools. New York State Education Department officials are investigating whether the company’s security practices run afoul of state law, which requires education vendors to maintain “reasonable” data security safeguards and to notify schools about data breaches “in the most expedient way possible and without unreasonable delay.” 

School districts in California, Colorado, Connecticut, Oklahoma and Washington have since that their personal information was compromised in the breach. Illuminate Education has never said how many people were affected by the lapse while at the that it has “no evidence that any information was subject to actual or attempted misuse.” 

CEO of the Future of Privacy Forum Jules Polonetsky (Future of Privacy Forum)

“FPF believes that the privacy and security of students’ information is essential,” Polonetsky said in the statement, declining to comment further. “To help ed tech companies better protect student data, we will be providing training for Pledge signatories, with a specific focus on data governance and security.”

For years, critics have accused the pledge of providing educators and parents with a false affirmation about the safety of education technology while being a tech-funded effort to thwart meaningful government regulation. 

The privacy forum’s decision to yank Illuminate doesn’t suggest stronger pledge enforcement going forward, said Doug Levin, the national director of The K12 Security Information eXchange. Rather, he accused the privacy forum of acting more in response to media coverage than a desire to hold companies to their promises.

“The only time that the Future of Privacy Forum has considered de-listing an organization is when the practices of a company have come under the attention of national media,” he said, adding that the press is an insufficient tool to hold tech companies accountable. “I think this is a case where [the privacy forum] was looking at collateral reputational damage and damage to the pledge and they had to act to protect their own self-interests and the interests of other pledge members. I do not read it as a signal that enforcement of the pledge will be enhanced going forward.”

Meanwhile, Levin sees Illuminate’s unwillingness to discuss its security practices with the privacy forum as another reason to believe the company acted negligently.

Illuminate is “clearly in legal jeopardy and I think they are concerned about making statements that could be used in a legal context to hold them accountable,” Levin said.

Still, the privacy forum’s decision to remove Illuminate raises the stakes from its previous enforcement efforts, most notably against the College Board, a nonprofit that administers the widely used SAT college admissions exam. In 2018, the privacy forum placed the nonprofit’s after found it was selling student data to third parties. The College Board was reinstated as an active pledge signatory a year later. It remains , despite a 2020 investigation by Consumer Reports that uncovered it was sending student data to major digital advertising platforms.

While some have argued that the College Board should have been removed from the pledge, the privacy forum has previously resisted efforts to de-list signatories. When the group learns about complaints against pledge signatories, it typically works with companies to resolve issues and ensure compliance, according to . 

Removing companies from the pledge, the post argued “could result in fewer privacy protections for users, as a former signatory would not be bound by the Pledge’s promises for future activities.”

Disclosure: The Bill & Melinda Gates Foundation and the Chan Zuckerberg Initiative provide financial support to the Future of Privacy Forum and The 74.

]]>
Survey Reveals Extent that Cops Surveil Students Online — in School and at Home /article/survey-reveals-extent-that-cops-surveil-students-online-in-school-and-at-home/ Wed, 03 Aug 2022 04:01:00 +0000 /?post_type=article&p=694119 When Baltimore students sign into their school-issued laptops, the police log on, too. 

Since the pandemic began, Baltimore City Public Schools officials have with GoGuardian, a digital surveillance tool that promises to identify youth at risk of harming themselves or others. When GoGuardian flags students, their online activities are shared automatically with school police, giving cops a conduit into kids’ private lives — including on nights and weekends.


Get stories like these delivered straight to your inbox. Sign up for The 74 Newsletter


Such partnerships between schools and police appear startlingly widespread across the country with significant implications for youth, according to . Nearly all teachers — 89% — reported that digital student monitoring tools like GoGuardian are used in their schools. And nearly half — 44% — said students have been contacted by the police as a result of student monitoring. 

The pandemic has led to major growth in the number of schools that rely on activity monitoring software to uncover student references to depression and violent impulses. The tools, offered by a handful of tech companies, can sift through students’ social media posts, follow their digital movements in real-time and scan files on school-issued laptops — from classroom assignments to journal entries — in search of warning signs. 

Educators say the tools help them identify youth who are struggling and get them the mental health care they need at a time when youth depression and anxiety are spiraling. But the survey suggests an alternate reality: Instead of getting help, many students are being punished for breaking school rules. And in some cases, survey results suggest, students are being subjected to discrimination. 

The report raises serious questions about whether digital surveillance tools are the best way to identify youth in need of mental health care and whether police officers should be on the front lines in responding to such emergencies. 

“If we’re saying this is to keep students safe, but instead we’re using it punitively and we’re using it to invite law enforcement literally into kids’ homes, is this actually achieving its intended goal?” asked Elizabeth Laird, a survey author and the center’s director of equity in civic technology. “Or are we, in the name of keeping students safe, actually endangering them?”

Among teachers who use monitoring tools at their schools, 78% said the software has been used to flag students for discipline and 59% said kids wound up getting punished as a result. Yet just 45% of teachers said the software is used to identify violent threats and 47% said it is used to identify students at risk of harming themselves. 

Center for Democracy and Technology

The findings are a direct contradiction of the stated goal of student activity monitoring, Laird said. School leaders and company executives have long maintained that the tools are not a disciplinary measure but are designed to identify at-risk students before someone gets hurt.

The Supreme Court’s recent repeal of Roe v. Wade, she said, further muddles police officers’ role in student activity monitoring. As states implement anti-abortion laws, that data from student activity monitoring tools could help the police identify youth seeking reproductive health care. 

“We know that law enforcement gets these alerts,” she said. “If you are in a state where they are looking to investigate these kinds of incidents, you’ve invited them into a student’s house to be able to do that.”

A tale of discrimination

In Baltimore, counselors, principals and school-based police officers receive all alerts generated by GoGuardian during school hours, according to by The Real News Network, a nonprofit media outlet. Outside of school hours, including on weekends and holidays, the responsibility to monitor alerts falls on the police, the outlet reported, and on numerous occasions officers have shown up at students’ homes to conduct wellness checks. On , students have been transported to the hospital for emergency mental health care. 

In a statement to The 74, district spokesperson Andre Riley said that GoGuardian helps officials “identify potential risks to the safety of individual students, groups or schools,” and that “proper accountability measures are taken” if students violate the code of conduct or break laws.

“The use of GoGuardian is not simply a prompt for a law enforcement response,” Riley added.

Leading student surveillance companies, including GoGuardian, have maintained that their interactions with police are limited. In April, Democratic Sens. Elizabeth Warren and Ed Markey warned in a report that schools’ reliance on the tools could violate students’ civil rights and exacerbate “the school-to-prison pipeline by increasing law enforcement interactions with students.” Warren and Markey focused their report on four companies: GoGuardian, Gaggle, Securly and Bark. 

In , Gaggle executives said the company contacts law enforcement for wellness checks if they are unable to reach school-based emergency contacts and a child appears to be “in immediate danger.” In on the company’s website, school officials in Wichita Falls, Texas, Cincinnati, Ohio, and Miami, Florida, acknowledged contacting police in response to Gaggle alerts.

In some cases, school leaders ask Securly to contact the police directly and request they conduct welfare checks on students, the to lawmakers. Executives at Bark said “there are limited options” beyond police intervention if they identify a student in crisis but they cannot reach a school administrator. 

“While we have witnessed many lives saved by police in these situations, unfortunately many officers have not received training in how to handle such crises,” in its letter. “Irrespective of training there is always a risk that a visit from law enforcement can create other negative outcomes for a student and their family.” 

In its , GoGuardian states the company may disclose student information “if we believe in good faith that doing so is necessary or appropriate to comply with any law enforcement, legal or regulatory process.” 

Center for Democracy and Technology

Meanwhile, survey results suggest that student surveillance tools have a negative disparate impact on Black and Hispanic students, LGBTQ youth and those from low-income households. In a letter on Wednesday to coincide with the survey’s release, a coalition of education and civil rights groups called on the U.S. Department of Education to issue guidance warning schools that their digital surveillance practices could violate federal civil rights laws. Signatories include the American Library Association, the Data Quality Campaign and the American Civil Liberties Union.

“This is becoming a conversation not just about privacy, but about discrimination,” Laird said. “Without a doubt, we see certain groups of students having outsized experiences in being directly targeted.”

In a youth survey, researchers found that student discipline as a result of activity monitoring fell disproportionately along racial lines, with 48% of Black students and 55% of Hispanic students reporting that they or someone they knew got into trouble for something that was flagged by an activity monitoring tool. Just 41% of white students reported having similar experiences. 

Nearly a third of LGBTQ students said they or someone they know experienced nonconsensual disclosure of their sexual orientation or gender identity — often called outing — as a result of activity monitoring. LGBTQ youth were also more likely than straight and cisgender students to report getting into trouble at school and being contacted by the police about having committed a crime. 

Some student surveillance companies, like Gaggle, monitor references to words including “gay” and “lesbian,” a reality company founder and CEO Jeff Patterson has said was created to protect LGBTQ youth, who face a greater risk of dying by suicide. But survey results suggest the heightened surveillance comes with significant harm to youth, and Laird said if monitoring tools are designed with certain students in mind, such as LGBTQ youth, that in itself is a form of discrimination. 

Center for Democracy and Technology

In its letter to the Education Department’s Office for Civil Rights Wednesday, advocates said the disparities outlined in the survey run counter to federal laws prohibiting race-, sex- and disability-based discrimination. 

“Student activity monitoring is subjecting protected classes of students to increased discipline and interactions with law enforcement, invading their privacy, and creating hostile environments for students to express their true thoughts and authentic identities,” the letter states. 

The Education Department’s civil rights division, they said, should condemn surveillance practices that violate students’ civil rights and launch “enforcement action against violations that result in discrimination.”

Lawmakers consider youth privacy

The report comes at a moment of increasing alarm about student privacy online. In May, the Federal Trade Commission announced plans to crack down on tech companies that sell student data for targeted advertising and that “illegally surveil children when they go online to learn.” 

It also comes at a time of intense concern over students’ emotional and physical well-being. While the pandemic has led to a greater focus on youth mental health, the May mass school shooting in Uvalde, Texas, has sparked renewed school safety efforts. In June, President Joe Biden signed a law with modest new gun-control provisions and an influx of federal funding for student mental health care and campus security. The funds could lead to more digital student surveillance.

The results of the online survey, which was conducted in May and June, were likely colored by the Uvalde tragedy, researchers acknowledged. A majority of parents and students have a favorable view of student activity monitoring during school hours to protect kids from harming themselves or others, researchers found. But just 48% of parents and 30% of students support around-the-clock surveillance. 

“Schools are under a lot of pressure to find ways to keep students safe and, like in many aspects of our lives, they are considering the role of technology,” Laird said. 

Last week, the Senate designed to improve children’s safety online, including new restrictions on youth-focused targeted advertising. The effort comes a year after a showing that the social media app Instagram had a harmful effect on youth mental well-being, especially teenage girls. One bill, the Kids Online Safety Act, would require tech companies to identify and mitigate any potential harms their products may pose to children, including exposure to content that promotes self-harm, eating disorders and substance abuse.

Yet the legislation has faced criticism from privacy advocates, who argue it would mandate digital monitoring similar to that offered by student surveillance companies. Among critics is the Electronic Frontier Foundation, a nonprofit focused on digital privacy and free speech. 

“The answer to our lack of privacy isn’t more tracking,” the . The legislation “is a heavy-handed plan to force technology companies to spy on young people and stop them from accessing content that is ‘not in their best interest,’ as defined by the government, and interpreted by tech platforms.” 

Attorney Amelia Vance, the founder and president of Public Interest Privacy Consulting, said she worries the provisions will have a negative impact on at-risk kids, including LGBTQ students. Students from marginalized groups, she said, “will now be more heavily surveilled by basically every site on the internet, and that information will be available to parents” who could discipline teens for researching LGBTQ content. She said the legislation could force tech companies to censor content to avoid potential liability, essentially making them arbiters of community standards. 

“When you have conflicting values in the different jurisdictions that the companies operate in, oftentimes you end up with the most conservative interpretations, which right now is anti-LGBT,” she said.

]]>
FTC Targets Ed Tech Companies that ‘Illegally Surveil Children’ /article/ftc-announces-plan-to-target-ed-tech-tools-that-illegally-surveil-children/ Fri, 20 May 2022 21:53:00 +0000 /?post_type=article&p=589724 The Federal Trade Commission announced ramped-up enforcement of education technology companies that sell student data for targeted advertising and that “illegally surveil children when they go online to learn,” in violation of federal student privacy rules.

“It is against the law for companies to force parents and schools to surrender their children’s privacy rights in order to do schoolwork online or attend class remotely,” the federal agency said in a media release Thursday. “Under the federal Children’s Online Privacy Protection Act (COPPA), companies cannot deny children access to educational technologies when their parents or school refuse to sign up for commercial surveillance.” 


Get stories like these delivered straight to your inbox. Sign up for The 74 Newsletter


Through a , the commission signaled its intent to “scrutinize compliance” with COPPA, the federal law that limits the data that technology companies can collect on children under 13 without parental consent. The statement, approved through a unanimous bipartisan vote by the five commissioners, reminds education technology companies that they are prohibited from using student data for commercial purposes, including for marketing and advertising, should not retain student data for a period longer than what’s deemed “reasonably necessary,” and must have sufficient security to ensure data remain confidential. Additionally, tech companies must not exclude students who do not disclose more personal information “than is reasonably necessary for the child to participate in that activity.” 

The policy statement comes at a critical moment for education technology companies. When the pandemic shuttered schools nationally and forced children into remote learning, their place in the education landscape grew exponentially as educators relied more heavily on their services. But they’ve also faced scrutiny for their data collection practices, particularly in the wake of high-profile breaches. recently notified students that their personal data was compromised in a breach at the company Illuminate Education. The hack exposed the personal information of some , the nation’s largest school district.

The FTC statement does not introduce any new rules, yet it makes clear that education technology and student privacy are an enforcement priority. Weak enforcement of student privacy rules has been a longstanding problem, said Cody Venzke, senior counsel at the nonprofit Center for Democracy and Technology.

Suggesting that the federal government had gone too easy on ed tech companies in the past, President Joe Biden criticized student surveillance practices on Thursday and signaled his support for greater student privacy protections. 

“When children and parents access online educational products, they shouldn’t be forced to accept tracking and surveillance to do so,” Biden said in a statement. The FTC, he said, “will be cracking down on companies that persist in exploiting our children to make money.” 

Among the services and applications that saw significant growth during the pandemic are those that monitor students’ online activities on school-issued devices and technology. Company executives say their digital products are critical to identify youth who are at risk of harming themselves or others, but critics argue the surveillance violates students’ privacy rights. 

The 74 has reported extensively on the expanding presence of such student surveillance companies, including Gaggle, which sifts through billions of student communications on school-issued Google and Microsoft accounts each year in search of references to violence and self-harm. Company executives say the tools save live,s but critics argue they could surveil students inappropriately, compound racial disparities in school discipline and waste tax dollars.

In one recent story, former content moderators on the front lines of Gaggle’s student monitoring efforts raised significant questions about the company’s efficacy and its effects on students’ civil rights. The former moderators reported insufficient safeguards to protect students’ sensitive data, a work culture that prioritized speed over quality, limited training and frequent exposure to explicit content that left some traumatized. 

In , FTC Chair Lina Khan said that “commercial surveillance cannot be a condition of doing schoolwork.” 

“Though widespread tracking, surveillance and expansive use of data across contexts have become increasingly common practices across the broader economy,” Khan said, the policy makes clear that federal law “forbids companies from wholesale extending these practices into the context of schools and learning.” 

The FTC’s comments on surveillance, Venzke said in an email, suggest that the agency will scrutinize the practices of education technology vendors that collect “troves of sensitive information about students’ lives, including student activity monitoring software vendors.” 

“Student activity monitoring companies must ensure they are taking appropriate steps to not only secure the sensitive data they collect on students, but also to ensure that they are collecting only the absolute minimum data that they need to achieve a legitimate educational purpose — and then that they delete the data when it is no longer needed,” Venzke said.

A Gaggle spokesperson didn’t immediately respond to a request for comment. In on Thursday, the company noted that it takes “data security very seriously,” only uses student information for educational purposes, has a strict data retention policy and has comprehensive security standards. The post said the company does not sell student data or engage in targeted advertising. 

Numerous companies have faced fines in recent years for violating the federal privacy law. In 2019, for example, YouTube paid to settle allegations it collected childrens’ data without parental consent and used it for targeted advertising. that same year to settle similar allegations. 

Amelia Vance

Despite the commission’s harsh critique of surveillance, the enforcement of student privacy rules will likely go beyond companies that monitor students online, said attorney Amelia Vance. the co-founder and president of Public Interest Privacy Consulting. She interpreted the FTC announcement to broadly encompass “surveillance capitalism,” where personal data are collected and sold for profit. However, she noted that Gaggle and other monitoring companies could have particular problems. In its announcement, the FTC said it is unreasonable for education technology companies to retain student data “for speculative future potential purposes.”

“So much of the monitoring information collected and kept, especially when it comes to tracking the mental health of students, it could easily, arguably be speculative,” she said. “That could cause confusion from companies about what obligations they have to either collect certain data or not collect certain data or not retain certain data even when the school has asked for it.” 

The FTC announcement follows a recent investigation into student monitoring companies by Democratic Sens. Elizabeth Warren and Ed Markey, which warned of surveillance companies’ potential harms and called on the Federal Communications Commission to clarify the provisions of another federal law, the Children’s Internet Protection Act, which requires schools to monitor students’ online activities.

In response to the FTC statement, a bipartisan group of senators cautioned that threats to online privacy have reached “a crisis point.” 

“We applaud the FTC’s attention to this urgent problem and its acknowledgment that a child’s education should never come at the expense of their privacy,” said a statement released by Markey, fellow Democratic Sen. Richard Blumenthal and Republican Sens. Bill Cassidy and Cynthia Lummis. “The FTC’s policy statement is an important step in the right direction, but it is not a replacement for legislative action.”

]]>
Meet the Gatekeepers of Students’ Private Lives /article/meet-the-gatekeepers-of-students-private-lives/ Mon, 02 May 2022 11:15:00 +0000 /?post_type=article&p=588567 If you are in crisis, please call the National Suicide Prevention Lifeline at 1-800-273-TALK (8255), or contact the Crisis Text Line by texting TALK to 741741.

Megan Waskiewicz used to sit at the top of the bleachers, rest her back against the wall and hide her face behind the glow of a laptop monitor. While watching one of her five children play basketball on the court below, she knew she had to be careful. 

The mother from Pittsburgh didn’t want other parents in the crowd to know she was also looking at child porn.

Waskiewicz worked as a content moderator for Gaggle, a surveillance company that monitors the online behaviors of some 5 million students across the U.S. on their school-issued Google and Microsoft accounts. Through an algorithm designed to flag references to sex, drugs, and violence and a team of content moderators like Waskiewicz, the company sifts through billions of students’ emails, chat messages and homework assignments each year. Their work is supposed to ferret out evidence of potential self-harm, threats or bullying, incidents that would prompt Gaggle to notify school leaders and, .

As a result, kids’ deepest secrets — like nude selfies and suicide notes — regularly flashed onto Waskiewicz’s screen. Though she felt “a little bit like a voyeur,” she believed Gaggle helped protect kids. But mostly, the low pay, the fight for decent hours, inconsistent instructions and stiff performance quotas left her feeling burned out. Gaggle’s moderators face pressure to review 300 incidents per hour and Waskiewicz knew she could get fired on a moment’s notice if she failed to distinguish mundane chatter from potential safety threats in a matter of seconds. She lasted about a year.

“In all honesty I was sort of half-assing it,” Waskiewicz admitted in an interview with The 74. “It wasn’t enough money and you’re really stuck there staring at the computer reading and just click, click, click, click.”

Content moderators like Waskiewicz, hundreds of whom are paid just $10 an hour on month-to-month contracts, are on the front lines of a company that claims it saved the lives of 1,400 students last school year and argues that the growing mental health crisis makes its presence in students’ private affairs essential. Gaggle founder and CEO Jeff Patterson has warned about “a tsunami of youth suicide headed our way” and said that schools have “a moral obligation to protect the kids on their digital playground.” 

Eight former content moderators at Gaggle shared their experiences for this story. While several believed their efforts in some cases did shield kids from serious harm, they also surfaced significant questions about the company’s efficacy, its employment practices and its effect on students’ civil rights.

Among the moderators who worked on a contractual basis, none had prior experience in school safety, security or mental health. Instead, their employment histories included retail work and customer service, but they were drawn to Gaggle while searching for remote jobs that promised flexible hours. 

They described an impersonal and cursory hiring process that appeared automated. Former moderators reported submitting applications online and never having interviews with Gaggle managers — either in-person, on the phone or over Zoom — before landing jobs.

Once hired, moderators reported insufficient safeguards to protect students’ sensitive data, a work culture that prioritized speed over quality, scheduling issues that sent them scrambling to get hours and frequent exposure to explicit content that left some traumatized. Contractors lacked benefits including mental health care and one former moderator said he quit after repeated exposure to explicit material that so disturbed him he couldn’t sleep and without “any money to show for what I was putting up with.”

Gaggle content moderators encompass as many as 600 contractors at any given time and just two dozen work as employees who have access to benefits and on-the-job training that lasts several weeks. Gaggle executives have sought to downplay contractors’ role with the company, arguing they use “common sense” to distinguish false flags generated by the algorithm from potential threats and do “not require substantial training.” 

While the experiences reported by Gaggle’s moderator team platforms like Meta-owned Facebook, Patterson said his company relies on “U.S.-based, U.S.-cultured reviewers as opposed to outsourcing that work to India or Mexico or the Philippines,” as . He rebuffed former moderators who said they lacked sufficient time to consider the severity of a particular item.

“Some people are not fast decision-makers. They need to take more time to process things and maybe they’re not right for that job,” he told The 74. “For some people, it’s no problem at all. For others, their brains don’t process that quickly.”

Executives also sought to minimize the contractors’ access to students’ personal information; a spokeswoman said they only see “small snippets of text” and lacked access to what’s known as students’ “personally identifiable information.” Yet former contractors described reading lengthy chat logs, seeing nude photographs and, in some cases, coming upon students’ names. Several former moderators said they struggled to determine whether something should be escalated as harmful due to “gray areas,” such as whether a Victoria’s Secret lingerie ad would be considered acceptable or not. 

“Those people are really just the very, very first pass,” Gaggle spokeswoman Paget Hetherington said. “It doesn’t really need training, it’s just like if there’s any possible doubt with that particular word or phrase it gets passed on.” 

Molly McElligott, a former content moderator and customer service representative, said management was laser focused on performance metrics, appearing more interested in business growth and profit than protecting kids. 

“I went into the experience extremely excited to help children in need,” McElligott wrote in an email. Unlike the contractors, McElligott was an employee at Gaggle, where she worked for five months in 2021 before taking a position at the Manhattan District Attorney’s Office in New York. “I realized that was not the primary focus of the company.”

Gaggle is part of a burgeoning campus security industry that’s seen significant business growth in the wake of mass school shootings as leaders scramble to prevent future attacks. Patterson, who founded the company in 1999 by that could be monitored for , said its focus now is mitigating the .

Patterson said the team talks about “lives saved” and child safety incidents at every meeting, and they are open about sharing the company’s financial outlook so that employees “can have confidence in the security of their jobs.”

Content moderators work at a Facebook office in Austin, Texas. Unlike the social media giant, Gaggle’s content moderators work remotely. (Ilana Panich-Linsman / Getty Images)

‘We are just expendable’

Under the pressure of new federal scrutiny along with three other companies that monitor students online, it relies on a “highly trained content review team” to analyze student materials and flag safety threats. Yet former contractors, who make up the bulk of Gaggle’s content review team, described their training as “a joke,” consisting of a slideshow and an online quiz, that left them ill-equipped to complete a job with such serious consequences for students and schools.

As an employee on the company’s safety team, McElligott said she underwent two weeks of training but the disorganized instruction meant her and other moderators were “more confused than when we started.”

Former content moderators have also flocked to employment websites like Indeed.com to warn job seekers about their experiences with the company, often sharing reviews that resembled the former moderators’ feedback to The 74.

“If you want to be not cared about, not valued and be completely stressed/traumatized on a daily basis this is totally the job for you,” one on Indeed. “Warning, you will see awful awful things. No they don’t provide therapy or any kind of support either.

“That isn’t even the worst part,” the reviewer continued. “The worst part is that the company does not care that you hold them on your backs. Without safety reps they wouldn’t be able to function, but we are just expendable.” 

As the first layer of Gaggle’s human review team, contractors analyze materials flagged by the algorithm and decide whether to escalate students’ communications for additional consideration. Designated employees on Gaggle’s Safety Team are in charge of calling or emailing school officials to notify them of troubling material identified in students’ files, Patterson said.

Gaggle’s staunchest critics have questioned the tool’s efficacy and describe it as a student privacy nightmare. In March, Democratic Sens. Elizabeth Warren and Ed Markey and similar companies to protect students’ civil rights and privacy. In a report, the senators said the tools could surveil students inappropriately, compound racial disparities in school discipline and waste tax dollars.

The information shared by the former Gaggle moderators with The 74 “struck me as the worst-case scenario,” said attorney Amelia Vance, the co-founder and president of Public Interest Privacy Consulting. Content moderators’ limited training and vetting, as well as their lack of backgrounds in youth mental health, she said, “is not acceptable.”

In to lawmakers, Gaggle described a two-tiered review procedure but didn’t disclose that low-wage contractors were the first line of defense. CEO Patterson told The 74 they “didn’t have nearly enough time” to respond to lawmakers’ questions about their business practices and didn’t want to divulge proprietary information. Gaggle uses a third party to conduct criminal background checks on contractors, Patterson said, but he acknowledged they aren’t interviewed before getting placed on the job.

“There’s a lot of contractors. We can’t do a physical interview of everyone and I don’t know if that’s appropriate,” he said. “It might actually introduce another set of biases in terms of who we hire or who we don’t hire.”

‘Other eyes were seeing it’

In a previous investigation, The 74 analyzed a cache of public records to expose how Gaggle’s algorithm and content moderators subject students to relentless digital surveillance long after classes end for the day, extending schools’ authority far beyond their traditional powers to regulate speech and behavior, including at home. Gaggle’s algorithm relies largely on keyword matching and gives content moderators a broad snapshot of students’ online activities including diary entries, classroom assignments and casual conversations between students and their friends. 

After the pandemic shuttered schools and shuffled students into remote learning, Gaggle oversaw a surge in students’ online materials and of school districts interested in their services. Gaggle as educators scrambled to keep a watchful eye on students whose chatter with peers moved from school hallways to instant messaging platforms like Google Hangouts. One year into the pandemic, Gaggle in references to suicide and self-harm, accounting for more than 40% of all flagged incidents. 

Waskiewicz, who began working for Gaggle in January 2020, said that remote learning spurred an immediate shift in students’ online behaviors. Under lockdown, students without computers at home began using school devices for personal conversations. Sifting through the everyday exchanges between students and their friends, Waskiewicz said, became a time suck and left her questioning her own principles. 

“I felt kind of bad because the kids didn’t have the ability to have stuff of their own and I wondered if they realized that it was public,” she said. “I just wonder if they realized that other eyes were seeing it other than them and their little friends.”

Student activity monitoring software like Gaggle has become ubiquitous in U.S. schools, and 81% of teachers work in schools that use tools to track students’ computer activity, according to a recent survey by the nonprofit Center for Democracy and Technology. A majority of teachers said the benefits of using such tools, which can block obscene material and monitor students’ screens in real time, outweigh potential risks.

Likewise, students generally recognize that their online activities on school-issued devices are being observed, the survey found, and alter their behaviors as a result. More than half of student respondents said they don’t share their true thoughts or ideas online as a result of school surveillance and 80% said they were more careful about what they search online. 

A majority of parents reported that the benefits of keeping tabs on their children’s activity exceeded the risks. Yet they may not have a full grasp on how programs like Gaggle work, including the heavy reliance on untrained contractors and weak privacy controls revealed by The 74’s reporting, said Elizabeth Laird, the group’s director of equity in civic technology. 

“I don’t know that the way this information is being handled actually would meet parents’ expectations,” Laird said. 

Another former contractor, who reached out to The 74 to share his experiences with the company anonymously, became a Gaggle moderator at the height of the pandemic. As COVID-19 cases grew, he said he felt unsafe continuing his previous job as a caregiver for people with disabilities so he applied to Gaggle because it offered remote work. 

About a week after he submitted an application, Gaggle gave him a key to kids’ private lives — including, most alarming to him, their nude selfies. Exposure to such content was traumatizing, the former moderator said, and while the job took a toll on his mental well-being, it didn’t come with health insurance. 

“I went to a mental hospital in high school due to some hereditary mental health issues and seeing some of these kids going through similar things really broke my heart,” said the former contractor, who shared his experiences on the condition of anonymity, saying he feared possible retaliation by the company. “It broke my heart that they had to go through these revelations about themselves in a context where they can’t even go to school and get out of the house a little bit. They have to do everything from home — and they’re being constantly monitored.” 

In this screenshot, Gaggle explains its terms and conditions for contract content moderators. The screenshot, which was provided to The 74 by a former contractor who asked to remain anonymous, has been redacted.

Gaggle employees are offered benefits, including health insurance, and can attend group therapy sessions twice per month, Hetherington said. Patterson acknowledged the job can take a toll on staff moderators, but sought to downplay its effects on contractors and said they’re warned about exposure to disturbing content during the application process. He said using contractors allows Gaggle to offer the service at a price school districts can afford. 

“Quite honestly, we’re dealing with school districts with very limited budgets,” Patterson said. “There have to be some tradeoffs.” 

The anonymous contractor said he wasn’t as concerned about his own well-being as he was about the welfare of the students under the company’s watch. The company lacked adequate safeguards to protect students’ sensitive information from leaking outside the digital environment that Gaggle built for moderators to review such materials. Contract moderators work remotely with limited supervision or oversight, and he became especially concerned about how the company handled students’ nude images, which are reported to school districts and the . Nudity and sexual content accounted for about 17% of emergency phone calls and email alerts to school officials last school year, . 

Contractors, he said, could easily save the images for themselves or share them on the dark web. 

Patterson acknowledged the possibility but said he wasn’t aware of any data breaches. 

“We do things in the interface to try to disable the ability to save those things,” Patterson said, but “you know, human beings who want to get around things can.”

‘Made me feel like the day was worth it’

Vara Heyman was looking for a career change. After working jobs in retail and customer service, she made the pivot to content moderation and a contract position with Gaggle was her first foot in the door. She was left feeling baffled by the impersonal hiring process, especially given the high stakes for students. 

Waskiewicz had a similar experience. In fact, she said the only time she ever interacted with a Gaggle supervisor was when she was instructed to provide her bank account information for direct deposit. The interaction left her questioning whether the company that contracts with more than 1,500 school districts was legitimate or a scam. 

“It was a little weird when they were asking for the banking information, like ‘Wait a minute is this real or what?’” Waskiewicz said. “I Googled them and I think they’re pretty big.”

Heyman said that sense of disconnect continued after being hired, with communications between contractors and their supervisors limited to a Slack channel. 

Despite the challenges, several former moderators believe their efforts kept kids safe from harm. McElligott, the former Gaggle safety team employee, recalled an occasion when she found a student’s suicide note. 

“Knowing I was able to help with that made me feel like the day was worth it,” she said. “Hearing from the school employees that we were able to alert about self-harm or suicidal tendencies from a student they would never expect to be suffering was also very rewarding. It meant that extra attention should or could be given to the student in a time of need.” 

Susan Enfield, the superintendent of Highline Public Schools in suburban Seattle, said her district’s contract with Gaggle has saved lives. Earlier this year, for example, the company detected a student’s suicide note early in the morning, allowing school officials to spring into action. The district uses Gaggle to keep kids safe, she said, but acknowledged it can be a disciplinary tool if students violate the district’s code of conduct. 

“No tool is perfect, every organization has room to improve, I’m sure you could find plenty of my former employees here in Highline that would give you an earful about working here as well,” said Enfield, one of 23 current or former superintendents from across the country who Gaggle cited as references in its letter to Congress. 

“There’s always going to be pros and cons to any organization, any service,” Enfield told The 74, “but our experience has been overwhelmingly positive.”

True safety threats were infrequent, former moderators said, and most of the content was mundane, in part because the company’s artificial intelligence lacked sophistication. They said the algorithm routinely flagged students’ papers on the novels To Kill a Mockingbird and The Catcher in the Rye. They also reported being inundated with spam emailed to students, acting as human spam filters for a task that’s long been automated in other contexts. 

Conor Scott, who worked as a contract moderator while in college, said that “99% of the time” Gaggle’s algorithm flagged pedestrian materials including pictures of sunsets and student’s essays about World War II. Valid safety concerns, including references to violence and self-harm, were rare, Scott said. But he still believed the service had value and felt he was doing “the right thing.”

McElligott said that managers’ personal opinions added another layer of complexity. Though moderators were “held to strict rules of right and wrong decisions,” she said they were ultimately “being judged against our managers’ opinions of what is concerning and what is not.” 

“I was told once that I was being overdramatic when it came to a potential inappropriate relationship between a child and adult,” she said. “There was also an item that made me think of potential trafficking or child sexual abuse, as there were clear sexual plans to meet up — and when I alerted it, I was told it was not as serious as I thought.” 

Patterson acknowledged that gray areas exist and that human discretion is a factor in deciding what materials are ultimately elevated to school leaders. But such materials, he said, are not the most urgent safety issues. He said their algorithm errs on the side of caution and flags harmless content because district leaders are “so concerned about students.” 

The former moderator who spoke anonymously said he grew alarmed by the sheer volume of mundane student materials that were captured by Gaggle’s surveillance dragnet, and pressure to work quickly didn’t offer enough time to evaluate long chat logs between students having “heartfelt and sensitive” conversations. On the other hand, run-of-the-mill chatter offered him a little wiggle room. 

“When I would see stuff like that I was like ‘Oh, thank God, I can just get this out of the way and heighten how many items per hour I’m getting,’” he said. “It’s like ‘I hope I get more of those because then I can maybe spend a little more time actually paying attention to the ones that need it.’” 

Ultimately, he said he was unprepared for such extensive access to students’ private lives. Because Gaggle’s algorithm flags keywords like “gay” and “lesbian,” for example, it alerted him to students exploring their sexuality online. Hetherington, the Gaggle spokeswoman, said such keywords are included in its dictionary to “ensure that these vulnerable students are not being harassed or suffering additional hardships,” but critics have accused the company of subjecting LGBTQ students to disproportionate surveillance. 

“I thought it would just be stopping school shootings or reducing cyberbullying but no, I read the chat logs of kids coming out to their friends,” the former moderator said. “I felt tremendous power was being put in my hands” to distinguish students’ benign conversations from real danger, “and I was given that power immediately for $10 an hour.” 

Minneapolis student Teeth Logsdon-Wallace, who posed for this photo with his dog Gilly, used a classroom assignment to discuss a previous suicide attempt and explained how his mental health had since improved. He became upset after Gaggle flagged his assignment. (Photo courtesy Alexis Logsdon)

A privacy issue

For years, student privacy advocates and civil rights groups have warned about the potential harms of Gaggle and similar surveillance companies. Fourteen-year-old Teeth Logsdon-Wallace, a Minneapolis high school student, fell under Gaggle’s watchful eye during the pandemic. Last September, he used a class assignment to write about a previous suicide attempt and explained how music helped him cope after being hospitalized. Gaggle flagged the assignment to a school counselor, a move the teen called a privacy violation. 

He said it’s “just really freaky” that moderators can review students’ sensitive materials in public places like at basketball games, but ultimately felt bad for the contractors on Gaggle’s content review team. 

“Not only is it violating the privacy rights of students, which is bad for our mental health, it’s traumatizing these moderators, which is bad for their mental health,” he said. Relying on low-wage workers with high turnover, limited training and without backgrounds in mental health, he said, can have consequences for students. 

“Bad labor conditions don’t just affect the workers,” he said. “It affects the people they say they are helping.” 

Gaggle cannot prohibit contractors from reviewing students’ private communications in public settings, Heather Durkac, the senior vice president of operations, said in a statement. 

“However, the contractors know the nature of the content they will be reviewing,” Durkac said. “It is their responsibility and part of their presumed good and reasonable work ethic to not be conducting these content reviews in a public place.” 

Gaggle’s former contractors also weighed students’ privacy rights. Heyman said she “went back and forth” on those implications for several days before applying to the job. She ultimately decided that Gaggle was acceptable since it is limited to school-issued technology. 

“If you don’t want your stuff looked at, you can use Hotmail, you can use Gmail, you can use Yahoo, you can use whatever else is out there,” she said. “As long as they’re being told and their parents are being told that their stuff is going to be monitored, I feel like that is OK.” 

Logsdon-Wallace and his mother said they didn’t know Gaggle existed until his classroom assignment got flagged to a school counselor. 

Meanwhile, the anonymous contractor said that chat conversations between students that got picked up by Gaggle’s algorithm helped him understand the effects that surveillance can have on young people. 

“Sometimes a kid would use a curse word and another kid would be like, ‘Dude, shut up, you know they’re watching these things,’” he said. “These kids know that they’re being looked in on,” even if they don’t realize their observer is a contractor working from the couch in his living room. “And to be the one that is doing that — that is basically fulfilling what these kids are paranoid about — it just felt awful.” 

If you are in crisis, please call the National Suicide Prevention Lifeline at 1-800-273-TALK (8255), or contact the Crisis Text Line by texting TALK to 741741.

Disclosure: Campbell Brown is the head of news partnerships at Facebook. Brown co-founded The 74 and sits on its board of directors.

]]>
Senate Inquiry Warns About Harms of Digital School Surveillance Tools /article/senate-inquiry-warns-about-harms-of-digital-school-surveillance-tools-calls-on-fcc-to-clarify-student-monitoring-rules/ Mon, 04 Apr 2022 21:37:00 +0000 /?post_type=article&p=587388 Updated, April 5

Democratic Sens. Elizabeth Warren and Ed Markey are calling on the Federal Communications Commission to clarify how schools should monitor students’ online activities, that educators’ widespread use of digital surveillance tools could trample students’ civil rights.

They also want the U.S. Education Department to start collecting data on the tools that could highlight whether they have disproportionate — and potentially harmful — effects on certain student groups. 

In October, the senators asked four education technology companies that keep tabs on the online activity of millions of students across the country — often 24 hours a day, seven days a week — to provide information on how they use artificial intelligence to glean their information. 

Based on their responses, the senators said:

  • The companies’ software may be misused to identify students who are violating school disciplinary rules. They cited a recent survey where 43% of teachers reported their schools employ the monitoring systems for this purpose, potentially increasing contact between police and students and worsening the school-to-prison pipeline.
  • The companies have not attempted to determine whether their products disproportionately target students of color, who already face harsher and more frequent school discipline, or other vulnerable groups, like LGBTQ youth.
  • Schools, parents and communities are not being appropriately informed of the use — and potential misuse — of the data. Three of the four companies indicated they do not directly alert students and guardians of their surveillance.

Warren and Markey concluded a dire “need for federal action to protect students’ civil rights, safety and privacy.”

“While the intent of these products, many of which monitor students’ online activity around the clock, may be to protect student safety, they raise significant privacy and equity concerns,” the lawmakers wrote. “Studies have highlighted unintended but harmful consequences of student activity monitoring software that fall disproportionately on vulnerable populations.”

An FCC spokesperson said they’re reviewing the and an Education Department spokesperson said they “look forward to corresponding with the senators” about its findings.

Lawmakers’ inquiry into the business practices of school security companies Gaggle, GoGuardian, Securly and Bark Technologies is the first congressional investigation into student surveillance tools, whose use grew dramatically during the pandemic when  learning shifted online.

It follows on the heels of investigative reporting by The 74 into Gaggle, which uses artificial intelligence and a team of human content moderators to track the online behaviors of more than 5 million students. The 74 used public records to expose how Gaggle’s algorithm and its hourly-wage workers sift through billions of student communications each year in search of references to violence and self harm, subjecting youth to constant digital surveillance with steep implications for their privacy. Gaggle, whose tools track students on their school-issued Google and Microsoft accounts, reported a during the pandemic.

Bark didn’t respond to requests for comment. Securly spokesman Josh Mukai said in a statement that the company is reviewing the senators’ March 30 report and looks forward “to continuing our dialogue with Senators Warren and Markey on the important topics they have raised.”

“Parents expect that schools will keep children safe while in the classroom, on a field trip or while riding on a bus,” GoGuardian spokesman Jeff Gordon said in a statement. “Schools also have a responsibility to keep students safe in digital spaces and on school-issued devices.” 

Gaggle Founder and CEO Jeff Patterson submitted a statement after this article was published. He said the company is reviewing the lawmakers’ recommendations “to assess how we can further strengthen our work to better protect students.”

“We want to ensure our technology is effectively supporting student safety without creating unintended risks or harms,” Patterson continued. “We have taken steps over the years to ensure effective privacy protections and mitigate bias in our platform, but welcome continued dialogue that will help make sure tools like Gaggle can continue to be used to support students and educators.”

Bark Technologies CEO Brian Bason wrote in a letter to  lawmakers that AI-driven technology could be used to solve the country’s “terrible history of bias in school discipline” by removing the decisions of individual teachers and administrators.

“While any system, including AI-based solutions, inherently have some bias, if implemented correctly AI-based solutions can substantially reduce the bias that students face,” Bason wrote.

As to the question of whether their surveillance exacerbates the school-to-prison pipeline,  the companies’ letters acknowledge in certain cases they contact police to conduct welfare checks on students. Securly noted in its letter that in some instances, education leaders “prefer that we contact public safety agencies directly in lieu of a district contact.”

Under the Clinton-era , passed in 2000, public schools and libraries are required to filter and monitor students’ internet use to ensure they don’t access material “harmful to minors,” such as pornography. Districts have cited the law to justify the adoption of AI-driven surveillance tools that have proliferated in recent years. Student privacy advocates argue the tools go far beyond the federal mandate and have called on the FCC to clarify the law’s scope. Meanwhile, advocates have questioned whether schools’ use of digital surveillance tools to monitor students at home violates Fourth Amendment protections against unreasonable searches and seizures.

In a recent survey by the nonprofit Center for Democracy and Technology, 81 percent of teachers said they used software to track students’ computer activity, including to block obscene material or monitor their screens in real time. A majority of parents said they worried about student data getting shared with the police and more than half of students said they decline to share their “true thoughts or ideas because I know what I do online is being monitored.”  

Elizabeth Laird, the group’s director of equity in civic technology, said it has been calling on student surveillance companies to be more transparent about their business practices but it’s “disappointing that it took a letter from Congress to get this information.” She said she hopes the FCC and Education Department adopt lawmakers’ recommendations.

“None of these companies have researched whether their products are biased against certain groups of students,” she said in an email while questioning their justification for holding off on such an inquiry. “They cite privacy as the reason for not doing so while simultaneously monitoring students’ messages, documents and sites visited 24 hours a day, seven days a week.” 

The 74’s investigation, which used data on Gaggle’s foothold in Minneapolis Public Schools, failed to identify whether the tool’s algorithm disproportionately targeted Black students, who are more often subjected to student discipline than their white classmates. However, it highlighted instances in which keywords like “gay” and “lesbian” were flagged, potentially subjecting LGBTQ youth to heightened surveillance for discussing their sexual orientation. 

Amelia Vance, an attorney and student privacy expert, said she was intrigued that the companies pushed back on the idea that their tools are used to discipline students since the federal monitoring requirement was meant to keep kids from consuming inappropriate content online and likely face consequences for viewing violent or sexually explicit materials. She agreed the companies should research their algorithms for potential biases and would benefit from additional transparency. 

However, Vance said in an email that FCC clarification “would do little at best and may provide counterproductive guidance at worst.” Many schools, she said, are likely to use the tools regardless of the federal rules. 

“Schools aren’t required to monitor social media, and many have chosen to do so anyway,” said Vance, the co-founder and president of Public Interest Privacy Consulting. Some school safety advocates are actively lobbying lawmakers to expand student monitoring requirements, she said. 

Asking the FCC to issue guidance “could actually be counterproductive to the goal of limiting monitoring and ensuring more privacy protections for students since it is possible that the FCC could require a higher level of monitoring.”

Read the letters from Gaggle, GoGuardian, Securly and Bark Technologies: 

]]>
Gaggle Surveils Millions of Kids in the Name of Safety. Targeted Families Argue it’s ‘Not That Smart’ /article/gaggle-surveillance-minnesapolis-families-not-smart-ai-monitoring/ Tue, 12 Oct 2021 11:15:00 +0000 /?post_type=article&p=578988 In the midst of a pandemic and a national uprising, Teeth Logsdon-Wallace was kept awake at night last summer by the constant sounds of helicopters and sirens. 

For the 13-year-old from Minneapolis who lives close to where George Floyd was murdered in May 2020, the pandemic-induced isolation and social unrest amplifed his transgender dysphoria, emotional distress that occurs when someone’s gender identity differs from their sex assigned at birth. His billowing depression landed him in the hospital after an attempt to die by suicide. During that dark stretch, he spent his days in an outpatient psychiatric facility, where therapists embraced music therapy. There, he listened to a punk song on loop that promised how  

Eventually they did. 


Get stories like these delivered straight to your inbox. Sign up for The 74 Newsletter


Logsdon-Wallace, a transgender eighth-grader who chose the name Teeth, has since “graduated” from weekly therapy sessions and has found a better headspace, but that didn’t stop school officials from springing into action after he wrote about his mental health. In a school assignment last month, he reflected on his suicide attempt and how the punk rock anthem by the band Ramshackle Glory helped him cope — intimate details that wound up in the hands of district security. 

In a classroom assignment last month, Minneapolis student Teeth Logsdon-Wallace explained how the Ramshackle Glory song “Your Heart is a Muscle the Size of Your Fist” helped him cope after an attempt to die by suicide. In the assignment, which was flagged by the student surveillance company Gaggle, Logsdon-Wallace wrote that the song was “a reminder to keep on loving, keep on fighting and hold on for your life.” (Photo courtesy Teeth Logsdon-Wallace)

The classroom assignment was one of thousands of Minneapolis student communications that got flagged by Gaggle, a digital surveillance company that saw rapid growth after the pandemic forced schools into remote learning. In an earlier investigation, The 74 analyzed nearly 1,300 public records from Minneapolis Public Schools to expose how Gaggle subjects students to relentless digital surveillance 24 hours a day, seven days a week, raising significant privacy concerns for more than 5 million young people across the country who are monitored by the company’s digital algorithm and human content moderators. 

But technology experts and families with first-hand experience with Gaggle’s surveillance dragnet have raised a separate issue: The service is not only invasive, it may also be ineffective. 

While the system flagged Logsdon-Wallace for referencing the word “suicide,” context was never part of the equation, he said. Two days later, in mid-September, a school counselor called his mom to let her know what officials had learned. The meaning of the classroom assignment — that his mental health had improved — was seemingly lost in the transaction between Gaggle and the school district. He felt betrayed. 

 “I was trying to be vulnerable with this teacher and be like, ‘Hey, here’s a thing that’s important to me because you asked,” Logsdon-Wallace said. “Now, when I’ve made it clear that I’m a lot better, the school is contacting my counselor and is freaking out.”

Jeff Patterson, Gaggle’s founder and CEO, said in a statement his company does not “make a judgement on that level of the context,” and while some districts have requested to be notified about references to previous suicide attempts, it’s ultimately up to administrators to “decide the proper response, if any.”  

‘A crisis on our hands’

Minneapolis Public Schools first contracted with Gaggle in the spring of 2020 as the pandemic forced students nationwide into remote learning. Through AI and the content moderator team, Gaggle tracks students’ online behavior everyday by analyzing materials on their school-issued Google and Microsoft accounts. The tool scans students’ emails, chat messages and other documents, including class assignments and personal files, in search of keywords, images or videos that could indicate self-harm, violence or sexual behavior. The remote moderators evaluate flagged materials and notify school officials about content they find troubling. 

In Minneapolis, Gaggle flagged students for keywords related to pornography, suicide and violence, according to six months of incident reports obtained by The 74 through a public records request. The private company also captured their journal entries, fictional stories and classroom assignments. 

Gaggle executives maintain that the system saves lives, including those of during the 2020-21 school year. Those figures have not been independently verified. Minneapolis school officials make similar assertions. Though the pandemic’s effects on suicide rates remains fuzzy, suicide has been a leading cause of death among teenagers for years. Patterson, who has watched his business during COVID-19, said Gaggle could be part of the solution. Though not part of its contract with Minneapolis schools, the company recently launched a service that connects students flagged by the monitoring tool with teletherapists. 

“Before the pandemic, we had a crisis on our hands,” he said. “I believe there’s a tsunami of youth suicide headed our way that we are not prepared for.” 

Schools nationwide have increasingly relied on technological tools that purport to keep kids safe, yet there’s to back up their claims.

Minneapolis student Teeth Logsdon-Wallace poses with his dog Gilly. (Photo courtesy Alexis Logsdon)

Like many parents, Logsdon-Wallace’s mother Alexis Logsdon didn’t know Gaggle existed until she got the call from his school counselor. Luckily, the counselor recognized that Logsdon-Wallace was discussing events from the past and offered a measured response. His mother was still left baffled. 

“That was an example of somebody describing really good coping mechanisms, you know, ‘I have music that is one of my soothing activities that helps me through a really hard mental health time,’” she said. “But that doesn’t matter because, obviously, this software is not that smart — it’s just like ‘Woop, we saw the word.’” 

‘Random and capricious’

Many students have accepted digital surveillance as an inevitable reality at school, according to a new survey by the Center for Democracy and Technology  in Washington, D.C. But some youth are fighting back, including Lucy Dockter, a 16-year-old junior from Westport, Connecticut. On multiple occasions over the last several years, Gaggle has flagged her communications — an experience she described as “really scary.”

“If it works, it could be extremely beneficial. But if it’s random, it’s completely useless.”
Lucy Dockter, 16, Westport, Connecticut student mistakenly flagged by Gaggle

On one occasion, Gaggle sent her an email notification of “Inappropriate Use” while she was walking to her first high school biology midterm and her heart began to race as she worried what she had done wrong. Dockter is an editor of her high school’s literary journal and, according to her, Gaggle had ultimately flagged profanity in students’ fictional article submissions. 

“The link at the bottom of this email is for something that was identified as inappropriate,” Gaggle warned in its email while pointing to one of the fictional articles. “Please refrain from storing or sharing inappropriate content in your files.” 

Gaggle emailed a warning to Connecticut student Lucy Dockter for profanity in a literary journal article. (Photo courtesy Lucy Dockter)

But Gaggle doesn’t catch everything. Even as she got flagged when students shared documents with her, the articles’ authors weren’t receiving similar alerts, she said. And neither did Gaggle’s AI pick up when she wrote about the discrepancy in where she included a four-letter swear word to make a point. In the article, which Dockter wrote with Google Docs, she argued that Gaggle’s monitoring system is “random and capricious,” and could be dangerous if school officials rely on its findings to protect students. 

Her experiences left the Connecticut teen questioning whether such tracking is even helpful. 

“With such a seemingly random service, that doesn’t seem to — in the end — have an impact on improving student health or actually taking action to prevent suicide and threats” she said in an interview. “If it works, it could be extremely beneficial. But if it’s random, it’s completely useless.”

Lucy Dockter

Some schools have asked Gaggle to email students about the use of profanity, but Patterson said the system has an error that he blamed on the tech giant Google, which at times “does not properly indicate the author of a document and assigns a random collaborator.”

“We are hoping Google will improve this functionality so we can better protect students,” Patterson said. 

Back in Minneapolis, attorney Cate Long said she became upset when she learned that Gaggle was monitoring her daughter on her personal laptop, which 10-year-old Emmeleia used for remote learning. She grew angrier when she learned the district didn’t notify her that Gaggle had identified a threat. 

This spring, a classmate used Google Hangouts, the chat feature, to send Emmeleia a death threat, warning she’d shoot her “puny little brain with my grandpa’s rifle.”

Minneapolis mother Cate Long said a student used Google Hangouts to send a death threat to her 10-year-old daughter Emmeleia. Officials never informed her about whether Gaggle had flagged the threat. (Photo courtesy Cate Long)

When Long learned about the chat, she notified her daughter’s teacher but was never informed about whether Gaggle had picked up on the disturbing message as well. Missing warning signs could be detrimental to both students and school leaders; districts if they fail to act on credible threats.

“I didn’t hear a word from Gaggle about it,” she said. “If I hadn’t brought it to the teacher’s attention, I don’t think that anything would have been done.” 

The incident, which occurred in April, fell outside the six-month period for which The 74 obtained records. A Gaggle spokesperson said the company picked up on the threat and notified district officials an hour and a half later but it “does not have any insight into the steps the district took to address this particular matter.” 

Julie Schultz Brown, the Minneapolis district spokeswoman, said that officials “would never discuss with a community member any communication flagged by Gaggle.” 

“That unrelated but concerned parent would not have been provided that information nor should she have been,” she wrote in an email. “That is private.” 

Cate Long poses with her 10-year-old daughter Emmeleia. (Photo courtesy Cate Long)

‘The big scary algorithm’

When identifying potential trouble, Gaggle’s algorithm relies on keyword matching that compares student communications against a dictionary of thousands of words the company believes could indicate potential issues. The company scans student emails before they’re delivered to their intended recipients, said Patterson, the CEO. Files within Google Drive, including Docs and Sheets, are scanned as students write in them, he said. In one instance, the technology led to the arrest of a 35-year-old Michigan man who tried to send pornography to an 11-year-old girl in New York, . Gaggle prevented the file from ever reaching its intended recipient.  

Though the company allows school districts to alter the keyword dictionary to reflect local contexts, less than 5 percent of districts customize the filter, Patterson said. 

That’s where potential problems could begin, said Sara Jordan, an expert on artificial intelligence and senior researcher at the in Washington. For example, language that students use to express suicidal ideation could vary between Manhattan and rural Appalachia, she said.

“We’re using the big scary algorithm term here when I don’t think it applies,” This is not Netflix’s recommendation engine. This is not Spotify.”
Sara Jordan, AI expert and senior researcher, Future of Privacy Forum

Sara Jordan

On the other hand, she noted that false-positives are highly likely, especially when the system flags common swear words and fails to understand context. 

“You’re going to get 25,000 emails saying that a student dropped an F-bomb in a chat,” she said. “What’s the utility of that? That seems pretty low.” 

She said that Gaggle’s utility could be impaired because it doesn’t adjust to students’ behaviors over time, comparing it to Netflix, which recommends television shows based on users’ ever-evolving viewing patterns. “Something that doesn’t learn isn’t going to be accurate,” she said. For example, she said the program could be more useful if it learned to ignore the profane but harmless literary journal entries submitted to Dockter, the Connecticut student. Gaggle’s marketing materials appear to overhype the tool’s sophistication to schools, she said. 

“We’re using the big scary algorithm term here when I don’t think it applies,” she said. “This is not Netflix’s recommendation engine. This is not Spotify. This is not American Airlines serving you specific forms of flights based on your previous searches and your location.” 

“Artificial intelligence without human intelligence ain’t that smart.”
Jeff Patterson, Gaggle founder and CEO

Patterson said Gaggle’s proprietary algorithm is updated regularly “to adjust to student behaviors over time and improve accuracy and speed.” The tool monitors “thousands of keywords, including misspellings, slang words, evolving trends and terminologies, all informed by insights gleaned over two decades of doing this work.” 

Ultimately, the algorithm to identify keywords is used to “narrow down the haystack as much as possible,” Patterson said, and Gaggle content moderators review materials to gauge their risk levels. 

“Artificial intelligence without human intelligence ain’t that smart,” he said. 

In Minneapolis, officials denied that Gaggle infringes on students’ privacy and noted that the tool only operates within school-issued accounts. The district’s internet use policy states that students should “expect only limited privacy,” and that the misuse of school equipment could result in discipline and “civil or criminal liability.” District leaders have also cited compliance with the Clinton-era which became law in 2000 and requires schools to monitor “the online activities of minors.” 

Patterson suggested that teachers aren’t paying close enough attention to keep students safe on their own and “sometimes they forget that they’re mandated reporters.” On the , Patterson says he launched the company in 1999 to provide teachers with “an easy way to watch over their gaggle of students.” Legally, teachers are mandated to report suspected abuse and neglect, but Patterson broadens their sphere of responsibility and his company’s role in meeting it. As technology becomes a key facet of American education, Patterson said that schools “have a moral obligation to protect the kids on their digital playground.” 

But Elizabeth Laird, the director of equity in civic technology at the Center for Democracy and Technology, argued the federal law was never intended to mandate student “tracking” through artificial intelligence. In fact, the statute includes a disclaimer stating it shouldn’t be “construed to require the tracking of internet use by any identifiable minor or adult user.” In , her group urged the government to clarify the Children’s Internet Protection Act’s requirements and distinguish monitoring from tracking individual student behaviors. 

Sen. Elizabeth Warren, a Democrat from Massachusetts, agrees. In recent letters to Gaggle and other education technology companies, Warren and other Democratic lawmakers said they’re concerned the tools “may extend beyond” the law’s intent “to surveil student activity or reinforce biases.” Around-the-clock surveillance, they wrote, demonstrates “a clear invasion of student privacy, particularly when students and families are unable to opt out.” 

“Escalations and mischaracterizations of crises may have long-lasting and harmful effects on students’ mental health due to stigmatization and differential treatment following even a false report,” the senators wrote. “Flagging students as ‘high-risk’ may put them at risk of biased treatment from physicians and educators in the future. In other extreme cases, these tools can become analogous to predictive policing, which are notoriously biased against communities of color.”

A new kind of policing

Shortly after the school district piloted Gaggle for distance learning, education leaders were met with an awkward dilemma. Floyd’s murder at the hands of a Minneapolis police officer prompted Minneapolis Public Schools to sever its ties with the police department for school-based officers and replace them with district security officers who lack the authority to make arrests. Gaggle flags district security when it identifies student communications the company believes could be harmful. 

Some critics have compared the surveillance tool to a new form of policing that, beyond broad efficacy concerns, could have a disparate impact on students of color, similar to traditional policing. to suffer biases. 

Matt Shaver, who taught at a Minneapolis elementary school during the pandemic but no longer works for the district, said he was concerned that could be baked into Gaggle’s algorithm. Absent adequate context or nuance,  he worried the tool could lead to misunderstandings. 

Data obtained by The 74 offer a limited window into Gaggle’s potential effects on different student populations. Though the district withheld many details in the nearly 1,300 incident reports, just over 100 identified the campuses where the involved students attended school. An analysis of those reports failed to identify racial discrepancies. Specifically, Gaggle was about as likely to issue incident reports in schools where children of color were the majority as it was at campuses where most children were white. It remains possible that students of color in predominantly white schools may have been disproportionately flagged by Gaggle or faced disproportionate punishment once identified. Broadly speaking, Black students are far more likely to be suspended or arrested at school than their white classmates, according to federal education data. 

Gaggle and Minneapolis district leaders acknowledged that students’ digital communications are forwarded to police in rare circumstances. The Minneapolis district’s internet use policy explains that educators could contact the police if students use technology to break the law and a document given to teachers about the district’s Gaggle contract further highlights the possibility of law enforcement involvement. 

Jason Matlock, the Minneapolis district’s director of emergency management, safety and security, said that law enforcement is not a “regular partner,” when responding to incidents flagged by Gaggle. It doesn’t deploy Gaggle to get kids into trouble, he said, but to get them help. He said the district has interacted with law enforcement about student materials flagged by Gaggle on several occasions, but only in cases related to child pornography. Such cases, he said, often involve students sharing explicit photographs of themselves. During a six-month period from March to September 2020, Gaggle flagged Minneapolis students more than 120 times for incidents related to child pornography, according to records obtained by The 74.

Jason Matlock, the director of emergency management, safety and security at the Minneapolis school district, discusses the decision to partner with Gaggle as students moved to remote learning during the pandemic. (Screenshot)

“Even if a kid has put out an image of themselves, no one is trying to track them down to charge them or to do anything negative to them,” Matlock said, though it’s unclear if any students have faced legal consequences. “It’s the question as to why they’re doing it,” and to raise the issue with their parents.

Gaggle’s keywords could also have a disproportionate impact on LGBTQ children. In three-dozen incident reports, Gaggle flagged keywords related to sexual orientation including “gay, and “lesbian.” On at least one occasion, school officials outed an LGBTQ student to their parents, according to

Logsdon-Wallace, the 13-year-old student, called the incident “disgusting and horribly messed up.” 

“They have gay flagged to stop people from looking at porn, but one, that is going to be mostly targeting people who are looking for gay porn and two, it’s going to be false-positive because they are acting as if the word gay is inherently sexual,” he said. “When people are just talking about being gay, anything they’re writing would be flagged.” 

The service could also have a heavier presence in the lives of low-income families, he added, who may end up being more surveilled than their affluent peers. Logsdon-Wallace said he knows students who rely on school devices for personal uses because they lack technology of their own. Among the 1,300 Minneapolis incidents contained in The 74’s data, only about a quarter were reported to district officials on school days between 8 a.m. and 4 p.m.

“That’s definitely really messed up, especially when the school is like ‘Oh no, no, no, please keep these Chromebooks over the summer,’” an invitation that gave students “the go-ahead to use them” for personal reasons, he said.

“Especially when it’s during a pandemic when you can’t really go anywhere and the only way to talk to your friends is through the internet.”

]]>
Dems Warn School Surveillance Tools Could Compound ‘Risk of Harm for Students’ /article/democratic-lawmakers-demand-student-surveillance-companies-outline-business-practices-warn-the-security-tools-may-compound-risk-of-harm-for-students/ Mon, 04 Oct 2021 20:41:00 +0000 /?post_type=article&p=578691 Updated, Oct. 5

A group of Democratic lawmakers has demanded that several education technology companies that monitor children online explain their business practices, arguing that around-the-clock digital surveillance demonstrates “a clear invasion of student privacy, particularly when students and families are unable to opt out.”

In to last week, Democratic Sens. Elizabeth Warren, Ed Markey and Richard Blumenthal asked them to explain steps they’re taking to ensure the tools aren’t “unfairly targeting students and perpetuating discriminatory biases,” and comply with federal laws. The letters went to executives at Gaggle, Securly, GoGuardian and Bark Technologies, each of which use artificial intelligence to analyze students’ online activities and identify behaviors they believe could be harmful.


Get stories like these delivered straight to your inbox. Sign up for The 74 Newsletter


“Education technology companies have developed software that are advertised to protect student safety, but may instead be surveilling students inappropriately, compounding racial disparities in school discipline and draining resources from more effective student supports,” the lawmakers wrote in the letters. Though the tools are marketed as student safety solutions — and grew rapidly as schools shifted to remote learning during the pandemic — there’s . Some critics, including the lawmakers, argue they may do more harm than good. “The use of these tools may break down trust within schools, prevent students from accessing critical health information and discourage students from reaching out to adults for help, potentially increasing the risk of harm for students,” the senators wrote.

The letters cited a recent investigation by The 74, which outlined how Gaggle’s AI-driven surveillance tool and human content moderators subject children to relentless digital surveillance long after classes end for the day, including on weekends, holidays, late at night and over the summer. In Minneapolis, the company notified school security when it identified students who made references to suicide, self-harm and violence. But it also analyzed students’ classroom assignments, journal entries, chats with friends and fictional stories.

Each of the companies offer differing levels of remote student surveillance. Gaggle, for example, analyzes emails, chat messages and digital files on students’ school-issued Google and Microsoft accounts. Other services include students’ social media accounts and web browsing history, among other activities.

The letters were particularly critical of the tools’ capacity to track student behaviors 24/7 — including when students are at home — and their ability to monitor students on their personal devices in some cases.

Schools’ use of digital monitoring tools has become commonplace in recent years. More than 80 percent of teachers reported using the tools, according to a recent survey by the Center for Democracy and Technology. Among those who participated in the survey, nearly a third reported that they monitor student activity at all hours of the day and just a quarter said it was limited to school hours.

“Because of the lack of transparency, many students and families are unaware that nearly all of their children’s online behavior is being tracked,” according to the letters. “When students and families are aware, they are often unable to opt out because school-issued devices are given to students with the software already installed, and many students rely on these devices for remote or at-home learning.”

A Securly spokesperson said in an email the company is “reviewing the correspondence received” by the lawmakers and is in the process of responding to their requests for information. He said the company is “deeply committed to continuously evolving our technology” to help schools protect students online. A Gaggle spokesperson said the company appreciates the lawmakers’ interest in learning how the tool “serves as an early warning system to help school districts prevent tragedies such as suicide, acts of violence, child pornography and other dangerous situations.” A GoGuardian spokesman said the company cares “deeply about keeping students safe and protecting their privacy.”

Bark officials didn’t respond to requests for comment.

The Clinton-era , passed in 2000, requires schools to filter and monitor students’ internet use to ensure they aren’t accessing material that is “harmful to minors,” such as pornography. Student privacy advocates have long argued that a newer generation of AI-driven tools go beyond the law’s scope and have urged federal officials to clarify its requirements. The law includes a disclaimer noting that it does not “require the tracking of internet use by any identifiable minor or adult user.” It “remains an open question” as to whether schools’ use of digital tools to monitor students at home violates Fourth Amendment protections against unreasonable searches and seizures, according to a by the Future of Privacy Forum.

In their letters, senators highlighted how digital surveillance tools could perpetuate several educational inequities. For example, the tools could have a disproportionate impact on students of color and further uphold longstanding racial disparities in student discipline.

“School disciplinary measures have a long history of disproportionately targeting students of color, who face substantially more punitive discipline than their white peers for equivalent offenses,” according to the letters. “These disciplinary records, even when students are cleared, may have life-long harmful consequences for students.”

Meanwhile, the tools may have a larger impact on low-income students who rely on school technology to access the internet than those who can afford personal computers. Elizabeth Laird, the director of equity in civic technology at the Center for Democracy and Technology, said their research “revealed a worrisome lack of transparency” around how these educational technology companies track students online and how schools rely on their tools.

“Responses to this letter will help shine a light on these tools and strategies to mitigate the risks to students, especially those who are most reliant on school-issued devices,” she said in an email.

]]>