Big Tech’s self-regulatory effort has long been accused of being toothless. Is that about to change?
A few months after education leaders at America’s largest school district announced that a technology vendor had exposed sensitive student information in a massive data breach, the company at fault — Illuminate Education — was recognized with the software industry’s equivalent of the Oscars.
Since that disclosure in New York City schools, the scope of the breach has only grown, with districts in six states announcing that some 3 million current and former students had become victims. Illuminate has never disclosed the full extent of the blunder, even as critics decry significant harm to kids and security experts question why the company is being handed awards instead of getting slapped with sanctions.
Amid demands that Illuminate be held accountable for the breach — and for allegations that it misrepresented its security safeguards — the company could soon face unprecedented discipline for violating the Student Privacy Pledge, a self-regulatory effort by Big Tech to police shady business practices. In response to inquiries by The 74, the Future of Privacy Forum, a think tank and co-creator of the pledge, disclosed Tuesday that Illuminate could soon get the boot.
Forum CEO Jules Polonetsky said his group would decide within a month whether to revoke Illuminate’s status as a pledge signatory and refer the matter to state and federal regulators, including the Federal Trade Commission, for possible sanctions.
“We have been reviewing the deeply concerning circumstances of the breach and apparent violations of Illuminate Education’s pledge commitments,” Polonetsky said in a statement to The 74.
Illuminate did not respond to interview requests.
In a twist, the pledge was co-created by the Software and Information Industry Association, the trade group that recognized Illuminate last month as being among “the best of the best” in education technology. The pledge, created nearly a decade ago, ensures that education technology vendors are ethical stewards of kids’ most sensitive data. Its staunchest critics have assailed the pledge as being toothless — if not an outright effort to thwart meaningful government regulation. Now, they are questioning whether its response to the massive Illuminate breach will be any different.
“I have never seen anybody get anything more than a slap on the wrist from the actual people controlling the pledge,” said Bill Fitzgerald, an independent privacy researcher. Taking action against Illuminate, he said, “would break the pledge’s pretty perfect record for not actually enforcing any kind of sanctions against bad actors.”
Through the voluntary pledge, launched in 2014, hundreds of education technology companies have agreed to a slate of safety measures to protect students’ online privacy. Pledge signatories, including Illuminate, have promised they will not sell student data to third parties or use the information for targeted advertising. Companies that sign the commitment also agree to “maintain a comprehensive security program” to protect student’s personal information from data breaches.
The privacy forum, which is funded by tech companies, has long maintained that the pledge is legally binding and offers assurances to school districts as they shop for new technology. In the absence of a federal consumer privacy law, the forum argues the pledge grants “an important and unique means for privacy enforcement,” giving the Federal Trade Commission and state attorneys general an outlet to hold education technology companies accountable via consumer protection rules that prohibit unfair and deceptive business practices.
For years, critics have accused the pledge of providing educators and parents false assurances that a given product is safe, rendering it less useful than a pinky promise. Meanwhile, schools and technology companies have become increasingly entangled — particularly during the pandemic. As districts across the globe rushed to create digital classrooms, few governments checked to make sure the tech products officials endorsed were safe for children, according to a recent report by Human Rights Watch. The group found that shoddy student data practices by leading tech vendors were rampant. Of the 164 tools analyzed, 89 percent “engaged in data practices that put children’s rights at risk,” with a majority giving student records to advertisers.
As companies suck up a mind-boggling amount of student information, a lack of meaningful enforcement has let tech companies off the hook for violating students’ privacy rights, said Hye Jung Han, a Human Rights Watch researcher, focused on children. As a result, she said, students whose schools require them to use certain digital tools are being forced to “give up their privacy in order to learn.” Paired with large-scale data breaches, like the one at illuminate, she said students’ sensitive records could be misused for years.
“Children, as we know, are more susceptible to manipulation based on what they see online,” she said. “So suddenly, the information collected about them in the classroom is being used to determine the kinds of content and the kinds of advertising they see elsewhere on the internet. It can absolutely start influencing their worldviews.”
But the regulatory environment under the Biden administration may be entering a new, more aggressive era. In May, the Federal Trade Commission announced that it would scale up enforcement on education technology companies that sell student data for targeted advertising and that “illegally surveil children when they go online to learn.” Even absent a data breach like the one at Illuminate, the commission wrote in a policy statement that education technology providers violate the federal Children’s Online Privacy Protection Act if they lack reasonable systems “to maintain the confidentiality, security, and integrity of children’s personal information.”
The FTC declined to comment on this article. Jeff Joseph, president of the Software and Information Industry Association, said its recent awards were based on narrow criteria, and judges “would not be expected to be aware of the breach unless the company disclosed it during the demos.” News of the breach was widely covered in the weeks before the June awards ceremony.
The trade group “takes the privacy and security of student data seriously,” Joseph said in a statement that the Future of Privacy Forum “maintains the day-to-day management of the pledge.”
Concerns of a data breach at California-based Illuminate began to emerge in January when several of the privately held company’s popular digital tools, including programs used in New York City to track students’ grades and attendance, went dark.
Yet it wasn’t until March that city leader announced that the personal data of some 820,000 current and former students — including their eligibility for special education services and for free or reduced-price lunches — had been compromised in a data breach. In disclosing the breach, city education officials accused the company of misrepresenting its security safeguards. The Department of Education, which reportedly paid Illuminate $16 million over the last three years, told schools to stop using the company’s tools in May.
A month later, officials at the New York State Education Department launched an investigation into whether the company’s data security practices ran afoul of state law. Under the law, education vendors must maintain “reasonable” data security safeguards and must notify schools about data breaches “in the most expedient way possible and without unreasonable delay.”
State officials said the breach affected about 174,000 additional students across the state outside New York City.
Doug Levin, the national director of The K12 Security Information eXchange, said the state should issue “a significant fine” to Illuminate for misrepresenting its security protocols to educators. Sanctions, he said, would “send a strong and very important signal that not only must you ensure that you have reasonable security in place, but if you say you do and you don’t, you will be penalized.”
Meanwhile, Illuminate has since become the subject of two federal class-action lawsuits in New York and California, including one that alleges that students’ sensitive information “is now an open book in the hands of unknown crooks” and is likely being sold in the dark web “for nefarious and mischievous ends.”
Plaintiff attorney Gary Graifman said that litigation is crucial for consumers because state attorneys general are often too busy to hold companies accountable.
“There’s got to be some avenue of interdiction that occurs so that companies adhere to policies that guarantee people their private information will be secured,” he said. “Obviously, if there is strong federal legislation that occurs in the future, maybe that would be helpful, but right now, that is not the case.”
School districts in California, Colorado, Connecticut, Oklahoma, and Washington have since disclosed to current and former students that their personal information had been compromised in the breach. But the full extent remains unknown because “Illuminate has been the opposite of forthcoming about what has occurred,” Levin said.
Most states do not require companies to disclose data breaches to the public. Some 5,000 schools serving 17 million students use Illuminate tools, according to the company, which was founded in 2009.
“We now know that millions of students have been affected by this incident, from coast to coast in some of the largest school districts in the nation,” including in New York City and Los Angeles, Levin said. “That is absolutely concerning, and I think it shines a light on the role of school vendors,” who are a significant source of education data breaches.
Nobody, including the National Security Agency, can guarantee that their cybersecurity infrastructure will hold up against motivated hackers, Levin said, but Illuminate’s failure to disclose the extent of the breach raises a major red flag.
“The longer that Illuminate does not come clean with what’s happened, the worse it looks,” he said. “It suggests that this was maybe leaning on the side of negligence versus them being an unfortunate victim.”
‘A public relations tool’
When Illuminate signed the privacy pledge six years ago, it acknowledged the importance of protecting students’ data and said it offered a “secure online environment with data privacy securely in place.” On its website, Illuminate touts an “unwavering commitment to student data privacy” and offers a link to the pledge.
“By signing this pledge,” the company wrote in a 2016 blog post, “we are making a commitment to continue doing what we have already been doing from the beginning — promoting that student data be safeguarded and used for encouraging student and educator success.”
Some pledge critics have accused tech companies of using it as a marketing tool. In 2018, a Duke Law and Technology Review report argued that pledge noncompliance was rampant and accused it of being “a mirage” that offered comfort to consumers “while providing little actual benefit.”
“The pledge may be more valuable as a public relations tool than as a means of actually effecting — or reflecting — industry improvements,” according to the report. Gaps between the pledge’s public declarations and companies’ business practices, it concluded, “are likely to mislead consumers.”
In 2015, a software researcher found that a large share of pledge signatories lacked basic encryption infrastructure to protect student data from hackers. Three years later, The New York Times published a damning report on the College Board, a nonprofit that administers the widely used SAT college admissions exam. College Board, the report exposed, was selling student data to third parties in violation of the privacy pledge. In response, the privacy forum announced the College Board’s status as a pledge signatory had been placed “under review” but restored its status as an active signatory a year later. The College Board, it said in a press release, had committed to changing its business practices.
Still, in 2020 a Consumer Reports investigation found the College Board was sending student data to major digital advertising platforms, including those operated by Microsoft and Google. The College Board remains a pledge signatory.
The nonprofit is “resolute in protecting student data privacy,” a spokesperson said in a statement. “Organizations that receive data from College Board, such as high schools, districts, colleges, universities, and scholarship organizations, must adhere to strict guidelines when using that data.”
Some critics have argued the College Board should have been removed from the pledge, but the Future of Privacy Forum has held that taking such action against signatories could do more harm than good. According to a recent blog post, when the forum becomes aware of a complaint against a pledge signatory, it typically works with the company to resolve issues and ensure compliance. The think tank argued it’s best to work with non-compliant companies to improve their business practices rather than exile them from the pledge outright. Removing companies “could result in fewer privacy protections for users, as a former signatory would not be bound by the Pledge’s promises for future activities.”
Attorney Amelia Vance, a former privacy forum employee and the founder and president of Public Interest Privacy Consulting, said the pledge had nudged education technology companies to change their business practices to ensure they follow its provisions.
“I almost always thought of it as a way to make companies better and more aware of student privacy than something to be enforced with specific teeth,” said Vance, who declined to comment on whether Illuminate should be removed. “After all, the Federal Trade Commission and state [attorneys general] are the ones who really have the enforcement powers here.”
But self-policing efforts, like the pledge, are “only as effective as the enforcement,” said Levin, the school security expert. Otherwise, it can only serve as “a nice window dressing” for Big Tech’s efforts to fend off stricter state and federal regulations — provisions he said must be strengthened.
At a minimum, he said the private forum should disclose companies that have been credibly accused of violating the pledge and conduct investigations. If they find a company out of compliance, he said, “it’s not clear to me that they should be allowed to re-sign the pledge.”
“If I were another signatory of the pledge, I would be quite concerned about whether or not the value of that pledge is being diminished” by including companies that violate its provisions, he said. “If it’s going to serve its purpose, there needs to be some policing.”
But to Fitzgerald, the privacy researcher, the forum’s failure to take action against bad actors has long rendered the pledge useless.
“It’s not like the pledge finally doing what it should have been doing five years ago would make a difference,” he said. “It’s never too late to start” removing companies that violate its provisions, he said, but “the fact that it hasn’t happened yet seems to indicate that it’s not going to happen.”