EFF legal intern Rob Ferrari was the lead author of this post.
A new school year has started, the second one since the pandemic began. With our education system becoming increasingly reliant on the use of technology (“edtech”), especially for remote learning during the pandemic, protecting student privacy is more important than ever. Unfortunately, the Future of Privacy Forum’s 2020 Student Privacy Pledge, like the legacy version, continues to provide schools, parents, and students with false assurance due to numerous loopholes for the edtech company signatories that collect and use student data.
The Future of Privacy Forum (FPF) originally launched the Student Privacy Pledge in 2014 to encourage edtech companies to take voluntary steps to protect the privacy of K-12 students. In 2016, we criticized the Legacy Pledge after it reached 300 signatories—to FPF’s dismay.
The 2020 Pledge once again falls short in how it defines material terms, such as “Student PII” and “School Service Providers”; many of the 2020 Pledge’s commitments are conditioned on school or parent/student consent, which may inadequately protect student privacy; and new commitments are insufficiently precise.
Additionally, while the Student Privacy Pledge is a self-regulatory program, FPF emphasizes that companies who choose to sign the Pledge are committing to public promises that are enforceable by the Federal Trade Commission (FTC) and state attorneys general under consumer protection laws—but this is cold comfort as enforcement actions against edtech companies for violating students’ privacy have been few and far between.
Loopholes in Definitions
Similar to our prior criticisms of FPF’s Legacy Student Privacy Pledge, the 2020 Pledge is filled with inconsistent terminology and fails to define material terms. This creates a disconnect between what schools, parents, and students might reasonably expect when reading the 2020 Pledge and what companies actually must do to comply with it. In short, inconsistent and vague terms undermine the Pledge’s ability to hold companies accountable.
Will the 2020 Pledge Protect Sensitive Student Data?
It’s unclear.
First, the 2020 Pledge commitments primarily apply to “student personally identifiable information” (“Student PII”), a new term that is said to have the same definition as “covered information” as defined in California’s Student Online Personal Information Protection Act (SOPIPA). But “covered information” in SOPIPA includes the term “personally identifiable information,” making the the 2020 Pledge definition, in part, circular. Furthermore, SOPIPA does not define “personally identifiable information,” and leaving it up to the companies is not sufficient. This creates compliance challenges for signatories as they need to assess the data provided about the student to determine if it could be construed as “personally identifiable information.” Ironically, FPF itself criticized SOPIPA (pp. 18-19) for being difficult to implement because the statute fails to define “personally identifiable information.” So why does the 2020 Pledge reference SOPIPA when FPF thinks the statute is challenging to implement?
Second, the 2020 Pledge’s definition of “Student PII” includes an exception for “de-identified information.” Signatories are, therefore, free to collect and use student data contrary to the Pledge’s commitments so long as the data is “de-identified.” While the U.S. Department of Education has drafted guidance on data de-identification, the 2020 Pledge fails to define that term and thus fails to provide a standard for de-identification that provides some baseline privacy protection and that can be used to determine signatories’ compliance with the Pledge.
Not all de-identification processes provide adequate protection. For example, an edtech provider might build a student profile that contains sensitive student data and then simply replace that student’s name with an ID number. This practice would weakly protect student privacy, but would it fall within the 2020 Pledge’s “de-identified information” exception? More stringent de-identification processes, such as aggregation of student data, could still compromise student privacy because certain types of data are more sensitive than others. For example, location data is extremely sensitive, and even in isolation, it could reveal patterns of a student’s daily habits; track the student’s precise whereabouts at any given moment; and compromise the student’s identity through extrapolation.
Admittedly, standards may be difficult to draft, and even best practices for de-identification carry some risk, as re-identification processes will become more sophisticated over time. But a minimum requirement for de-identification would help close this otherwise fairly large loophole. By contrast, leaving the term undefined and open to individual company interpretation creates a broad exception that undermines the Pledge’s ability to hold companies accountable and meaningfully protect student privacy.
Which Companies Are Subject to the 2020 Pledge?
The 2020 Pledge commitments apply to “School Service Providers,” which is a Legacy Pledge term that has been revised. Despite the revisions, the term continues to create confusion regarding when a signatory company is subject to the 2020 Pledge obligations.
First, the 2020 Pledge's definition of a “School Service Provider” is inconsistent as to whether a company must both design and market a product/service for schools in order to be bound by the Pledge, or whether simply marketing its product/service for schools is sufficient.
As provided by the first line of the definition, to qualify as a “School Service Provider,” a company must simply market its product/service for use in schools. This laxer definition is better for students. But in the second line, the Pledge creates an exception when a product/service is not both designed and marketed for schools. FPF’s FAQ further explains that a product/service must be designed for education, not just marketed (see “Is my company eligible to take the Pledge?”).
Similar to our prior criticism of the Legacy Pledge, this is problematic because a company could be a signatory to the Pledge and qualify as a “School Service Provider” by marketing its product/service (or one of its products/services) for schools, but that same company could then argue that data collection via a particular product/service is not subject to the Pledge commitments because the product/service was only marketed—and not also designed—for schools.
For example, Beanstack by Zoobean, a 2020 Pledge signatory, is a product that encourages reading through various challenges. While Beanstack is marketed to schools as well as libraries, companies, and consumers, would Zoobean be allowed to argue that its product was simply marketed to, but not designed for schools? What if Beanstack was actually designed for use by libraries or consumers, but schools incidentally showed demand for the product?
Zoobean isn’t alone. It’s unclear what market the signatory AvePoint, a Microsoft 365 “data management solutions provider,” was designed for, though it appears to be marketed to corporations, governments, and higher education (which is ironic, given that the Pledge applies to providers in K-12 schools, not colleges and universities). Botdoc, a secure data transfer service, does not appear to even be designed for schools, but must be marketed to schools (seeing as it’s a Pledge signatory). These are just a few examples of why the confusing “School Service Provider” definition is a problem.
Second, a company can qualify as a signatory to the Pledge if it offers even a single product/service that matches FPF’s definition of “School Service Provider” (notwithstanding the confusion surrounding the definition). But that company would not be bound by the Pledge for any of its other products/services that fall outside of FPF’s definition. As FPF explained in a blog post (again adding to the marketing/design confusion): “One of the most common misunderstandings about the Pledge is the assumption that the Pledge applies to all products offered by a signatory or used by a student. However, the Student Privacy Pledge applies to ‘school service providers’—companies that design and market their services and devices for use in schools.” This is concerning because schools, parents, and students might mistakenly trust a brand’s entire suite of products/services based on its status as a signatory to the Student Privacy Pledge. It also doesn’t help that this caveat is not readily discernible from even a close read of the 2020 Pledge and FAQ.
Similarly, it’s unfortunate that the Pledge applies to only K-12 education and doesn’t apply to colleges and universities. Higher education schools and students might mistakenly believe that a Pledge signatory is obligated to protect the privacy of data collected from post-secondary students, when this is not the case.
Notification Required for Changes to Which Privacy Policy?
The 2020 Pledge requires signatories to provide prominent notice to schools, parents, and students when making changes to “educational privacy policies.” This is a narrowing change from the Legacy Pledge, which required companies to provide notice when changing their “consumer privacy policies.” The definitional section of the 2020 Pledge fails to define “educational privacy policies,” instead defining only “consumer privacy policies” (which is likely a mistake made during the revision process).
Without providing a definition of “educational privacy policies,” this change is problematic. For example, Google Workspace for Education has a privacy notice that cross-references the company’s general privacy policies. Could Google argue that it’s not obligated under the 2020 Pledge to notify schools, parents, and students when it makes changes to its general privacy policies, because those are not solely for its educational products? By failing to define “educational privacy policies,” the 2020 Pledge creates uncertainty that could allow companies to be in technical compliance while avoiding transparency for their users.
Loopholes in Consent-Based Privacy
Many of the Pledge’s commitments provide exceptions when an edtech company is performing “authorized educational/school purposes” or when the company acquires parent/student consent. Consent from either the school or the parent/student controls an edtech company’s obligations with respect to collecting, maintaining, or sharing Student PII; building personal profiles of a student; and to the duration of time that Student PII can be retained. Structuring key commitments this way may not adequately protect student privacy.
First, schools can determine whether an activity is an “authorized educational/school purpose,” which effectively provides consent on behalf of the parent/student. This bypassing of parent/student consent is particularly concerning for schools or school districts that overlook privacy concerns of parents/students, implicitly trust privacy policies, or lack the resources to properly train administrators and teachers on best practices for student privacy.
Second, student privacy that is contingent on parent/student consent has its own inherent shortcomings. Parents/students might consent to company conduct that compromises student privacy because of deceptive practices such as opt-out by default (as opposed to opting in to data collection and use) and other “dark patterns.” The FAQ itself states that “a parent or student may authorize a signatory to use student PII for non-educational purposes,” which is concerning given the risk of deceptive settings (see “What does the Pledge say about the limits on signatories using of student PII?”). Furthermore, parents/students might lack a meaningful choice because there are barriers to opting out, such as when a school, district, or individual teachers heavily rely on an edtech company’s products/services and no real alternative exists—that is, parents/students are inadvertently pressured into consenting at the risk of a subpar education.
New Commitments Don’t Go Far Enough
FPF’s 2020 Pledge includes additional commitments that could actually enhance student privacy when compared to the Legacy Pledge, but the binding language does not go far enough.
First, the 2020 Pledge now requires School Service Providers to provide “resources” to educate schools, parents, and students on how to use their products/services in a way that promotes privacy and security. EFF strongly believes that proper training is critical to ensuring student privacy. But the language in this commitment is vague and weak because it fails to set a minimum standard of what “resources” must be provided. The FAQ does provide guidance through a non-exhaustive list of what “resources” FPF has in mind, many of which we believe should be part of a comprehensive approach to student privacy (see “What other information is available about providing resources to support users and/or account holders?”). But the FAQ is not the Pledge. And student privacy might not be adequately protected even if the 2020 Pledge is read alongside the FAQ—for example, if a product/service itself doesn’t have robust privacy settings, or if a company simply provides “manuals” to non-tech savvy users.
Second, the 2020 Pledge also now requires that companies “incorporate privacy and security when developing or improving” their products/services. This obligation is elaborated upon in the FAQ (see “What additional information is available about incorporating privacy and security into the design process?”), which says, for example, a company could comply by “applying privacy and security by design principles.” While EFF fully supports privacy-by-design, FPF’s approach here runs into the same problems as the new “resources” commitment. The FAQ is not the Pledge, and there is no minimum standard required for this obligation. A company could do the bare minimum and claim that it has satisfied its obligation—for example, by following the transparency or openness principles of privacy-by-design by having a privacy policy, but doing nothing else. This is a far cry from the spirit of privacy-by-design, which focuses on privacy at every level of the design process.
Protecting student privacy requires a robust, comprehensive program—generic commitments working in isolation are not sufficient.
The Student Privacy Pledge is Nothing Without Enforcement
Even if FPF’s 2020 Pledge were an airtight document capable of being precisely applied to evaluate whether companies have kept their legally binding public promises, edtech companies will not be held accountable unless there is enforcement. FPF itself apparently hasn’t created an enforcement mechanism to regularly assess signatories’ compliance with the Pledge, and it remains to be seen whether the FTC and state attorneys general are willing to enforce it.
Despite hosting a workshop on student privacy in 2017, the FTC rarely brings enforcement actions focused on student privacy. In fact, since the start of 2018, the FTC has reviewed 66 consumer privacy cases, none of which are primarily aimed at addressing student privacy issues. Student privacy in relation to edtech companies should be a central focus, particularly in light of a year filled with remote learning and the subsequent spike in student privacy concerns. With the FTC chairwoman earlier this year expressing interest in tackling student privacy issues, it’s time to put words into action.
EFF is not opposed to voluntary mechanisms like the Student Privacy Pledge to protect users—if they work. Schools, parents, and students must have confidence that a company whose products they use in classrooms—and that’s a signatory to the Pledge—is not only complying with the Pledge, but is in fact meaningfully protecting the privacy of students.
* This article was originally published here
HELP STOP THE SPREAD OF FAKE NEWS!
SHARE our articles and like our Facebook page and follow us on Twitter!
0 Comments