Protect the privacy of student data in the digital age


Experts recommend policies to improve the privacy of student data in educational technology platforms.

The rise of online learning sparked by the COVID-19 pandemic created a seller’s marketplace for education technology companies. Long before the explosion, however, a cultural shift in education had already encouraged the digitization of industry. Today, many education officials are scrutinize the collection and use of student data as they seek to interpret, understand and comply with privacy regulations designed to protect sensitive student information.

Under the Family Education and Privacy Rights Act (FERPA) of 1974, educational institutions are prohibited from sharing “personally identifiable information” of students without parental consent. FERPA regulations apply to all education providers receiving federal funds and are designed to protect students’ paper and digital academic records. In 1978, federal lawmakers extended the protections afforded under FERPA by enacting the Modification of the protection of student rights (PPRA), which grants parents and students the right to opt out of federally funded surveys or assessments that concern a number of protected subjects.

The schemes set up by FERPA and PPRA are implemented by the Department of Education and mainly concern the obligations of schools. The FERPA and PPRA rules do not apply to education technology companies – or so-called “ed-tech” companies. Adopted in 1998, Children’s Online Privacy Protection Act (COPPA), however, is enforced by the Federal Trade Commission (FTC) and prohibits operators of children’s online services, commercial websites, and apps from collecting and disclosing data on children under the age of 13. without parental consent.

As the use of electronic technology has continued to skyrocket in recent years, industry leaders have looked beyond the three major federal student privacy laws and turned self-regulation as a means of protecting student data. In 2014, the Software and Information Industry Association and the Future of Privacy Forum developed on Student confidentiality commitment, an industry pledge whereby electronics technology companies make public statements detailing their student data privacy practices for the sake of accountability.

Although the signing of the Undertaking is voluntary, the FTC may use public undertakings by companies to bring civil suits against one of the over 400 signatories of the Undertaking who do not protect student data. Many reviews insist, however, that such actions have not yet taken place. Some defenders call to strengthen traditional protections of student privacy to face the growing digital education landscape.

In this week’s Saturday seminar, academics explain gaps in student data privacy regulations and offer methods to better protect student information and data.

  • Current debates on student privacy do not include the increasingly popular online learning platforms that deliver learning experiences directly to users, suggest Elana Zeide of University of Nebraska Law School and Helen Nissenbaum of Cornell Tech. In a item Posted in Theory and Research in Education, Zeide and Nissenbaum highlight how two types of platforms – Massive Open Online Courses (MOOCs) and Virtual Learning Environments (VLEs) – are not included in the scope of student privacy regulations, as they collect personally identifiable information directly from learners without academic mediation. They Argue that MOOC and VLE operators should go beyond complying with regulations on the use of business data to upholding education-specific student privacy standards.
  • In a item published in the Duke Law and Technology Review, Alexi Pfeffer-Gillett of Carey Law School at the University of Maryland maintains that educational software vendors do not comply with the Student Privacy Pledge. After analyzing the privacy policies of eight companies that signed the Pledge, Pfeffer-Gillett Explain that seven are in violation of at least one of the core promises of the Commitment. Apple, for example, collects personally identifiable information and commits in behavioral targeting of advertisements. Pfeffer-Gillett too Remarks that companies that have not signed the Pledge are not necessarily less compliant with the Pledge’s standards. Instead, he suggests that “the Engagement may be more useful as a public relations tool than as a means of actually making… industry improvements.”
  • De-identification of student data alone cannot adequately protect student privacy, Elad Yacobson of Weizmann Institute of Science and several co-authors support in a item published in the Learning Analysis Journal. Using machine learning algorithms to analyze and cluster unlabeled data sets, Yacobson’s team was able to re-identify personal information from anonymized student interaction data. Yacobson and his co-authors could even identify when a selected group of gifted children went on a school trip. Noting that there is no “silver bullet” for the confidentiality of educational data, Yacobson and his co-authors combat that privacy technology must be accompanied by clear regulation and increased awareness among educators.
  • In a search paper in Research and practice in technology and advanced learning, Tore Hoel of Oslo Metropolitan University and Weiqin Chen of Augusta University examine data sharing through an educational lens. Hoël and Chen to suggest three principles to be taken into account in educational data privacy policies. First, the protection of privacy and data must be achieved by negotiating data sharing with individual students. Second, educational institutions should be transparent in their decisions to access data, and this access should past a standard of necessity. Finally, schools and universities should use data sharing negotiations as an opportunity to increase data literacy.
  • In a paper in the Virginia Journal of Law and Technology, N. Cameron Russell of Fordham Center on Information Law and Policy and several co-authors identify A legal and regulatory vacuum in the sale of student information: Existing privacy laws do not cover the sale of student information by data brokers. Russell’s team defenders transparency in the commercial student data market. They Argue that brokers should be required to follow procedures that promote data accuracy, such as the obligation to notify downstream data users of inaccuracies. They too to favor provide opt-out clauses for parents and emancipated students and recommend that schools should educate students and families on how their survey results will be used commercially before administering them.
  • In a chapter of Cambridge Handbook on Consumer Privacy Protection, Elana Zeide of University of Nebraska Law School argues that traditional student privacy regulations are insufficient in “an age of big data”. Zeide recommended best practices for education technology companies to cultivate trust among stakeholders, such as providing sufficient transparency and accountability. Her too suggests maintain traditional expectations that students’ personally identifiable information will remain in schools rather than being sold to for-profit companies.

The Saturday Seminar is a weekly feature that aims to put in written form the type of content that would be conveyed in a live seminar involving regulatory experts. Every week, Regulatory review publishes a brief overview of a selected regulatory topic, then distills recent research and academic writing on that topic.


About Author

Comments are closed.