My notes on the Global Education Forum's roundtable about Ethical Implications of AI (2): Privacy
Justo Hidalgo
Chief AI Officer at Adigital. Highly interested in Responsible AI and Behavioral Psychology. PhD in Computer Science. Book author, working on my fourth one!
This is a continuation of the previous post on some reflections on AI and Education we shared during the Global Education Forum 's roundtable on Ethics and AI.
In the first post I wrote about bias and algorithmic fairness. In this one, the main topic is privacy.
When thinking about education, the concept of privacy takes on a heightened level of significance. It is related to both traditional academic data and to the corners of students' personal lives, beliefs, and preferences. An anecdote I always us is how when my colleagues and I founded Quantified Reading , we needed to explain about customers hPow careful they (we) needed to be with the information we collected from readers, as what we read tell A LOT about who we are.
Hence, the responsibility to safeguard such sensitive information becomes a pivotal duty of every educational institution.
Addressing privacy in education requires a comprehensive and proactive approach, and key to this is the principle of data minimization. This concept relates to respecting the trust placed in educational institutions by students. By collecting only the data essential for educational purposes, we dramatically reduce the risk of data breaches and misuse. Excessive data collection not only jeopardizes student privacy but also increases the liability for institutions due to regulations such as GDPR in the EU.
Moreover, securing the data we must hold is critical. Utilizing state-of-the-art encryption technologies and ensuring that data storage—whether on local servers or cloud-based systems—meets the highest standards of security is non-negotiable. This secure approach guards against unauthorized access and potential data breaches, maintaining the integrity of student information throughout its lifecycle. This may seem obvious but... it isn't.
And it's related to access controls, another cornerstone of a robust privacy strategy. These controls ensure that only authorized personnel can access sensitive data, safeguarding against both external breaches and internal misuse. Implementing role-based access controls, coupled with multi-factor authentication and regular audits of access logs, provides a layered security approach that is both dynamic and resilient.
The concept of 'Privacy by Design' is also critical. Yes, I know we are not in 2018 anymore! But we cannot forget that this approach, which mandates that privacy considerations should be woven into the fabric of all AI systems used in education from the outset, is MANDATORY. Integrating privacy into the design process ensures that it remains a core focus throughout the development and deployment of educational technologies, rather than an afterthought or a box-ticking exercise in compliance.
Compliance with privacy-related regulations such as the previously mentioned General Data Protection Regulation (GDPR) in Europe or the Family Educational Rights and Privacy Act (FERPA) in the United States provides a structured framework for managing student data. However, compliance should be seen as the minimum standard rather than the ultimate goal. The spirit of these regulations is to foster a culture of privacy that respects and protects the individual rights of students.
领英推荐
Compliance must be seen as the minimum standard on privacy we must build. Not the ultimate goal at all.
When it comes to handling data that can be used for analytics or machine learning, techniques like anonymization and pseudonymization can play significant roles. These techniques help obscure the identities of individuals, allowing for the beneficial uses of data in educational tools and AI applications while protecting students' identities. For instance, this is one of the topics we analyze at Adigital 's Certification for Transparency and Explainability of AI systems. Transparency about data practices is fundamental to maintaining trust. And while every company should deeply think about this, it is clear that educational institutions should be the first to clearly communicate how they collect, use, store, and share student data. This transparency, coupled with obtaining informed consent from students or their guardians, ensures that all parties are aware of and agree to the ways in which their data is handled.
Which takes me to the need of continuous privacy impact assessments, essential to identify and mitigate risks in a timely manner. These assessments help institutions understand the implications of new technologies and practices, ensuring that privacy measures evolve as needed to address emerging threats and vulnerabilities.
Many of what I just mentioned is mandatory. But technology evolves faster than what regulations may be able to react to. That's why educating all stakeholders—students, educators, and administrators—about the importance of privacy is crucial. Training programs that highlight responsible data handling and the recognition of potential threats empower individuals to act as both beneficiaries and guardians of their own data.
Just to finish, I believe having a robust incident response and breach notification plan in place is indispensable. Should a data breach occur, having predefined procedures for containment, investigation, and communication ensures that the institution can respond effectively, minimizing harm and restoring trust. Again, this is mandatory under GDPR and other laws, but institutions should implement them by design.
Not an easy call, right? The good news is that privacy is probably the ethical principle that has evolved the most in the last 10 years. The bad news is that this concept is evolving quite rapidly from social, technological and cultural standpoints, so educational institutions, educators and students alike should be always vigilant.