Skip to the main content.

4 min read

AI in education: What are the risks and challenges?

AI in education: What are the risks and challenges?
AI in education: What are the risks and challenges?
7:03

Whilst there are many opportunities to be realised from using AI in education, its use also has the potential to create new (and exacerbate existing) risks and challenges that schools face, including to ethics, child safety and protection and the education system as a whole. 

This article is part of a series of insights from 9ine exploring the impact of Artificial Intelligence (AI) in education. From the opportunities it can bring, to the potential risks and challenges it creates where not implemented appropriately, this series will bring you up to speed and provide practical tips to remain compliant and ethical when using AI in education. 

In the previous article, we explored what AI is, what opportunities it creates for education and how it is currently being used. In this article we take a deeper look at the risks and challenges that schools will be faced with when implementing AI. 

What are the risks and challenges of using AI in Education? 

Although AI can create a number of opportunities for educational institutes, using AI in education can also create a number of risks and challenges. From risks to individuals, to the integrity of the educational system as a whole. In order to realise the opportunities of AI in education, schools will need to be mindful of these risks when considering whether it is appropriate to use AI, and how. 

Privacy and Data Protection 

From the lack of transparency about how personal data is processed using AI, known as the ‘black box’ problem, to the limited rights that individuals have over their data because of this, the use of AI in education creates and increases a number of risks to the privacy of students and teachers. Because AI relies on vast amounts of data to create effective outputs, data minimisation (a key requirement of data protection and privacy laws globally) is often overlooked. This can result in unauthorised processing of personal data, where data that was collected for one purpose is now being used for another in the context of AI. These issues can result in risks to schools of non-compliance and threats to the privacy of educators and students. 

Ethics 

Privacy is not the only area of ethics that using AI in education threatens. The lack of transparency in using AI can also result in unjustified actions, which may be taken based on the output from an AI system, the rationale for which may not be understood or explainable. Decisions made by AI in education can then lead to, or perpetuate, societal discrimination because an AI system’s design and functionality often reflects the values of its designer and the data chosen to be inputted. The use of AI can also impact the autonomy of individuals, particularly in the case of personalisation, where information may be filtered before it is presented to the individual thereby reducing their exposure to, and experience of the world. 

Cybersecurity 

The rapid adoption of AI also introduces complex cybersecurity risks that traditional practices cannot always sufficiently address and its use in schools is no exception, where funding, resources and expertise may already be stretched. Increasing use of AI expands the attack surface of a school, increasing the potential entry points and vulnerabilities that an attacker can use to compromise a system or network. This expanded surface lowers the barriers to attackers for creating and injecting malware into a school’s system. AI can also enable more sophisticated phishing attacks by profiling individuals, automating impersonation and carrying out Distributed Denial of Service (DDoS) attacks.   

The Education System 

In addition to the risks to individuals, the use of AI also creates risks to the functioning and integrity of the educational system as a whole. The increasing use and availability of AI systems in schools can lead to an overreliance on AI and a loss of critical thinking skills. This overreliance may stem from altruistic aims, such as taking advantage of the efficiencies that AI can create, but it can lead to more dangerous results, such as plagiarism and academic dishonesty where students use AI-generated content as their own. Overreliance decreases the ability of students to truly learn, undermining the education system as a whole. 

Child Safety and Protection 

AI can also create and increase a number of risks to child safety, a key responsibility which schools have in protecting children from abuse, neglect and harm. From age inappropriate conversations that may take place between a child and an AI, to decreased interactions with humans and an increasing addiction/dependency on AI. The Council of International Schools have also raised concerns about the risk of AI being used to create ‘deep fakes’, which are images, videos, audio files or GIFs that can be manipulated by a computer to use someone’s face, voice or body without their consent. Deep fakes have already been shown to cause distress and harm to teachers and students.   

So what can we do?

These examples highlight some of the risks that using AI in education can create and exacerbate, all of which are challenges that schools will need to overcome and implement safeguards against. Other challenges include knowing when a task is appropriate to use AI for and how to remain compliant with regulations on AI. Over the next ten articles, 9ine will take you through more of these topics and questions, and explain how you can practically put safeguards in place. At 9ine we offer a number of products and services that can help schools with the challenges that AI presents, specific solutions to support schools with their leadership and oversight of AI include: 

  • 9ine Academy LMS: Our AI Pathway is your school's learning partner for AI ethics and governance. With differentiated course levels you can enrol staff in an Introductory course to AI, then for those staff with a greater responsibility, enrol them in Intermediate and Advanced courses. There’s also specialist courses for Ai in Safeguarding, Child Protection and technology.
  • Application Library: A solution that enables all staff to access a central searchable library of all EdTech in the school. The library contains all information staff need to know about the AI in use (if there is), privacy risks, safeguarding risks and cyber risks. With easy to add ‘How to’ and ‘Help’ guides, Application Library becomes a single, central digital resource. Through implementing Application Library your school will identify duplication in EdTech, reduce contract subscription costs and have a workflow for the request of new EdTech for staff to follow.
  • Vendor Management: Removes the pain, and time, from evaluating and vetting third party vendor contracts, privacy notices, information security policies and other compliance documents. Vendor Management provides a thorough, ‘traffic light’ based approach to inform you of vendor privacy, cyber, AI, and safeguarding risks. Vendor Management supports you to demonstrate to parents, staff and regulators how you effectively evaluate and manage technology you choose to deploy.
  • Privacy Academy: A virtual, in-person 6 month monthly training and professional development to manage privacy law, AI, cybersecurity and safeguarding risks of harm at your school. Following a project based methodology we train and coach you on implementing a Privacy Management Programme, considering AI, cyber and safeguarding risks of harm.

In our next article, in celebration of Cybersecurity Awareness Month, we go into further detail about the challenges and risks to Cybersecurity that AI can create and how to safeguard against them. 

9ine company overview

9ine equips schools to stay safe, secure and compliant. We give schools access to all the expertise they need to meet their technology, cyber, data privacy, governance, risk & compliance needs - in one simple to use platform. For additional information, please visit www.9ine.com or follow us on LinkedIn @9ine.

AI in education: Finding your AI systems and categorising them for risk

AI in education: Finding your AI systems and categorising them for risk

The use of AI is being increasingly regulated (particularly in the education sector), to counter the risks and challenges that the use of AI in...

Read More
AI in education: Leadership, oversight and do schools need an AI officer?

AI in education: Leadership, oversight and do schools need an AI officer?

The increasing use of AI in schools requires leadership and oversight to ensure that the benefits and opportunities of AI are realised, whilst...

Read More
AI in education: AI and cyber security

AI in education: AI and cyber security

Cybersecurity is a key consideration for schools, given the large amounts of children’s data they hold (which makes them a key target for cyber...

Read More