Skip to the main content.

5 min read

AI in education: AI and cyber security

AI in education: AI and cyber security
AI in education: AI and cyber security
10:09

Cybersecurity is a key consideration for schools, given the large amounts of children’s data they hold (which makes them a key target for cyber attackers). But how does AI affect cybersecurity for schools?

This article is part of a series of insights from 9ine exploring the impact of Artificial Intelligence (AI) in education. From the opportunities it can bring, to the potential risks and challenges it creates where not implemented appropriately, this series will bring you up to speed and provide practical tips to remain compliant and ethical when using AI in education. 

In our previous articles, we explored the opportunities that AI creates in education but also the risks and challenges it presents. To celebrate cybersecurity Awareness Month, this article focuses on the impact on cybersecurity of AI in education. Cybersecurity is not a new concept for schools, but with the increasing use of IT and because they are a prime target for cybercrime, it is important that schools educate themselves on the threats which AI poses to cybersecurity.

So what makes schools a target? 

Schools are lucrative and vulnerable targets for cyber attackers (who seek to gain access to personal data and/or sensitive information) for a number of reasons:  

  • Their IT infrastructures: Due to limited resources and their familiarity to users, many schools rely on older IT infrastructures which are often outdated and lack modern security features, making them easier to exploit than modern applications.
  • Large amounts of sensitive data: Schools process contact details, health and medical records, financial information, attendance, behaviour, grades etc. These types and volumes of data would make any organisation a prime target, but schools are even more attractive to attackers, who often value children’s data more highly than that of adults
  • More likely to pay a ransom: Due to the nature of the data they hold, their reputation and their responsibilities, schools have been found more likely to pay a ransom to an attacker than other sectors. In 2022 SOPHOS found that education was the sector that had the highest rates of ransom payment, with nearly half (47%) of lower educational organisations surveyed paying the ransom when attacked.
  • Large attack surface: Due to the vast number of systems and web-facing assets that schools use, schools have a large ‘attack surface’. This means that there are multiple points where an unauthorised person could gain access to personal or sensitive data.
  • Lack of resources and expertise: Many schools have a lack of resources and expertise for cybersecurity due to budget constraints. Even schools with dedicated staff and systems can find themselves overextended or not skilled enough to address the vulnerabilities they have.
  • Unaware staff and students: A general lack of training and awareness amongst educators and students about cybersecurity makes them more vulnerable to cyber attacks. In England, a 2024 survey found that 1 in 3 teachers had not had cybersecurity training that year and only 66% said it was useful, making them less likely to be able to spot a potential threat and handle it accordingly. 

Can AI help schools with cybersecurity? 

The good news is that AI brings a lot of benefits to the field and can help schools with cybersecurity by:

  • Improving cybersecurity skills: AI can help assess and improve a security professional’s skills, knowledge and competencies by creating highly personalised learning paths including realistic and engaging security training. This makes it much easier and cheaper for schools to upskill individuals to help protect against potential cybersecurity threats.  
  • Process large amounts of data: AI can constantly monitor and quickly analyse vast amounts of data to detect suspicious behaviour patterns indicative of a cyber threat (for example, analysing a school’s email traffic to identify phishing attempts). This could take hours, if not weeks using traditional security processes which are subject to human error and rely heavily on manual analysis. 
  • Protect against ‘zero-day’ attacks: Whilst traditional methods of cyber attack prevention are reasonably effective against ‘known’ cybersecurity threats, AI has the potential to protect against ‘zero-day’ threats where cyberattacks exploit a previously unknown vulnerability. Biometric recognition, facial recognition, and fingerprint scanners are already being used to detect attempted fraudulent logins. 
  • Automate processes: AI can automate repetitive or monotonous security-related tasks, creating efficiencies and also reducing human error. AI can be especially helpful to schools where information and data is stored and moved across clouds, databases, apps and internal systems, which can be hard for internal teams to monitor and assess risks on. 
  • Behavioural analytics: Where schools do use older IT infrastructures and legacy systems using rule-based approaches to cybersecurity, which are easier to exploit, AI can still help by using behavioural analytics to determine how a school’s systems and users should behave to detect deviations from the norm and unusual behaviour far more quickly. 

How are attackers using AI to challenge cybersecurity in education?

Whilst there are a number of ways in which AI can support schools with their defence against cybersecurity threats, as with most technologies, for every good purpose AI can also be used for a nefarious one and cyber criminal organisations have already invested in AI to launch large-scale, targeted cyber attacks against schools. In 2023, 347 cyber events were reported in the education and childcare sector to the UK Information Commissioner’s Office - an increase of 55% in 2022. With attacks on the rise here are some of the ways that cyber attackers are using AI:  

  • More sophisticated attacks: Generative AI (GenAI) (a type of AI which can create text, code and other types of content) can be used by attackers to generate more convincing phishing messages which aim to trick victims into sharing personal information through a fake email, text message or phone call. GenAI can also be used by attackers to generate new types of malware and ransomware, and in the same way schools can automate the scanning for vulnerabilities so can attackers, who can then deploy this malware automatically. The increase in sophistication of the attacks makes it harder for schools to protect themselves against them.
  • Vendor risks: Schools already rely heavily on third party EdTech vendors, and many AI systems also rely on third-party software or hardware components. This makes schools susceptible to supply-chain attacks if these components are compromised, making schools dependent on the security measures of their providers. 
  • New types of attacks: Prompt-injection and data-poisoning are both new types of attacks utilising AI. Prompt-injection attacks are a key weakness in AI models, where an attacker creates an input designed to make the AI behave in an unintended way e,g, generate offensive content, reveal confidential information etc. Data-poisoning attacks occur when an attacker tampers with the data that an AI model is trained on to produce undesirable outcomes (both in terms of security and bias). Both of these types of attacks could result in real harm for students and teachers where students may be exposed to something they shouldn’t be or a decision is made by an AI model using inaccurate data.

So what can schools do about it?

Schools need to be focused on cybersecurity, and the impact of AI on this, both the positives and the negatives to protect their network from cybersecurity threats. Schools need to make sure they have the correct expertise and resources in place to protect themselves and build funding narratives and cybersecurity budgets for these if not. 

In addition to Cyber Security Testing, Cloud Security Assessments and Penetration Testing at 9ine we offer a number of products and services that can help schools with the challenges that AI presents for cybersecurity, specific solutions include: 

  • Security & Systems Assessment: A service to equip schools with everything they need to proactively manage and mitigate security risks    
  • Security & Systems Essentials: A service that empowers technology teams to proactively identify, manage, and mitigate risks related to security, configuration and system performance
  • Vendor Management: Removes the pain, and time, from evaluating and vetting third party vendor contracts, privacy notices, information security policies and other compliance documents. Vendor Management provides a thorough, ‘traffic light’ based approach to inform you of vendor privacy, cyber, AI, and safeguarding risks. Vendor Management supports you to demonstrate to parents, staff and regulators how you effectively evaluate and manage technology you choose to deploy.
  • Privacy Academy:  A virtual, in-person 6 month monthly training and professional development to manage privacy law, AI, cybersecurity and safeguarding risks of harm at your school. Following a project based methodology we train and coach you on implementing a Privacy Management Programme, considering AI, cyber and safeguarding risks of harm.
  • Tech Academy: Training and professional development to enhance school IT teams' skills in security and operations.

In our next article, we will take a look at the approaches regulators are taking to AI globally and recap our recent Webinar: ‘The Impact of AI Regulations on Schools – Exploring the EU AI Act and Global Trends for Responsible Use of AI in Education’. You can watch a recording here

9ine company overview

9ine equips schools to stay safe, secure and compliant. We give schools access to all the expertise they need to meet their technology, cyber, data privacy, governance, risk & compliance needs - in one simple to use platform. For additional information, please visit www.9ine.com or follow us on LinkedIn @9ine.

Cyber crime in schools: Key threats and how to mitigate risk

Cyber crime in schools: Key threats and how to mitigate risk

In this blog, we outline the most common cyber threats facing the education sector and explore key questions like who commits these crimes, what is...

Read More
Improving Cybersecurity In Independent UK Schools

Improving Cybersecurity In Independent UK Schools

Cybersecurity can be daunting at the best of times, but it doesn’t have to be. When we speak of the term ‘cybersecurity’, we are actually speaking of...

Read More
AI in education: Finding your AI systems and categorising them for risk

AI in education: Finding your AI systems and categorising them for risk

The use of AI is being increasingly regulated (particularly in the education sector), to counter the risks and challenges that the use of AI in...

Read More