9ine Insights | Latest news from 9ine

AI in Education: AI Literacy. What do schools need to know?

Written by 9ine | Feb 25, 2025 1:05:07 PM

The EU AI Act’s first provisions came into effect this month, including the requirement for schools to ensure the appropriate level of AI literacy for their staff. Schools outside of the EU should also be paying attention to the importance of AI literacy though, so in this article we look at what AI literacy is and what schools need to do globally to ensure it. 

February 2025 marks an important month in the EU AI Act Calendar, with the first measures of this risk-based, overarching regulation on the development and use of Artificial Intelligence (AI) in the European Union taking effect. These include the obligation for providers and deployers of AI Systems to ensure a sufficient level of AI Literacy of their staff and other persons dealing with the operation and use of AI Systems on their behalf. But what does this mean in practice? And why should schools outside of the European Union pay attention?

What is AI Literacy?

AI Literacy means ensuring that your staff have the appropriate skills, understanding, technical knowledge, experience, education and training on Artificial Intelligence, and on the AI systems which they develop and operate. This is required so that the greatest benefits from AI Systems can be obtained, whilst protecting the fundamental rights and health and safety of individuals. It also supports these individuals in making informed decisions about the deployment of AI Systems, providing the necessary human oversight required for AI. 

‘Providers’ are defined as those who develop an AI System and ‘place it on the market’ or ‘put it into service’ in the European Union, whether for payment or free of charge under the EU AI Act. ‘Deployers’ are defined as those who use an AI System, unless it is in a personal, non-professional capacity. Generally, schools will be considered to be deployers, as they predominantly obtain their AI Systems from third party EdTech Vendors rather than developing them in-house. There are some actions which schools could take though, which would make them a provider, for example if they modify an AI System in a way that means that it remains or becomes a ‘high risk’ AI System. 

Regardless of whether they are considered providers or deployers, schools have a responsibility to make sure that all of their staff have the appropriate level of AI literacy. Schools will also need to make sure that the providers of the AI Systems that they use have ensured the appropriate level of AI literacy in their staff, as part of the due diligence the school performs on them in their Vendor Management Process. 

Why is AI Literacy important?

AI Literacy helps to equip individuals with the necessary expertise to make informed decisions on AI Systems. The level of expertise required will vary depending on the role a particular individual has in relation to the use of AI at your school, but all staff will be required to have some level of AI literacy. Individuals with higher levels of responsibility for the use of AI Systems at your school will require deeper knowledge and a higher level of AI Literacy. All staff will need to know how to ensure compliance of the AI System with legal and ethical requirements, be able to answer basic questions from students, pupils, parents and guardians, and have enough understanding to report any ethical or legal concerns they have with the use of AI at your school. Other individuals may need to know how the AI system works technically, how it should be used, the suitable ways in which to interpret the AI System’s output or how decisions taken with the assistance of AI will have an impact on individuals. Those with a responsibility for implementing the instructions for use and human oversight for the AI System will need to have an adequate level of AI literacy to perform this role. 

AI Literacy can help individuals understand the opportunities that AI can create for the school, its limitations, risks and challenges, as well as the safeguards, rights and obligations in relation to use of AI Systems. Having this knowledge will empower individuals to support the school in identifying opportunities for AI in education, and balance these against the risks and challenges, to ensure that the use of AI supports the school’s strategy and educational goals. It can also help the school build trust in their use of AI, reassuring students, parents and guardians in the school’s proficiency and safety in its use of Artificial Intelligence. If a staff member gets asked a question about how the school is using AI, it is not a good look if they don’t know what this is, how the school is using it and who to contact within the school for further questions. AI Literacy can also equip individuals with the skills that they need to be questioning about the school’s use of AI, empowering all staff to be the school’s eyes and ears on the ethical and compliant use of AI and supporting the school in ensuring this. 

Whilst no direct fines or other sanctions will apply for violating the AI Literacy requirements under the EU AI Act, from the 2nd August 2025 providers and deployers of AI systems may face civil liability for non-compliance with the Act. This means that if the use of AI systems by staff who have not been adequately trained causes harm, the school could be liable to pay damages to those impacted. A High School in the United States of America has already been sued by parents, claiming that their child had been unfairly punished for using generative Artificial Intelligence on an assignment. The student used an AI tool to prepare an outline and conduct research for a project, and when the teacher found out, he was given detention, received a lower grade, and was excluded from the National Honor Society. The High School did not have an AI policy in place and the parents also claimed that the school needed to provide training on the use of AI to its staff, because they did not have the appropriate level of AI literacy. This case shows the importance of AI Literacy and what schools could be liable for if they don’t provide this. 

In addition to having to pay damages, regulators will likely criticise non-compliance with the AI literacy requirements in any inquiries and investigations, particularly because the risk of non-compliance can be greatly reduced by having informed and capable staff.  Where a school is found to be non-compliant with the EU AI Act in any way, evidence of the level of AI literacy in their school, as well as of any training they have provided, will be taken into account when a regulator decides the level of action to take against a school. 

This all shows that whether it is used to realise opportunities or to avoid risks, harms, costs and regulatory action, AI literacy is important.

Our school isn’t in the EU, so why should we care?

Even where the EU AI Act doesn’t currently apply to a school, given the ripple effect that the EU’s legislation has globally (known as the ‘Brussels Effect’), as demonstrated by the General Data Protection Regulation’s impact on privacy regulation, it is likely that countries will replicate the Act’s requirements at least in part (if not in full) over time. This ripple effect can already be seen with South Korea introducing the AI Basic Act in December 2024, which shows some influence from the EU AI Act and includes the importance of providing training on Artificial Intelligence to professionals.   

Beyond legal requirements for AI though, given the importance of AI literacy in helping individuals to realise the opportunities of AI, whilst avoiding and minimising the risks, schools should see AI literacy as an important strategic priority that will empower their staff, and support their school in achieving their educational goals. It can also help the school to avoid non-compliance with other legal requirements they are subject to, such as for safeguarding and data protection and privacy.  

How do we ensure AI Literacy?

Schools have significant flexibility in devising the content and format of their AI training for staff, but they need to ensure that they at least have some. Countering allegations of non-compliance will be challenging if schools fail to implement any form of AI training or learning resources, whereas defending against regulators or civil claimants arguing that the AI training provided was inadequate will be far easier. Therefore, providing base-level training is better than doing nothing. 

A useful first step in approaching AI literacy is to look at whether the school has provided any relevant training, or produced any other relevant resources previously. If so, the school should document these and retain evidence of who at the school has received training on them. If this analysis is not feasible, or identifies gaps, schools should look to quickly implement AI literacy measures. A layered approach to training is advised as not all staff or all departments will be required to have the same level of AI literacy. Schools should look to offer basic training to all employees and roll out more sophisticated or role-specific training as required in, in a phased approach. 

In deciding what training will be required, a useful first step is looking at UNESCO’s AI Competency Framework for Teachers, which defines the knowledge, skills and values teachers must master in the age of AI.  It outlines fifteen competencies across five dimensions: Human-centered mindset; Ethics of AI; AI foundations and applications; AI pedagogy; and AI for professional learning. These competencies are categorised into three competency levels: acquire, deepen, and create. Topics include: human agency, ethical principles and safe and responsible use. Schools can use this framework to decide who in their school needs to know what to ensure the appropriate level of AI literacy across the school and then develop the relevant training for each of these competencies. 

The European Commission has also created a repository of AI literacy practices, the aim of which is to provide examples of ongoing AI literacy practices to encourage learning and exchange among providers and deployers of AI systems on AI literacy. The repository shows that organisations are using e-learning platforms, layered and role-based approaches, personalised learning journeys, applicable examples and case studies, knowledge hubs and technical and non-technical training to improve the AI literacy of their staff. They are using these to train individuals on the risks related to confidentiality, security, data privacy, intellectual property, fundamental rights, ethics and bias of using Artificial Intelligence and provide examples of good practice in these areas. 

Given the time, cost and expertise that it takes to develop these courses and maintain and update them as the technology and regulatory landscape evolves rapidly, schools should consider the efficiencies and quality that can be gained from utilising a third party’s AI literacy resources. Outsourcing the responsibility for researching the relevant requirements for AI, designing courses and modules and reviewing and updating these can bring significant advantages to a school, particularly as the deadline for compliance with this requirement in the EU has already passed. Schools can then complement these materials with materials specific to their school, to ensure the appropriate level of AI literacy at their school.  

How can 9ine help? 

At 9ine, we have developed 9ine Academy LMS. This is an on-demand training and certification platform which enables schools to enrol individual staff members or entire groups in comprehensive training courses, modules, and assessments, featuring in-built quizzes for knowledge checks. 

Our AI Pathway is your school's learning partner for AI ethics and governance. With over 20 differentiated course levels you can enrol all staff in an Introductory course to AI, then for those staff with a greater responsibility, enrol them in Intermediate and Advanced courses. There’s also specialist courses for AI in Safeguarding, Child Protection and Technology.

Schools can also subscribe to learning pathways in AI, Privacy, Cyber, Tech Operations and Risk Management. Alternatively, schools can purchase courses on a per person and a per course basis. We are currently offering free trials for up to three members of a school’s leadership team, so contact us if you would like to take advantage of this, or have any questions on Academy LMS.  

Other products and services that 9ine offer, which can help schools in meeting their Artificial Intelligence compliance requirements include: 

  • Application Library: A solution that enables all staff to access a central searchable library of all EdTech in the school. The library contains all information staff need to know about the AI in use (if there is), privacy risks, safeguarding risks and cyber risks. With easy to add ‘How to’ and ‘Help’ guides, Application Library becomes a single, central digital resource. Through implementing Application Library your school will identify duplication in EdTech, reduce contract subscription costs and have a workflow for the request of new EdTech for staff to follow.
  • Vendor Management: Removes the pain, and time, from evaluating and vetting third party vendor contracts, privacy notices, information security policies and other compliance documents. Vendor Management provides a thorough, ‘traffic light’ based approach to inform you of vendor privacy, cyber, AI, and safeguarding risks. Vendor Management supports you to demonstrate to parents, staff and regulators how you effectively evaluate and manage technology you choose to deploy.

9ine company overview

9ine equips schools to stay safe, secure and compliant. We give schools access to all the expertise they need to meet their technology, cyber, data privacy, governance, risk & compliance needs - in one simple to use platform. For additional information, please visit www.9ine.com or follow us on LinkedIn @9ine.