AI in education: What are the risks and challenges?
Whilst there are many opportunities to be realised from using AI in education, its use also has the potential to create new (and exacerbate existing)...
7 min read
9ine : Nov 7, 2024 1:37:37 PM
The increasing use of AI in schools requires leadership and oversight to ensure that the benefits and opportunities of AI are realised, whilst avoiding the risks and harms. But who should do this in schools? At what level, and in what department? Should it be an AI Officer? This article takes a closer look at the requirements for leadership and oversight in schools and how this is being approached.
This article is part of a series of insights from 9ine exploring the impact of Artificial Intelligence (AI) in education. From the opportunities it can bring, to the potential risks and challenges it creates where not implemented appropriately, this series will bring you up to speed and provide practical tips to remain compliant and ethical when using AI in education.
In our previous article, we looked at the impact of AI on privacy, data protection and ethics in education, looking at how the use of AI in education exacerbates existing ethical issues as well as creating new ones. In this article, we look at the leadership and oversight that schools need when it comes to governing AI, including the question of whether an AI Officer is required.
The purpose of leadership within a school is to monitor and guide team member’s and teacher’s work, to ensure that they meet the school’s established goals and standards. It includes setting objectives, providing detailed instructions and regularly reviewing the progress of the school against these.
The use of AI in schools is a relatively new phenomenon, meaning that the need to provide leadership and oversight on it is also a new concept to schools. Schools have already had experience of integrating, leading on, and providing oversight of traditional Information and Communication Technologies (ICT) (such as computers and the world wide web) but with technology leaders like Bill Gates describing AI as the most important technical advance in decades, previous leadership and oversight approaches to traditional ICT are not necessarily fit for purpose.
Everyone has a role to play in the responsible use of AI, as all stakeholders within a school are part of the AI ecosystem and lifecycle, whether it is their data being used by an AI system, that they are impacted by its decisions, or because they have a responsibility for how AI is used within a school. When it comes to leadership and oversight, the key questions for schools are: Who should have responsibility, accountability and oversight of AI use in our school? Where should they sit in terms of seniority, hierarchy and department? What responsibilities do they have? Is this one person or multiple?
Despite AI’s broad range of applications, the responsibility for the successful integration of AI has largely been placed on IT departments. This has mainly been because AI is seen as an ICT Tool, that the AI tools which a school uses are often provided by EdTech vendors (meaning that IT can manage the engagement with these vendors) and because IT departments are capable of managing the technical aspects of AI systems, including security.
But leaving the leadership and oversight of AI use in a school to just the IT department misses out on a lot of value which can be gained from good AI leadership and oversight, and also exposes a school to a lot of risks where the responsible use of AI may need input from other areas and departments.
Having IT lead on AI leadership and governance will not necessarily involve looking at the strategic deployment of AI systems, to benefit the school as a whole, and the offering it gives to current and potential students. AI has the opportunity to revolutionise how schools operate, and to make them pioneering schools when used responsibly and appropriately. A school may strategically want to be seen as a place where AI is a key part of the schools day-to-day, or where it gains a reputation for having specialist teachers in AI, with AI as a key part of the curriculum. Keeping the leadership and oversight of AI just to one department may limit the ability to have this strategic oversight and forward thinking.
Another area where some schools have looked at placing the responsibility for AI leadership and oversight is where the responsibilities lie for data protection and privacy. Due to the ethical issues which AI raises, as an area of ethics and the fact that AI often involves processing personal data, it somewhat makes sense to give the individual(s) responsible for this at a school the responsibility for leadership and oversight of AI too. However, this still misses out on this strategic view depending on the level at which the person (or people) with this role operates. Additionally, the issues and risks which AI raises are broader than just privacy concerns and the policies and procedures which will be required to govern AI will go beyond its use of personal data, involving a technical, ethical, legal and pedagogical element, meaning that this would expand the role of the privacy and data protection lead in your school substantially.
In trying to overcome this issue of who, what and how when it comes to AI leadership and oversight, a new role has emerged, that of an ‘AI Officer’. The role of an AI Officer is to ensure accountability, leadership and oversight of AI to ensure that it is safe, ethical, unbiased and non-discriminatory.
The introduction of the concept of AI Officer has been likened to the introduction some countries have made for the mandatory requirement of a Data Protection Officer to provide oversight on privacy and data protection. Some countries are choosing to simply require certain governance controls to be in place for AI, being less prescriptive about the specific roles that have responsibility for them, whereas others are introducing mandatory requirements for them in certain areas, such as the White House in the US requiring federal agencies to designate AI Officers earlier in 2024. The reference to the need for designation of an AI Officer has also been made in various proposed legislation.
Having someone designated as an AI Officer can overcome these issues of fragmented leadership and oversight by giving someone overall accountability and oversight of the school’s use of AI, meaning that even where it is not currently legally required, it is something that schools may want to consider. An AI Officer can provide:
Our view is that the role of an AI Officer is not restricted to a single person. The decisions that need to be made by the ‘role’ span different skill sets. The responsibilities consider pedagogy, safeguarding, privacy and cyber security expertise. Additionally the role requires subjective analysis of each of these considerations when determining how AI should be deployed and then how it will be managed. Creating a team that has oversight for all the responsibilities of the AI Officer is likely the first best step. Second to that is creating a policy for the evaluation and use of AI in school, and thirdly, identifying the training that is required for all staff to support their understanding of AI and the implications of the schools AI policy.
Whether schools decide to designate an AI Officer or allocate roles and responsibilities for AI across the school, schools need to ensure that they have leadership and oversight of AI in place. They need to:
At 9ine we offer a number of products and services that can help schools with the challenges that AI presents, specific solutions to support schools with their leadership and oversight of AI include:
In our next article, we will take a practical look at helping schools in finding your AI systems and categorising them.
9ine equips schools to stay safe, secure and compliant. We give schools access to all the expertise they need to meet their technology, cyber, data privacy, governance, risk & compliance needs - in one simple to use platform. For additional information, please visit www.9ine.com or follow us on LinkedIn @9ine.
Whilst there are many opportunities to be realised from using AI in education, its use also has the potential to create new (and exacerbate existing)...
Cybersecurity is a key consideration for schools, given the large amounts of children’s data they hold (which makes them a key target for cyber...
In needing to strike a balance between the opportunities that AI can bring, with the risks and challenges it creates, countries globally have...