Zoe Amar joined us at our latest Higher Education Chair of Audit Committee Roundtable where the topic of discussion was Artificial Intelligence. The topics covered were:
AI v Machine Learning
AI has been around since the 1950s, but it has really started to develop far more over the last 18 months, particularly with the advent of Chat GPT and generative AI starting to ramp up in the speed of its development.
Essentially, AI is technology that allows computers and machines to imitate human intelligence and problem-solving capabilities. Many of us will know that this technology is all around us every single day. So as an example, if you have locked your phone, and then use your face to unlock it, that is a form of AI. If you have applied to remortgage or for a loan, it is likely that some form of AI will have been involved in the algorithm, which calculates whether you are eligible for that mortgage or loan. So, it’s out there, and we’re using it; and that is why it’s so important that we think about how we can use it more strategically within our organisations.
Machine learning is a process where you use mathematical models of data to help a computer learn and to see patterns. For example, Amazon saves your product choices and as such, it’s learning from your previous purchasing behaviour, and subsequently offers suggestions based on your buying behaviours and others like you.
Recent research has shown the impact that AI could have on the job market; anything that is more commoditised within our roles, could in fact be automated. That doesn’t mean people’s jobs are being taken away, it may just mean that our jobs look different. One of the trends is that it may potentially free up staff time to enable them focus on the things that really matter. In the university space, if AI was more involved in assessments you may then be able to spend more time with students explaining what their results mean, and where they can improve and really create some advice and challenge for how they can progress even further. So, there are massive implications for productivity and a real potential for it to save working time, as well.
In terms of AI regulation, the approach that has been suggested is that it is devolved to individual sector regulators. Thus, the regulation of each sector will be dependent on the AI skills and the ability to resource the regulation. Some regulators are skilled and well-resourced around digital, but others are less so. Therefore, we could potentially see some inconsistent respects to regulation.
Where AI is starting to have an impact on higher education
Technologist Dr Stephen Thaler[i] wanted his AI, Dabus, to be recognised as the inventor of a food container; but that appeal was rejected by the Supreme Court. The case raised questions as to who has responsibility for what? So how does the human in the loop best work with the technology? And what roles do they have and who gets credit for what? If we think about the way that AI is altering the way in which some students may be putting work together, or creating content, or writing presentations, who gets the credit? And then what does that then mean for IP, and the income that’s also going to be at play from ownership of that IP. Whilst this case has been rejected at the Supreme Court, the questions it has raised have not gone away.
The Russell Group’s five principles of generative AI guidelines[ii] are helpful to consider. What can universities use themselves and what might we adapt? The practical application of this is so important. How can universities support students and staff to become AI literate? It’s important to have specific examples of how AI is being used and the skills that people are developing as a result, even if they don’t initially realise what those skills are. You must get those examples of using AI out in the open – students will be using some of it in some way, and academics might be using it as well. So, it is about understanding specifically what they are doing, what they have learned from it, and how their skills might be altering as a result.
One of the recommendations in the guidelines is that universities need to talk to students, schools and other universities to obtain a holistic understanding of how everyone is beginning to use these tools and also what the emerging best practice might be.
As these tools are progressively adopted, one of the fears is the loss of community. University for many brings back fond memories and there’s potential for AI to make things a lot more transactional, thus losing the human side of things. One area for university audit committees to think about is how to adopt these tools so that they are in line with our values, ensuring we’re still giving people a really great experience, when they come to university. A new student, for example, who wants to find out some very, transactional pieces of information such as what time the student union opens or if there is a shop on campus can be supported by AI. But if a student is struggling being away from home, or finding their course difficult and needs help, that’s the kind of area where currently AI is not a substitute for a human. Hopefully where AI can free up people’s time, they can then spend that additional time with that young person and give them the support that they need.
There is also a worry that we are going to see a divide emerging between tech literate schools and students and those who are digitally excluded. We saw during lockdown that there was a lot of discussion about the impact of digital exclusion. So, children who were at home with Wi-Fi and who had devices could get on with homeschooling but those without such access couldn’t participate in home learning. There is still that inequality of access to technology. So many will start university with very different stages of experience with technology. So how do you overcome that?
What Audit Chairs should be considering
Audit Committees need to be proactive and ask questions around what is currently happening within your institution. What’s your understanding of the risks involved? Don’t be afraid to put the brakes on until you get adequate answers. It feels like there’s a lot of pressure to be the first in using these tools for fear of getting left behind but if you don’t feel you’re getting the answers at your audit committees, and you don’t feel comfortable with the way the university is using these tools, you should feel free to ask for more information, or ask to bring in an expert to support.
Students’ expectations have changed including the speed of response and getting more personalised recommendations for things. Many sectors such as financial services and retail have really invested in AI and so students may have some very sophisticated expectations of where AI may be involved at university. In terms of scenario planning, how is it going to change the role of your university and the shape of the workforce of your university as well?
How are you going to give staff and students time to try new ideas out with AI? How can you create an inclusive approach to it as well if there is a disconnect due to their circumstances because they might not have had the same access to technology that other staff and students may have had? How do we bring them up to speed? Do we need to give them more support and guidance as part of the onboarding process?
How is this impacting governance? Do you need to adapt your policies? What’s the knock-on impact with things such as data policies and IT policies – do you need consider all those things together? What are the processes in place if a new AI tool arrives on the scene tomorrow and there is a need to get up to speed with it quickly? You might not be able to wait three months until the next board meeting, and have a process in place to deal with any new risks or new opportunities quickly.
How do we upskill boards with AI? Do they have the skills and the experience to provide the right strategy, the right scrutiny and the right support around how to make really informed decisions about how this technology gets adopted? Skilling up your board is key. This might be the right time to carry out a skills audit, looking at the AI, tech and data skills that you have got on your audit committee as well as the board and executive.
[i] https://www.bbc.co.uk/news/technology-67772177
[ii] https://russellgroup.ac.uk/news/new-principles-on-use-of-ai-in-education/
Our next Higher Education Chair of Audit Committee roundtable will be taking place at 9am on Wednesday 19 June 2024 where we will be joined Henry Hughes, Chief Technology Officer at JISC. Contact Louise Hughes: lhughes@hwfisher.co.uk to reserve your place.
We’d love to hear from you. To book an appointment or to find out more about our services: