University challenge
6th December 2024
Dr Matt Bawn explores how to ensure AI adds value to STEM higher education, equipping students with the skills needed in the AI-powered workforce of the future
The arrival of artificial intelligence (AI) technologies, particularly generative AI (GenAI), is already changing higher education (HE), with both students and educators using AI platforms and tools for day-to-day tasks. But as AI becomes an ever more powerful and pervasive part of education, research and the workplace, how will higher education need to change in response?
When GenAI platforms such as ChatGPT first became publicly available a few years ago, many universities responded with measures focused on protecting the integrity of assessment or avoiding academic malpractice. The relative abundance of essay-based assessment in life science education was deemed to be at risk from the ability to effortlessly generate plausible text with AI.
More recently, universities have started to create policies and best practices for staff and students so that these transformative technologies can be used to enhance the student experience, and to ensure assessment prepares learners for employment in an AI-driven world.
Visions of the future
What this future world will look like remains to be seen, as industry and employers are only just starting to forge their own visions of how they will use GenAI in their day-to-day business. But experts have started to explore general trends.
A report in 2023 by think tank Demos on how universities can prepare ‘the AI generation’ suggests that broad skills, rather than deep knowledge, will be key in the workplaces of the future. While AI and automation may take over many entry-level and administrative positions, potentially narrowing opportunities for graduates, there will be a demand for people skilled in analytics and communication who show adaptability and resilience in the face of change. The authors stress the need for education to continually evolve in light of technological advancements, with employability strategies kept under review, and more GRASP (general relational, analytic, social and personal) skills taught alongside academic learning.
Microsoft’s ‘Future of Work’ report suggests that as AI is applied to more tasks, human work is shifting to the “critical integration” of AI output, requiring specialist expertise and judgement. The report paints a picture of AI systems and humans working together as collaborative team members, with each party challenging the assumptions and arguments of the other.
At the RSB’s annual Accreditation Conference in April, an afternoon was set aside to discuss AI and try to gain a better understanding of the main concerns and opportunities. Many highlighted the need for a unified approach and sector-wide guidance, especially with regards to safeguarding the integrity of degree awards and classifications. It was clear that both individuals and their departments are already developing innovative ways to bring GenAI into seminars and lecture theatres, finding it can be used to quickly and easily add student value to teaching – for example, by generating lecture summaries, making lecture transcripts more accessible, and suggesting further reading and formative questions.
GenAI has also been used to encourage discussion and critical thinking – for example, by asking students to analyse and assess the output generated by AI models following a series of prompts related to how the biological sciences can help address global challenges.
The use of such tools in the assessment or evaluation of students’ work remains controversial, however. Guidance and oversight are needed to ensure that the same standards of ethical and equitable usage expected of students are also applied to educators.
Revolution or evolution?
This is not the first time that new technologies have been seen as portents of a re-evaluation of the relevance and value of basic subject knowledge – the advent of the internet and Google search was heralded by many as such. What is crucially different about GenAI, however, is how quickly and easily it can be used to generate large amounts of plausible content without any experience or understanding.
Another consideration is that the GenAI era is coming at a time when HE is already facing other changes. The funding landscape, online learning and changing student expectations are challenging the perceived value of a university education.
As AI becomes increasingly incorporated into HE, we run the risk that higher education will become a dehumanised and devalued learning experience – for example, where educators use GenAI to assess students’ work, also generated by GenAI, to evaluate knowledge of content created by GenAI. It is therefore important to acknowledge these challenges and design curricula and assessments that add value for students and employers.
To develop competency and effectively use GenAI, its outputs must be critically assessed and evaluated, and this can only be done by having parallel competencies in the subject content being taught – meaning there is still value in subject-specific knowledge. Future curricula must have the power to consolidate broad skills into deep translational knowledge, build in key transferable skills such as critical thinking and technological literacy, and ensure assessment measures the true capabilities of students.
AI and active learning
A recent conference funded by the Heads of University Bioscience titled Intelligent Assessment in the Age of AI looked at how building GenAI into learning and assessment was being used to increase ‘active learning’ around areas such as conservation biology and nutrition, and to enhance students’ understanding of academic writing and communication. We also saw how GenAI tools enabled students and educators to work together to design questions that test their knowledge and the criteria used to grade their work. This collaborative process helped students better understand what is expected of them, engage more deeply with the content and feel more involved in their assessments.
Our recent special issue of The Biologist, and the recent Nobel Prize in Chemistry for AlphaFold, have highlighted the immense potential of AI in biosciences, particularly in areas such as drug discovery, genomics and personalised medicine. HE institutions must, therefore, equip graduates with the skills needed to understand and ethically contribute to these advancements. This includes fostering a deep understanding of AI’s capabilities, but also its limitations and the ethical considerations when using it. We must ensure that future bioscience graduates not only utilise AI effectively, but also uphold standards of transparency, fairness and accountability in this rapidly evolving technological world.
Dr Matt Bawn is a senior lecturer in bacterial genomics at Newcastle University and a member of the RSB’s Early Career Lecturers in Biosciences (ECLBio) group.