14.8 C
Johannesburg
Monday, December 15, 2025

AI adoption in education is a must

By Thapelo Molefe

Artificial Intelligence (AI) can revolutionise higher education by improving teaching, learning, assessments and student inclusion, but only if harnessed responsibly, according to education experts and academics.

The Eduvos conference on transforming education through AI in Midrand, Gauteng, brought together local and international academics to debate how universities could adapt to rapid technological change.

Discussions centred on innovation, ethics and impact. 

One of the concerns was the impact of AI on teachers’ jobs.

“AI will never take anybody’s job. But people who know how to use AI will take the jobs of people who don’t,” said Professor Pius Owolawi, who is the HOD of computer systems engineering and assistant Dean for Industry Liaison, Special Projects and Work Integrated Learning at the Tshwane University of Technology 

He warned that curricula were lagging behind the pace of AI-driven change, creating a mismatch between graduate skills and industry needs.

Delegates highlighted several ways AI could strengthen teaching and learning. These included automating time-consuming tasks like grading, generating personalised feedback and enabling lecturers to tailor lesson plans for different learning styles. 

Tools such as Gradescope can already assess handwritten scripts and provide detailed analysis, while platforms like NotebookLM and Notion allow students to turn class notes into podcasts or mind maps, reinforcing knowledge in multiple formats.

Eduvos lecturer and a full-time data science researcher, Francis Macombe, shared findings from his study on how AI could predict student performance in South Africa’s private higher education sector.

His research focused on identifying which students were at risk of failing and finding ways to support them early.

Macombe explained that private higher education in South Africa was largely funded by parents, with only limited bursaries and grants available. This created a need for institutions to actively support students so they could succeed academically, which in turn helped institutions maintain stability and throughput rates.

To address this, Macombe tested several AI algorithms, including random forests, support vector machines, extreme gradient boosting, linear regression, naive Bayes classifiers and artificial neural networks. 

He found that neural networks were the most effective in predicting whether a student would pass or fail. The study used data from 3000 students, including assignment marks, test and exam scores, and even information about their parents’ education background.

The predictive model produced a score for each student. A score above 0.5 indicated that the student was likely to pass, while a score below 0.5 signalled a student at risk.

This allows institutions to flag students who need extra support, such as booster classes, additional tutoring or bursaries.

Macombe also highlighted specific factors that had a positive impact on the success of students.

For example, completing assignments and participating in group projects were strongly linked to passing. Financial support in the form of bursaries was also shown to significantly improve academic performance.

“These findings mean that private institutions can proactively support students, ensuring they submit assignments, participate in group work and receive financial support when needed,” Macombe said. 

“Predictive AI tools give us the opportunity to identify at-risk students early and intervene before failure occurs, ultimately improving both student outcomes and institutional success.”

A student-focused panel discussion broadened the conversation to how learners themselves were experiencing AI in the classroom. 

Owolawi said many students already used AI positively for instant feedback, personalised tutoring and bridging language gaps in technical subjects. 

But he warned of “over-reliance and passive learning” as well as growing inequalities in access to advanced AI models, with African students often disadvantaged.

A researcher at ICT Africa, Leslie Dwolatzky, argued that ethical concerns were not new.

“The students who would have cheated 10 years ago will still be the ones to use AI tools to cheat today. What has changed is the ease and likelihood of not getting caught.” 

He stressed that universities must show students the real benefits to reduce unethical use.

Responding to a question on responsible use, Macombe said students needed better preparation to navigate AI critically. 

“They must know that some content AI generates might not be accurate. They need to verify, research further and cite properly to use it responsibly,” he said.

One concern was the impact on motivated learners using AI tools to boost creativity and problem-solving.

Over-reliance could risk stifling original thinking.

The Academic Head of IT at Eduvos and one of the organisers, Amos Anele, said the event was designed to foster collaboration and bring together seasoned academics, researchers, industry practitioners and policymakers to explore how AI could drive innovation and be adopted ethically. 

He stressed that AI was not replacing educators but “supporting what we do to ensure students from diverse backgrounds are included, while also personalising learning and optimising institutional operations”.

Anele said Eduvos allowed students and academics to use AI tools, but with strict emphasis on responsible use. 

“We don’t want them to fully rely on it. Their complex problem-solving, decision-making and research skills are at risk if they do. That’s why we introduced modules on AI ethics and privacy to prepare them for the 4IR era,” he said.

He noted that while AI could ease content generation and streamline large projects, students must still apply critical thinking and defend their work.

“If you have a beautiful assignment and cannot explain it in an oral presentation, then something is wrong. No matter the guidelines, there will always be cases of over-reliance, and in such cases, consequences follow – often a zero for that work,” Anele said.

On governance, Stella Bvuma who is the HOD and director of the School of Consumer Intelligence and Information Systems at the University of Johannesburg, stressed that institutions needed clear frameworks to ensure AI was deployed ethically and transparently. 

She warned against relying solely on automated tools for disciplinary decisions, arguing that punishment without human oversight risked unfair treatment. 

“AI must never replace the role of educators in understanding and addressing student behaviour. Support, guidance and empathy are essential in the classroom,” she said.

Bvuma urged universities to adopt policies covering data privacy, equity, accountability and risk management, while ensuring scalability so that AI systems remained sustainable in a rapidly changing tech landscape. 

She also called on students to engage actively with AI policies.

“Policies are not just for the older generation. Students must interrogate them, critique them and contribute because they shape your future directly.”

Taking the discussion further, Eduvos content writer and senior research associate at UJ, Ngoma Matanga, highlighted AI’s potential in guiding career choices.

He proposed a system where students could write narratives of their aspirations on a university website, and AI would analyse the curriculum to recommend tailored study paths.

Using natural language processing and advanced word embedding techniques, these tools could help learners align personal interests with institutional offerings. 

“It would be nice for a student to tell their story and the system points them to the most suitable course,” he said, adding that higher education institutions could adopt this approach to personalise education from the start of a student’s journey.

However, ethical concerns and access barriers remain.

Delegates emphasised fairness and transparency in the use of AI, particularly in admission processes where reliance on historical data could perpetuate bias. 

They also stressed the need for equitable access to AI tools, noting that many South African students still struggled with digital literacy and infrastructure gaps.

Panellists also warned against assuming that all students have laptops or smartphones.

“Policies must be inclusive, accommodating those who learn visually, those who prefer audio, or those who may only be able to access material offline,” said one participant.

Community engagement was identified as another critical strategy. Several speakers urged universities to help bridge the digital divide by extending training and awareness campaigns to rural schools. 

“It doesn’t have to be a major project with grants. Something as small as opening up your laptop in your community and showing others how AI works can make a difference,” said another panellist.

Despite the challenges and concerns, the adoption of AI was crucial to prepare students for the 21st century world of work, said Owolawi.

INSIDE EDUCATION

Related articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Inside Metros G20 COJ Edition

JOZI MY JOZI

QCTO

MTN Online School Special Edition

Climate Change Special Edition

spot_img

Inside Education Quarterly Print Edition

Latest articles

Ads Blocker Image Powered by Code Help Pro

Ads Blocker Detected!!!

We have detected that you are using extensions to block ads. Please support us by disabling these ads blocker.