Logo

Ensure AI serves institutions, not the other way around

We’d all prefer that AI tools helped us to do research and grade papers rather than take over campuses. Here’s how to develop AI tools for your institution responsibly

Craig J. Ramlal's avatar
24 Mar 2025
copy
0
bookmark plus
  • Top of page
  • Main text
  • Additional Links
  • More on this topic
A catoon image of a robot and human arm wrestling
image credit: iStock/Denis Novikov.

Created in partnership with

Created in partnership with

University of the West Indies logo

You may also like

How to create a higher education AI policy
4 minute read
AI guidelines and policy concept

Popular resources

Imagine you’re juggling flaming torches (or maybe just grading at 2am). Enter artificial intelligence (AI), promising to carry half those torches – and let you catch up on some sleep. AI can transform teaching and research but it also has the power to scorch everything in sight if it’s not developed and used responsibly.

Why does it matter? Because:

  • Data privacy breaches can harm students
  • Biased AI systems could unfairly limit access to opportunities
  • Over-reliance on AI can erode critical thinking.

With great tech comes great responsibility

You wouldn’t invite a house guest who insults the family, so why bring in AI tools that clash with your institution’s mission? Developing AI tools begins with aligning tools with your institution’s values. For example, if an institution values diversity, AI can facilitate holistic admissions processes that consider a wider range of student backgrounds, ensuring fairness in admissions and countering negative biases.

Transparency is also essential. No one likes hidden ingredients in their food, and people certainly don’t like “black box” AI tools that they can’t scrutinise. Educators, students and administrators should understand how an AI system processes data and why it makes certain recommendations. This clarity helps you spot bias and course-correct early. To make sure AI decisions stay above board, you can publish summaries of your AI’s logic, explain how it collects and uses data and involve an ethics committee in ongoing reviews. You might want to look at Unesco’s AI and Education: Guidance for policy-makers for recommendations on implementing transparency and explainability of these systems.

AI often sifts through vast amounts of student data, all of which need to be stored and managed securely, so privacy protection should also be front and centre. That means gathering only what you need, encrypting it and ensuring strict controls on who can access it and why. Finally, none of these matters if no one knows how to use AI responsibly. Investing in AI literacy for faculty, students and staff can be a game changer. Simple workshops on how AI works, how algorithmic bias creeps in and how to engage with AI ethically can foster a campus culture that benefits from technology without being blindsided by it. 

Putting AI to work: teaching, learning and everything in between

One of the most transformative ways that AI can improve education is by personalising the learning journey. As leaders, you can support faculty and departments in adopting adaptive learning platforms that tailor lessons in real time. This real-time responsiveness can boost engagement because students feel the content addresses their needs and offers advanced challenges for high achievers and targeted help for those who are struggling. This frees your staff to focus on the creative and relational sides of teaching, but do ensure the necessary resources, training and infrastructure are in place.

Additionally, AI can handle many administrative and instructional support tasks, such as answering “When is the registration deadline?” or “What time is class for course X?” queries that usually flood your faculty’s inbox. That way, teachers and teaching assistants can devote more energy to lively debate, brainstorming sessions and personalised feedback.

Beyond the classroom, AI can be seen as a strategic investment for your campus. Tools like ChatGPT with deep research; Google’s Gemini with Deep Research; Perplexity and Consensus can automate most of the grunt work for literature reviews, generate summary reports, manage cross-disciplinary collaborations or map out emerging research trends. However, remind your faculty and research teams that AI-generated insights require human oversight to verify accuracy.

Looking to the future – with our humanity intact

Higher education is being reshaped by the promise of AI but as leaders, you bear the responsibility of guiding your institution towards ethical practices that protect student privacy and preserve human connections. Stay curious about emerging AI tools, demand clarity from commercial developers, maintain high standards for transparency, and build a campus culture that understands AI’s strengths and limitations.

AI’s true power lies in complementing – not overshadowing – insight, empathy and creativity that only people bring. Let AI handle the mundane tasks. Let your educators focus on shaping minds and nurturing new ideas. By striking this balance, you ensure that higher education remains a place where technology is in service, not the other way around, and where innovation and integrity go hand in hand. 

Craig J. Ramlal is head of control systems in the department of electrical and computer engineering at the University of the West Indies, St. Augustine.

If you would like advice and insight from academics and university staff delivered direct to your inbox each week, sign up for the THE Campus newsletter.

Loading...

You may also like

sticky sign up

Register for free

and unlock a host of features on the THE site