Logo

How to create a higher education AI policy

A successful university AI policy guides internal innovation and usage, directs resources and identifies key contacts for emergent needs. Here are the steps and considerations for writing guidelines

Eric Scott Sembrat's avatar
13 Feb 2025
copy
0
bookmark plus
  • Top of page
  • Main text
  • More on this topic
AI guidelines and policy concept
image credit: CoreDesignKEY/iStock.

Created in partnership with

Created in partnership with

Georgia Tech's Center for 21st Century Universities square logo

You may also like

Developing a GenAI policy for research and innovation
5 minute read
Team of engineers are conducting an experiment in a lab

Popular resources

Creating a university artificial intelligence policy is essential for streamlining operations and anticipating future innovations. This policy goes beyond formalising procedures; it educates on AI capabilities and limitations, centres tools around legal compliance and ethical considerations, and adapts to future technologies. 

The primary purposes of an institutional AI policy is to: 

  1. unite data governance, security and leadership teams towards a common goal 
  2. regulate and put in place procedures for data ingestion into AI services 
  3. audit and ensure safe, ethical and effective AI use in higher education. 

Your AI policy will not be a boilerplate reflection of what an AI policy should be – it will reflect your institutional culture of integrating disruptive technology. A successful policy won’t do much – most of your processes and procedures (such as software procurement, data sharing and privacy) probably already have owners and a well-defined process. Successful policy implementation will ensure those processes’ effectiveness with emergent disruptors, such as the recent advances of DeepSeek.

Step-by-step guide to creating a university AI policy

Here, the process to develop an AI policy for a higher education institution is broken down into steps – from conception to implementation – taken by pivotal members of your organisation. 

Identify policy leaders

Confirm which offices, titles, departments or informal working groups will lead writing or implementing AI policy within your university. This process should reflect how change happens in your institution, whether through a department, leadership or collaborative cohort.

Survey the institution’s internal and external landscape 

The aim is to position the AI policy at the intersection of the organisation’s current and aspirational states. Start by surveying institutional stakeholders to understand usage and perspectives on AI. These groups include: 

  • Students: Students can be early adopters of new technology. Engage with libraries, teaching units and academic innovators to understand student usage. Academic innovators – the people who introduce and implement new ideas, methods or technologies to improve teaching, learning and research – can provide insights, and can be found sprinkled across your academic units. 
  • Research units: Source research-angled considerations and use cases from faculty and staff. These units often house subject matter experts who can translate marketing and sales concepts into practical realities. 
  • Data governance: Collaborate with IT for insights on cybersecurity, data governance and data-sharing practices. IT departments can direct you to key figures in these areas and help define data usage practices. 
  • Legal and ethics: Work with legal departments, ethics bodies and compliance officials to align with federal, state and organisational guidelines. These experts can bridge existing policies with broader legal requirements and point out niche institutional practices. 
  • Academic units: Gather insights from diverse disciplines to understand use cases and risks. Academic units may have their own perspectives on leveraging AI, reflecting their specialised ecosystems of trust, exploration and risks. 

Evaluate the external landscape focusing on three pillars: 

  • Authoritative: Examine federal, state and regents’ positions on AI. Identify leading entities and upcoming laws or policies that could impact AI usage. 
  • Applications and services: Connect with enterprise leads to understand AI integration in foundational tools. Many essential tools that institutions use are developed or hosted by external enterprises, making it crucial to understand their AI strategies. 
  • Peers: Engaging with academic peers and regional universities can provide a wide range of insights, whether they are ahead or behind in AI adoption. 

Build an AI advisory coalition

Form an advisory coalition among AI adopters, primary stakeholders and leaders to drive the AI policy. Use findings from your survey to form the team. The coalition will be crucial to identify institutional  and stakeholder needs. 

Define what the policy is intended to achieve 

Your AI coalition will help you define success for an AI policy. Success can align to one of three pillars: 

  • Experimentation and pilots: Encourage disruptive growth and exploration. Policies might focus on rapid experimentation to explore the unknown. 
  • Centralisation and equity: Build stable, careful steps in evaluation and usage. Policies might aim for equitable access and centralised control to ensure consistent and fair use of AI. 
  • Retraction and cautious adaptation: Encourage a deliberate, methodical approach to integrating AI. Some policies might focus on thorough evaluation and risk management before implementation to minimise disruptions. 

These perspectives may differ across individual use cases. The approach to a research grant might differ from interactions with a learning management system vendor. The goal is to create a policy that balances innovation with safety and ethical considerations, ensuring that AI is used effectively and responsibly across the institution. Engaging your AI advisory coalition can yield powerful guidance on direction. 

Generating and updating a policy 

Your AI policy should reflect your institution’s culture and voice, integrating disruptive technology into practices already in place. It won’t replace processes such as software procurement or data sharing but will enhance them, ensuring their effectiveness and pointing to existing policies.

A successful AI policy guides internal innovation and usage, directing resources and identifying key contacts for emergent needs. It partners with your institutional culture to define appropriate AI usage, helping your organisation manage today’s AI tools and prepare for future advancements. 

Eric Sembrat is director of digital learning technologies in the Center for 21st Century Universities at Georgia Tech. 

If you would like advice and insight from academics and university staff delivered direct to your inbox each week, sign up for the Campus newsletter.

Loading...

You may also like

sticky sign up

Register for free

and unlock a host of features on the THE site