In the next decade, artificial intelligence (AI) is poised to have a transformative effect across business and society – and nations are clamouring to be the first to reap the benefits.
The UK’s recently published National AI Strategy proposes to position the country as a global AI superpower by 2030 by growing the AI ecosystem, transitioning to an AI-enabled economy and getting AI governance right to protect citizens while enabling innovation.
Aligned to the strategy, the chancellor, Rishi Sunak, recently announced a new £34 million package to fund 2,000 scholarships for AI and data science master’s conversion courses for disadvantaged students. Also announced was funding to support an additional five fellowships for world-leading researchers through the Alan Turing Institute.
All this is welcome news. However, to ensure investment in AI research and education delivers on its potential to transform society and benefit economies, industry-academia collaboration is essential. And having a career that has spanned both academia and industry (Siemens, Medicsight and Huawei), I have lived experience of some of the problems with the UK’s approaches to such collaboration.
One of the key challenges is the limited understanding industry may have of how to engage universities. Most universities provide a “menu” of collaboration possibilities, spanning placements, graduate hires, joint MSc projects, consultancy and funded projects. An example of the latter is a knowledge transfer partnership, which funds what we might call a knowledge transfer associate to work with a company on a challenge it is facing, supervised on a part-time basis by an academic. Programmes such as these not only enable knowledge transfer from academia to business but also inspire academics to work on practical problems.
But even if this menu of possibilities were better promoted to companies, both industry and academia may also need to adopt novel, more flexible ways of working together if the National AI Strategy is to be successfully implemented.
First, as applications of AI have expanded, a diverse range of stakeholders and disciplines must be brought together to produce practical working deployments that are not just technically sound but also meet ethical and legal requirements. To this end, universities should encourage multidisciplinary working, such as coupling mathematicians and computer scientists with experts in the humanities and social sciences to produce trustworthy, autonomous systems that are fair and responsible and preserve privacy. Several recent big breakthroughs have happened at the interface of AI and a different discipline, such as the success of Google’s AlphaFold protein folding approach, which uses AI to solve a long-standing problem in biology.
Maybe universities also need to break down barriers for industry as well. Beyond access to academics and research, some companies may benefit from access to university resources, such as working space, library resources, high-performance computing or public lecture series. Currently, neither universities nor companies are always set up for this.
Another issue is AI brain drain from academia to industry. Large technology companies often have computational and human resources that enable AI research at a scale that is difficult or impossible for university academics to achieve. This may entice academics to leave universities, leaving an AI knowledge gap in academia.
We need to make efforts to keep AI experts at academic institutions as universities are an important component of the innovation ecosystem. A preferred model may be joint industry-academia appointments, allowing academics to work part time at a university and part time at a company. This can enhance the company’s work by leveraging the latest academic research, but also benefit the university by bringing real-world work experience to the classroom, better preparing students for AI careers. Several luminaries in the field, such as deep-learning pioneers Geoffrey Hinton and Yann LeCun, already have such affiliations.
AI is rapidly evolving and is therefore a moving target. Hence, we as universities should continually review our offer to ensure we’re developing the next generation of AI and data scientists with the skills that society and industry need, not just in the areas in which we have existing expertise or interests. For example, at Queen Mary, we’ve recently been awarded funding by the Biotechnology and Biological Sciences Research Council (BBSRC) for a new industry partnership aimed at bridging a known skills gap for researchers who possess both deep computational skills and an understanding of key biological concepts. Through this, we’ll be developing specific modules that will train students in both areas, to produce “AI native” biological scientists.
There’s also more that industry and universities might be able to do in terms of collaborative teaching. Universities and industry can co-design lectures, modules and entire courses, not just to better equip current students for life outside of education but also, via massive open online courses, to inspire the next generation or, via continual professional development courses, to help company employees stay abreast of latest AI developments.
Although the challenges are manifold, there has never been a more important time for industry and academia to work together to address the AI skills gap and fuel world-class research innovation.
Greg Slabaugh is professor of computer vision and AI at Queen Mary University of London, where he directs the Digital Environment Research Institute.
Register to continue
Why register?
- Registration is free and only takes a moment
- Once registered, you can read 3 articles a month
- Sign up for our newsletter
Subscribe
Or subscribe for unlimited access to:
- Unlimited access to news, views, insights & reviews
- Digital editions
- Digital access to THE’s university and college rankings analysis
Already registered or a current subscriber? Login