The “triple helix” model of innovation – describing the idealised links between academia, industry and government to encourage economic development – is a great way of thinking about how a world-leading research university can play its full role in society.
It is a model that is recognised in the Times Higher Education World University Rankings methodology. In 2010, we added industry income (defined as research income from industry per member of staff) as a new pillar and metric. The measure is converted from local currency into US dollars using purchasing power parity and is then z-scored.
The goal of the metric is to recognise the importance of the relationship between university and industry interests; if a university cannot deliver high-quality research, then industry will not pay for it.
The degree to which this is seen as important to universities varies significantly and is sometimes linked to geography. Universities in East Asia, especially South Korea, China and Japan, frequently express a desire for more emphasis on knowledge transfer and industry links. Sometimes it is an explicit objective of the university.
So why is this metric one of the smallest parts of the ranking, accounting for only 2.5 per cent of an institution’s overall score?
One reason is that this is one of the data points that causes us most concern in terms of data quality. What we define as industry, rather than government, might seem obvious, but in practice it frequently isn’t. Different political systems have very different ways of distributing money to universities, sometimes through intermediaries, and it can be difficult to distinguish this from genuine industry funding.
Second, even when the funding source is clear, the point at which institutions should account for this income can be subjective. Should they do this at the start of the project; when the actual income is received; when the expenditure is made; or should they evenly distribute the income across the lifespan of the project? These accounting decisions can make a big difference to the metric score for an individual institution on a year-by-year basis.
We have also been pondering whether there are other metrics that we could include to enhance our measurement of knowledge transfer – either to replace the industry income indicator or to allow us to increase the weighting of the pillar in the ranking.
In 2018, we experimented with gathering additional data on income from consultancy, number of spin-offs, and income from patents or intellectual property.
Unfortunately, the data were not consistent enough for us to include in the World University Rankings; institutions interpreted the definitions in different ways and the figures themselves varied hugely in scale. But an indirect outcome of the experiment was that we started thinking about a much broader measure of university impact, and from that came our University Impact Rankings, based on the United Nations’ Sustainable Development Goals (SDGs), which launched last year.
This, in turn, made us think more deeply about knowledge transfer, as SDG 9 is focused on industry, infrastructure and innovation.
The metrics we included in our table on this SDG are:
- Percentage of research publications about industry, innovation and infrastructure in the highest-cited journals
- Number of patents that cite research conducted by the university
- Number of university spin-offs
- Research income from industry per member of staff
These metrics were devised using lessons from our World University Rankings, and we believe that what we learned from the University Impact Rankings will now inform the next generation of WUR.
Of these four indicators, the one that I think should also be part of the World University Rankings (in addition to industry income) is the number of patents. The idea behind this metric is not to directly evaluate the number of patents produced by research done at a university; rather it is to measure when patents cite research from any university.
This gives us a very different, and complementary, view of the relationship between a university and industry and measures the very thing that the triple helix model is expected to develop.
Do you have ideas about how we can improve our rankings? Send suggestions and questions to us at profilerankings@timeshighereducation.com.
Duncan Ross is chief data officer at Times Higher Education.
Register to continue
Why register?
- Registration is free and only takes a moment
- Once registered, you can read 3 articles a month
- Sign up for our newsletter
Subscribe
Or subscribe for unlimited access to:
- Unlimited access to news, views, insights & reviews
- Digital editions
- Digital access to THE’s university and college rankings analysis
Already registered or a current subscriber? Login