What’s the top university in the world?
Is it the UK’s University of Oxford? Is it Australia’s Western Sydney University? Or South Africa’s Johannesburg University? Is it the Nanyang Technological University (NTU)?
Actually, it is all of them.
Each one of these institutions is currently listed as the top university in the world in one of Times Higher Education’s various different international university rankings.
Oxford comes first in the traditional World University Rankings 2024, which are focused on global research-intensive universities. Western Sydney tops the overall Impact Rankings, which capture universities’ social and economic impact, based on the metrics and measurements behind the United Nations’ Sustainable Development Goals (SDGs).
Johannesburg leads one of the multiple Impact Rankings for individual SDGs, for SDG 1 – no poverty. Meanwhile, NTU is the world’s number one institution in the Young University Rankings, which seek to understand high-performing research universities that do not have the benefits of great historical wealth and prestige by looking at the performance of universities at 50 years old or under.
All of these institutions, and many more, can be top of the world because of a simple, unequivocal fact: there is no single, definitive ranking of universities. There is no “correct” university ranking.
These league tables are ultimately based on the subjective decision-making of their compilers – their choices of which metrics to employ and, crucially, what weight to give those metrics will have a significant impact on universities’ overall scores and ranking positions.
This may seem obvious, and I have been saying it for years, but responsible rankers need to state this repeatedly and bluntly: no matter how robust, rigorous and comprehensive the underlying data is, no ranking is set down on a tablet of stone, and none should be taken as any kind of definitive list.
So what else do I believe responsible rankers are obliged to do? Here’s my eight-point guide to ranking responsibly.
Make clear that there is no single model of excellence in world higher education; embrace diversity and do not judge all institutions against any single one-size-fits-all ranking system
The original “world university rankings”, based largely on research metrics, have been criticised for encouraging universities around the world into conformity – a homogenisation of the global higher education sector into a primarily Anglo-Saxon ideal, exemplified by Harvard University, which frequently comes first in such league tables. The world rankings do indeed focus on a global, research-led elite, made up of no more than perhaps a few thousand universities with broadly similar global profiles and goals: recruiting from the same pool of international talent, publishing in international academic journals, collaborating with each other and enjoying a global brand, reputation and reach.
The make-up of this global research elite is steadily diversifying – indeed, there are now more Asian universities listed in THE’s World University Rankings than there are institutions from Europe or North America, and Africa’s representation is growing significantly. The traditional world rankings are an entirely appropriate way to monitor and benchmark such like-minded universities. Indeed, the World University Rankings have proven to be a helpful and closely watched barometer of the shifting geopolitics of knowledge.
But while this league table serves these global research universities well, it is vital to recognise that we need different metrics and ranking systems to understand different missions and contexts.
This is why THE is proud to have simultaneously named multiple of the world’s number one universities – each based on different missions, goals and contexts with different metrics and methodologies to capture those. Indeed, the proliferation of rankings is helpful in democratising this space, giving universities mission-specific choices and sending the message that there is no “correct” or definitive ranking outcome. The enormous global diversity of higher education is a great strength to be protected and celebrated.
Recognise the inherent inequalities in the world
There is no question that traditional rankings are dominated by the Global North – indeed, the US and the UK are the only two countries that make the world’s top 10 in the research-led league table. Although the data shows a slow but steady global levelling-up, the world rankings do reflect the realpolitik of an unequal world. But THE has, as far as possible, taken steps to mitigate any unfairness and bias: we apply purchasing power parity calculations to all financial data; we normalise most world ranking indicators for institutional size to ensure that quality is rewarded over quantity; and we ensure that our academic surveys, distributed in multiple languages, are statistically representative of the real distribution of scholarship worldwide, not dominated by Western voices.
Crucially, we do not charge universities to take part in any of our rankings exercises.
In addition, with new rankings, THE is finding ways to uncover excellence in universities that are not dependent on traditional wealth, age or reputational advantages. The Impact Rankings, for example, use the SDGs as the framework for performance measurement – these are a truly international set of targets and measurements developed through international consensus. The Impact Rankings allow for a wide diversity of missions and contexts, recognise local and national as well as international work, and do not over-weight traditional research metrics in favour of teaching, or community engagement and outreach. Data collection is also organised in a way that is flexible enough not to require a large and well-resourced institutional data team more typical in richer, Western universities.
The Impact Rankings are therefore shining a spotlight on an exceptionally diverse array of universities across the continents, with many in the Global South taking world-leading positions in arenas that do not require vast wealth – in providing social mobility, for example, or promoting equality, or in upskilling communities for decent work and economic growth, or driving health benefits for communities.
These analyses are giving universities in the Global South due credit and worldwide recognition for their excellent work.
Be honest that it is inherently crude to reduce universities to a single, composite score
Responsible rankers must be clear: while an overall, composite ranking score and position can provide a readily accessible and helpful overview of an institution’s or a nation’s broad strengths, there are many, many great things that universities do that can never be captured and reflected in a ranking. Sometimes the data, for example on teaching quality or graduate outcomes, may not be comparable across multiple countries and systems, or it simply may not exist at all. Moreover, great niche strengths, in, say, teaching innovation or strong industry links, can be buried in a composite score that might weight other aspects more highly.
Responsible rankers, including THE, acknowledge this. We encourage users on our website to break down the overall composite scores and reorder the rankings based on their preferences, across a range of different performance areas – teaching, research, international outlook or industry links in the case of the World University Rankings. Alongside the disaggregated performance pillar scores, there are subject-level rankings; there is supplementary data on each institution, for example on its size, its staff-to-student ratio or gender mix, and of course there are written profiles of each university to put all the data into context. For the Impact Rankings, there’s a separate ranking for each and every one of the 17 UN SDGs.
In addition, THE’s subscription Data Dashboards, available to institutions and governments alike, provide detailed and interactive, data-driven profiles of universities, moving beyond rankings to allow for bespoke, highly flexible benchmarking across a wide range of metrics, covering teaching, research, knowledge transfer, finances, social and economic impact, reputation and sustainability.
Provide methodological clarity
There are many, many university rankings systems out there, each with its own unique characteristics and methodology. So it is important for any user to always read the label. For example, the first of the global university rankings, the Academic Ranking of World Universities developed in Shanghai in 2003, is very widely cited but is based on only six indicators concerned exclusively with research. This is of course helpful for those interested in relative institutional research strengths, but it tells us precisely nothing about the student experience, for example.
THE always includes a prominent link to a detailed methodological explanation against each of its rankings, so that users have a quick and simple mechanism to immediately understand what goes into each ranking and what is – and is not – taken into account for each table.
Invest properly in data collection, validation and quality assurance
THE’s suite of university rankings draws on a range of independent data sources – bibliometric databases to help us understand research outputs and research quality, for example, and a global Academic Reputation Survey administered by our data team. But in all our rankings, at core, we rely on the voluntary submission and sign-off of basic institutional data from all our contributors. Such data can include anything from staff and student numbers to funding through to institutional policy on sustainability.
To ensure fair and meaningful comparisons, THE has developed clear and comparable international data definitions and provides very clear submission guidance. Unlike other rankings providers, we do not scrape data from websites or other public sources, and we work directly in partnership with universities to obtain accurate data – challenging institutions, in some cases, to justify any significant year-on-year variations and engaging in direct consultation over their submissions. This requires a huge amount of data support, validation and quality assurance. For the 2024 edition of the World University Rankings alone, THE’s in-house data team gathered many millions of data points from more than 2,673 universities from 127 countries. Working individually to support each submitting university, the data team managed and resolved a total of 34,455 queries from our submitting universities.
Acknowledge statistical limitations
One of the challenges of rankings can be the uneven distribution of data. Is the difference in the overall score between first place and 50th place the same as it is between 501st and 550th, for example? In the case of the World University Rankings, the answer is “no”; the scores become much more tightly packed as you move down the list, reducing the score differentials between each ranking position. To responsibly acknowledge this, THE uses bands. After the top 200 ranked universities, we list institutions in alphabetical order in banded groups: at first into groups of 50, then lower down the rankings into larger groups. This avoids the risk of offering false precision and the undue separation of universities that achieve very similar overall scores.
Open up to external scrutiny and challenge
THE was founded in 1971 as a source of information and insights for the academics and administrators who work in universities. While our websites and datasets today attract many tens of millions of students and prospective students, our core audience remains as it was in 1971 – and we remain fully accountable to the tens of millions of academics and university professionals who are part of THE’s core academic community. While some university ranking systems are produced by consumer media titles and sit alongside other consumer rankings, of, for example, hospitals, cars and holidays, THE serves the global higher education sector exclusively. This means, of course, that our rankings and insights have perhaps the most engaged and informed – and rightly challenging – community of users of all, who hold us to the highest of standards.
Despite this constant accountability to our community, we also more overtly open ourselves up to critical friends for advice and guidance. We have external advisory boards for our World, Impact, Arab and Sub-Saharan Africa rankings, and we will be setting up additional boards soon. We were also the only global university rankings to bring in a full, external audit, from professional services firm PricewaterhouseCoopers to validate and sign off our data methods and calculations, and we hold roughly 50 open and interactive webinars, or public meetings, on our rankings data and analysis each year.
Continue to innovate
THE has been producing trusted global rankings for 20 years. But we have never stopped listening to our users, adapting new statistical techniques and seeking fresh sources of useful data. As well as the pioneering Impact Rankings, first published in 2019 and soon to include more than 2,000 universities, in 2023 we launched the first ever specialist Sub-Saharan Africa Ranking, supported by the Mastercard Foundation, to better reflect the specific developmental priorities of African universities south of the Sahara. In 2024, THE will publish the world’s first ranking of interdisciplinary science, in partnership with Schmidt Science Fellows as well as an innovative new ranking of online learning.
Watch this space. We’ll soon be naming an even larger number of the world’s number one universities. It is the responsible thing to do.
Phil Baty is chief global affairs officer at Times Higher Education.