How to serve up your HE data to policymakers

Academics who want their findings to be used but not abused should bear eight things in mind, says Nick Hillman

January 31, 2019
cherry-pie-chart-and-graphs
Source: Michelle Thompson

Policymakers use masses of data and, typically, display an instinctive trust in the numbers put before them. But many of them know little about data gathering and processing, or the responsible uses of metrics.

They clearly need help in sorting, understanding and using data. That is true for politicians and civil servants but is equally true for everyone else influencing policy – whether in strategic roles within universities, thinktank land or the media.

At the Higher Education Policy Institute, we publish new data – including, with Advance HE, the best available figures on UK students’ perceptions of value for money, workload and well-being. But we consume more than we produce.

This has taught us eight important things to bear in mind for academics hoping to clear the fog that clouds the interface between the data producers and policymakers.

  1. Accuracy and precision are not the same. Policymakers want something between the number on the back of a cigarette packet and a figure correct to the fifth decimal place. When preparing a piece on where student fees really go, I was often told that the data could not be made available because they had rough edges. That is unpersuasive because when policymakers are presented with precise numbers, their first instinct is to round them up or down to make them more memorable.
  2. The currency of politics is killer facts. If you want the spotlight shone on a specific policy area, you need to find the single most striking statistic about it. One example regarding university access is the shockingly small proportion of care leavers in the UK making it to higher education – currently thought to be 12 per cent. Conversely, if you think that a data point will be misunderstood, put the caveats in flashing lights up front. Otherwise, the number will be milked until it turns to cheese.
  3. Ministers do not always know what data they want. The teaching excellence framework is a good example. When it was launched, ministers had little interest in the details: they just wanted teaching and learning measured. The TEF has become uber-complicated partly because the government accepted criticisms aimed at the early model rather than sticking with its original “not big, bossy or bureaucratic” formulation.
  4. Some data are always wrong. One instance is opinion poll data on how people will react to student funding changes. Before 2012’s tripling of the tuition fee cap in England, a majority of full-time students said they would not have gone to university if fees had been significantly higher, and most part-time students suggested they would still have gone even if their fees had been much higher. Both claims turned out to be wrong. Academic studies that do a better job of assessing the veracity of the sentiments behind such statistics could be very valuable.
  5. The higher education sector has agency in data presentation. UK universities’ own figures typically suggest that only about half of their income from fees goes on teaching and learning. But such numbers tend to exclude IT, library and building costs. If we dislike headlines about how little money goes on teaching and learning, it is in our gift to change them.
  6. Contextualising difficult data is better than blocking publication. Some people in universities come close to saying that the government’s Longitudinal Education Outcomes data, which record the salaries that students on particular courses at particular universities go on to earn, should not be published because of the huge disparities they reveal. But the argument that knowledge should be restricted is especially unpersuasive when it comes from academics, whose own research typically involves contextualising difficult findings.
  7. The data gaps need filling. It may feel as though there is no shortage of data on UK higher education. But, in fact, lots of crucial numbers are lacking. For example, while undergraduate students are regularly surveyed, staff and applicants are less often questioned. It is true that when we start counting things, they can take on undue importance; we can end up knowing the price of everything and the value of nothing. But, equally, we cannot evaluate what counts unless we sometimes count it. On important issues like the mental health of staff, we still know remarkably little. The mistake we make in higher education policy is not counting too many things; it is assuming they all matter for every institution. It is refreshing when a university bravely states that it does not want to be fabulous at everything and instead defines a small number of areas where it does want to excel; that may well be the path to the greatest success.
  8. Data are not always enough. We also need more qualitative evidence in higher education, because big data can obscure powerful human stories. This is perhaps especially true for research policy. We talk about how much money is being spent more often than we tell stories about how particular research grants are curing a life-limiting disease, protecting the planet or promoting peace. Such stories can bring an issue to life for a politician or official in a way that data alone cannot. 

Nick Hillman is director of the Higher Education Policy Institute.

POSTSCRIPT:

Print headline: Want policymakers to heed your HE data? Serve facts up nicely

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Related articles

Reader's comments (1)

TNA should be analysed, both quantitively as well as qualitatively. Only then should further professional developments opportunities be made accessible

Sponsored

ADVERTISEMENT