Journal rankings help define, not distort

一月 6, 2011

Dennis Tourish argues that journal rankings are haunting business schools and being used to micromanage researchers ("Publish or be damned", 16 December). He suggests that they distort scholarship by devaluing work published in unranked journals and by encouraging people to publish in highly ranked journals that also have high rejection rates. He wants to see a moratorium on the use of these lists.

Tourish's comments take little account of history, the economic context or the challenges confronting the leaders of UK business schools. Journal lists are a feature of the academic landscape and cannot be wished away. However, they can be improved through debate, and, when combined with other measures, they enable academics to understand the rules of their profession so that they can make a better contribution to society and the economy.

Journal lists and guides have been a feature of the work of academic researchers in many subject areas for at least 40 years. While once rankings of relative quality were passed on informally within a small number of elite departments, now these ratings are being made public and subject to scrutiny and debate.

The research selectivity exercise, introduced in the mid-1980s and the precursor to the research assessment exercise, was developed to justify why universities received significantly more funding than polytechnics and other higher education institutions, and also to rationalise the differential funding of different departments.

With the transformation of polytechnics into universities in 1992, larger numbers of academics faced having their research assessed and there was a need to legitimise and understand how judgements were made. In business and management, this expansion also made it clear that it was not possible to read all of the work submitted.

How then could it be consistently and fairly rated? One way was to rate the quality of the editorial processes used in assessing work before it was submitted.

Over the next 18 years journal lists emerged from an ever-wider range of UK business schools. These lists were based on analyses of the judgements of successive research assessment exercise panels and were used internally to hone RAE submissions.

As they became more widely available in post-1992 universities, the number of researchers also increased significantly. Unfortunately, funding to support this work did not expand at the same rate. With the size of the funding cake growing more slowly than the number of researchers, it is not surprising that many went hungry. It is also not surprising that many managers focused more acutely on activities and measures to bolster funding that were influenced by research ratings as they were translated into league table rankings.

The Association of Business Schools' Academic Journal Quality Guide is a publicly available document that uses public data and clearly defined methods. It is not one institution's view of the world, but the result of a systematic review of many institutions' views. It is not perfect; very few things are.

The easiest way for the circle of despair outlined by Tourish to be squared is for three things to happen.

First, the number of issues of the most highly rated journals should be increased, thereby reducing rejection rates. This move would make sense, because these journals are the most highly read and the most highly cited.

Second, the editors of UK journals who feel that their journals are poorly rated should consider the ways in which the rating of these journals can be improved or the journals discontinued.

Third, and perhaps most importantly, academics and their institutions should find other sources of funding for their research work, such as commercial book contracts, applied research, consultancy or conference activity.

In an increasingly market-oriented and privatised sector, it is these activities, more than the wishing away of academic journal lists, that will assure the future employment of academics. As these things become more important to UK universities, the measures and ratings methods will change. We will be damned if they don't.

Charles Harvey, Newcastle University; Aidan Kelly, Goldsmiths, University of London; Huw Morris, University of Salford; Michael Rowlinson, Queen Mary, University of London; Editors, Association of Business Schools Academic Journal Quality Guide.

请先注册再继续

为何要注册?

  • 注册是免费的,而且十分便捷
  • 注册成功后,您每月可免费阅读3篇文章
  • 订阅我们的邮件
注册
Please 登录 or 注册 to read this article.
ADVERTISEMENT