Undergraduate teaching is in crisis for one simple reason: we reward superficiality. We give away first-class and upper-second-class degrees like confetti, rewarding undergraduates whose knowledge is so limited that only a few years ago they would have done no better than a lower second. For those who have been lecturers for long enough to have witnessed this devaluation of education, it is a disturbing trend.
As part of a recent research project, I asked 30 senior academics from around the world to identify the most influential figures from the past 50 years in one particular field. Their responses revealed a reassuringly high degree of agreement, and when I looked at why particular figures had been identified, it was because of their ability to synthesise information, and in so doing, to identify the way forward in that area of research. Rather few of the most influential figures were those who made one-off discoveries.
Yet synthesis is something we don't reward. The research assessment exercise has not rewarded researchers for writing books or reviews, nor does the current system encourage us to teach undergraduates to be synthesisers. Synthesis requires a broad view and, of course, part of that broad view requires basic building blocks, but if we exclude synthesis from the way we educate undergraduates, they aren't going to do it or value it. In fact, in general, we do exactly the opposite: we reward superficial, short-term recall.
Political correctness is probably the main cause of superficiality - the obsession with making sure everyone is treated identically. Short-answer and multiple-choice examination questions are considered fairer and more objective; assessing a synthetic essay not only takes longer, it is more difficult and relies more on subjectivity. You can take fairness too far: accountability is killing scholarship.
The higher education system in the UK is set up in such a way that even in the most research-active universities and departments, the teaching of undergraduates is fundamental. If it weren't, we would have a system like the Max Planck Institutes in Germany, where top researchers are hived off into undergraduate-free zones. Even though undergraduate teaching is an integral part of the UK system, most academics hate it and those who don't, acknowledge that - often despite their best efforts - undergraduates are provided with only a superficial education.
There are several reasons why undergraduate education seems superficial. First, superficiality is what happens at school - and is rewarded. Teaching to the curriculum in school has a powerful effect on what happens at university. At school, and increasingly at university, the emphasis is primarily on facts and communication ability - the ability to recall information - and, to a lesser extent, on how to write and speak. There is hardly any weight at all given to how to think.
The national curriculum creates a downward spiral: as we allow weaker students to obtain 2:1s and become teachers, they know less and less. They know less about their subject, but also less about how to think and behave in an academic manner. Let me give you an example. A colleague told me about a fourth- (final) year science student. The student gave a presentation to the department on his research project. At the end of the talk, someone in the audience asked whether the student had checked his measurement of a key variable to see whether it was repeatable.
"No," he said. "I've still got to do that." My colleague, however, knew that the student had done it. When he questioned him afterwards, the student admitted that he had checked the measurement of the key variable, but that it was not repeatable (ie, the values were meaningless), which is why he'd not mentioned it in the talk. Worse, he told his supervisor that he would also be leaving it out of his write-up for the same reason, imagining that the supervisor would somehow agree to be party to this deception. Despite not knowing the difference between right and wrong, or what constitutes scientific misconduct, this particular student went on to be awarded a good upper-second-class degree.
Learning to think in a scientific manner means knowing what science is and what science isn't, and what constitutes good practice. In fact, scientific misconduct seems to be distressingly common among undergraduates, and almost certainly stems from the national curriculum. When they are under pressure to complete the curriculum, teachers often openly fudge results if an experiment doesn't go quite right. They do this usually because they are short of time, not because they don't know the difference between right and wrong. I have quizzed many teachers about this kind of deception and they, despairingly, say that the system forces them into it. I have also quizzed successive groups of final-year students about scientific conduct and they all gave examples of malpractice at school in terms of the way science was done.
Second, there are too few rewards for teaching in universities. Sadly, this is exacerbated by the postdoctoral system, especially prestigious long-term postdoctoral positions. Getting those people back on board and enthusiastic about teaching can be difficult.
Research and teaching have always been in conflict. Most people go into an academic career because they enjoy research: few do it because they enjoy teaching more than research. As a result, most academics tend to begrudge the time they have to "give up" to teaching. This is made worse by the fact that most rewards - job satisfaction and promotion - come through research success, that is, success in securing funds (and those essential overheads for the institution) from an ever-diminishing pot.
The elusive nature of research funding adds to the problem: academics spend (and often waste) inordinate amounts of time applying for funding. The pressure on them is immense. Is it any wonder that they begrudge giving time to teaching? The endless, vulgar scrabble for research funding cannot, however, account for the current disillusionment with teaching, for it has been like this for as long as I can remember. Much more important, in my opinion, is the fact that teaching is so poorly rewarded - good teaching is rarely recognised, and even more rarely is it recognised appropriately. I'm not talking about those academics who opt out (or are opted out) of research and decide to focus on teaching. What I mean is those individuals who are successful at research and are also effective, inspiring teachers - surely they are the ones who should be rewarded?
This raises another issue. Should academics be trained as teachers? Should they have a teaching degree before being let loose among our undergraduates? This is a recurring topic, and those who value teaching, but probably know little about research, often say it is a good idea. It may be that as more people enter academia via the postdoc route, and hence have little experience and often little interest in teaching, some formal qualification may be useful.
On the other hand, requiring academics to obtain a teaching qualification before being fully integrated into a department could cause resentment and do more harm than good. One solution might be to trade real teaching during the first semester (or year) for some kind of mentoring or training by skilled, respected teachers (who are also researchers) who, in turn, are appropriately rewarded for their efforts.
In a recent Times Higher Education feature ("Blind sides", 30 April 2009), Tara Brabazon identified five main benefits of a teaching qualification: (i) problem-solving facilities; (ii) improving the consistency of delivery (ie, no one should read their PowerPoint slides); (iii) awareness of the portfolio of assessment methods; (iv) management of diversity in the classroom; and (v) the ability to welcome surprises and teach the unexpected. (Hmm - the last depends on your skill as a performer, as it's not what you tell them, it is the way you tell them.) The suggestions are fair enough, but we could probably teach those five points in a few hour-long sessions.
I know of several people who have volunteered to undertake a postgraduate teaching course, and judging from their comments, it wasn't all that useful. It may seem paradoxical, but often those trained to teach teachers may not be best suited to do it. I remember when, several decades ago, one particular university's statistics department had too few undergraduates. To justify their existence, they insisted that they, as the experts, should do all undergraduate statistics teaching, whatever the department. To see how this would pan out, I went along to some of their introductory lectures. They were appalling, for although they might well have been "experts", they had no idea about what kind of statistics were required in different departments, nor the motivation behind different disciplines. One size does not fit all.
Being an effective teacher is largely about being motivated and caring whether you do a good job or not. Motivation, in turn, may depend on reward. If you are very clever (as most postdocs are), there's little understanding about the kinds of difficulties the average undergraduate might have. Effective teaching is about empathy. A visiting seminar speaker from the US, who told me that he had made a career choice to teach in a liberal arts college rather than at an Ivy League university, impressed me recently. This was a quality-of-life decision: his college did not expect him to acquire grants (although he did, and was co-principal investigator on a multimillion-dollar project), and the expectation was that he would publish just one or two papers a year (which he did - easily).
Just as we have no real way of rewarding good teaching, we have no way of doing much about bad teaching. Bad teaching can take many forms, from a dreary performance in a lecture or tutorial, to unethical exam marking. In a previous life, I had an experience that had a profound effect on my own attitude towards teaching. I was asked to second-mark a colleague's examination papers. I dutifully did so and afterwards, when I had access to the first assessor's marks, discovered to my horror that there was no correlation whatsoever between his marks and mine.
This was in the days before scripts had to be annotated; there was simply a mark. I was at an early stage in my career and felt I had done the marking as conscientiously as possible and, despite a niggle of self-doubt, I still trusted my own values. I thought about it all weekend: what might my colleague (who was less than enthusiastic about teaching) have done instead of marking all these scripts, while still retaining a glimmer of conscience (rather than awarding marks at random)? Finally, I had it. I bet to myself that the scripts had been scored on the aesthetic quality of the candidates' handwriting. I went through the papers again and did the same and, to my amazement, found a strong correlation between his and my marks. I was horrified. I spoke to the exams officer, and was even more horrified to hear him say, "Oh well, we'll just use your marks". There was no question of taking Dr X to task; the "let's not rock the boat" attitude prevailed. And sloppy marking still occurs. Not only is it monumentally unfair to undergraduates, it is also unfair to the colleagues who have to pick up the pieces.
Third, the internet fosters superficiality. As though we needed further proof on that point, a friend who paid an undergraduate to help with a research project by locating relevant papers on a particular subject wonderfully demonstrated it recently. The student was bright and conscientious, but restricted his search to the internet. When my colleague checked, huge areas of relevant research were missing, so the overview was incomplete. The undergraduate, of course, had counted the numbers of PDFs he had downloaded and was blissfully unaware of how incomplete - and hence unscholarly - his efforts had been.
It is precisely because the internet fosters superficiality that we ought to change the way we teach. As libraries throw away kilometres of journals and bury their books in the basement to make way for computer terminals, we foster greater dependence on online resources. Of course, it is not the internet per se that's the problem; it is that we have failed to train undergraduates to use it in a scholarly way.
In the past we used to ask undergraduates to find information in the library, and compile it in a coherent way. We also expected undergraduates to retain information across years and between individual lecture courses. And we used to expect undergraduates to give a coherent verbal account to an external examiner. No more! We studiously avoid synoptic forms of assessment, such as general papers or oral examinations, or indeed anything that requires retaining knowledge for longer than one semester; or anything that requires synthesis. In fact, we avoid anything at all that might expose the superficiality of the system.
How do we rectify this situation, a situation not of academics' making? First, I think, we need to acknowledge the constraints. These are: (i) we are stuck with a national curriculum, which means we are probably stuck with the illusion of success at school. As with proposed tax cuts it is difficult to go back to a more realistic way of life; (ii) success in universities is always going to be driven largely by research funding; and (iii) undergraduate fees are here to stay. Changing the way we educate under-graduates within these constraints requires a few bold steps. Here are some suggestions.
The first may sound a bit Orwellian - in the first year we need a strong re-education programme. We need to (gently) shake undergraduates out of their school-curriculum complacency and introduce a new set of rules. We need also to spell out that they are not customers (as Frank Furedi wrote in "Now is the age of the discontented" in Times Higher Education, 4 June 2009). Contrary to what students may think, coming to university is not like turning up at the barbers, paying for a haircut and sitting there passively while someone else does all the work. Rather, it is - or should be - more like turning up to an appointment with a personal trainer, where you are told how to get fit but have to do the work.
Second, we should give fewer lectures and provide more interaction in the form of projects and tutorials. Learning to think in a critical and scholarly way requires dialogue between teacher and student. We should abandon or greatly reduce coursework, or not assess it, so that it doesn't count towards a degree, and we should introduce many more synoptic and general papers.
Third, we need fewer forms of assessment. A high diversity of assessment is deemed to be fairer, providing opportunities for students with different abilities to excel. But too many assessments simply make it harder to discriminate. Worse, many of our assessments are meaningless - especially coursework, where we have no control over who has actually done it or how long it has taken. I'd be the first to acknowledge that people learn at different rates, but coursework simply muddies the waters, at least for assessment.
Fourth, we need a better system of rewards for academics who are good researchers and effective teachers. And the best reward would be research opportunities from the institution itself - such as studentships or postdoctoral posts or simply some no-strings research funding. Rewards of this kind would be a tremendous incentive.
We may already be paying the price for an education system that excels in superficiality. Perhaps the Government should reflect on the fact that almost every aspect of its education system has stultified creativity - is it any wonder we are struggling to find creative ways out of the recession?
'FAST-FOOD PEDAGOGY': TEACHING IS NO LONGER SEEN AS THE KEY ROLE OF THE ACADEMY
Despite the public's presumption of teaching excellence at top universities, Christopher Reid says the reality is that most faculty view teaching as little more than a burden that will have only a negligible effect on their ability to acquire tenure.
At the Midwestern university where I was recently a graduate student, two professors in my small department failed their tenure reviews within a span of six years. They were, by all accounts, excellent teachers and each benefited from a legion of student support. Professors in other departments were also regularly let go, despite petitions and pleading editorials in student newspapers.
Last year, I was a visiting assistant professor in the humanities at another institution, engaged in what might be termed "fast-food pedagogy". I had a double course load, which meant little time for preparation and no time for research. To judge from an advertisement in my department directed at graduating seniors, I was paid less than an outlet manager of a regional restaurant chain.
The public think that teaching is a professor's greatest responsibility and constitutes the university's most important function. The opinion predominates because we want it to be true - in fact, we need it to be true. It is no exaggeration to suggest that the future of this country is in the hands of its teachers. Having taught at three American universities, however, I have seen little evidence that teaching is especially important. Indeed, my experience has shown me that teaching in higher education is not valued, either by administrators or faculty, nearly as much as one might suppose or hope. Forget the bromide that the university's principal mission is to act in the public good. A university is a business. Full stop. It views teaching, however inspirational or transformative, through glasses the colour of money.
When the economy soured last autumn, concomitant with talk about the negative impact on university budgets were discussions about reducing the adjunct faculty and limiting new faculty hires. In other words, the first on the chopping block are almost always the teachers, not the administrators. To be sure, administrators are let go too, but they are eliminated in the second or third wave. It is survival of the fittest at a university and those who are around the longest are the ones who control the purse strings.
Promises, promises
Most US faculty nowadays are effectively temps; known as adjuncts, they are contract employees, often without benefits, who can be shown the door for any reason. What about tenure? It is becoming an anachronism. At many places, only a quarter of instructors or less have tenure. While some schools have removed tenure altogether, it still serves as a kind of bait to unwitting junior faculty, for whom the prospect of lifetime employment lures them to their first jobs.
It certainly did for me. As an inside candidate for a position in Chicago, I was told that a tenured position, which had been closed when a faculty member retired, might open up again. Some years later, when I was a first-year professor at another school in the Midwest, I was asked back for a second year with a vague promise that a tenured position could become available if I just stuck around long enough. In both instances, I eventually recognised the game being played. I decided not to be duped.
University teaching is a revolving door by necessity, since universities don't want to get stuck with employees they can't fire. From a business standpoint, tenure represents a huge financial liability. While all kinds of reasons are given for why a particular professor was not granted tenure, you don't need an MBA to see that hiring adjuncts to meet the bulk of a department's teaching load makes more sound economic sense than hiring someone forever with regular pay increases.
Where I was teaching last year, I calculated that between me and another adjunct, the 16 classes we were teaching reflected two thirds of the total curriculum (minus language classes). We also found out, quite by accident, that we were paid the lowest salaries of any of the instructors in our field in the Big Ten grouping of Midwestern universities. What a great deal we turned out to be for the university.
A further point on my own situation is worth mentioning. I was nominated for a school teaching award and asked to return to teach for a second year. Before signing a contract, I asked for a very modest salary increase, noting how little I was compensated. In a curt email, the offer of an additional year's employment was promptly rescinded for alleged budgetary reasons. The chair then made the same offer to the other adjunct in the department who, just days before, had been turned down for a post for the following year. He declined, leaving the university with no one to teach the core part of the department's curriculum the following year.
It is hard to imagine that the school was thinking about teaching quality or student welfare in these negotiations. The salary increase would still have left me at the bottom of the salary scale and I still would have been teaching twice as much as a permanent faculty member. The quality of my teaching or popularity with students, however, never came up. The simple truth is that an adjunct is viewed as an interchangeable part that can be swapped in and out as required.
A buyer's market
Contributing to this perception is the glut of unemployed PhDs. It is a buyer's market and universities know it. An adjunct is rarely in a position to negotiate anything. It is not only the university administration that diminishes teaching. In a seemingly cannibalistic move, faculty members do, too. It has become a truism among faculty at the top universities that teaching doesn't really matter. More precisely, if you're an adjunct, quality teaching won't help you to keep your job; if you're junior faculty, teaching won't help you to get tenure; and, finally, if you're tenured faculty, teaching is frequently a repetitive chore and an object of resentment.
"Publish or perish" is accurate, but only for the cream of the crop. "Teach or perish" is a joke. The idea that research at the university level should be all-important is not very convincing, however. While a good teacher contributes to students' intellectual growth and ability to think critically, the educational impact of a scholarly article is minimal, considering that academic journals commonly have only a few hundred subscribers. In the end, the principal function of research is to prove a professor's bona fides. When it comes to higher education, the popular imagination wins out. The sanguine belief that good teaching is universally esteemed trumps the reality that it isn't. The accepted credo in the academy that teaching is secondary should be seen as a threat to anyone interested in getting a degree. Universities, however, are still able to exploit the idealism of students and their parents, and the young adjuncts who, after eight years of work on their dissertations, have decided that teaching is their calling.
To some extent it is due to successful public relations and the tailwind of tradition that people still have confidence in the quality of the education offered at universities. The aura of the university and the student's pragmatic aim just to get his or her degree mask an institution in crisis. Enrolment levels may continue to soar, but so long as the value of teaching is diminished, instructors like me will continue to look for something else to do.
I am optimistic that the public's blind faith in the quality of teaching in the academy will not persist much longer. As the price of tuition continues to outpace the rate of inflation, Americans will start asking questions about the kind of education they're getting. They will awaken from their ivy-wreathed dreams of going to college to wonder about the quality of instruction and make a decision with their wallet - which, as any adjunct will tell you, is something universities themselves have become quite good at.
Christopher Reid recently taught as a visiting assistant professor of German at a Midwestern university. He is currently a university administrator in Boston, Massachusetts.