Access all areas: the push to show outreach efforts pay off

Institutions have little evidence that their widening participation activities deliver, but they are working to change that, finds Chris Havergal

March 3, 2016
A boy wearing a hoodie
Source: Rex

The footnotes to a Conservative Party press release issued ahead of last year’s general election probably didn’t register on the radar of most directors of widening participation at UK universities.

However, that annotation put access departments at the centre of what is now the Tory government’s social mobility agenda. For contained within it was the first mention of a new target to double the proportion of young people from disadvantaged backgrounds entering higher education by the end of the decade.

Universities can point to considerable success in this area. Eighteen-year-olds from the most deprived backgrounds are more likely than ever to enrol in higher education and, in England, were 30 per cent more likely to win a place in 2015 than they were at the start of the decade.

Spending by institutions is at a record high too, with English universities and colleges detailing an outlay totalling £750.8 million in their access agreements for 2016‑17.

ADVERTISEMENT

But being at the top of the political agenda has not been plain sailing for widening participation departments. At the start of February, Prime Minister David Cameron said that the low numbers of black and minority ethnic students winning places at the UK’s most prestigious universities “should shame our country” and claimed that “ingrained, institutional and insidious” attitudes were holding back progress.

More worryingly, perhaps, some recent evidence raises questions about the effectiveness of universities’ activities in this area. Writing in Times Higher Education in February, Danny Dorling, Halford Mackinder professor of geography at the University of Oxford, observed that although more children from the poorest neighbourhoods are going to university, their numbers are not increasing at the same rate as those from the richest neighbourhoods. This came on the back of an analysis by Ucas, based on multiple measures of inequality, that suggested that the gap between university entry rates for the UK’s most advantaged and most disadvantaged students is wider than previously thought, and that progress on closing it has halted. And Scotland’s Commission on Widening Access, which published its interim report in November, found that a significant proportion of recent progress had been delivered in a single year, coinciding with the introduction of additional funded places. All this might suggest that it has been the expansion of the sector, rather than specific outreach activities undertaken by universities, that has driven the diversification of the student population.

ADVERTISEMENT

In addition, there is evidence to suggest that most of the “heavy lifting” in widening participation is being done by less prestigious universities. Hesa data show that the proportion of students with parents who were in lower-level occupations or were unemployed increased by 4.8 percentage points across UK higher education between 2004-05 and 2014-15, from 28.2 per cent to 33 per cent. But among the Russell Group of large research-intensive universities, the average rise was only 1.4 percentage points, from 19.5 per cent to 20.8 per cent.

The data – which were criticised by universities – show that seven members of the mission group, including the universities of Oxford and Cambridge, have a lower proportion of students from poorer backgrounds now than they did 10 years ago.

With universities under pressure to deliver progress, and budgets being tightened at national and institutional level, the key question must be “what works”: which outreach activities have the greatest impact on students, and which offer the best value?

Carole Torgerson, professor of education at Durham University, believes that UK institutions are not in a good position to answer that question. In a study conducted for the Sutton Trust, she says that she “didn’t find anything that we would describe as rigorous research evaluating UK-based interventions”.

For Torgerson, “rigorous” generally means a randomised trial including a control group. This approach has been undertaken in a number of US universities, and the results indicate that the most effective ways of getting disadvantaged students into university are assigning academics and undergraduates to mentor school pupils with similar interests and, perhaps surprisingly, paying students a monthly stipend to stay in school.

Given the significant differences between the two countries’ education systems, however, it is hard to say how easily these results would translate to the UK, where, according to Torgerson, evaluations have been less sophisticated.

Simply saying that students who have taken part in a particular activity have had a particular outcome does not establish a causal relationship, especially when student populations and institutions are so diverse, Torgerson says.

Universities offer a range of widening participation activities, such as summer schools, mentoring, advice sessions and bursaries, but it could be that some school pupils who took part in a particular intervention were already predisposed towards university study, and perhaps took part in the activity for that very reason. Even if students believed that the activity was effective, how do we know that something else would not have been even more helpful?

ADVERTISEMENT

“Where we have funding for interventions, even if it is for the best possible motive, even if the people who participate think they are effective, unless we have the rigorous experiments in place to test these out we cannot be sure that money is being spent to the greatest advantage,” Torgerson says. “You could have a series of interventions, with some of them being very effective, others being less effective, and some being harmful. Unless you have a fair test and actually look at what would have happened to the participants had they not received the intervention, you don’t actually know.”

A boy pushing another boy on the street while three look on
Source: 
Alamy

Across the sector, there is recognition that more randomised controlled trials are needed, especially when so much money is being spent on outreach.

Graeme Atherton, head of London’s AccessHE project, agrees that it is “absolutely crucial” that universities put increased effort into evaluation.

“There is some evidence that some things appear to have a greater impact than others, but certainly, when it comes to individual institutions, they have to build much more robust evaluation into their work,” he says. “Standardised trials could be part of that, but so could looking at more qualitative ways of trying to capture the impact of what they do.”

Randomised controlled trials are increasingly commonplace in school-level education, and across the social sciences more generally, so it is perhaps surprising that it is higher education – where many of the researchers conducting such trials can be found – that is lagging behind.

The reasons for this are complex, but some in the sector would point to a long-standing disconnect between widening participation teams and educational research departments in universities.

“The connection between most widening participation departments and faculty or the rest of the institution is not often strong or well established,” says Atherton, formerly the executive director of the Aimhigher West, Central and North London partnership, which also sought to boost participation by non-traditional students. “While universities have academics with the knowledge that is required, they will sometimes not want to do what is required. Academics want to publish papers, but publishing papers and evaluating widening participation work aren’t the same thing; you won’t get a paper out of [the latter].”

Another commonly cited problem is a lack of training in research methods among widening participation staff. Les Ebdon, director of fair access to higher education and head of the Office for Fair Access quango, acknowledges that this is a problem.

“In widening participation work, you tend to recruit a lot of really enthusiastic people who want to get out there and do things, and maybe reflection isn’t their strongest suit,” he says. “One of the roles I have played is to go to universities and say: ‘What research are you doing?’ or ‘Why haven’t you engaged your educational researchers?’ Occasionally, the response has been: ‘That’s a very good idea.’”

Some critics would also question whether discovering “what works” in creating a genuinely diverse student cohort is a priority for universities, which may be focused more on recruitment – and on meeting numerical targets negotiated with Ebdon. For example, a study by researchers at the University of the West of England, published at the end of 2015, claims that higher education institutions have focused their outreach efforts on schools that are perceived to offer easy access to bright students from poor backgrounds, meaning that pupils in more isolated areas – arguably those who most need help – are missing out.

A second study conducted by the same university, published last summer, was critical of outreach teams’ use of neighbourhood-level data on higher education participation, with 53 per cent of the senior managers who were surveyed saying that recruiting an advantaged student from a disadvantaged area would represent as good a result (39 per cent) or better (14 per cent) than recruiting a disadvantaged student from an advantaged neighbourhood.

Most universities would reject any suggestion that they have deliberately ignored attempts to develop best practice in widening participation or that they are not undertaking evaluative work now.

King’s College London, for example, is beginning to undertake randomised controlled trials. The institution works closely with researchers to assess the success of its programmes, and it is collaborating with widening participation departments at other universities to conduct peer reviews of their respective activities.

But Anne-Marie Canning, director of widening participation at King’s, highlights that the relative youth of many access teams, and the fact that these teams have often been moved around institutions during the course of their existence, means that their research functions have taken time to mature.

“Widening participation practitioners were in the past seen as people who ran summer schools; now we are seen as a core part of the policy environment and we know that means being evidence-led,” Canning says. “Now I feel I can talk to people like Becky Francis [professor of education and social justice at King’s] and other colleagues at the same level, but I don’t think it would always have been the case in the past. Also, until the £9,000 tuition fee money came through, widening participation departments were funded in a very different way and probably didn’t have as much money. [Now] we probably have more scope to build in evaluation from the beginning.”

Canning adds that while many universities are undertaking evaluative work, much of it currently remains internal. In large part, this is simply down to not having a good place to publish it, so a national research portal for access is high on Canning’s wish list.

The UK is not the only country grappling with the question, and Australia is perhaps one of the closest comparators. Australian higher education institutions can report some success, with students from the poorest neighbourhoods representing 17.9 per cent of all undergraduates in 2014, compared with 16.3 per cent in 2009.

But Steven Schwartz, former vice-chancellor of Macquarie and Murdoch universities in Australia and of Brunel University London in the UK, says that both countries’ sectors face similar problems. He highlights the common lack of national networks for evaluating success and the difficulty of obtaining good data.

“The Sutton Trust is most certainly correct; we do not know what works and what doesn’t in widening participation,” says Schwartz, a psychologist who led a government review of UK university admissions in 2003. “Controlled experiments are essentially non-existent. The only way to really know what works is through controlled trials in which randomly selected groups are compared.”

Looking forward, the Sutton Trust recommends that universities spend 10 per cent of their outreach budgets on “robust research trials”, and the charity will work with Offa to undertake a series of trials and to develop a common evaluation methodology.

Many in the sector welcome the way that the Sutton Trust has pushed the issue of “what works” up the political agenda. There is less consensus, however, over whether randomised controlled trials are the panacea that the sector is looking for. The University of Sheffield has led the way on evaluation with the establishment of its Widening Participation Research and Evaluation Unit, but James Busson, head of outreach and widening participation, questions whether higher education access could really be treated in the same way as a drugs trial.

“Control groups worry me because everyone’s experience is different,” says Busson. “We are very much of the mind that we contribute to students’ decision-making, we can provide them with information and guidance, but we are not the only influencers of them. The problem with a control group is that you take out other things they have done; we can label a school and say ‘that’s a widening participation school’, but if there is an engaging teacher, that will make a massive difference to students.”

ADVERTISEMENT

Added to these complications can be the ethical questions around creating a control group of students who would not receive as much support to access higher education: would schools participate in such a study? And Busson says that as universities extend their outreach work, it is becoming harder and harder to find a group of students who have not been touched by such an activity in some way.

Sharon Smith, director of the Higher Education Access Tracker (Heat) project at the University of Kent, agrees that it would be hard to isolate the effect of a single activity that leads directly to enrolment. She says that people are still asking what the “magic activity” is. “We shouldn’t be asking that,” she says. “We should be asking what [series of interventions] we should be delivering for student progression. Students don’t just participate in one activity and engage with one university: they might engage with multiple universities.”

Already, even without randomised controlled trials, many in the sector believe that a clear consensus around what works is emerging. The emphasis, they believe, should be on intervening early, at primary school level, and maintaining support over many years. By contrast, there is much less agreement about whether bursaries help to attract students into higher education, and then keep them there.

Someone walking a dog alongside three boys walking in the road
Source: 
Alamy

With a view to confirming this hypothesis and building a clearer view of best practice, many sector leaders argue that the priority should be to open up data about students and the widening participation activities they take part in, rather than pursuing randomised controlled trials.

The Kent-based Heat project, which is receiving £3 million from the Higher Education Funding Council for England, may represent the best chance of success in this area. Fifty universities have signed up to the service, which allows institutions to record students’ participation in outreach activities, cross-referenced with data from sources such as the Higher Education Statistics Agency, on factors such as their subsequent progress through higher education. Subscribers are able to compare their performance with that of similar institutions, and one of the next steps, says Smith, will be to create control groups within the dataset. Working with such a large dataset should allow for much more robust conclusions about the effectiveness of packages of support to be analysed, it is hoped.

“I don’t think it should be individual universities doing it. I think we as a collective and a community need to be doing it, because it won’t be easy,” Smith says. “These are difficult questions, so we should be doing it together.”

Nevertheless, the Heat project does face obstacles. In particular, the current unavailability of data from Ucas means that institutions are unable to undertake analysis of offer-making and, if students do go on to enter higher education, universities have to wait for Hesa to provide more information.

Directors of widening participation complain that if they want to find out where the young people they worked with ended up, they are often reliant on individual students answering the phone and telling them.

Negotiations between Heat and Ucas, which cites students’ concerns about data security, continue. But Canning says that opening up data is what the government’s recently unveiled Social Mobility Working Group, of which she is a member, needs to “crack”.

“We need a golden thread of data from kids at school, to Heat, to Ucas, to DLHE [the Destinations of Leavers from Higher Education survey] and labour market data,” she says. “We need to join it up if we want a picture of what social mobility looks like; at the moment it is too piecemeal.”

Whatever the solution is, be it randomised controlled trials or more and better data – and the answer is probably a mix of both – these are not questions that are going to be solved soon. This might sound dispiriting when directors of widening participation are under pressure to show results quickly, but, if the most effective interventions really are those that start in primary school, it will be some time before the pupils being targeted get around to applying to university.

In the meantime, universities may continue to rely on more qualitative approaches. And, even when more robust data are available, it will be important not to lose the insights that this type of evidence can offer, Canning concludes.

“Hard evidence is one thing, but widening participation is a human endeavour, and we know the kids and see them progress and sometimes it can feel very different to how the research evidence base looks,” she says. “A student who does a programme here but goes to another institution…has got a good outcome for them, but it might not look like a success in a report.”


Fair play: not social engineering, just an effort to find those who will thrive at university

Some years ago, I participated in a debate at the Oxford Union. The motion before the house was: “Admission to university should not be based on marks alone.”

The opposition team likened university applicants to runners in a footrace. Just as the fastest runner wins the race, applicants with the highest entrance examination scores deserve to “win” university selection. Considering factors such as the type of school applicants attended, their family income or the obstacles they may have had to overcome was unfair because it could lead to someone with lower marks getting in ahead of someone with higher marks.

Our team argued that the footrace analogy was flawed. In a race, runners are all required to begin from the same starting line. This is not true in life. Some people are disadvantaged – by illness, poverty or circumstance. Because money can buy the best schooling, university admission is like a race in which some runners are doped.

Despite the cogency of our arguments and our brilliant presentation, my team lost. And poll after poll shows that the general public would also have voted against us. People understand the educational value of diversity and believe in equal opportunity, but they value fairness most of all. And they consider giving special assistance to groups who have been disadvantaged to be unfair. But is that really a coherent viewpoint?

One reason that qualified students from deprived backgrounds sometimes don’t apply to university is that they lack accurate information about their chances of being accepted. An information programme for students, parents and teachers would counteract this, and few would object.

Some low-income students are also prone to thinking that university culture is not for them. But, again, few would object to a programme of outreach and open days to address this.

What about running special training programmes such as summer schools and foundation years to help applicants from deprived backgrounds reach the necessary standard? True, they are available for only some students, but they do not guarantee admission: they just level the playing field. Because diversity in the classroom is valuable in itself, most people would agree this is fair, too.

But then we come to assigning bonus points to certain students based on their background. Research shows that candidates who had the grit to struggle through a poor school and achieve reasonable marks perform better at university than candidates with similar marks who were better supported at private schools. Uncovering such hidden talent is not “social engineering” as some have alleged; it is simply an attempt to find the students who will fare best at university.

Hard quotas, in which places are reserved for disadvantaged students, are particularly difficult to justify. When I was dean of medicine at the University of Western Australia, the country had only a tiny number of indigenous doctors. As they could serve as important role models, my colleagues and I wanted to admit indigenous students whose marks were slightly lower than the entry standard. Although it is now routine for the university to consider factors other than marks in the admissions process, in those days rejecting a student with higher marks in favour of one with lower marks was considered grossly unfair. In the opinion of the public and the press, it was reverse discrimination. Passions ran high until someone suggested that extra places (beyond the approved allocation) be created for indigenous students. Don’t ask me why, but this was somehow perceived to be fairer. By the way, the university has since graduated many indigenous doctors, the majority of whom met the normal standard for admission.

Widening participation remains controversial, and anyone who advocates it must expect to be challenged. Nevertheless, the effort is worthwhile. By widening participation, we can ensure that universities remain engine rooms of social mobility. When it comes to building a fair society, who could ask for more than that?

Steven Schwartz is the former vice-chancellor of Macquarie University and Murdoch University in Australia, and of Brunel University London. This article is based on a speech he made last year at the University of Bedfordshire.


Transformative: helping ex-offenders and others with unruly lives into higher education

Like world peace or the eradication of poverty, widening participation is a term that rarely inspires a negative response. But for the commercially savvy academic production lines of the UK’s university sector, widening participation is often little more than virtue signalling as they pile them high in an effort to reach the holy grail of 50 per cent participation.

Meanwhile, for the most disadvantaged, the very idea of going to university remains as terrifyingly remote as it did in the 1950s. For Joe Baden, director of what is arguably the UK’s most innovative widening participation scheme, a career in armed robbery was a far more realistic option. After a spell on remand in Belmarsh prison, Baden was given a two-year community service order for violent affray and possession of a firearm. “I thought education was for people from another world, but I agreed to join a basic skills programme to placate my probation officer,” he tells me.

He soon discovered that he enjoyed academic work, and he eventually graduated in 1998 with a BA in history from Goldsmiths, University of London. In 2002, he launched the Open Book project at Goldsmiths, with the aim of helping ex-offenders and those with addiction and mental health problems to get into education. “We don’t make excuses for people or try to forgive people,” Baden emphasises. “No one has the right to forgive me or to say they are ‘empowering’ me. People can only empower themselves.”

The project provides both practical and emotional support, offering mentoring, drop-in and study sessions, and an outreach programme that includes visits to prisons, hospitals and residential units. Open Book has helped more than 200 students to find places on undergraduate courses (mostly at Goldsmiths and other London institutions) and has a total of 500 people on its books, studying a range of degree subjects from foundation to postgraduate level.

Azra is a typical Open Book student, whose backstory includes: “drugs, homelessness, prison, lots of crime, lots of drugs. I came here with no background in college or school. I got lots of help in coming to terms with who I am. They made me believe.”

Open Book, explains Baden, is about empathy, expertise and the determination to instil confidence into individuals whose background renders the dreadful cliché “non-traditional student” a crass understatement. For instance, one student telephoned Joe in the middle of the night, immediately after a serious shooting incident at her home. She was worried that this might result in the late submission of an essay.

Via their life experiences, and their immersion in the often unruly lives of their students, the Open Book team, who are all successful alumni, promote a powerful reminder of the transformative power of education and the true meaning of widening participation.

Dick Hobbs is professor in the Institute for Culture and Society at the University of Western Sydney and emeritus professor of sociology at the University of Essex. This piece is based on an article he wrote for the online magazine Spitalfields Life in December.

ADVERTISEMENT

POSTSCRIPT:

Print headline: Are we connecting?

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Reader's comments (1)

There is a strong case for improving data analysis and evaluation to support fair access. This recent analysis - https://timdracup.wordpress.com/2016/02/12/the-missing-520/ - applies a methodology previously used by the Sutton Trust and the Social Mobility Commission to consider progress by English Russell Group Universities against the POLAR3 UKPIs. It questions the value for money achieved from the current annual investment in fair access by these institutions - and identifies perceived threats to 'institutional autonomy' as an obstacle to further progress. An even greater priority is to establish a mechanism to secure coherence across the demand and supply sides of the market for fair access support. NNCOs apart, negligible progress is being made in that direction. It will not be sufficient simply to reinvent AimHigher. Something more is necessary.

Sponsored