Is employability data being manipulated?

The Destinations of Leavers from Higher Education survey is being abused, according to one former university employee

March 12, 2015

When I started the post, it was said that the DLHE would make me hard of hearing. In no time at all, they laughed, I’d be hearing ‘barrister’ not ‘barista’

A management position at a university on a meteoric rise through the league tables, reporting impressive levels of graduate employability. A job that sounds too good to be true? In my case, it was.

When I was appointed to a senior position in a university’s employability department a few years ago, I was full of optimism and excitement. The university’s results were improving year on year, and its graduates were obtaining fantastic jobs. I had high hopes that we would continue this upward trajectory.

Part of my role was to help manage the Destinations of Leavers from Higher Education survey, the annual review of alumni’s employment or educational status six months after their graduation.

ADVERTISEMENT

The survey process is governed at a national level by the Higher Education Statistics Agency and is required by the Higher Education Funding Council for England, but it is conducted in-house, by universities themselves.

Over three months, each university tries to reach its recent graduates via post and email, but predominantly by telephone. Hesa sets high targets, requiring 80 per cent of UK residents who studied full-time to respond.

ADVERTISEMENT

As anyone working in higher education will know, the outcomes of the survey are vital. Poor results such as an above-average rate of unemployment can push a university down the pecking order in national newspaper league tables and are published at course level on the Unistats website, influencing many would-be students’ decisions about where to apply.

The two pieces of information that are most critical to league table positions are average graduate salaries and the proportion of graduates in “graduate-level” or “professional” jobs. When I started in the post, I was the butt of jokes – it was said that the DLHE would make me hard of hearing. In no time at all, colleagues laughed, I’d be registering “barrister” not “barista”. Their jests turned out to be not far from the truth.

It goes without saying that different courses will have different levels of success when it comes to graduate destinations. Those such as social care and teaching, generally speaking, lead to graduate-level employment. The same cannot be said of programmes such as dance and heritage studies.

Soon after my arrival, I learned that managers had devised a traffic light system designed to reflect this. Courses generally providing positive results were marked green on a spreadsheet; those providing negative results were marked red; amber lay somewhere in between. This provided the framework for targeting the survey and maximising positive results. Graduates of “green” courses were called obsessively, whereas graduates from red courses were often not called at all. This system controlled and altered the end results beyond all recognition.

But the dishonest manipulation of the survey results did not end there. With speculation in the pages of Times Higher Education that graduate salary data could be linked to the tuition fees that universities are allowed to charge, graduate earnings became an increasingly hot topic throughout my employment. Senior managers in the employability department soon began to discuss new plans to target graduates in good jobs in order to guarantee the desired flattering results.

One of the most unsettling incidents took place towards the end of my first survey cycle when it became apparent that the university’s average salary figure was looking very low compared with previous years. Managers got worried. To my shock, I suddenly found myself being instructed to make any salary below £10,000 a year “disappear” because these were bringing the average down. When I refused to do this, a colleague was provided with a login to the system and made the alterations.

Completed forms were selectively “lost” if they showed a low salary, a poor job or unemployment. At one stage, when the hoped-for results looked out of reach, I was asked to instruct the call centre team to stop writing the results of their phone calls directly on to the forms and to record the information on Post-it notes instead. The Post-its could then be assessed, and only positive examples would be written up properly. Forms showing unemployment or lowly work destinations were “filed” separately or never completed.

All graduate destinations on forms were coded. Teaching assistants, for example, should be coded as category “6” – classed as a non-graduate destination. A loophole in the guidelines, however, allowed these to be coded as a “2”, considered to be graduate-level. I was asked daily by senior management to see if I could nudge the codes upwards. The coding process is conducted in-house, and is subjective, ambiguous and not policed.

ADVERTISEMENT
Coffee grinder beside a legal document

Courses providing positive results were marked green while those providing negative results were red. This provided the framework for targeting the survey

Other examples of fraudulent practices included cutting short telephone calls when the graduate appears to be unsuccessful and encouraging them to halt the survey. This means a potentially damaging form can be neutralised by making it a “prefer not to say” result. Similarly, when graduates were found to be in low-paid jobs away from home, telephonists were encouraged to mark them down as “travelling”. Telephonists were instructed to persuade graduates that their low-level position contained supervisory elements in order that the role could be coded as “management level”, even if this involved resorting to blatant flattery.

And while the use of social media is not approved under the official DLHE guidelines, the university flouted this rule by cyberstalking graduates and gleaning information from Facebook, Twitter and LinkedIn profiles in order to complete forms. This information was not time bound and much of it was probably inaccurate – yet it was presented to Hesa as if it had been gained from a telephone interview conducted within the survey period.

As a result of the misleading survey findings, the university made some questionable business choices based on incorrect management information. A prime example of this was found in the Faculty of Art. Our telephonists were encouraged to ask art students with no employment or with low-level positions whether they had ever made artwork for family and friends and whether they had carried out any “portfolio-building” since graduating. No matter how tenuous, this allowed the telephonist to enter “self-employed” on the survey form – a result that is clearly preferable to “unemployed”. However, the flip side of this was that academic staff in the Faculty of Art were so satisfied with their apparently high graduate employment figures that they rejected careers adviser support on the basis that their graduates did well on their own. This meant that hundreds of graduates were denied professional careers guidance on the strength of flawed and massaged figures.

It distressed me to see how some colleagues in the employability department suffered because of the university’s desperate need to get good survey results, as the relentless drive resulted in a disturbing level of control and bullying from senior management positions.

Within days of starting the job I recognised the importance of saving all emails, recording conversations and ensuring that I didn’t go into meetings alone. I faced a constant battle to ensure that I remained transparent and on the correct side of the moral line. For this, I was described as militant and obstructive. There were implied threats to my job security if I failed to comply.

ADVERTISEMENT

Instead of being genuinely interested in the valuable management information being produced by the survey, the university had an attitude of “make it happen”. I was told regularly that if I could get good results from the survey for the university, then my progress through the ranks would be unhindered – as if I had the power to control graduates’ destinations.

For the higher education sector, the key concern should be oversight of this process. In many respects, the regulations and rules set out by Hesa are robust – the telephonists’ scripts, the guidelines, the paperwork. However, Hefce and Hesa are arguably complicit in the deception and misleading of our young, would-be undergraduates. Their audit processes are based on checking completed forms against the data that have been entered. There are no proper checks on the validity of these data, no measures to detect the fraudulent activity that can be conducted between a conversation with a graduate and the recording of data on a form. Why isn’t a sample of the results checked with graduates for accuracy?

Has Hefce heard anything about these fraudulent practices? Do such things go on at other institutions? I’ve heard that they do. Perhaps the sector is turning a blind eye to these goings-on in order to maintain the veneer of success.

Whatever the truth, the lack of policing of the Destinations of Leavers from Higher Education survey leaves it wide open to distortion and makes a mockery of the government’s ambitions – set out in its 2011 White Paper – to “empower prospective students” by providing them with better information.

Higher Education Statistics Agency (HESA) beer pump

‘A reasonable amount of latitude’: How Hesa and Hefce ensure data integrity

The Destinations of Leavers from Higher Education survey is overseen by the Higher Education Statistics Agency and conducted by higher education institutions following the agency’s guidelines.

A spokesman for the Higher Education Funding Council for England explains that “these guidelines allow a reasonable amount of latitude so the institutions can tailor the overall process so that it suits local circumstances, for example approaching alumni by email and phone before sending a letter”.

The funding council, he continues, takes “any suggestion of manipulation of data in our surveys very seriously and will investigate the allegations, but we cannot act on an anonymous statement about an unnamed institution.”

The spokesman explains that routine scrutiny of the data is devolved to Hesa, “who do extensive checks to ensure that data are internally consistent and that response rates are met”.

However, Hefce also “undertake[s] audits of DLHE in order to test the systems at HEIs for collecting the data”. The spokesman says that “currently this does not involve direct contact with alumni to verify their responses/status”.

He adds that the funding council hopes that the small business, enterprise and employment bill currently passing through Parliament “will significantly enhance the information on graduates’ employment status and salaries”. The bill, he explains, “includes provision to enable linking of student data to longer term earnings data under appropriately controlled circumstances”.

A spokesman for Hesa says that the data collection process for the survey taking place six months after graduation “includes extensive quality assurance mechanisms”.

“First-stage validation applies a large number of ‘hard checks’ to ensure that, for example, recorded dates fall within expected ranges, valid categories have been recorded for particular fields, comparisons between data items lead to consistent results,” he says.

“Failure to meet these validation checks results in the data submission being rejected, forcing the Hep [higher education provider] to correct the data and resubmit.

“Second-stage checks include verification processes whereby data aggregations and summaries are generated from each Hep’s submission – these being returned to that Hep for scrutiny and also being scrutinised by the data quality assurance team at Hesa. During this process a data quality system is used to record, investigate and resolve data quality issues involving Hesa working closely with each Hep.”

Once any issues have been resolved, the head of each higher education institution is asked to sign off each data submission before it is “committed” to the Hesa database.

He adds that “Hesa itself does not have powers of audit.”

ADVERTISEMENT

Times Higher Education staff

Times Higher Education free 30-day trial

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Reader's comments (2)

Three things 1 Is this true? 2 Is anyone surprised? 3 Since its yes to both, how utterly depressing
Correction Yes to 1, No to 2, how utterly depressing

Sponsored

ADVERTISEMENT