A guest post by the Tertiary Education Union on the Performance Based Research Fund:

A recent report on the behaviour of New Zealand’s universities around research performance funding confirms a truism – you get what you reward. And when rewards for universities are based on their ability to generate research outputs, we should not be surprised at the lengths they will go to in order to be on top.

The Tertiary Education Commission confirmed last week that universities were gaming its performance research funding scheme (PBRF) to make sure they could get a higher ranking in the exercise than other universities. Universities have been changing people’s employment agreements, restructuring departments and people’s jobs and in some cases making teaching-focused academics redundant simply so that they can appear higher on a rankings ladder than other universities.

While institutions get no more tax-payer dollars from improving their rankings in the PBRF exercise, they are fighting for prestige – to be “New Zealand’s Leading University” or our “top ranked university for research quality”. This race for rankings was not the intent of the research performance exercise and the commission now believes the system warrants change.

But the commission faces a Herculean task in reforming PBRF while trying to remain true to the core principle of performance based research funding – that you can drive up research performance with the right competitive incentives.

One thing we have learned in recent years from universities is will take the most direct route to the cash and the rankings no matter what stands in their way (think Pamplona’s running of the bulls for the right mental image of universities pursuing performance funding).

A decade ago, when the incentive was ‘bums on seats’ tertiary institutions restructured courses and sucked students out of schools and other tertiary education providers onto their campuses. Some of the outcomes of this market model were positive – New Zealand has one of the highest tertiary education participation rates in the world. However, there were, as has been widely documented in the past, many perverse outcomes – from massive marketing budgets to the now infamous ‘twilight golf’.

So we identified the perverse outcomes of funding based on student numbers and declared the way forward was not about mass participation but about high-end research.

Student numbers were capped and a new competitive incentive introduced centred on published research ‘outputs’ (primarily peer reviewed articles in international journals written for the academic community). 

The aim was to calculate an aggregate research performance score for each institution in order to allocate them funding commensurate with their overall research performance.

TEU has no problem with rewarding excellence in our institutions, but research outputs are only one part of the work of tertiary education staff – academics also are responsible for teaching, administrative tasks, engaging in public meetings to share their knowledge, and so on.

Yet the lure of being ranked number one has meant universities have punished award-winning teachers; asked academics to shift their focus from community research to peer reviewed journals; and questioned the future of academics who perform vital administrative roles, because this takes time away from writing journal articles.

So how do we stop universities from narrowing the role of an academic down to ‘research’, when that is what is what the government rewards?

TEC has proposed changing the way the ‘rankings’ are put together, by only calculating the average quality score of tertiary institutions based on those staff who actually receive one of the quality scores (As, Bs, and Cs) and excluding those who (for a whole host of good reasons) don’t even reach the bar in this narrow performance measure.

This move will not help students who have lost their favourite university teacher, departments who have lost great administrative academics, or communities who couldn’t find an academic expert to attend a public meeting because they are all too busy writing journal articles.

It is a warranted action and will reinforce what the PBRF exercise was about and show institutions they should ‘play by the rules of the game’ with integrity. After all the rules are explicit: PBRF scores are not to be used for hiring and firing academics; not to be used for promotions or performance management. What PBRF aggregate scores are for is to provide an auditing and accountability tool that can help the government dish out the dollars ($1.6 billion over six years) to institutions.

However, changing the rules for calculating the average scores of universities is only a stopgap measure. What the latest report from the commission highlights are fundamental problems with designing performance measures in the tertiary education system. This means a much bigger debate is needed: how do we get the best out of our tertiary education staff?

TEU often argues that the evidence shows performance incentives do not work – you need only look at the mostly incentive-based packages of today’s CEOs to see that any chief executive not worth his or her salt is still clever enough to rort the system and generate an exponentially increasing take home pay. And the creative industries – such as major companies in Silicon Valley – are again realising that giving workers greater autonomy (rather than strict line-management) is the way to get great work from staff. Our worry is that the government sees the perverse outcomes of auditing measures like PBRF but thinks this can be solved by just adding in more performance incentives.

So instead of rethinking whether performance measures work in the tertiary sector, the government has set up a performance exercise looking at student retention and completion. For tertiary institutions the quickest route to achieving in this exercise is making sure students pass their courses. The simplest way to ensure students pass is to put pressure on academics to elevate grades (and in a few isolated cases this is already beginning to happen in a range of institutions across New Zealand).

Should we just sit back and hope that gaming is not so accepted by our tertiary providers, and that they will not pervert good credentialing (the awarding of qualifications) or research performance exercises just to be ranked number 1? Or should we review whether performance based funds really help us get the best out of tertiary education sector?

Only the commission and the government can decide where we head next, but to us it seems clear – the report into PBRF gaming shows it is time to sit down and talk about how we protect the reputation of our tertiary education sector and ensure that we are getting a well-rounded set of outcomes from universities. 

Dr Sandra Grey
TEU National President

Tertiary Education Minister Steven Joyce also said:

But he has now revealed further reforms are planned for universities, including a review of the councils and a separate review of a the controversial Performance Based Research Fund (PBRF). …

Joyce, meanwhile, said funding for engineering and the physical sciences would get a boost in the Budget.

Research would also get a boost, though a modest increase in funding to the controversial Performance Based Research Fund (PBRF).

The PBRF, set up under the previous Labour administration, has been the subject of allegations of rorting by universities hiding away their least productive researchers to maximise their share of the funding. There has also been growing concern about the amount of time wasted by academics tinkering with their PBRF portfolios.

While the PBRF had been ”broadly successful,” Joyce said there were problems which would be addressed in a review this year.

“Frustrating gamesmanship” needed to be addressed and the Government was also keen to look at introducing incentives for the commercial success of research.

”You make it [PBRF] all about [publishing] papers, so if somebody steps out of the system for a year to go and work on a commercial project somewhere, does that damage their PBRF score so that they can’t contribute to what the university is being asked to contribute to?” Joyce said.

There may be a way to introduce incentives to the PBRF process for commercialising research, he said.

”You’ve got to be careful because obviously, it’s fine in engineering and economics and science and things but it’s a little bit harder in humanities.”

So further change looks likely, but whether there will be agreement on what it should be is another matter.

Comments (8)

Login to comment or vote

Add a Comment