Quality or quality rorting?

The Herald reports:

The University of Auckland has been knocked off its perch in a ranking of universities on the research performance of academic staff.

Rival Victoria University of Wellington is now number one in , according to the latest Performance-Based Research Fund (PBRF) Quality Evaluation, released yesterday by the Commission (TEC).

The University of Auckland, which under current measures ranked first in both the 2003 and 2006 evaluations, ranked second. The PBRF is a system used to assess the research performance of tertiary education organisations.

Its results are used in the allocation of about $263 million a year in research funding.

VUW has had an active programme in place to basically rort the PBRF rankings by manipulating employment contracts and the like. I've blogged on these in the past.

Now it may be that their first place is genuinely deserved, but their constant attempts to artificially manipulate the rankings means we can't be sure.

UPDATE. A reader does some sums:

VUW put 641.54 staff members into the PBRF evaluation exercise, out of a claimed 779.91 eligible staff members (i.e. 82% of its purportedly eligible academic staff participated). On its face, looks great – most of VUW's academic staff got included in the evaluation, meaning that its score must be a good representation of the quality of its academics as a whole. And so VUW came out top on the PBRF assessments for not only its average research quality score per participating staff member (the figure everyone is fixating on as showing the “best” research institution in the country), but also its score for research quality of its staff as a whole.

However, when you go back to the last PBRF round (in 2006), things start to look a bit strange. In 2006, VUW put up 598.5 staff for PBRF assessment – 43 fewer than in 2012. However, in 2006 VUW also reported having 988.1 staff eligible for PBRF … some 209 more than in 2012. So, in 6 years, some 21% of VUW's staff appear to have vanished from its books when it comes to PBRF purposes – despite its roll having increased by about 1000 EFTS (equivalent full time students) in that period. (I'd also note that VUW is the only University to show anything like this level of decline in staff numbers, so it cannot solely be a consequence of changing the rules on counts as a PBRF eligible staff member).

Then you compare VUW's reported staff numbers in 2012 to the other Universities in the process, and some other funny things start to jump out at you. VUW had 16,690.43 EFTS on its books in 2012. In comparison, AUT has 15,771.56 EFTS … about 1000 less. Yet, somehow, AUT has 952.1 eligible staff members … 170 more than VUW does. So despite having 6% fewer students on its books, AUT apparently has 22% more staff members who are eligible for PBRF purposes than VUW does. Which seems somewhat odd.

Equally, Otago has 18,715.9 EFTS – 2000 more than VUW. But for the PBRF exercise, Otago put 1168.24 staff members in for PBRF evaluation, out of its 1567.5 eligible staff. That is to say, despite having only about 12% more students than VUW, Otago put forward almost twice the number of staff for assessment, and apparently has twice as many PBRF eligible staff on its books. Which again seems odd.

Now, you'd expect this fact to show up somewhere in the PBRF results, and it sort of does. One of the figures the Tertiary Education Commission reported is a measure of “the extent degree-level and above teaching and learning is underpinned by research at each [institution]” In other words, how much are students actually getting their knowledge from the people who are doing the research the PBRF exercise is measuring. And here VUW's result plummets – from being top of the research rankings, it falls to sixth out of the nine Universities. There is one PBRF assessed staff member for every 26 students at VUW, as compared to one for every 20 at Auckland or one for every 16 at Otago.

Why is this? Well, because VUW has only put in a relatively small number of staff members into the PBRF exercise compared to other Universities, there aren't as many of them to stand in front of a classroom and teach the (comparatively large) number of students enrolled at VUW. 

So, if we're going to treat the PBRF results as telling us anything useful about Universities, then here's the message it give. VUW apparently has a comparatively small number of researchers who produce higher quality work than their peers at other institutions. But if you are a student wanting to study at a University where your teachers are also actively engaged in research activities, you should be cautious about attending VUW as there will be comparatively more of you competing for the attention of those higher-quality research staff – meaning that the actual teaching you receive is more likely to come from staff members who aren't active in researching the topics they are telling you about. 

Alternatively, we could recognise the PBRF process for what it is, and treat any claims that it accurately measures “quality” as requiring more than a pinch of salt.

It is pretty clear from these numbers that VUW's ascendancy to the top of the table is more to do with their disciplined program to reclassify staff, than any actual increase in quality.

The has already made some rule changes in response to VUW's rorting and rigging attempts. I think they have a bit more work to do.

Comments (12)

Login to comment or vote

Add a Comment