Thursday, October 17, 2019

Be Careful What You Measure

"Be careful what you measure, because that's what you'll get" is a management platitude dating back at least to V. F. Ridgway's 1956 Dysfunctional Consequences of Performance Measurements:
Quantitative measures of performance are tools, and are undoubtedly useful. But research indicates that indiscriminate use and undue confidence and reliance in them result from insufficient knowledge of the full effects and consequences. ... It seems worth while to review the current scattered knowledge of the dysfunctional consequences resulting from the imposition of a system of performance measurements.
Back in 2013 I wrote Journals Considered Harmful, based on Deep Impact: Unintended consequences of journal rank by Björn Brembs and Marcus Munaf, which documented that the use of Impact Factor to rank journals had caused publishers to game the system, with negative impacts on the integrity of scientific research. Below the fold I look at a recent study showing similar negative impacts on research integrity.

Citation gaming induced by bibliometric evaluation: A country-level comparative analysis by Alberto Baccini, Giuseppe De Nicolao and Eugenio Petrovich (summary here) shows that the introduction of a research assessment scheme based on bibliometrics caused Italian researchers to massively game the system. Their study uses:
a new inwardness indicator able to gauge the degree of scientific self-referentiality of a country. Inwardness is defined as the proportion of citations coming from the country over the total number of citations gathered by the country.
Source
Their Figure 1 shows that for G10 countries inwardness gradually increased over the period 2000-2016. But starting in 2011 Italy (red) started increasing significantly faster, rising over the remaining 6 years from 6th (20% inwardness) to 2nd (30%).  They identify the cause as a change in the incentives for researchers:
The comparative analysis of the inwardness indicator showed that Italian research grew in insularity in the years after the adoption of the new rules of evaluation. While the level of international collaboration remained stable and comparatively low, the research produced in the country tended to be increasingly cited by papers authored by at least an Italian scholar.

The anomalous trend of the inwardness indicator detected at the macro level can be explained by a generalized change in micro-behaviours of Italian researchers induced by the introduction of bibliometric thresholds in the national regulations for recruitment and career advancement. Indeed, in 2011 research and careers evaluation were revolutionized by the introduction of quantitative criteria in which citations played a central role. In particular, citations started being rewarded in the recruiting and habilitation mechanisms, regardless of their source.
The change in the research and career evaluation:
created an incentive to inflate those citation scores by means of strategic behaviors, such as opportunistic self-citations and the creation of citation clubs
Source
The change is even more obvious when they plot inwardness as a function of the rate of international collaboration. Researchers gamed the system so effectively that:
Before 2010, Italy is close to and moves together with a group of three European countries, namely Germany, UK, and France. Starting from 2010, Italy departs from the group along a steep trajectory, to eventually become the European country with the lowest international collaboration and the highest inwardness.
So what did the Italian authorities expect when they made hiring, promotion and research funding dependent upon citation counts? Similarly, what did academics expect when they made journal prestige, and thus tenure, depend upon citation counts via Impact Factor? Was the goal to increase citation counts? No, but that's what they got.

1 comment:

  1. Thew italian case may be special, but the analysis is missing the political dimension: since Reagan/Thatcher many governments have tried and largely succeeded at bringing under ideological control academia, which they saw as infected by "communists", using two principal methods:

    #1 Switch to grants awarded very competitively directly to individual Principal Investigators, turning them into "dog eat dog" small businessmen, with a conservative mentality, and universities into business parks, renting out space to the groups of those Principal Investigators which became largely tenants. A very much desired side effect of this was the casualization of research emmployment, from corporate and government labs, with their expensive permanent jobs with benefits, to Principal Investigator contractors, with short term subcontracts to PhD students and Research Associates.

    #2 Effectively take away from Universities control of academic promotions, by forcing them to appoint as tenants, I mean Professors, the biggest recipients of research contracts, which in turn depends on grants from government and corporate sponsors, and on publishing in "top journals" whose editors and reviewers are appointed by government and corporate sponsors. In practice this has meant that in the main it would be researchers fully aligned with the ideological preferences of the editors and reviewers of "top journals" and fully endorsed by government and corporate funding sponsors would make career progress.

    Point #2 applies mostly to the social disciplines, in particular political economy, sociology, political studies, while the humanities have been somewhat less controllable because of their modest funding requirements.

    ReplyDelete