Written by: Rinze Benedictus, staff advisor, UMC Utrecht & PhD candidate at CWTS, Leiden University

shield-953533_1920.jpg

On 31 May 2018 Marta Teperek spoke with Rinze Benedictus about changing the academic reward system at the Utrecht Medical Center. The blog post below was written by Rinze Benedictus to summarise the main points of the discussion for the readers of the Open Working blog.


University Medical Centers: places where the society enters the building

In the early 2000s in the Netherlands, medical faculties merged with university hospitals to become university medical centers (UMCs), with a triple task: research, healthcare and teaching. The aim was to better integrate biomedical research and healthcare and improve healthcare through research. This coincided with an international rise of an indicator-based view on scientific quality, as expressed by rankings, and bibliometric indicators like the H-index and other citation measures.

On one hand, this meant the creation of organisations where prestige was built on academic values that were increasingly informed by an indicator-based view on scientific quality. UMCs became apt producers of scientific knowledge that ‘counted’ bibliometrically. Papers published by UMCs accounted for around 40 percent of all Dutch scientific output and they were internationally cited well above average. This led to a remarkable publishing rate in certain medical subfields, e.g. in cardiology, the highest-producing Dutch professor authored more than 100 papers per year.

On the other hand, in terms of staff and funding, UMCs are organisations where delivering healthcare is the primary activity. They are, simply put, large hospitals where many, many patients are treated. So, from an academic perspective, “society entered the building”. This created a need to look at biomedical research conducted at UMCs from the patients’ perspective.

At the same time, increasingly, the premise that producing “high quality” biomedical knowledge would more or less automatically benefit patients, came under scrutiny. There was believed to be a mismatch between the mission of UMCs and the incentive and reward system for researchers.

On the road to changing academic rewards at UMC Utrecht

At the UMC Utrecht over the course of many years steps have been taken to address this issue. In 2010 six strategic multidisciplinary research programs were formed that focused on a limited number of disease targets. During the first evaluation of these research programs, societal stakeholders were involved. In addition, support for innovation and valorisation was increased.

In 2015 the UMC Utrecht decided that societal impact of the research programs should be one of the overarching goals. To bring incentives and rewards in line with this goal, the UMC Utrecht used the nation-wide Standard Evaluation Protocol (SEP) in academic evaluation in order to further emphasize the societal relevance of research (on a group level). Next, changes were also introduced on the individual level: portfolios were introduced for aspiring professors and associate professors. In these documents, researchers described themselves in five different aspects. Portfolios replaced traditional CVs that were often centred around publications.

The debate about incentives and rewards was significantly shaped by dean prof. dr. Frank Miedema and three other academics, who started the Science in Transition initiative in 2013. This fuelled the debates about academic evaluation systems across the Netherlands, and it triggered a range of meetings and discussions at the UMC Utrecht. All these events created an atmosphere where alternatives to existing incentives and rewards could be discussed.

The debate was also boosted by the reproducibility crisis in academia and the discussion about “research waste”. That made it very difficult to ignore the issues or continue saying that publications are always up to high standards because they are peer reviewed. Specifically, in the field of health research, the report from the Health Council, and the recent minister’s response was a wake-up call. The Health Council urged UMCs not just try to be “excellent” but to pursue “research questions which are relevant to practice”.

Challenges

Of course, the new approach is not without critique. Researchers are concerned about the “transportability” of their CVs to other institutes or other countries: will they be recognized elsewhere for their outside of publications achievements? More specifically, researchers engaging in basic research feel they are under pressure to demonstrate societal relevance.

Change is coming

The debate has gained momentum and is increasingly leading to actual change at universities. At Utrecht University the portfolio is now used during all hiring/promotion procedures for professors. In addition, the faculty of Geosciences uses the “impact pathways” approach to evaluate its research according to the SEP. The narrative-based Impact Case Studies from the UK Research Excellence Framework were an obvious inspiration.

Signs of change are also visible at the Nijmegen Donders Institute for Brain, Cognition and Behavior (part of Radboud UMC) that introduced the programme of Sustainable science. And the Free University Amsterdam published a manifesto for “gross academic value”, which has now been linked to Open Science (Dutch).

From a science policy perspective, the changes at UMC Utrecht resonate very well with the current push for Open Science in Europe and in The Netherlands. The Openness of the research agenda, combined with open data and open access, implies a new way of doing research that requires fitting incentives and rewards.

References