Predatory Publishing: "Publication Lists Are Overrated"

A long-known phenomenon was recently rediscovered by the media and has since become the subject of widespread heated debate: The issue at stake is so-called predatory publishing, in which dubious publishers offer researchers the opportunity to publish their specialist articles and lectures in open access journals – for a fee and in absence of peer review or any other form of scientific quality control.

A Setback for the Open Access Movement

Thousands of scientists have apparently responded to such offers. In this way, many research reports have been published that would probably never have had a chance with established peer review journals. Even though other commentators have already pointed out that the number of articles published by "pirate publishers" accounts for only a single-digit percentage of the total number of scientific publications, the damage to the reputation of the scientific system is immense. The quality debate that has now flared up is also likely to be a severe setback for the open access movement.

The Pressure to Publish Makes Predatory Publishers Successful 

Retractions and problems with regard to reproducibility have not contained the flood of new publications.

As already pointed out, the phenomenon of predatory journals is not new. The same applies to the criticism of publication pressure which weighs on scientists worldwide and forms the implicit basis for the business model of predatory publishing. The fact that the Deutsche Forschungsgemeinschaft (DFG) now only requires a small selection of written references from its applicants has done little to contain the flood of hectic new publications; nor have the retractions and problems with the reproducibility of published research results.

Quantitative Metrics are Overrated

Only days before the latest scandal became known, Jörg Hacker, Martin Lohse, Peter Strohschneider and I described in the Frankfurter Allgemeine Zeitung  how the often exaggerated fixation on seemingly objective metrics – such as publication lists for example – threatens the diversity of research and constitutes a barrier to the career paths of early-career researchers in particular.

As a research funding foundation, we are increasingly critical of this development. In our eyes, quantitative metrics dominate academia to an increasingly risky degree. Excessive transparency imperatives and permanent evaluation tend to institutionalize distrust vis-à-vis researchers. On the other hand, they are expected to deliver surprising, preferably "disruptive" findings as quickly as possible, which ideally will also be suitable for immediate application.  

More Conformity than Experimentation

The narrow time frame of funding increases the danger that curiosity as the intrinsic motivation of the individual researcher gives way to extrinsic incentives. Sooner or later, the quality of research must suffer as a result. There is a threat of situations developing in which risk-laden projects and purely knowledge-oriented basic research are completely brushed aside and radically new ideas seek financial support in vain. In order to keep pace in the race for resources and jobs, early-career researchers in particular are forced to "self-optimize" their credentials, which are subsequently oriented towards abstract metrics and indicators – but pay too little attention to the individual and his or her academic stature.

Microscopy Lab
Risk-laden projects and knowledge-oriented basic research must continue to be tackled. (Foto: Cira Moro for the Volkswagen Foundation)

Appreciation for Personalities Instead of Indicators

With its funding portfolio, the Volkswagen Foundation adopts an approach designed to counteract this trend. We are convinced that a funding organization geared to enabling scholars to take innovative research paths and develop radically new ideas must offer financial planning security over a longer period of time, unconditionally accept the risk of creative failure – and not only rely on "application prose", publication lists, and the usual quantitative metrics when reaching a decision, but also as often as possible gain a personal picture of the shortlisted applicants.

Innovative Research Requires Time, Money, and Reliability

In the case of high-risk projects (e.g. in the funding initiative "Life? - A fresh scientific approach to the basic principles of life", but also in the area of the "Freigeist Fellowships"), applicants are therefore increasingly invited to present their projects in person. Only a personal discussion enables the panel of reviewers to comprehensively assess previous research achievements, current research activities, and scientific performance in the context of an applicant’s core topic.

Quantitative Valuations Cannot Replace Qualitative Ones

Formal assessment procedures kill curiosity and make early-career researchers less willing to take risks.

Of course, the Volkswagen Foundation cannot initiate such a process with every application. Nevertheless, it would be desirable if more institutions were to follow a similar path when assessing projects and individuals. Otherwise, we fear the diversity of research is in danger.

Formal assessment procedures according to the h-factor, ratings and rankings force early-career researchers in particular to resort to forms of "self-optimization". This inevitably leads to a certain risk aversion and smothers their curiosity to pursue innovative results1. This pressure to conform is already reflected in the applications received by the Volkswagen Foundation. We would like to see a much greater will to experiment and to set off for (as yet) unknown shores of scholarly knowledge. 

The Time is Ripe for New, Critical Discussion

If the system were to switch to focusing more on the personality of individual researchers in all its complexity and to assessing the significance of quantitative indicators with greater restraint, the diversity of research would certainly benefit.

If the latest reporting on predatory publishing provides an opportunity to reopen the discussion on the questionability of the measured "accountability" of researchers, then the journalists will have achieved a great deal.

1: Wilhelm Krull; Die vermessene Universität. Ziel, Wunsch und Wirklichkeit. Wien:Passagen, 2017


The library of the Humboldt University explains on its homepage, what "predatory publishing" means and how academics arm themselves against dubious publication offers: Predatory Publishing (HU-Berlin, in German)

Blog post (in German) by Carsten Könneker, editor-in-chief of "Spektrum der Wissenschaft" on the current publication scandal: "Das Publish or Perish-Diktat muss enden".