Spitze eines Eisberges

The Dark Sides of Open Science

Andrew Curry

Have calls for transparency in science gone too far? German science expert Stefan Hornbostel argues that some transparency is good for science – but too much can backfire, reducing the efficiency and quality of research and eroding public trust.

Over the last two decades, the open science movement has been gathering strength. At its simplest, it's a call for greater transparency from researchers. Advocates for more transparency in science argue that openness about who’s doing science, where their funding is coming from, what their data looks like and how it's used will increase public trust in the scientific process and its output. The response from policymakers and scientists themselves has been transformative: Research institutions are now ranked and monitored. More and more funding agencies are requiring data sets to be made publicly accessible. Some scientific journals have done away with anonymous peer reviews.

And yet when pollsters ask Germans whether or not they trust scientists, the answers are split evenly down the middle. Trust in science hasn't dropped in the last 20 years, but it's also been stubbornly hard to win over approximately half the populace despite the increased efforts to be transparent. "Only 50 percent of the population says they trust science," laments Stefan Hornbostel, head of the German Centre for Higher Education Research and Science Studies (DZHW). "For a system that's so costly, you’d expect to have higher rates."1

Perhaps, Hornbostel says, the efforts at transparency are going too far. Even before open science became a popular buzzword, science was characterized by transparency: From peer review to the scientific method itself, criticism, skepticism and scrutiny are baked into the system. Hornbostel argues that the push for transparency in science has had negative consequences that have largely been ignored or left unconsidered. By raising expectations and leaving them unmet, efforts to increase transparency can actually have the opposite effect. "Calls for transparency result in calls for more transparency – it's a kind of loop," Hornbostel says. "Transparency is necessary, but at the same time attention to the dark side isn't evolving at the same speed."

As an example, Hornbostel cites reforms to the peer review process. Revealing the names of reviewers was intended to make peer review more transparent. But given the tightly-knit nature of many scientific disciplines, this particular form of transparency can have the opposite effect: "Reviewers whose names are known might not be as honest when criticizing colleagues," he says. In a recent survey, for example, more than 40 percent of German scientists who were also experienced as peer reviewers said the quality of reviewers has gone down in recent years.2

Prof. Dr. Stefan Hornbostel

Prof. Dr. Stefan Hornbostel tells during the Herrenhausen Conference that only 50 percent of the population trusts in science. 

Calls for transparency result in calls for more transparency.

Transparency comes at a financial cost, too. Requiring researchers to make their raw data accessible means money must be spent creating publicly-accessible databases and scrubbing datasets of any information that might violate someone's privacy. For some types of research – sociology studies based on personal interviews, for example, or large epidemiological studies using medical records – adequately protecting personal information might be impossible. "Storing data is time intensive and costly," Hornbostel says. "We will therefore have to think about which data are really relevant for long-term archiving and how we want to organize forgetting - that is, deleting data."

And Hornbostel is concerned that many of the measures put in place to make research institutions more accountable – from excellence rankings based on the number of papers published in high-impact journals to increased pressure to focus on "useful" research areas – make science less efficient and productive. "Do we need less transparency in science? Clearly the answer is no," Hornbostel says. "But we need to find a balance. We need a system of checks and balances, but we also need to avoid overregulating to avoid killing academic freedom."

Even the open access publishing movement, which demands payment from authors to publish their work so that it can be read for free, has had unintended consequences. For every respected PLOS-One, there are dozens of shady pay-to-publish journals with no peer review procedures. Such predatory journals have increased the uncertainty for young researchers, for whom publications are an important component of career success. "Scientists really have to check now – and research institutions should provide assistance – if this is a good journal, or a predatory one?" Hornbostel says.

Prof. Dr. Stefan Hornbostel

Hornbostel is concerned that many of the standards by which research institutions should account for make science less efficient and productive.

Too much transparency can pose a security risk to the population.

Science communication, too, plays a role. Researchers are under pressure to communicate their results to the public, which in turn is eager to hear how their tax money is being used in the lab. But headlines and press releases rarely communicate the complexities or uncertainty built into the scientific process. Take the tenacious anti-vaccination movement: The results of one poorly designed study, quickly retracted, were widely reported before the scientific community had an opportunity to properly scrutinize or respond to the findings. Though the research has been thoroughly discredited, the damage to public health has been lasting and seemingly irreversible.

And open access to data might be a security risk, too: In a world where gene editing techniques like CRISPR make it possible to genetically engineer organisms with basic equipment, publishing lab notes on experiments with dangerous viruses might be transparent – but not in the public interest.

Hornbostel sees a lot of value in transparency. But "open science isn't the answer for all the problems we have," he says. "One has to distinguish between forms of open science that are helpful and forms that will cause more problems than they solve."

1 In general, the majority of the population is also positive: According to the Science Barometer 2017 (Wissenschaftsbarometer 2017), just a small majority (50 percent of the population) have "more confidence" in science. But only 12 percent say they have little or no confidence. (Source: Wissenschaftsrat; Berlin, October 20, 2017)

2 12 % of respondents with experience as peer reviewers indicate that the quality of reviews has deteriorated significantly; 32 % indicate that the quality of reviews has deteriorated (Source: Neufeld, J., & Johann, D. (2016): 2016 Wissenschaftlerbefragung 2016 – Variablenbericht – Häufigkeitsauszählungen. Hannover/Berlin: DZHW.)

More articles within our focus topic can be found under "Science and the public" (in German only).

Information about the Event

At the Herrenhausen Conference "Transparency and Society – Between Promise and Peril", international experts discussed the question of whether limitless transparency in politics, business, and society is detrimental or beneficial. And how to find a socially accepted balance between security and freedom, public interest and privacy. Prof. Dr. Stefan Hornbostel was one of the speakers at the event. His talk was titled "The Two Faces of Transparency: How much transparency is beneficial? How much is too much?"

Newsletter of the Volkswagen Foundation

Our newsletter informs regularly (about once a month) about current funding offers, deadlines, events and news about the Foundation and funded projects. Are you interested in our newsletter? Then register here (in German only).