close
close

Biomedical scientists are struggling to replicate their own discoveries

Biomedical scientists are struggling to replicate their own discoveries

A large number of biomedical scientists tried but failed to replicate their own studies, with many not publishing their findings, according to the survey.

The study’s authors warn that researchers’ failure to be rigorous in their work creates “serious problems of bias” and hinders innovation in science.

Times Higher Education logo with red T, purple H and blue E.

Their survey of nearly 1,600 authors of biomedical sciences papers found that 72 percent agreed. there was a reproducibility crisis in your area.

Participants cited many factors, but the main reason most participants felt consistently contributed to research irreproducibility was publication pressure.

The study found that only half (54 percent) of participants had previously attempted to repeat their work. Of these, 43 percent failed.

Of those who tried repeat one of your own studiesjust over a third (36 percent) said they published the results, according to data published in the journal PLOS Biology November 5th.

Lead author Kelly Cobey, assistant professor in the School of Epidemiology and Public Health University of Ottawastated that respondents believed their institution did not value replication research as much as new research.

“Until we give researchers the time, funding and space to approach their research with rigor, which includes recognizing replication studies and null findings as valuable components of the scientific system, we are likely to see only selective published reports of the scientific system,” she said Higher Education Times.

“This creates serious bias problems and limits our ability to innovate and discover new things.”

Coby said publications remain “an important, if problematic, currency of researcher success” because there is a perception that null results are not as interesting as positive ones.

“Researchers may feel that the value of their results is limited… if they are unlikely to be accepted in a peer-reviewed journal, especially a prestigious one.”

Many researchers reported that they had never attempted to replicate someone else’s research. Of the participants who attempted to replicate another team’s results, more than 80 percent were unable to achieve the same results.

Kobe called for a much more rigorous system to monitor research reproducibility and researcher perceptions of the academic ecosystem, conducted at the national level.

“I think it is clear that problems with academic incentives continue to permeate the scientific system and that we need serious advocacy efforts and reforms if we are going to bring our research into line with best practices for transparency and reproducibility,” she said.