A Manifesto for Reproducible Science – Munafo et al. - 2017 - Article


This paper argues for the adoption of measures to optimize key elements of the scientific process: methods, reporting and dissemination, reproducibility, evaluation, and incentives.

What is the problem?

Scientific creativity is characterized by identifying novel and unexpected patterns in data. A major challenge is to be open to new and important insights while avoiding being misled by our tendency to see structures in randomness. Apophenia (seeing patterns in random data), confirmation bias (focusing on evidence that’s in line with our expectations/favoured explanation), and hindsight bias (seeing an event as having been predictable after it has occurred) combined can lead us to false conclusions.

If several potential analytic pipelines can be applied to high-dimensional data, false-positives are highly likely.

What are measures that can be implemented when performing research? (methods)

Protecting against cognitive biases

Effective solution to diminish self-deception and unwanted biases is blinding. Preregistration is an effective form of blinding because the data doesn’t exist and outcomes aren’t yet known.

Improving methodological training

Threats to the robustness of science could be improved by statistical training. Training in research practice that can protect against cognitive biases and the influence of distorted incentives is most important. Without formal requirements for continuing education, the best solution may be developing educational resources that are accessible, easy-to-digest and immediately and effectively applicable to research (e.g. brief, web-based modules on specific topics).

Implementing independent methodological support

Many clinical trials have multidisciplinary trial steering committees to give advice and oversee the design and conduct of a trial. The need for these committees came from the fact that financial conflicts of interest could exist in clinical trials (e.g. sponsors may be companies manufacturing the product being tests and (un)intentionally influence design, analysis, and interpretation). Including independent researchers could alleviate these influences.

Encouraging collaboration and team science

Solution to ‘lack of resources to improve statistical power’. Distributed collaboration across study sites facilitates high-powered designs and more potential for testing generalizability (as opposed to relying on the limited resources of single investigators).

What are measures that can be implemented when communicating research? (reporting and dissemination)

Promoting study preregistration

Can include the registration of a basic study design as well as a detailed pre-specification of study procedures, outcomes, and analysis plan. This was introduced to solve two problems: publication bias and analytical flexibility (outcome switching).

  • Publication bias: file drawer problem, the fact that more studies are conducted than published.
  • Outcome switching refers to changing the outcomes of interest in the study depending on the observed result.

Improving the quality of reporting

Preregistration improves discoverability of research, but this does not guarantee usability. Improving the quality and transparency in reporting research is necessary to address this. The Transparency and Openness Promotion (TOP) guidelines offer standards as a basis for journals and funders to incentivize or require more transparency in planning and reporting research.

Registered reports (RR) – initiative to eliminate various forms of bias in hypothesis-driven research, specifically evaluation of a study based on the results. RR’s divide peer review processing in two stages, before and after results are known. First reviewers assess a detailed protocol that includes study rationale, procedure, and an analysis plan. Publication of study outcomes are guaranteed if the authors adhere to the approved protocols, meet pre-specified checks, and conclusions are appropriately evidence-bound. They prevent publication bias by accepting articles before results are known. They also neutralize P hacking by knowing the hypotheses and analysis plans in advance.

  • Main objective against RR’s – format limits exploration or creativity by making authors follow a pre-specified methodology. But they place no restrictions on creative analysis practices or serendipity.

Authors can report the outcomes of any unregistered exploratory analyses, as long as those tests are clearly labelled as post-hoc.

What are measures that can be implemented to support verification of research? (reproducibility)

Promoting transparency and open science

How credible scientific claims are based on the evidence supporting them, which includes methodology applied, data acquired, and process of methodology implementation, data analysis, and outcome interpretation.

Open science is the process of making the content and process of producing evidence and claims transparent and accessible to others.

  • There are barriers to meet these ideals, including vested financial interested (scholarly publishing), and few incentives for researchers to pursue open practices.
  • Commercial and non-profit organizations are building new infrastructures like the Open Science Framework to make transparency easy and desirable for researchers.

What are measures that can be implemented when evaluating research? (evaluation)

Diversifying peer review

Pre-and-post-publication peer review mechanisms accelerate and expand the evaluation process. Sharing preprints enables researchers to get quick feedback on their work from a diverse community instead of waiting months for a few reviews in the conventional, closed peer review process.

Data sharing includes sharing data in public repositories, offering advantages in terms of accountability, data longevity, efficiency, and quality (reanalysis could catch crucial mistakes or fabrication).

  • Badges acknowledging open science practices – Center for Open Sciences suggested journals to assigned badges to articles with open data (and other open practices like preregistration and open materials).
  • The Peer Reviewers’ Openness Initiative – researchers signing this initiative pledge commit to not offer comprehensive review for any manuscript that doesn’t make its data publicly available without clear justification.
  • Requirements from funding agencies – NIH intends to make public access to digital scientific data the standard for all NIH-funded research. NSF requires submission of a data-management plan outlining how data will be stored and shared.

What role do incentives play?

Publication is the currency of academic sciences and increases likelihood of employment, funding, promotion, and tenure. Positive, novel, clean results are more likely to be published than negative results, replications and results with loose ends. Consequently, researchers are incentivized to produce the former, even at the cost of accuracy. Ultimately, incentives increase the likelihood of false positives in published literature. Shifting the incentives offers a chance to increase credibility and reproducibility in published results. There will always be incentives for innovative outcomes, but there should be incentives and rewards for transparent and reproducible research.

Conclusion

Challenges to reproducible science are systemic and culture, but that does not mean they can not be met. The measures described offer practical and achievable steps to improve rigor and reproducibility.

Access: 
Public
Work for WorldSupporter

Image

JoHo can really use your help!  Check out the various student jobs here that match your studies, improve your competencies, strengthen your CV and contribute to a more tolerant world

Working for JoHo as a student in Leyden

Parttime werken voor JoHo

Image

Click & Go to more related summaries or chapters:

Summaries per article with Research Methods: theory and ethics at University of Groningen 20/21

Summaries per article with Research Methods: theory and ethics at University of Groningen 20/21

Supporting content: 
Access: 
Public
Comments, Compliments & Kudos:

Add new contribution

CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Image CAPTCHA
Enter the characters shown in the image.
Check how to use summaries on WorldSupporter.org


Online access to all summaries, study notes en practice exams

Using and finding summaries, study notes en practice exams on JoHo WorldSupporter

There are several ways to navigate the large amount of summaries, study notes en practice exams on JoHo WorldSupporter.

  1. Starting Pages: for some fields of study and some university curricula editors have created (start) magazines where customised selections of summaries are put together to smoothen navigation. When you have found a magazine of your likings, add that page to your favorites so you can easily go to that starting point directly from your profile during future visits. Below you will find some start magazines per field of study
  2. Use the menu above every page to go to one of the main starting pages
  3. Tags & Taxonomy: gives you insight in the amount of summaries that are tagged by authors on specific subjects. This type of navigation can help find summaries that you could have missed when just using the search tools. Tags are organised per field of study and per study institution. Note: not all content is tagged thoroughly, so when this approach doesn't give the results you were looking for, please check the search tool as back up
  4. Follow authors or (study) organizations: by following individual users, authors and your study organizations you are likely to discover more relevant study materials.
  5. Search tool : 'quick & dirty'- not very elegant but the fastest way to find a specific summary of a book or study assistance with a specific course or subject. The search tool is also available at the bottom of most pages

Do you want to share your summaries with JoHo WorldSupporter and its visitors?

Quicklinks to fields of study (main tags and taxonomy terms)

Field of study

Access level of this page
  • Public
  • WorldSupporters only
  • JoHo members
  • Private
Statistics
783