About

Promoting Robust and Reliable Research Practice in the Science of Organizations

The research community faces a number of challenges related to promoting robustness and reliability in our work. Reliable Research in Business aims to address these challenges and to discuss best practices regarding how to improve reproducibility; replicability; generalizability; and the rigor of peer review with an explicit focus on robustness and reliability.

This initiative promotes robust and reliable research practice in the Science of Organizations and other social, behavioral and economic science by focusing on the following three aspects of research activities.

Theory Development
This aspect focuses on good practices for achieving rigor, transparency and exactness; adding replication to the research trajectory; good practices for testing existing theories in new contexts; as well as examining the cause, practice, and consequence of irresponsible research practices.

Methodology
This aspect addresses transparency concerning the data gathered, the design, the analysis methods, and the results; the collection of new data for a fair test of the original study; useful methodologies to uncover the conditions under which a theory may not be applicable; as well as registered trials.

Publishing and Reviewing
This aspect addresses the debates about requiring data availability as a part of the peer review process; debunking studies and theories that failed to survive attempted refutations; as well as evaluating studies that demonstrate a lack of statistical support. The initiative includes participants sampled from thought leaders and major journal editors from a variety of disciplines, and scholars at different career stages as part of the discussion.

Reproducibility

How to duplicate the results of a prior study using the same materials and procedures used by the original investigator

  • Theory Development: Good Practices for Achieving Rigor, Transparency and Exactness
  • Methodology: Transparency Concerning the Data Gathered, the Design, the Analysis Methods, and the Results
  • Publishing and Reviewing: The Current State of Data Sharing and Debates about Requiring Raw Data, Whether Quantitative or Qualitative, Be Made Available as a Part of the Peer Review Process

Replicability

How to duplicate the results of a prior study if the same procedures are followed but new data are collected

  • Theory Development: Adding Replication to the Research Trajectory
  • Methodology: How the Collection of New Data Would Constitute a Fair Test of the Original Study
  • Publishing and Reviewing: What to Do with Studies and Theories That Failed to Survive Attempted Refutations

Generalizability

How to verify whether the finding of a study applies in other contexts or populations that differ from the originals

  • Theory Development: Good Practices for Testing Existing Theories in New Contexts
  • Methodology: Useful Methodologies to Uncover the Conditions Under Which a Theory May be Wrong
  • Publishing and Reviewing: Testing U.S.-based Theories outside the U.S.

Training & Developmental Guidance

How to verify whether the finding of a study applies in other contexts or populations that differ from the originals

  • Improving methodological approaches through rigorous training in statistics and research practices.
  • Training our next-generation scholars and publication gatekeepers on fraud detection.How to encourage more rigorous and useful research, in terms of theorizing, study design, testing, and reporting.
    • Theory Development: The Cause, Practice, and Consequence of HARKing
    • Methodology: The Merits of Registered Trials
    • Publishing and Reviewing: How to Evaluate Studies that Demonstrate a Lack of Statistical Support in a Particular Sample for Specific Hypotheses or Research Propositions