Massive Global Meta-Science Collaboration Reveals Significant Challenges to Research Credibility Across Social and Behavioral Sciences
Global SCORE initiative finds only half of social science claims replicate, highlighting the urgent need for data transparency and open science reform.
By: AXL Media
Published: Apr 1, 2026, 11:15 AM EDT
Source: Information for this report was sourced from Center for Open Science

A Large-Scale Audit of the Scientific Record
The Systematizing Confidence in Open Research and Evidence (SCORE) program represents one of the most ambitious efforts to date to quantify the reliability of published research. By sampling claims from nearly 4,000 papers published between 2009 and 2018 across 62 journals, the initiative sought to move beyond anecdotal concerns about a "replication crisis." The sampled disciplines spanned a wide array of fields, including economics, psychology, criminology, and political science. Funded by the U.S. Defense Advanced Research Projects Agency (DARPA), the project’s goal was to move toward a more objective, multi-method system for assessing whether a scientific claim can actually be repeated and verified by independent parties.
The Triple Pillars of Research Repeatability
A fundamental contribution of the SCORE findings is the formalization of three distinct dimensions of research credibility: reproducibility, robustness, and replicability. While these terms are often used interchangeably in casual conversation, the researchers established rigorous technical definitions for each. Reproducibility focuses on whether the original data yields the same results when the original analysis is repeated. Robustness tests whether the findings hold up when different, but equally reasonable, analytical methods are applied to the same data. Finally, replicability asks if the same phenomenon can be observed when the experiment is conducted again with entirely new data. The program’s results suggest that a paper might succeed in one of these areas while failing in others, indicating that credibility is a multidimensional trait.
Transparency Barriers and the Reproducibility Gap
One of the most striking findings from the program concerns the lack of transparency in the original publications. Out of a sample of 600 papers, only 24% provided the raw data necessary for a reproduction attempt. When data and code were both available, the success rate for precise reproduction reached 77%; however, when researchers had to reconstruct datasets from public sources, that rate plummeted to just 11%. This data confirms that "open science" practices—sharing the underlying ingredients of a study—are the single most important factor in allowing other scientists to verify a discovery. Without these shared materials, the "hard work" of verifyin...
Categories
Topics
Related Coverage
- New spatial kidney map identifies high-risk B cell clusters as major driver of rapid diabetic kidney failure
- Nutritional timing breakthrough: Post-meal metabolic state found to durably enhance T cell immune performance
- New AI Breakthrough Uses Single Blue Whale Call to Unlock 25 Years of Hidden Underwater Acoustic Data
- Biostatistics Experts Develop R Package for Automated Flowchart Generation to Enhance Scientific Research Reproducibility