Reproducing experiments is vital to science. Being able to replicate, validate and extend previous work also speeds new research projects. In a recent survey, 90 percent of researchers acknowledged a reproducibility crisis.
But reproducing scientific work remains challenging and time consuming, partially because of the historically 'closed' nature of scientific data, as well as the lack of recognition for reproducing others' work.
Moore Foundation grantee Casey Greene and his team at the University of Pennsylvania have developed a new processing tool to help solve this problem, through enabling reproducible computational analyses.
Greene, an assistant professor of pharmacology at the University of Pennsylvania and an investigator through the foundation's Data-Driven Discovery initiative, and his colleague Brett Beaulieu-Jones have developed a process called continuous analysis, which provides inherent reproducibility to computational research with minimal cost to the researcher.
Continuous analysis combines Docker, a container technology akin to virtual machines, with continuous integration, a software development technique, to automatically rerun a computational analysis whenever updates or improvements are made to source code or data.
This new tool allows researchers to reproduce results without contacting the study authors. Continuous analysis allows reviewers, editors or readers to verify reproducibility without manually downloading and rerunning code and can provide an audit trail for analyses of data that cannot be shared.
"Reproducibility can have wide-reaching benefits for the advancement of science," said Beaulieu-Jones and Greene in their study published recently in Nature Biotechnology. "Continuous analysis lays the groundwork needed to address reproducibility and robustness of findings in the broad sense."
Read the full article here.
Message sent
Thank you for sharing.