Knowing the score
New application helps authors, publishers address heightened reproducibility and transparency standards
Register for free to listen to this article
Listen with Speechify
0:00
5:00
SAN DIEGO—SciCrunch Inc., a scientific researcher content management system and collaboratively edited knowledge base of scientific resources hosted by the University of California, San Diego (UCSD), has announced the release of SciScore, an application designed to generate a score and supporting report that an agency, publisher or author may use to identify whether key areas of reproducibility and transparency are addressed in a manuscript. SciScore is reportedly the first and only working application of its kind, designed to be used in support of the preclinical scientific research community’s pursuit of reproducibility and transparency.
In January 2016, the U.S. National Institutes of Health (NIH) introduced new grant review guidelines that placed greater emphasis on areas of reproducibility and transparency, and in the process, changed the way in which grants are awarded. In response, many scientific journals—notably PLoS, JBC, eLife, AACR, MBoC and GSA— have revised their author guidelines to encourage researchers to include and emphasize the elements required for reproducibility and transparency.
“The National Institutes of Health has changed grant guidelines in 2016 to refocus the granting mechanism on rigor from novelty,” says Dr. Anita Bandrowski, neuroscience researcher at UCSD and the founder and CEO of SciCrunch. “This is a huge undertaking, and it takes time for investigators to come into contact with these guidelines.”
To use the tool, an individual author, agency or publisher uploads a manuscript to the SciScore application, which provides a score and a corresponding supporting report that assessed whether key areas of reproducibility and transparency are addressed in the manuscript. The platform uses AI and deep learning technology to calculate a score based on evidence of randomization, blinded conduct of experiments, sample size estimation, whether sex is included as a biological characteristic, and cell line authentication or contamination. The application also flags resource ambiguity such as a mislabeled or unidentified cell line. An author can improve a manuscript’s score by adding any missing information, addressing experimental methods or correcting ambiguous information.
“SciScore is a tool that produces a report; the user can choose to act on the report or not, but at least in once case a colleague had acted on the recommendation of the tool,” recounts Bandrowski. “I was asked to look over a paper that a colleague was putting into a journal. I ran the methods section through our tool, in addition to looking the paper over. The tool came back with a ‘Not Detected’ message for sex as a biological variable. Based on this, I told him that he needed to state which sex was being used and address why this was the case. Instead, he undertook a new set of experiments to determine if the effect was the same in both male and female animals, and discovered differences that surprised him.”
Bandrowski reports that the expansion of the method has led to a major finding.
“This colleague did not take the easy way out; his lab went the extra mile to determine if what was true in males was also true in females,” she notes. “For SciScore, it was a single line, a tiny reminder in the report that led a good scientist to do better science.”
The application’s development was supported by the Small Business Innovation Research (SBIR) program grants R43OD024432 and R44MH119094. According to SciCrunch, major publishers—including Wiley & Sons, NatureResearch and eLife—are currently piloting the SciScore application.
Manuscripts submitted for analysis are removed from the company’s cloud server almost immediately after scoring is completed, keeping the authors’ yet-unpublished information secure and private.
“We will continue to improve the tool as we are able,” says Bandrowski. “We have been talking to the Materials Design Analysis Reporting group, who are creating a checklist that should be applicable to many journals to see which aspects of the checklist we might go after next. Ideally, a checklist like this would be implemented across many publishers consistently, and SciScore would be able to verify the author’s response, improving the scientific literature and making it more consistent among publishers.”
“I believe that more rigorous science will benefit all of society,” she adds. “I don’t think that it is genuine to say ‘we will cure cancer with the next paper,’ but studies that are more rigorous should enable clinical trials to be based on more sound science.”