The researchers, led by the College of Cambridge, analysed greater than 12,000 analysis papers on breast most cancers cell biology. After narrowing the set all the way down to 74 papers of excessive scientific curiosity, lower than one-third – 22 papers – have been discovered to be reproducible. In two circumstances, Eve was capable of make serendipitous discoveries.
The outcomes, reported within the journal Royal Society Interface, display that it’s attainable to make use of robotics and synthetic intelligence to assist deal with the reproducibility disaster.
A profitable experiment is one the place one other scientist, in a unique laboratory beneath related situations, can obtain the identical outcome. However greater than 70% of researchers have tried and failed to breed one other scientist’s experiments, and greater than half have failed to breed a few of their very own experiments: that is the reproducibility disaster.
“Good science depends on outcomes being reproducible: in any other case, the outcomes are basically meaningless,” mentioned Professor Ross King from Cambridge’s Division of Chemical Engineering and Biotechnology, who led the analysis. “That is notably important in biomedicine: if I’m a affected person and I examine a promising new potential remedy, however the outcomes aren’t reproducible, how am I purported to know what to imagine? The outcome might be individuals dropping belief in science.”
A number of years in the past, King developed the robotic scientist Eve, a pc/robotic system that makes use of methods from synthetic intelligence (AI) to hold out scientific experiments.
“One of many large benefits of utilizing machines to do science is that they’re extra exact and file particulars extra precisely than a human can,” mentioned King. “This makes them well-suited to the job of making an attempt to breed scientific outcomes.”
As a part of a challenge funded by DARPA, King and his colleagues from the UK, US and Sweden designed an experiment that makes use of a mixture of AI and robotics to assist deal with the reproducibility disaster, by getting computer systems to learn scientific papers and perceive them, and getting Eve to aim to breed the experiments.
For the present paper, the staff targeted on most cancers analysis. “The most cancers literature is big, however nobody ever does the identical factor twice, making reproducibility an enormous situation,” mentioned King, who additionally holds a place at Chalmers College of Expertise in Sweden. “Given the huge sums of cash spent on most cancers analysis, and the sheer variety of individuals affected by most cancers worldwide, it’s an space the place we urgently want to enhance reproducibility.”
From an preliminary set of greater than 12,000 revealed scientific papers, the researchers used automated textual content mining methods to extract statements associated to a change in gene expression in response to drug remedy in breast most cancers. From this set, 74 papers have been chosen.
Two completely different human groups used Eve and two breast most cancers cell traces and tried to breed the 74 outcomes. Statistically important proof for repeatability was discovered for 43 papers, that means that the outcomes have been replicable beneath equivalent situations; and important proof for reproducibility or robustness was present in 22 papers, that means the outcomes have been replicable by completely different scientists beneath related situations. In two circumstances, the automation made serendipitous discoveries.
Whereas solely 22 out of 74 papers have been discovered to be reproducible on this experiment, the researchers say that this doesn’t imply that the remaining papers will not be scientifically reproducible or strong. “There are many the reason why a selected outcome will not be reproducible in one other lab,” mentioned King. “Cell traces can generally change their behaviour in numerous labs beneath completely different situations, for example. An important distinction we discovered was that it issues who does the experiment, as a result of each particular person is completely different.”
King says that this work exhibits that automated and semi-automated methods might be an vital software to assist deal with the reproducibility disaster, and that reproducibility ought to grow to be an ordinary a part of the scientific course of.
“It’s fairly stunning how large of a difficulty reproducibility is in science, and it’s going to want an entire overhaul in the way in which that a variety of science is completed,” mentioned King. “We expect that machines have a key function to play in serving to to repair it.”
The analysis was additionally funded by the Engineering and Bodily Sciences Analysis Council (EPSRC), a part of UK Analysis and Innovation (UKRI), and the Wallenberg AI, Autonomous Methods and Software program Program (WASP)
Reference:
Katherine Roper et al. ‘Testing the reproducibility and robustness of the most cancers biology literature by robotic.’ Royal Society Interface (2022). DOI: 10.1098/rsif.2021.0821