Dr Nadege Bault works within the  Centre for Therapeutic Ultrasound (CENTUS) and is Head of Operations for the  Brain Research & Imaging Centre (BRIC) at the University of Plymouth
A University of Plymouth researcher has contributed to a large-scale international study which has highlighted that scientific conclusions can shift dramatically depending on who conducts the analysis.
The research, part of a series of articles published in the journal Nature, asked a team of 457 independent analysts from around the world to reassess data from 100 previously published studies across the social and behavioural sciences.
They included Dr Nadège Bault, Lecturer in Cognitive Neuroscience within the University’s School of Psychology, who was invited to become involved in the study having previously been involved in a similar assessment of statistical analyses in the field of neuroimaging.
All the analysts received the same dataset and the same key research question, but were given freedom in how to conduct the analysis based on their informed judgment.
Although most of the reassessments broadly supported the main claims of the original studies, effect sizes, statistical estimates, and levels of uncertainty often differed meaningfully, with the analysts reaching the same conclusion as the original authors in just one third of cases.
Importantly, the researchers say, these discrepancies were not due to a lack of expertise as experienced researchers with strong statistical backgrounds were just as likely to arrive at divergent results as others.
At the same time, observational studies proved less robust than experimental ones, suggesting that more complex data structures allow greater analytical flexibility – and thus greater uncertainty.
The project set out to strengthen the credibility of studies in the social and behavioural sciences and, its leads say, delivers a clear message that scientific objectivity does not lie in identifying a single “true” analysis, but in making the space of plausible alternatives transparent – both in research reports and in communication with the broader scientific community.
Dr Bault, who works within the Centre for Therapeutic Ultrasound (CENTUS) and is Head of Operations for the Brain Research & Imaging Centre (BRIC) become involved in the study as she wanted to explore whether findings were reproducible when variability in collecting data was set aside.

There have recently been many improvements in research practices in the field of psychology and social science and pre-registrations, sharing of datasets and code is becoming standard practice.

Nevertheless, there are still some blind spots. I believe this series of articles will raise awareness of the need to increase our collective expertise in statistics and set some minimal standards for what constitute robust analyses. For my own work, the impact will be that I ask a second person to re-analyse the data and replicate the results for all findings I plan to publish in the future.

Nadège BaultDr Nadège Bault
Lecturer in Psychology (Behavioural or Cognitive Neuroscience)

The international collaboration was led by Balázs Aczél and Barnabás Szászi (Eötvös Loránd University and Corvinus University), and was conducted as part of the Systematizing Confidence in Open Research and Evidence (SCORE) program.
Aczél, Professor at Eötvös Loránd University, said: 
“These findings do not call into question the credibility of prior research. Rather, they draw attention to the fact that presenting a single analysis often fails to reflect the true degree of empirical uncertainty, and that ignoring analytic variability can lead to unwarranted confidence in scientific conclusions.”
Szászi, Assistant Professor at Eötvös Loránd University and Corvinus University, added: 
“We advocate for the broader use of multi-analyst and ‘multiverse’ approaches, especially for questions of high scientific or societal importance. Rather than seeking a single true answer, these approaches make visible how stable—or fragile—scientific conclusions really are.”