The killer quote! Prestigious institutions and prestigious journals drive irreproducibility in the life sciences - well, at least in this particular sample.
The killer quote! Prestigious institutions and prestigious journals drive irreproducibility in the life sciences - well, at least in this particular sample.
And yet another one in the ever increasing list of analyses showing that top journals are bad for science:
"Thus, our analysis show major claims published in low-impact journals are significantly more likely to be reproducible than major claims published in trophy journals. "
To my knowledge, first time that not only prestigious journals, but also prestigious institutions are implicated as major drivers of irreproducibility:
"Higher representation of challenged claims in trophy journals and from top universities"
Tips for using Jupyter notebooks as part of a reproducible workflow (one that goes from raw data to research article with a single command): https://docs.calkit.org/notebooks
We invite staff and students at the University of #Groningen to share how they are making #research or #teaching more open, accessible, transparent, or reproducible, for the 6th annual #OpenResearch Award.
Looking for inspiration?
Explore the case studies submitted in previous years: https://www.rug.nl/research/openscience/open-research-award/previous-events
More info: https://www.rug.nl/research/openscience/open-research-award/
#OpenScience #OpenEducation #OpenAccess #Reproducibility
@oscgroningen
7/ Wei Mun Chan, Research Integrity Manager
With 10+ years in publishing and data curation, Wei Mun ensures every paper meets our high standards for ethics and #reproducibility. From image checks to data policies, he’s the quiet force keeping the scientific record trustworthy.
Here is how PCI is elevating the standards for open and reproducible science: introducing 4 already-existing and 2 new features
#OpenScience #Reproducibility
"#Reproducibility isn’t just about repeating results, it’s about making the #research process transparent, so others can follow the path you took and understand how you got there."
Listen to our new OpenScience podcast with Sarahanne Field @smirandafield
https://www.rug.nl/research/openscience/podcast/#Field
In this 10 min episode, Sarahanne reimagines reproducibility for #qualitative research.
She addresses challenges in ethical #data sharing of transcripts, and the importance of clear methodological reporting.
Reproducibili HIGH Tea with Don van Ravenzwaaij
Tue, May 20 |
2–3 PM CET
H.0431 (Heijmans) & https://osc-international.us4.list-manage.com/track/click?u=fa279e9b279d77a612c19105f&id=c54b2c5758&e=c7c7b4acc0
Learn how to de-identify data for open sharing!
Don’t miss it!
#OpenScience #Reproducibility #DataPrivacy
New study: #ChatGPT is not very good at predicting the #reproducibility of a research article from its methods section.
https://link.springer.com/article/10.1007/s10115-025-02428-z
PS: Five years ago, I asked this question on Twitter/X: "If a successful replication boosts the credibility a research article, then does a prediction of a successful replication, from an honest prediction market, do the same, even to a small degree?"
https://x.com/petersuber/status/1259521012196167681
What if #LLMs eventually make these predictions better than prediction markets? Will research #assessment committees (notoriously inclined to resort to simplistic #metrics) start to rely on LLM replication or reproducibility predictions?
New RFP Open! COS is funding open source developers to build or enhance tools that integrate with the Open Science Framework (OSF).
Got an idea to make open science more powerful? We want to hear it.
A propos of nothing in particular a rough estimate of the size of the papermill problem in the scientific literature:
https://bjoern.brembs.net/2024/02/how-reliable-is-the-scholarly-literature/
(from last year)
Reproducibility in #insect studies. Using a 3x3 experimental design, 3 labs, 3 species & 3 experiments, this study reveals cases of both sufficient & poor #reproducibility, highlighting opportunities for improving rigor in insect research @PLOSBiology https://plos.io/3EzuIxj
"#Reproducibility should be a key factor in all your #research, in all your projects. It costs time, but it's also shifting the time, and in the end it can save time again."
Listen to the latest episode of our #OpenScience Bites podcast with Michiel de Boer of the @Dutch_Reproducibility_Network
https://www.rug.nl/research/openscience/podcast/#de-boer
Open Science Bites is a series of short #podcast episodes - each around 10 minutes long - focusing on one specific open science practice.
Recently, I got several opportunities to discuss the reproducibility crisis in science. To help discuss that complex topic, we need to agree on a vocabulary.
My favorite one has been published by Manuel López-Ibáñez, Juergen Branke and Luis Paquete, and is summarized in the attached diagram, which you can also find here: http://nojhan.net/tfd/vocabulary-of-reproducibility.html
It's good that this topic is not fading away, but is gaining traction. "Slowly, but surely", as we say in French.
If you want a high resolution suitable for impression, do not hesitate to ask!
@lcheylus That is a monumental achievement! Congrats #debian! #reproducibility #reproducible
It's been known for quite some time that more prestigious journals publish less reliable #science. Now two papers provide compelling empirical evidence as to potential underlying mechanisms:
https://www.journals.uchicago.edu/doi/10.1086/733398
https://academic.oup.com/qje/advance-article/doi/10.1093/qje/qjaf010/7997678
The gist of the story is that scientists are so afraid of being scooped that they cut corners. Corner-cutting is rewarded by hi-ranking journal publications and the successful authors then teach their students how to get ahead in science.
These 2 papers provide compelling empirical evidence that competition in #science leads to sloppy work being preferentially published in hi-ranking journals:
https://www.journals.uchicago.edu/doi/10.1086/733398
https://academic.oup.com/qje/advance-article/doi/10.1093/qje/qjaf010/7997678
Using the example of structural #biology , the authors report that scientists overestimate the damage of being scooped, leading to corner-cutting and sloppy work in the race to be first. Faster scientists then end up publishing sloppier work in higher-ranking journals.
Is NixOS truly reproducible?
Now this would have been an interesting article if the title had been:
"Eleven strategies for getting research institutions to implement infrastructure supporting reproducible research and open science"
Bus why would one train researchers to do something their institution does not adequately support?
"Eleven strategies for making reproducible research and open science training the norm at research institutions"