eupolicy.social is one of the many independent Mastodon servers you can use to participate in the fediverse.
This Mastodon server is a friendly and respectful discussion space for people working in areas related to EU policy. When you request to create an account, please tell us something about you.

Server stats:

208
active users

#reproducibility

3 posts3 participants0 posts today

Dear HPC users,

We invite you to take part in a research survey that explores how researchers approach computational reproducibility, especially when working with HPC and cloud infrastructure. The goal is to better understand current practices, challenges, and needs around reproducible research workflows.

Survey link:
👉 ec.europa.eu/eusurvey/runner/c

The survey takes approximately 10 minutes. It is anonymous and entirely voluntary. The results will be published in a research paper and also contribute to shaping best practices and training resources that support reproducible science, such as those developed in the de.KCD project.

The survey is open until August 31st, 2025.

Your time and input are greatly appreciated in advancing more reproducible and reliable computational research.

We're now in the middle of the special roundtable session on "Living With Machines: Comparative Literature, AI, and the Ethics of Digital Imagination".

I'll speak briefly about different kinds of AI (generative LLMs, non-generative LLMs, deep learning, machine learning); when to use which, and most important, when NOT to use LLMs; as well as some best practices for #transparency, #reproducibility and #sustainability in this context.

Details: conftool.pro/icla2025/index.ph

www.conftool.pro2025 ICLA Congress - ConfTool Pro - BrowseSessions

📢 Just published our new work on federated random forests for privacy-preserving machine learning!
📄 “A Federated Random Forest Solution for Secure Distributed Machine Learning”
📌 IEEE: doi.org/10.1109/CBMS65348.2025

📂 Supplementary slides:
🔗 doi.org/10.5281/zenodo.16539345

We're advancing secure AI without sharing data. Feedback & collaborations welcome! 🚀
#FederatedLearning #PrivacyPreservingAI #MachineLearning #OpenScience #IEEE #DataScience #Zenodo #ResearchSoftware #Reproducibility

Docker Desktop for Statisticians revolutionises R use by creating isolated, reproducible environments. This eliminates version conflicts and simplifies setups. With Docker, you run pre-configured R containers, enabling efficient and clean analysis environments. Explore container management to enhance statistical work and ensure easy collaborative sharing. #Docker #Statistics #Reproducibility #RStats statology.org/docker-desktop-f

Statology · Docker Desktop for Statisticians: Running R in ContainersDocker Desktop transforms how statisticians work with R by providing isolated, reproducible environments that eliminate version conflicts and setup complications.

Retractions and failures to replicate are signs of weak research. But they're also signs of laudable and necessary efforts to identify weak research and improve future research. The #Trump admin is systematically weaponizing these efforts to cast doubt on science as such.

"Research-integrity sleuths say their work is being ‘twisted’ to undermine science."
nature.com/articles/d41586-025

www.nature.comResearch-integrity sleuths say their work is being ‘twisted’ to undermine scienceSome sleuths fear that the business of cleaning up flawed studies is being weaponized against science itself.

And yet another one in the ever increasing list of analyses showing that top journals are bad for science:

"Thus, our analysis show major claims published in low-impact journals are significantly more likely to be reproducible than major claims published in trophy journals. "

biorxiv.org/content/10.1101/20

bioRxiv · A retrospective analysis of 400 publications reveals patterns of irreproducibility across an entire life sciences research fieldThe ReproSci project retrospectively analyzed the reproducibility of 1006 claims from 400 papers published between 1959 and 2011 in the field of Drosophila immunity. This project attempts to provide a comprehensive assessment, 14 years later, of the replicability of nearly all publications across an entire scientific community in experimental life sciences. We found that 61% of claims were verified, while only 7% were directly challenged (not reproducible), a replicability rate higher than previous assessments. Notably, 24% of claims had never been independently tested and remain unchallenged. We performed experimental validations of a selection of 45 unchallenged claim, that revealed that a significant fraction (38/45) of them is in fact non-reproducible. We also found that high-impact journals and top-ranked institutions are more likely to publish challenged claims. In line with the reproducibility crisis narrative, the rates of both challenged and unchallenged claims increased over time, especially as the field gained popularity. We characterized the uneven distribution of irreproducibility among first and last authors. Surprisingly, irreproducibility rates were similar between PhD students and postdocs, and did not decrease with experience or publication count. However, group leaders, who had prior experience as first authors in another Drosophila immunity team, had lower irreproducibility rates, underscoring the importance of early-career training. Finally, authors with a more exploratory, short-term engagement with the field exhibited slightly higher rates of challenged claims and a markedly higher proportion of unchallenged ones. This systematic, field-wide retrospective study offers meaningful insights into the ongoing discussion on reproducibility in experimental life sciences ### Competing Interest Statement The authors have declared no competing interest. Swiss National Science Foundation, 310030_189085 ETH-Domain’s Open Research Data (ORD) Program (2022)

To my knowledge, first time that not only prestigious journals, but also prestigious institutions are implicated as major drivers of irreproducibility:

"Higher representation of challenged claims in trophy journals and from top universities"

biorxiv.org/content/10.1101/20

bioRxiv · A retrospective analysis of 400 publications reveals patterns of irreproducibility across an entire life sciences research fieldThe ReproSci project retrospectively analyzed the reproducibility of 1006 claims from 400 papers published between 1959 and 2011 in the field of Drosophila immunity. This project attempts to provide a comprehensive assessment, 14 years later, of the replicability of nearly all publications across an entire scientific community in experimental life sciences. We found that 61% of claims were verified, while only 7% were directly challenged (not reproducible), a replicability rate higher than previous assessments. Notably, 24% of claims had never been independently tested and remain unchallenged. We performed experimental validations of a selection of 45 unchallenged claim, that revealed that a significant fraction (38/45) of them is in fact non-reproducible. We also found that high-impact journals and top-ranked institutions are more likely to publish challenged claims. In line with the reproducibility crisis narrative, the rates of both challenged and unchallenged claims increased over time, especially as the field gained popularity. We characterized the uneven distribution of irreproducibility among first and last authors. Surprisingly, irreproducibility rates were similar between PhD students and postdocs, and did not decrease with experience or publication count. However, group leaders, who had prior experience as first authors in another Drosophila immunity team, had lower irreproducibility rates, underscoring the importance of early-career training. Finally, authors with a more exploratory, short-term engagement with the field exhibited slightly higher rates of challenged claims and a markedly higher proportion of unchallenged ones. This systematic, field-wide retrospective study offers meaningful insights into the ongoing discussion on reproducibility in experimental life sciences ### Competing Interest Statement The authors have declared no competing interest. Swiss National Science Foundation, 310030_189085 ETH-Domain’s Open Research Data (ORD) Program (2022)

We invite staff and students at the University of #Groningen to share how they are making #research or #teaching more open, accessible, transparent, or reproducible, for the 6th annual #OpenResearch Award.

Looking for inspiration?
Explore the case studies submitted in previous years:
🔗 rug.nl/research/openscience/op

More info:
🔗 rug.nl/research/openscience/op

#OpenScience #OpenEducation #OpenAccess #Reproducibility
@oscgroningen

Continued thread

Jack Taylor is now presenting a new #Rstats package: "LexOPS: A Reproducible Solution to Stimuli Selection". Jack bravely did a live demonstration based on a German corpus ("because we're in Germany") that generated matched stimuli that certainly made the audience giggle... let's just say that one match involved the word "Erektion"... 😂

There is a paper about the LexOPS package: link.springer.com/article/10.3 and a detailed tutorial: jackedtaylor.github.io/LexOPSd. Also a #Shiny app for those who really don't want to use R, but that allows code download for #reproducibility: jackedtaylor.github.io/LexOPSd Really cool and useful project! #WoReLa1 #linguistics #psycholinguistics

Continued thread

7/ Wei Mun Chan, Research Integrity Manager

With 10+ years in publishing and data curation, Wei Mun ensures every paper meets our high standards for ethics and #reproducibility. From image checks to data policies, he’s the quiet force keeping the scientific record trustworthy.

Are you an educator and wondering if and how to include #preregistration into student assignments? Then join our #webinar and learn from our speakers!

🗓️ 26 June
⌚ 13:30 -15 hrs
register here:
events.teams.microsoft.com/eve

Ewout Meijer from Maastricht University and Elen Le Foll from the University of Cologne will share their experiences with having students preregister their term papers and theses work.

We will make a recording of the webinar available.
#openScience #reproducibility