eupolicy.social is one of the many independent Mastodon servers you can use to participate in the fediverse.
This Mastodon server is a friendly and respectful discussion space for people working in areas related to EU policy. When you request to create an account, please tell us something about you.

Server stats:

226
active users

#bibliometrics

1 post1 participant0 posts today

On my way home after two intense days at CWTS | Leiden University discussing data sources and indicators for bibliometric analysis. A huge thanks to Clara Calero Medina, Ludo Waltman and Vincent Traag for the inspiring conversations and useful tips that will soon be integrated into the Université de Lorraine #bibliometrics support services!

🚀 The #INNBI project keeps gaining momentum — our third meeting just wrapped up with energy and shared purpose!

Partners from 🇺🇦 🇦🇲 🇬🇪 🇲🇩 🇩🇪 are working together to move open science, high-quality metadata, and responsible bibliometrics forward — grounded in national contexts, but globally connected.

Together, we’re building a future where research infrastructures are open, interoperable, and inclusive.

🔗 Project info: dzhw.eu/en/forschung/projekt?p

#OpenScience #Bibliometrics @DZHW

Unlock the Power of Open Research Data!
Join our hands-on OpenAIRE Graph API Workshop on June 20, 14:00 – 16:00 CEST.

Limited to 30 participants for an interactive experience.

Dive into real-world applications on:
- Bibliometric analysis
- Scholarly discovery &
- Open Science monitoring

Register now: shorturl.at/QZVD2

I just reviewed a paper. I sent a detailed 3 page critique, mostly of the Bayesian analysis used in the paper. Reviewer 1 wrote 2 sentences, "The current manuscript is a very well written". With reviewers like that, it's no wonder that science has problems. Too many papers are being submitted, so it's impossible to find competent reviewers.
I blame #bibliometrics.

MDPI as a corruption indicator? A new preprint shows a striking trend across Europe 🇪🇺: more MDPI papers → higher perceived corruption → lower innovation.

👉 arxiv.org/abs/2411.06282v1

It’s not that MDPI = bad. But when it dominates, it signals a broken system chasing quantity over quality.

Ukraine? 🇺🇦 Not in the study, but we see the same rise of #MDPI. We could build better. Instead, we copy the worst.

ResearchFish Again

One of the things I definitely don’t miss about working in the UK university system is the dreaded Researchfish. If you’ve never heard of this bit of software, it’s intended to collect data relating to the outputs of research grants funded by the various Research Councils. That’s not an unreasonable thing to want to do, of course, but the interface is – or at least was when I last used it several years ago – extremely clunky and user-unfriendly. That meant that, once a year, along with other academics with research grants (in my case from STFC) I had to waste hours uploading bibliometric and other data by hand. A sensible system would have harvested this automatically as it is mostly available online at various locations or allowed users simply to upload their own publication list as a file; most of us keep an up-to-date list of publications for various reasons (including vanity!) anyway. Institutions also keep track of all this stuff independently. All this duplication seemed utterly pointless.

I always wondered what happened to the information I uploaded every year, which seemed to disappear without trace into the bowels of RCUK. I assume it was used for something, but mere researchers were never told to what purpose. I guess it was used to assess the performance of researchers in some way.

When I left the UK in 2018 to work full-time in Ireland, I took great pleasure in ignoring the multiple emails demanding that I do yet another Researchfish upload. The automated reminders turned into individual emails threatening that I would never again be eligible for funding if I didn’t do it, to which I eventually replied that I wouldn’t be applying for UK research grants anymore anyway. So there. Eventually the emails stopped.

Then, about three years ago, ResearchFish went from being merely pointless to downright sinister as a scandal erupted about the company that operates it (called Infotech), involving the abuse of data and the bullying of academics. I wrote about this here. It then transpired that UKRI, the umbrella organization governing the UK’s research council had been actively conniving with Infotech to target critics. An inquiry was promised but I don’t know what became of that.

Anyway, all that was a while ago and I neither longer live nor work in the UK so why mention ResearchFish again, now?

The reason is something that shocked me when I found out about it a few days ago. Researchfish is now operated by commercial publishing house Elsevier.

Words fail. I can’t be the only person to see a gigantic conflict of interest. How can a government agency allow the assessment of its research outputs to be outsourced to a company that profits hugely by the publication of those outputs? There’s a phrase in British English which I think is in fairly common usage: marking your own homework. This relates to individuals or organizations who have been given the responsibility for regulating their own products. Is very apt here.

The acquisition of Researchfish isn’t the only example of Elsevier getting its talons stuck into academia life. Elsevier also “runs” the bibliometric service Scopus which it markets as a sort of quality indicator for academic articles. I put “runs” in inverted commas because Scopus is hopelessly inaccurate and unreliable. I can certainly speak from experience on that. Nevertheless, Elsevier has managed to dupe research managers – clearly not the brightest people in the world – into thinking that Scopus is a quality product. I suppose the more you pay for something the less inclined you are to doubt its worth, because if you do find you have paid worthless junk you look like an idiot.

A few days ago I posted a piece that include this excerpt from an article in Wired:

Every industry has certain problems universally acknowledged as broken: insurance in health care, licensing in music, standardized testing in education, tipping in the restaurant business. In academia, it’s publishing. Academic publishing is dominated by for-profit giants like Elsevier and Springer. Calling their practice a form of thuggery isn’t so much an insult as an economic observation. 

With the steady encroachment of the likes of Elsevier into research assessment, it is clear that as well as raking in huge profits, the thugs are now also assuming the role of the police. The academic publishing industry is a monstrous juggernaut that is doing untold damage to research and is set to do more. It has to stop.

In the Dark · The Researchfish Scandal
More from In the Dark

An entertaining, informative and overall really well done video about the h-index and why you shouldn't use it, by @stefhaustein, @carey_mlchen et al.

I really will be sharing this video a lot: "What is the h-index and what are its limitations? Or: Stop using the h-index"

youtube.com/watch?v=HSf79S3XkJw

#hIndex #bibliometrics #researchEvaluation #researchAssessment #publishOrPerish

@academicchatter