Editors can’t spot talent. I’ve heard this joke before. It isn’t funny
Editors at The BMJ are lousy at predicting the citation performance of research papers. Or are they?
Editors at The BMJ are lousy at predicting the citation performance of research papers. Or are they?
Observations on reproducibility and research integrity from London STM Week
What if even by saying “fake science” you inadvertently participate in a scam? What if this phrase legitimizes fraud, lies, and deceit? Let’s call it what it is – dupery.
Karin Wulf and Rick Anderson reflect on the OSTP’s response to their interview questions, and on some implications of those responses and of the memo itself.
Karin Wulf and Rick Anderson interview Dr. Alondra Nelson, acting director of the White House Office on Science & Technology Policy when the new OSTP memo was published.
A new type of post from us today, offering a smorgasbord of opinions on topics including the ongoing Twitter/Elon Musk saga, just what “equitable access” to the literature means, the ongoing lack of experimental controls in one area of bibliometric analysis, and whether journals are more like a gate or a sewer.
Karin Wulf and Rick Anderson provide a roundup of responses to the new OSTP public access memo — and a preview of their interview with OSTP leadership.
Kicking off Peer Review Week 2022: Does trust in research begin with trust in peer review across the whole ecosystem, and what does that look like for different communities and stakeholders?
One more answer to the question, Is Research Integrity Possible without Peer Review? Today’s response is from journal Editor-in-Chief and surgeon, D. Robert Siemens.
Continuing the run-up to this year’s Peer Review Week (September 19-23) today you’ll hear the Chefs’ answers to the question: Is research integrity possible without peer review?
For an early start on Peer Review Week, we reached out to the SSP community to ask “Is research integrity possible without peer review?”
Some initial thoughts on the new OSTP memo on public access to results of federally funded research — and questions about its intent and implications.
If we don’t know what citations mean, what does it mean when we count them? Revisiting a 2015 (!) post in light of recent developments in citation metrics and impact.
Avi Staiman suggests revamping the peer review process to make it less about tearing down the work of others, and more about helping authors improve their papers.
Clarivate Analytics announced today that all journals in the Web of Science Core Collection will get Impact Factors raising questions about the Emerging Sources Citation Index. Further, Clarivate will only report Impact Factors to the first decimal devaluing journal rank in subject categories.