The threat to science and scholarship of Donald Trump’s election has been the subject of a number of excellent posts, both here on The Scholarly Kitchen and elsewhere. From removing the EPA website’s climate change pages to jeopardizing scientific collaboration as a result of the notorious US travel ban on researchers, students, and others from seven mostly Muslim countries, the politicization of scholarship has, for many of us, gone way too far. Equally, if not more alarming, is the growing use of ‘alternative facts’ (aka lies) and the explicit rejection of expert opinion by some (mostly rightwing) politicians on both sides of the Atlantic.

As a result, public trust in science and scientists is at risk. Many people don’t understand how science works and, instead, rely on attention-grabbing (and often misleading or downright erroneous) headlines; their distrust is compounded by media coverage of what is wrong with science, rather than what is right – peer review is broken, fraudulent science is on the rise, sham journals scam authors, and predatory publishing abounds. This, in turn, makes it easy for those who seek to discredit genuine academic research to claim that their ‘alternative facts’ are just as valid.

Science protest sign
Image via Martha Soukup.

Some of this, without doubt, is our own fault. In an effort to stay competitive by reducing costs, corners have been cut. Rigorous training and support for peer review isn’t as widespread as it could and should be. Fake authors and reviewers, like predatory publishers, really do exist. There is still too much emphasis on the publish or perish dichotomy, which can lead to bad science getting published. And more…

So, how can we (re)build trust and in scientific and scholarly communications? The good news is that there are already a number of tools, services, and initiatives out there that we can use to ensure that the scholarship being published is as rigorous as possible. Some are commercial, others are non-profit; some open, others proprietary. They have been launched – and are supported by – organizations across all sectors of scholarly communications: associations, funders, publishers, research institutions, service providers, and researchers themselves. Many encourage or enable increased transparency as a way of increasing trust. Not all of them will be appropriate for every organization, but they all represent an opportunity not just to make research more trustworthy, but also to make it work better – for everyone.

  1. Educate your researchers about the dangers of predatory publishers through the Think, Check, Submit initiative, which provides them with valuable advice on why and how to decide where to publish
  2. Improve clarity around who did what work on a research paper by implementing CRediT’s taxonomy of researcher roles – and enable better recognition for the role of co-authors at the same time
  3. Sign up to the Transparency and Openness Guidelines launched by the Center for Open Science – signatories include large publishers such as Elsevier, Springer Nature, and Wiley as well as several smaller publishers and associations
  4. Encourage or, if appropriate, require reproducibility. PLOS, Science Exchange, and Figshare’s Reproducibility Initiative is a good example of organizations collaborating to address this problem
  5. Consider more open alternatives to traditional double-blind peer review. While this may not be appropriate for all disciplines, in some areas it can represent a big step forward in terms of transparency. eLife, F1000, and ScienceOpen provide three good, but different, examples of this in practice
  6. Open annotation tools are also a great way to increase transparency. One such tool, Hypothes.is, is already being used by scientific fact checkers, for example, the Climate Feedback Group
  7. And, for open data, check out – and support – Force 11’s Fair Data Principles, which call for data to be Findable, Accessible, Interoperable, and Reusable
  8. Make it easy for your authors to disclose their conflicts of interest, for example, by using a system like Convey, developed by the Association of American Medical Colleges
  9. Make the most of persistent identifiers. Collect validated ORCID iDs during the submission process to enable your authors to reliably (and seamlessly if they sign up for auto-update) connect themselves with the DOIs for their publications*
  10. Support the Committee for Publication Ethics (COPE). You don’t have to be a member to access many of their great resources such as flowcharts and sample letters for implementing their Code of Conduct, but members do benefit from access to additional ones, including their ethical audit tool
  11. Help improve peer review education for researchers. Several publishers and other organizations already do this both directly and through their support for Sense About Science, which has some great publications on peer review as well as running workshops for early career researchers
  12. Use a reputable software tool such as Crossref’s Crosscheck during the submission process, to minimize the risk of publishing plagiarized work
  13. Crossmark, also from Crossref, aids transparency after an article is published, by alerting readers if it has been updated, corrected, or retracted
  14. Abide by industry standards. Organizations such as CASRAI and NISO work with volunteers from right across the scholarly community to identify and develop global standards from which everyone can benefit
  15. Get involved! Join an industry association if you don’t already belong to one. Attend a meeting. Volunteer for a committee. Make sure your organization’s – and your researchers’ – voice is heard as our community debates how best to tackle the challenges facing scholarship today

This is by no means a comprehensive list but, like Kent Anderson’s original 2012 post, “A Proposed List – 60 Things Journal Publishers Do“, which has subsequently grown to 96 things due to suggestions from other Scholarly Kitchen Chefs and the community, I hope many of you will add to this one too, as well as comment on it.

*Full disclosure: I am ORCID’s Director of Community Engagement & Support

 

Alice Meadows

Alice Meadows

I am a Co-Founder of the MoreBrains Cooperative, a scholarly communications consultancy with a focus on open research and research infrastructure. I have many years experience of both scholarly publishing (including at Blackwell Publishing and Wiley) and research infrastructure (at ORCID and, most recently, NISO, where I was Director of Community Engagement). I’m actively involved in the information community, and served as SSP President in 2021-22. I was honored to receive the SSP Distinguished Service Award in 2018, the ALPSP Award for Contribution to Scholarly Publishing in 2016, and the ISMTE Recognition Award in 2013. I’m passionate about improving trust in scholarly communications, and about addressing inequities in our community (and beyond!). Note: The opinions expressed here are my own

Discussion

16 Thoughts on "15 Things We Can Do To Stand Up For Science!"

One other essential ‘ingredient’ is missing in this article describing the diminishing public trust in science are the shady business practices that many traditional publishers have engaged in (e.g. extortionately high prices, bundling of content, poor or no remuneration for submitted content, etc.). Even if the ‘science’ might be right, the business practices are not. Academics have become more active protesting in this regard (e.g. http://www.thecostofknowledge), and various Editorial Board resignations and the unwillingness of large institutional purchasers (even countries) to continue these contracts under existing terms is testament to this dissatisfaction. If science wants to retain its credibility in terms of content, it will have to regain it in terms of economics and fair practice.

As regards the ’60 Things Journal Publishers Do’, academics are starting to understand that many of these ‘things’ are actually outsourced by publishers to external companies and/or that they are not as expensive as they are purported to be. Academics are increasingly discovering how they can gain access to these tools affordably and efficiently, and that there is not as much ‘behind the curtain’ as they had presumed there to be.

Can you point to any evidence of public trust in science being lessened due to pricing structures of publishers, the Big Deal, or the exchange of publishing services for the right to sell/license content? I recognize that these are all issues for libraries and some researchers, but I’m having a hard time believing that the general lay public is unwilling to believe scientific data because Elsevier charges a lot of money for some of their journals. I think you’re barking up the wrong tree here.

The scholarly publishing community does not exist in a vacuum, David. A sizable sub-set of the general public are students, for example, and many are painfully aware of the negative economic effects traditional publishing has on them. These problems have also been broadly communicated via the public media (for example, see https://www.theguardian.com/higher-education-network/2016/apr/18/why-academic-journals-expensive), and there is an increasing perception of the broad negative effects that, for example, unaffordable paywalls have on the progress of science. It would be unfair to point out only Elsevier as well, as they are not the only company involved in such practices, just the largest.

Again, “impeding the progress of science” is an entirely different subject than “public trust in science,” the subject of this post.

How can there be public trust in science when, for example, one see’s that some publishers have clear agendas is to mazimize profits rather than to maximize dissemination outputs – to support shareholders interests above the support of science? When a student see’s that his/her university has no access to a particular journal as their university cannot afford it, while the publisher is realising magnificent profits? The public has lost trust in science as it see’s it as captive victim of powerful business interests rather than as window to objective and accessible knowledge.

Again, I think you’re projecting.

If you can provide evidence that there is widespread belief among the public that scientific research results are unreliable due to the business models/prices charged by publishers, I would love to see it. How does a student not being able to access an article drive the general public to assume that article must be false? I think you’re trying to equate two very different subjects. Go out into the street and ask the first 100 non-academics and non-librarians you meet to name a single scholarly publisher and what they think about their business models and profit margins, and I’m willing to bet you’ll be met with 100 blank stares.

I didn’t see this point on either list, but it was the first thing that came to mind after reading the opening paragraph:

focus on the science, and leave any partisan opinions at home.

The trust in science and scientists is declining because scientists are NOT seen as objective; rather, scientists are seen as pursuing a political agenda. A good example is climate change: people who question or dispute the accepted narrative are labeled “deniers,” thereby shutting down a discussion.

Focus on the science.

I think this points to a really key problem — the mistaken assumption that science can in any way, remain separate from the real world and controversy. Is it actually possible to report on reality without upsetting someone (or someone’s business plans)? Once that person/business is upset, it is in their best interests to cast doubt upon the research. No matter how much one “focuses on the science”, controversy may arise. Even the most objective reporting may be painted as “biased” by those with interests that it shows evidence against. Sticking one’s head in the sand and pretending that science is somehow not a part of the real world is not a solution.

Further, science is filled with controversies and disagreements. Climate change is no different than neural development or any other field with more than one school of thought. The difference here is the intrusion of business interests into the scientific debate, not the other way around.

Is it actually possible to report on reality without upsetting someone

If you search for “kenneth zucker uspath censorship”, you’ll find stories about Kenneth Zucker, who was expelled from the USPATH conference in early February for un-PC trans-related research. His problem? He dissented from the party line that trans therapy is beneficial.

Its hard to believe that all of the other medical researchers at the conference were presenting unbiased research when the only dissenter was expelled. Is it the goal of science to confirm a person or group’s chosen “reality” or to enhance our understanding about the world we live in?

Growing up, I was taught to seek out information both for and against a topic before making a decision. If science only presents the “correct” narrative, then how can people make informed decisions?

This researcher is not someone I’m familiar with, but a quick perusal of the news reports (http://nymag.com/scienceofus/2016/02/fight-over-trans-kids-got-a-researcher-fired.html) seems to support my point. A clinician with highly controversial treatment methods had his clinic shut down due to the efforts of activists from outside of the academic community, and the institution where he worked bowed to public pressure.

Further, there has to be a line drawn somewhere. If you search for “Andrew Wakefield autism” you’ll find that Wakefield has had his medical license revoked and is not asked to speak at major scientific conferences on the subject. Is this a problem — his views certainly are un-PC and deviate from the party line? Should he be given a platform to espouse them at every scientific meeting? Must every crackpot be catered to in the name of objectivity or at some point is it acceptable to exclude some ideas from the conversation? Looking forward to your opinions on chemtrails as well.

I think you may need to add the link to ‘Think, Check, Submit’ (http://thinkchecksubmit.org/) in the first bullet point.

Other than that, a truly interesting article as usual in the Kitchen. Thanks for sharing!

Excellent article, thank you Alice Meadows! I would just like to emphasise a couple of additional areas:
1. Building science literacy among the general public
2. Working with the mass media and governments to address misleading and confusing news on health and science.
Neil Pakenham-Walsh, Coordinator of Healthcare Information For All http://www.hifa.org

Thanks Neil – both great suggestions. Good organizations to support that are addressing both these issues are AAAS, BAAS (and other similar associations) and Sense About Science.

Comments are closed.