While the specific goals of open access (OA) publishing aren’t universally agreed upon, the overriding ambition is to make scientific knowledge freely available to everyone, with no barriers.

The barriers that have historically informed OA strategies — subscription paywalls and large commercial publishers — have proven to be malleable. Paywalls have changed a great deal, and most are now semi-porous or fall completely after relatively short embargo periods. There are public access policies in place for more than 98% of research funded by the US government. Hybrid and direct OA publishing have become more commonplace. Yet, paywalls and commercial publishers remain fixations in discussions about OA.

detour sign

Commercial publishers, after a period of resistance and uncertainty, have embraced the new revenues Gold OA can provide — so much so that there are now concerns that large, commercial publishers have themselves appropriated the OA movement via Gold OA, eliminating some paywalls while solidifying their place in the world:

Jutta Haider, an associate professor in Information Studies at Lund University . . . believes the turning point came in 2012, when the UK’s Finch Report was published. From that moment “open access became gold OA, and gold OA became APC OA.” In other words, pay-to-publish. The implications of this have been significant, she says, not least because it has allowed legacy publishers to appropriate the movement, and by doing so to continue to control and make excessive profits from scholarly communication.

The bottom line is that more scholarly content is free than ever before. Paywalls have been modified or dropped. Commercial publishers have embraced OA. Whether this constitutes success in some measure for the OA movement remains an unsettled question.

Now there’s a new wrinkle — the Internet has changed in the intervening years. These changes have introduced diversions and detours that aren’t based on paywalls but which can prevent OA content from reaching an audience. These changes have put non-neutral intermediaries on the discovery path, and these intermediaries divert readers without leaving a trace.

Thinking solely about transactional paywalls as the main barriers to OA content success no longer reflects how the Internet is run and governed, and by whom. New barriers are coming from major commercial intermediaries wielding algorithms that drive search engines and social media. Other barriers come from state-sponsored, industry-sponsored, and hacker-abetted propaganda that uses these intermediaries to spread misinformation faster than scientific research can possibly respond.

Publishers who rely solely on non-commercial digital distribution seem particularly vulnerable to being confounded by these new intermediaries — Google, Facebook, Twitter, and others with click-based advertising models and infrastructure-level presences.

Passive information purveyance faces tough odds when pitted against active information placement by technology-savvy commercial entities working 24/7 to gain any advantage they can. Gold OA, predicated on selling pre-publication services, doesn’t naturally create the commercial incentives to push content placement and delivery after publication. Subsequently, OA publisher investments in usage data monitoring, search engine optimization, and sophisticated social media monitoring is sometimes lacking.

The effect of this might show up in data from Altmetric, which presents measures intended to “showcase the attention and influence of research.” Analyzing their Top 100 articles for 2016 (data available here), 70% of the Top 100 articles were published behind paywalls. Given the preponderance of paywalled journals, it may be tempting to interpret a 30% rate of non-paywalled articles in the data as an indication that OA journals are punching above their weight, as they comprise only 18% of papers currently, according the best available estimate. Drilling into the data, however, tells a different story. Fully 1/3 of the articles in the Top 100 categorized by Altmetric as OA come from Gold OA journals from for-profit publishers (mainly Elsevier and SpringerNature), which suggests that for-profit publishers’ promotional practices may benefit the prominence of the content they publish. Moreover, 58% of the articles Altmetric classified as OA were published by journals that typically have paywalls, suggesting that articles Altmetric refers to as “OA” in their dataset are often selectively free paywalled articles. Paywalled articles themselves generated 10% more tweets and 10% more Facebook posts, while coverage in mainstream media differed only slightly.

Solid article promotion practices may explain why 89% of the Top 100 Almetric articles in 2016 came from journals that generally employ paywalls as well as the trend for those articles to perform better in social media and the tendency for Gold OA articles from for-profit publishers to perform better.

A quick thought experiment may also indicate how significant the challenge for neutral content purveyance might be. Consider the premise that access to more scientific information more immediately should lead to a society that is more scientifically literate, makes more evidence-based decisions more quickly, and advances generally in a united manner based on the best policy options informed by policy. Contrast this with what we have today — a confused and fragmented populace divided in opinions about nutrition, vaccinations, evolution, energy policy, global warming, and economics, and seemingly mired in place by endless spats and sparring about what is even true or false. In decades past, decisions about the ozone layer or acid rain were made relatively quickly by policymakers based on input from scientists. Now, all we do is argue about what is or is not true.

The assumption that the Internet is a neutral publishing substrate no longer holds, and organizations that sell content after publication may have an advantage in this environment.

The assumption that the Internet is a neutral publishing substrate no longer holds, and organizations that sell content after publication may have an advantage in this environment, insofar as their commercial incentives align with those of those seeking to divert users from their content. Activism around and after publication seems to be a new, common requirement.

Facebook, Google, and Twitter are the main information intermediaries today, all with click-driven advertising models that rely on algorithms to maximize revenues. Driving clicks means maximizing emotional response. I recently wrote about the effects these are having on our online experiences, citing parts of an essay by Maciej Ceglowski, in which he writes:

One problem is that any system trying to maximize engagement will try to push users toward the fringes. . . . the algorithms have learned that users interested in politics respond more if they’re provoked more, so they provoke. . . . In a setting where attention is convertible into money, social media will always reward drama, dissent, conflict, iconoclasm, and strife.

On the search front, Google’s algorithms have a mixed record, and clearly are not neutral. They favor content that has certain characteristics, and generally try to keep users on Google and clicking. The anti-vaccination misinformation we’re still facing comes from the days when Google’s predilection for clickable search results essentially drove the “vaccines and autism” frenzy from the late-1990s onward.

These search biases generate barriers you barely notice or feel. It’s hard to see the search result that isn’t there, difficult to see your machine’s CPU information and browser details modifying your online experience, and hard to realize when your phone and desktop are sharing data behind the scenes.

Publishers are getting little to no help from these new intermediaries. In Germany, Facebook has one manager to handle all of its publishing partners, clearly an inadequate level of staffing. For newspapers, the problem has been bilateral, as a story from the Columbia Journalism Review explains:

Just as many local newsrooms complained they were not getting attention from Facebook’s stretched journalism partnership team, Facebook sometimes found it equally difficult to connect with anyone at a small publisher who could tell them how the platform might help their business.

Many publishers have assigned staff as intermediaries with Google and Facebook. As non-neutral, they need to be managed. It’s not just publishers who perceive this. The power of these new intermediaries and their unpredictable nature have led Denmark to appoint an ambassador to deal with them:

. . . just as we engage in a diplomatic dialogue with countries, we also need to establish and prioritize comprehensive relations with tech actors, such as Google, Facebook, Apple, and so on.

Are these entities truly becoming as powerful as nation-states, able to impose information barriers along their borders? Perhaps so. Take Google, for example, as described in another essay by Maciej Ceglowski:

Google in particular has come close to realizing our nightmare scenario from 1998, a vertically integrated Internet controlled by a single monopoly player. Google runs its own physical network, builds phone handsets, develops a laptop and phone operating system, makes the world’s most widely-used browser, runs a private DNS system, PKI certificate authority, has photographed nearly all the public spaces in the world, and stores much of the world’s email.

Recent news that Google may now be able to track when its online advertising drives offline sales via credit card data tied back to user accounts might cross a line for privacy advocates, but it fits with the company’s overall drive to “organize the world’s information” — which is increasingly for their benefit, and not in the open, accessible way they initially aspired to achieve.

What Facebook and Google offer as “help” to publishers seems less than helpful. Facebook’s Instant Articles initiative has seen publishers pulling out of the program after seeing lackluster financial results and experiencing a lack of overall confidence in the approach.

A similar sort of help but for mobile arrived recently in the form of Google’s Accelerated Mobile Pages (AMP) initiative is being widely panned:

Google AMP is bad news for how the web is built, it’s bad news for publishers of credible online content, and it’s bad news for consumers of that content. Google AMP is only good for one party: Google. Google, and possibly, purveyors of fake news.

What AMP does ostensibly is put a site’s pages in a common framework from Google so that it renders faster and more consistently on mobile devices. However, this approach enables capture, as Google gains the content, traffic data, and other factors to further its advertising business model — all while subverting the owner’s branding, trade dress, and analytics. This gets to the issue of trust, as without native branding, Google takes the role of authority:

Anybody can cram an illegitimate idea into a web page and – so long as it’s encoded as AMP content – it’ll look like it’s from a legit new organization endorsed by Google. Because everything in AMP looks the same. Content shown in Google’s AMP view is stripped of all branding as if the content were from a legitimate news agency. There’s a not so subtle message behind this lack of branding: it’s that the source of information doesn’t matter so long as Google got you there.

Without differentiation, content from borderline sources becomes easier to purvey effectively. Allowing questionable content to flood the field blocks access to good OA articles. Branding does matter to readers and publishers — it is a link of trust and a badge of authenticity. Predatory publishers already try to sow confusion around journal brands. If Google’s AMP replaces branding, predatory journals and illegitimate articles may gain the same face validity as well-reviewed OA science and scholarship, which could be both an immediate and a long-term problem for OA publishers.

These intermediaries can interpose complicated, unseen, and very real barriers to access, starting at the point of discovery or awareness, and continuing through a plethora of distractions, counterfactual assertions, and misinformation.

There is also the larger issue within “the culture of clicks,” which we see in measures like Altmetrics and online advertising. A recent set of analyses of local news sources from Sweden and the Netherlands show that readers clicked on 9% of the stories but spend 20% of their time on the site on those stories. Relying on click counts alone may cut the actual value of content in half. This gets to the consistent issue of what a click actually represents. It is transactional, not meaningful, yet the largest players online are relying on clicks to define the Internet’s addressable information space.

This has another effect, which is that misinformation in a click-based advertising model is all upside — that is, there is very little or no cost to get started, and if something works, the rewards can be large. Again, OA publishers find themselves competing against entities that have clear economic incentives to push their information — whether it’s true or not — for long periods of time.

The challenge this poses for OA publishers is that OA’s ambitions depend on articles being found. With non-neutral intermediaries mixing quality OA content with predatory or lackluster OA content as well as anti-science clickbait, these intermediaries could stymie OA goals in unexpected or unseen ways.

Even the paths to overcoming these problems are littered with potholes and felled trees. Content marketing may be one method of drawing users to your articles. With forces looking to affect public opinion using advertising in ever more pernicious ways, it’s natural for users to view advertisements with more skepticism than ever (as well as search results):

. . . intelligence officials have found that Moscow’s agents bought ads on Facebook to target specific populations with propaganda. “They buy the ads, where it says sponsored by–they do that just as much as anybody else does,” says the senior intelligence official. (A Facebook official says the company has no evidence of that occurring.) The ranking Democrat on the Senate Intelligence Committee, Mark Warner of Virginia, has said he is looking into why, for example, four of the top five Google search results the day the U.S. released a report on the 2016 operation were links to Russia’s TV propaganda arm, RT. (Google says it saw no meddling in this case.) Researchers at the University of Southern California, meanwhile, found that nearly 20% of political tweets in 2016 between Sept. 16 and Oct. 21 were generated by bots of unknown origin; investigators are trying to figure out how many were Russian.

Into this workspace — filled with unknown bots, political and anti-science propaganda, and algorithms geared to divert audiences to the most extreme information, article promotion becomes less effective and more expensive.

The practical implications for OA publishers of these non-neutral intermediaries aren’t entirely clear, but to achieve its goals of making good scientific content not only available but prominent, OA publishers will benefit from thinking differently about the new information environment in which paywalls are perhaps the most obvious and manageable aspect. It’s no longer safe or rational to assume that “free” cuts through the clutter, overcomes paid campaigns from misinformation vendors, and competes well with motivated commercial operators.

Kent Anderson

Kent Anderson

Kent Anderson is the CEO of RedLink and RedLink Network, a past-President of SSP, and the founder of the Scholarly Kitchen. He has worked as Publisher at AAAS/Science, CEO/Publisher of JBJS, Inc., a publishing executive at the Massachusetts Medical Society, Publishing Director of the New England Journal of Medicine, and Director of Medical Journals at the American Academy of Pediatrics. Opinions on social media or blogs are his own.

View All Posts by Kent Anderson


4 Thoughts on "Detours and Diversions — Do Open Access Publishers Face New Barriers?"

Great article. I teach a course on using the web for research. I am constantly amazed at what vast amounts of misinformation are found on the web and the appearance of legitimacy in which it is presented. As a society it is not the challenge of low information people but the realization that we have misinformed people who believe that if they saw it on the web then it is true!

As a publisher the greatest challenge was publishing an errata sheet because the errata never had the authority of that which was printed in the book!

There may be some compensating advantage in the AMP approach also. Those who distrust anything issued from the “liberal” media will no longer be able to separate out the “liberal” sources from those they prefer.

As for people being rational, we all know that confirmation bias, cognitive dissonance, and other such psychological phenomena have long played a role in how people come to believe what they do. Pure reason may not be a major factor.

The relatively short embargo period, if this translates to 12-18 months, is likely the entire useful period of the information. Tina Baich (IUPUI) and I studied 10 years of resource sharing requests at our institutions and found that demand for articles, in all subjects, drops off the edge of a cliff after the first year. (paper given at IFLA 2016) Part of that is certainly when articles become available through full-text databases but given that publishers certainly know the relative age of downloads and would also certainly have set the embargo periods to cover, as completely as possible, the useful life.

Resource-sharing requests may not equate to online usage. I run a company that is able to look across the usage data for 1,600+ journals. Demand for content that is more than 5 years old is high. Usage for this content accounted for 35% of the usage across the set, and 60% of the denials/turnaways and 59% of abstract views. At today’s SSP meeting, a bibliometrician noted in conversation that cited half-life continues to rise, as well.

Leave a Comment