In recent weeks, I have argued that content leakage is reducing the value of the subscription big deal. The syndication model might enable publishers to recapture much of this leakage, a model that Springer Nature has begun to pilot with ResearchGate, indicative of the strategic dilemmas that syndication poses. For libraries, syndication offers the opportunity to provide dramatically improved experiences for their users — with a number of risks as well, including the prospect of substantially reducing their leverage at the negotiating table.

Hans Brinker statue.
An Effort to Stop Leakage. Hans Brinker statue, image via Uberprutser, reused under CC BY SA license.


The big benefit to users — who currently face substantial stumbling blocks in accessing and using e-resources — would be straightforward. Users on any participating cross-publisher hub, from Mendeley to Dimensions to Web of Science to ResearchGate, would be able to have on-platform access to versions of record based on their institutional license entitlements. Hubs will see even greater investment, as they develop a variety of new business models and services that take advantage of seamless on-platform content, that will serve user needs. The benefits to the user do not seem to be in question.


The benefits to most publishers are also clear. In a syndication model, libraries continue to license entitlement rights directly from publishers and syndication allows publishers to track additional sources of usage beyond just the publisher platform. Those publishers that embrace syndication will probably seek to syndicate their content through as many third-party platforms as possible, to reduce what they see as “leakage,” that is, usage of the content they publish where usage is not reported to their library customers. The opportunity for publishers is to increase the amount of usage that they can track and report.

The tracking and reporting of syndicated usage will take place through standard COUNTER usage reporting. The COUNTER 5 standard already has provisions for adding syndicated usage (reported via the new Distributed Usage Logging standard) to library usage reports. While I do not have a clear sense of the input and concerns of librarians that may have entered this process, it is clear that it is now a community standard. Publishers will add this additional usage to the denominator in calculator price per download, thereby making their content licenses appear to be a better value for money than they previously had seemed to be, ensuring that there is a strong (if brief) first mover advantage for those publishers that move ahead with syndication.

What kinds of levels of usage increases can libraries anticipate? Elsevier has calculated [PDF; slide 10] that ScienceDirect usage stats would increase by 4-5% if Mendeley usage was counted and that adding versions of record to SSRN for entitled users would provide at least another 1%. But ResearchGate is by far the biggest prospect, and it would not surprise me to see at least some publisher usage numbers grow by 10%, 25%, or more for major library customers — once versions of record are distributed there to license-entitled users.

As a result, libraries should expect publishers that move forward with content syndication to report substantially higher amounts of usage, especially if major sites like ResearchGate begin to distribute versions of record based on entitlements, as this month’s pilot announcement appears to suggest.


So far, so good. The user gets all versions of record through a single platform, the publisher gets credit for usage, and everything moves seamlessly with fewer barriers along the way. But there is an opportunity, or a problem, in the long run.

Libraries are looking to hold steady or reduce their spend on licensed journal bundles. I’ve made the case that leakage has allowed groups of libraries to walk away from subscription big deal bundles in recent years. The platforms through which content is leaking most extensively — ResearchGate and Academia perhaps more than any others, but also pirate sites and institutional and disciplinary repositories — have afforded libraries the greatest leverage in their big deal negotiations. To the extent that leaks are plugged up, we must examine how this affects publishers’ and libraries’ negotiating positions.

Negotiating positions will in turn depend on the choices that are made as to how exactly syndication is operationalized. If a platform currently provides a preprint version of an article, syndication can be configured to enable entitled users to access the version of record while continuing to enable other users to access the preprint or other alternative version. Elsevier’s efforts to connect institutional repositories with its ScienceDirect platform are instructive in this regard. It is also possible to imagine publishers hoping that a syndication model would enable them to tamp down on the amount of preprint availability for non-entitled users; although to be clear, no one I have spoken with about these models has suggested such a goal.

The question then is what is the effect on subscriptions of having versions of record seamlessly available to entitled users at third party platforms like ResearchGate? Recognizing that reality will be more complicated, let’s take the extreme case: where 100% of usage migrates away from the publisher platform and to ResearchGate. In this case, has the publisher re-established its pricing power, because it has a far greater amount of usage that can be attributed to the library subscription and it can dramatically alter the user experience everywhere (not just on its own platform) if the license lapses? Or is the library better positioned to exert pressure on a publisher by threatening to cancel, because all of that usage now occurs on a platform where, if the license were to lapse, preprints might be readily available in the place of versions of record?

For a library that is pursuing a strategy of trying to maximize usage of licensed e-resources to justify subscriptions and the budgets that support them, content syndication is great news. For those libraries that are trying to right-size the usage of subscription e-resources as a strategy to align themselves internally and improve their negotiating position with publishers, syndication may prove to be a mixed bag at best.

Business Models

At the same time syndication alters the marketplace for licensed scholarly content, it will energize the business of the cross-publisher hubs. The content itself will be provided under existing publisher licenses with libraries — or based on open access. But the hubs will require investment in order to launch, operate, and continuously improve. How they might be monetized should be of substantial interest to all parties in the ecosystem.

Will cross-publisher hubs be built on top of services that are currently licensed to libraries, like Scopus and Web of Science (or through a reconfiguration of ProQuest or EBSCO content platforms and discovery services)? If so, that suggests the library will have some reasonable degree of influence in their development. The library license with these discovery services could presumably extend to the expanded functionality of serving as a content access platform. Of course, library fees may be expected to increase for any one of these platforms that becomes a dominant hub, even if over time they were to be reduced as open models somehow drive down the licensing burdens.

But what if the emergent cross-publisher hubs are not predicated on the existing library channel? The recent Springer Nature/ResearchGate pilot announcement is a good example to consider. ResearchGate does not operate through agreements with universities, corporations, or their libraries. Instead, it has built a platform designed around the researcher and their professional network, which is monetized through analytics and advertising (and which apparently has to date been losing money doing so). On the one hand, if a provider like ResearchGate were to become a dominant player in syndication of scholarly content, libraries might be relieved to be absolved of an additional payment for the cross-publisher hub. I have already explained why Elsevier fears ResearchGate as a syndication hub and Springer Nature would like to embrace it.

For libraries, it is my view that many will see that a cross-publisher hub with which they do not have a relationship is actually the more dangerous possibility. There are two reasons that libraries might be concerned about the challenges created by such new entrants.


A platform like ResearchGate could, of course, choose to monetize itself through direct charges, to authors or readers, for basic or more likely freemium services. In such a scenario, libraries might be troubled to find themselves disintermediated. But the bigger challenge is that user fees do not seem to be emerging as the monetization route.

Rather, it appears more likely that that this new breed of cross-publisher hubs would be funded by one form or another of user data monetization (“surveillance capitalism,” if you like). Libraries continue to strongly reject such models, as evidenced in a recent statement by most of the Ivy Plus libraries. But if libraries are not somehow the customers for the platforms through which users engage with content, then another party may control the terms of exchange.

Now, the privacy and surveillance dynamics of a new crop of platforms may not be substantially different than the problems with many existing content platforms. There is little evidence thus far that libraries have been prepared to insert strong enough license language to achieve their objectives, let along audit vendors and hold them accountable for missteps. The emerging challenge with syndication is that there may be a set of platforms through which license-entitled content can be accessed but with which libraries have little direct relationship and over which they have no apparent control.

Thus, libraries may wish to begin adding greater specificity in their content licenses about the platforms through which their entitlements can enable users. We are already seeing many different publishers and platforms sending certain forms of entitlement data to Google Scholar, to enable more seamless linking. As far as I understand, there are no explicit library permissions through their publisher agreements to enable the sharing of this information.

Third-party content platforms through which publisher licenses can enable entitled users to gain access can raise another kind of issue. RA21 will enable users to login to a platform without sharing very much data about themselves (hopefully none at all, if library interests are preserved in the final recommendations). Which content hubs will libraries be comfortable having turned on for their users through their publisher licenses? Do current content license agreements provide libraries any discretion on the delivery sites through which their license entitlements provide access for their users?

A Checklist

The syndication model is continuing to develop and may take a number of different forms if it ultimately emerges at all. That said, improving the researcher experience is a vital priority. And, leakage has emerged as key to the library’s medium-term content negotiating strategy, as has been the case in library efforts to capitalize on the declining value of the subscription big deal. Syndication with distributed entitlements is designed to stem the flow. In addition, privacy continues to loom large as a key value for professional librarians. Given all this, there are several clear steps that libraries should begin taking to be prepared:

  1. Determine how syndication would affect your negotiating position for content licenses. Syndicated models will suppress content leakage, and isn’t content leakage good for libraries? That said, if syndication supports your negotiating strategy, by making it possible to more readily supplant versions of record with preprints in the user workflow and thereby over time making it easier to walk away from a negotiation, consider language in publisher license agreements encouraging or requiring it. If syndication interferes with your goals by suppressing leakage, consider language in publisher license agreements that will limit or contain the effects of syndication.
  2. Assess the implications on licensing strategy if substantially more usage were to be added to your COUNTER report, especially for major commercial and scientific journal publishers. Consider adding provisions to license agreements specifying how usage is counted, to the extent that impacts any parts of the license fee. Determine how to remap negotiating strategy as a result.
  3. Consider adding provisions to license agreements specifying the sites, or requirements for the sites, through which the license entitlements can be operationalized. It is not hard to imagine some sites that publishers might like, which libraries would not, and getting ahead of this dispute would seem to be desirable. If license agreements speak directly to this issue, libraries will be better positioned to exercise some control over this emergent ecosystem, rather than complain after the fact at how it has developed.
  4. Identify the library’s existing licenses for sites that might become cross-publisher hubs. Examine the license agreements for those to determine what — if any — protections the library is afforded and how pricing would change should the site’s model evolve. Consider updating those proactively bearing in mind the prospect of content syndication.
  5. Consider exploring new relationships with potential supercontinents, including especially ResearchGate and also Academia, to ensure that the library has some influence over their policies and practices. Even if such a site’s current practices are seen as problematic (as presumably many of ResearchGate’s advertising practices would be seen to be), it may be strategically valuable to find ways to influence them rather than to allow an entirely separate syndication workflow to develop outside the library’s ambit.

There is yet time for libraries to get ahead of a major strategic development among publishers. The challenge is to link licensing, scholarly communications, privacy policy, user needs, and library technology together. Libraries that are able to do so thoughtfully have an opportunity to advance their values and interests and those of their user communities at this moment of rapid strategic change.  


I thank Rick Anderson, David Crotty, Lisa Janicke Hinchliffe, Kimberly Lutz, and Doug Way for comments on an earlier draft of this piece.

Roger C. Schonfeld

Roger C. Schonfeld

Roger C. Schonfeld is the vice president of organizational strategy for ITHAKA and of Ithaka S+R’s libraries, scholarly communication, and museums program. Roger leads a team of subject matter and methodological experts and analysts who conduct research and provide advisory services to drive evidence-based innovation and leadership among libraries, publishers, and museums to foster research, learning, and preservation. He serves as a Board Member for the Center for Research Libraries. Previously, Roger was a research associate at The Andrew W. Mellon Foundation.


8 Thoughts on "Isn’t Leakage Good for Libraries?"

I’m probably not alone in this, but I still don’t quite understand the motivation for a publisher to adopt a syndication model. Pushing your content over to other channels like this means you lose your branding efforts, your advertising revenue, and any of the marketing benefits you gain from your own platform (sending readers to related content, related publications, etc.). Do we expect RG and other platforms to pay publishers for the ability to host content? If so, that would make better sense, but I’ve yet to see any indication that this is the case, as with most internet advertising-based companies, they expect to get content for free.

Here the argument is that the benefit one would see is an increase in COUNTER statistics reported to a library, and that would give the publisher leverage in difficult negotiating situations. I’m not sure how much that’s worth — libraries are canceling big deals because they simply don’t have the funds to pay the prices for an ever-increasing body of literature, because they see the prices as too high, and because they are interested in changing the culture of publication toward open access. Most publishers are seeing year on year increases in traffic and the numbers of articles accessed. Publishers are already showing libraries increased usage numbers, so would bigger increases make much of a difference?

If the argument is that leakage causes a library to drop a big deal because they feel they can get much of what they need through preprints, Green OA, etc., then why does having secondary locations where one can (if purchased) access subscription content make any difference? The leakage will still be there even if one licenses one’s content to syndicators. The advertising and surveillance based business of RG and its ilk is based on bulk, and it’s not like they’re going to eliminate the leakage or turn away non-subscribers.

Earlier this week, Facebook was lamenting that users want local news, but Facebook is having a problem finding local news to share. The reason they can’t find it is because social media sites like Facebook have killed the local news market. They provided a platform and brought people to it and then continually made it more difficult for smaller news outlets to get attention. Then they took all the advertising money. Losing local news is a huge societal problem for which we currently have no solution. I have long posited that if Facebook were an honest broker, they would pay for content instead of expecting news organizations to pay them for distribution. This will not happen and I don’t know why anyone thinks RG would pay for it either.

Interesting — I don’t rely on Facebook for local news at all. I get local news entirely by listening to my local NPR affiliate during my commute and by watching my local TV station in the evening. But then again, I have no idea how typical I am in that regard. Also, I’m over 50…

I get all of my news from Facebook and Twitter feeds (with the exception of the 20 minutes to and from work with NPR). I haven’t watched news on the TV in at least a decade. The hyper local stuff comes from community-based blogs, which I only follow on social media.

Both EBSCO and ProQuest pay publishers huge amounts to use publishers content, why would publishers give everything to RG for free. Not sure the RG business model makes any sense. As to increasing usuage, few publishers have any problem now with their usage data. Libraries already have trouble supporting the usage load now. The cost of use for a Science Direct package already puts the cost per use at pennies. The growth curve for usage continues to climb at the major publishing houses. Mixing versions of the same article is not helping researchers. They need the final version, not the preprint.

I always felt that the first academic social network that was willing to sign deals and license content from publishers would blow all of the competitors out of the water. Put that VC money toward licensing deals, and become THE one place to get the latest literature for free, become the dominant player, and own the market. Then figure out a business model from there.

But it has never happened. Some I suppose is the internet business model, which is usually based around de-valuing anything other than your own advertisements and refusing to pay for anything, but I suspect that in the end, our market is simply too small to support a surveillance/advertising business model to enough of a level to pay for the content needed to drive it.

There isn’t a 18-25 year old within 500 miles of me who listens to NPR. I’m not sure they even know what a radio station is. That’s the issue – the value of news, local or national, does not mean a thing to the youngest generation.

Comments are closed.