Options-Analysis1Here’s a proposition with which I suspect publishers, editors, authors, librarians, and readers would all agree: over the past couple of decades, the scholarly communication environment in which we all operate has become much more complex.

Here’s another one that I hope will be uncontroversial: with the increasing complexity of the scholarly communication environment has come a greater intensity of feeling about the impacts and implications of those changes and about what we ought to do going forward. Some members of our community feel under threat, some feel exhilarated about possibilities for the future, some feel angry, some feel anxious. Many are confused and apprehensive. None of these feelings, I believe, is intrinsically uninformed or necessarily irrational, though all of us may handle our feelings in ways that are more or less useful and wise.

The increasing complexity of our environment and the heightened emotion around the issues we’re dealing with suggest, I believe, the increasing importance of discriminating between analysis and advocacy. As issues become more complex, the more important it becomes to do (and listen to) careful analysis of those issues; at the same time, however, as issues become more emotionally or politically fraught, the louder will become the voices of advocacy on all sides. To be clear, the world needs both analysts and advocates; however, it’s essential that we be able to discriminate between them. If we don’t carefully do so, we run the risk of accepting propaganda as reportage or debatable interpretation as solid fact.

Virtually every segment of the scholarly communication community has advocates, and every segment of the community also produces analysis of various kinds. In the United States, publishers have advocacy groups like the Association of American Publishers (AAP); scholarly authors have the American Association of University Professors (AAUP); libraries have the American Library Association (ALA). Analysts and analysis organizations include the Coalition for Networked Information (CNI), Outsell, Inc., and a variety of individuals who provide research, data, and consultation to all segments of the community. Some organizations have one department or subunit that acts in an advocacy role and another that does analysis — consider, for example, the Association of Research Libraries (ARL), which hosts both a Statistics & Assessment office dedicated to the gathering and analysis of quantitative data from member libraries, and an advocacy organization (the Scholarly Publishing and Academic Resources Coalition, or SPARC), which lobbies Congress as well as encouraging funding agencies and the higher education community towards policy reform.

Furthermore, the dividing line between analysis and advocacy can sometimes be tough to identify, especially when a particular analyst has a hidden agenda. Sometimes, advocacy deliberately masquerades as analysis: just because an organization calls itself a “Research Council” doesn’t mean it’s doing disinterested research.

Anticipating one likely objection to this essay, I want to emphasize my view that advocacy is both good and important. However, it’s essential that those of us charged with making decisions about programs, priorities, and resource allocation be able to recognize the limitations of advocacy — whatever its affiliation or agenda — as a source of complete or reliable information.

Why would I say that? Fundamentally, because it is an analyst’s job to tell the whole story, but it is an advocate’s job to tell only the part of the story that will further the advocate’s agenda. This becomes particularly problematic when advocates are treated in the news media as sources of analysis.

Please note that none of this means it’s possible to do absolutely unbiased and objective analysis; all of us have agendas of one kind or another, not all of them conscious. However, it is in the essential nature of analysis to attempt to provide a whole and unbiased picture, whereas it is in the essential nature of advocacy to promote a particular goal. This difference is fundamental and matters very much. Think about it this way: imagine someone looking at a rock and telling you about it. If he’s an analyst, he’ll be trying to give you as much useful and objectively correct information about the rock as he can; if he’s an advocate, he’ll be trying to sell you the rock. (Or to convince you to buy something else instead of a rock).

How do these fundamental differences play out in the real world, and what kinds of markers can we look for when trying to discriminate between analysis and advocacy? I would suggest that they include these five:

Complexity vs. Simplicity

When we do analysis we tend to draw attention to complexities, because a recognition of complexities (where they legitimately exist) leads to a fuller and more nuanced understanding of the issue at hand. When we do advocacy, on the other hand, we want to make the issue as simple as possible — partly because simple stories are easier to communicate, and partly because it’s much easier to sell people on a simple proposition than on a complex and qualified one. This is not a hard-and-fast rule, of course — sometimes, voices of advocacy will try to create nuance and complexity where they don’t exist, if the simple reality of a situation is not conducive to the advocates’ agenda. But in my experience this scenario is less common than its opposite.

Data vs. Anecdote

Advocacy argues from data when the data support the agenda, and argues from anecdote when supportive data is lacking or where the data’s implications are too complex to unambiguously support the agenda. Analysis argues only or primarily from data, using anecdote sparingly (if at all) and only for illustrative purposes. To be clear, none of this means that anecdotes can’t be tremendously useful and meaningful, particularly when used honestly and responsibly — only that, as the popular axiom has it, “the plural of anecdote is not data.” If you encounter an anecdote being used to illustrate a point previously demonstrated by data, you’re probably reading analysis; if you encounter an anecdote being used in the absence of data, you’re probably reading advocacy.

Comprehensiveness vs. Selectivity

When we do analysis we draw on relevant information as broadly and inclusively as possible, trying to incorporate as much relevant data as we can to inform conclusions and carefully taking into account the implications of the full range of that information. When we do advocacy, we use data selectively, emphasizing those data points that support our agenda and downplaying or leaving unmentioned those that don’t. Please note, again, that there is not necessarily anything dishonest or wrong about the latter approach; it simply reflects the fact that advocates have a different job from that of analysts. (Of course, the selective approach can easily devolve into dishonesty if we are unscrupulous about the selection criteria we use when presenting data.)

Transparency vs. Opacity

Analysts are more likely to share their data widely, recognizing that there may be information embedded in the data that they themselves have missed and that others may be able to tease out. Good analysts also understand that their interpretations will inevitably be shaded by their own biases and experience and that there is great interpretive value in letting others look at the same value through the lenses of their own biases and prejudices. Advocates will tend to share their data grudgingly, if at all — for exactly the same reasons.

Passion vs. Dispassion

Advocacy needs to be passionate, because communicating the urgency and rightness of the agenda is an important part of its job. An advocate who presents his message coolly and dispassionately is arguably not doing what he should. Analysis, on the other hand, needs to be dispassionate; it needs to show, not only by the analysis itself but also by the manner in which it’s presented, that it does not have a vested interest in any particular conclusion but is simply stating the facts as found. Again: we all understand that there is no such thing as a purely unbiased presentation of the facts — but the ideal toward which analysis aspires is to present reality as dispassionately and with as much objective accuracy as possible.

Are there other markers like the above that we all can usefully watch for when trying to discriminate between advocacy and analysis? Comments welcome.

Rick Anderson

Rick Anderson

Rick Anderson is Associate Dean for Collections and Scholarly Communication in the J. Willard Marriott Library at the University of Utah. He speaks and writes regularly on issues related to libraries, scholarly communication, and higher education, and has served as president of NASIG and of the Society for Scholarly Publishing.

View All Posts by Rick Anderson


21 Thoughts on "Advocacy, Analysis, and the Vital Importance of Discriminating Between Them"

Great post Rick! As you’ve pointed out, it’s so important to understand the role/agenda of the entity providing the data.

The analyst and the advocate seem to be wearing different hats, implying that an analyst having completed his/her work then hands over to the advocate. But researchers, having analyzed their data, then need to publish and become advocates for one interpretation over another. They may advocate what they see as the best interpretation. Sometimes, for political reasons, they have to hide their best interpretation in a footnote and more formally offer the politically correct interpretation. Careers are built and lost on the basis of such judgements. Given the ways our peer-review system operates, sometimes it is not possible to dispassionately set out the possibilities and let readers decide.

These are good points, but I do think there’s a meaningful difference between arguing for a particular interpretation of the data where the interpretation is driven by a relatively agenda-free recognition of what the data itself indicates (which would be a more analytical approach) and arguing for a particular interpretation of the data where the interpretation is driven by a preexisting agenda (which would be a more advocacy-type approach). Those two approaches exist on a spectrum, of course, but I think the difference between them is still real and meaningful.

Yes, the distinction is worth making. As you will be aware, there are a few researchers who do not engage in mega-funded blind data-gathering. They design their data collections to support or refute particular hypotheses. In other words, their advocacy (implicit support for a particular hypothesis) precedes data collection, which is then followed (they hope) by further advocacy.

Most research starts with a hypothesis. Perhaps the difference is whether one is willing to let go of that hypothesis and state what the data shows, even if it goes against one’s initial thought.

Exactly: there’s a difference between a hypothesis and an agenda. Ideally, a researcher starts with a hypothesis and proceeds to test it in the spirit of “Let’s see whether this proposition is true or false.” That’s very different from starting with a predetermined conclusion or preference as to the truth/falsity of the hypothesis and looking for data in support of that conclusion. The former is what I would call analysis; the latter is what I would call advocacy.

Right, and I would characterize that not as real analysis, but rather as advocacy masquerading as analysis.

If the data lead a researcher to conclude that “all swans are white,” it only takes one anecdote of an observer seeing a black swan to disprove the thesis.

Wouldn’t it require actual data proving the existence of a black swan to disprove the thesis, rather than an anecdote? This guy my cousin knows heard somebody saying the earth was flat, and that they sailed out to the edge. Does that disprove the “theory” that the earth is round?

A story about a black swan would be anecdote, and wouldn’t prove the theory wrong. The documented existence of a black swan would be data, and would.

Good discussion–but what about “analysis” that proceeds from a fixed perspective (e.g., that OA is all about publisher revenue)–and analysts that charge four-digit prices? Is that analysis or a form of advocacy? (I regard my research work in OA as analysis, but can see that some might regard it as advocacy, as another example.)

Walt, can you explain what you mean by “analysts that charge four-digit prices”? I don’t understand the reference.

If by “‘analysis’ that proceeds from a fixed perspective” you mean “analysis” that is undertaken with a predetermined conclusion already in mind, I would characterize that as advocacy masquerading as analysis. (But if that’s not what you meant by that phrase, then please set me straight.)

As for analysts that charge a lot of money: I don’t see that parameter as relevant to the issue of analysis vs. advocacy. Some kinds of analysis are more expensive to conduct than other kinds, and I would expect the analysts who do that kind of work to charge more. Other analysts may charge a lot for their work simply because they can get away with it, but that doesn’t necessarily mean that the analysis they do is influenced by a preconceived agenda with regard to the data itself.

Responding to both: re the charges, I was thinking of Rick’s comment “Analysts are more likely to share their data widely”–and the reference is to Outsell,, for what that’s worth. By “fixed perspective” I really mean only regarding one aspect of a situation as worth looking at–e.g., the view that OA is only about publisher revenue.

Walt, you make a good point about data sharing — I probably should have said “Analysts are more likely to share their data widely, unless they have a proprietary business interest in the data.”

As for looking at only one aspect of a situation: it’s certainly true that one’s agenda may be expressed, at least in part, by what part of the environment one chooses to analyze. But I would hesitate before assuming that any particular study of one specific aspect of the environment is necessarily informed by the belief that the aspect under examination is the only one worth looking at. No one can study every worthwhile question every time. Sometimes the choice to study A rather than B is an expression of a qualitative judgement about B, and sometimes the choice just reflects a personal interest, or a prioritization based on resource limitations, or the flip of a coin. When I buy a book in a bookstore, I’m not necessarily saying that it’s the only book in the store worth buying.

I still am befuddled by the “five figure” remark. What does it mean?

As it happens, I erred: it was four figures–that is, the price of obtaining a copy of the analysis. Rick: a fair point.

Why would wanting to be paid for your work make the work less credible? From your bio (http://waltcrawford.name/) it appears that you take paid jobs to do research, and that you charge for your books that you’ve written about your research. Does this automatically make your conclusions less credible?

Or is there a particular dollar figure where credibility is compromised? If so, please let us know.

And for what it’s worth, many of Outsell’s clients subscribe to their reports and, like most subscription models, pay far less for each report than the standalone price.

I was reacting to Rick’s comment about analysts being more likely to share their data widely. I don’t see that I ever suggested that paid analysis lacks credibility, and certainly don’t believe that to be true.

I see. I guess it was the naming of a dollar figure that confused most of us.

I would, however, suggest that Outsell’s reports are very widely read, likely read more times by more people than the average research paper published in a journal.

terrific post rick, and thank you for the clarity and precision of language. i like the discussion of these issues at the level of our ecosystem. nothing we don’t know, but i’d add that individual advocates posing as analysts run rampant (esp on social media) – and can be powerful lobbyists of their own. many ‘crises’ in scholarly publishing are advanced significantly by more heat than light, and loud (albeit popular) voices uttering few facts. and yes, i have anecdotes…..

Comments are closed.