Reinforced by the exploits of spinach-eating Popeye the Sailor-man and his one-time eponymous brand of canned spinach, we’ve had a unique place on our plates for spinach — a dreaded but muscle-building, anemia-curing super vegetable. But was this place truly deserved?
By now, many of us have heard the story of how a simple error in a decimal’s position created the long-standing misperception that the iron content of spinach was 10-fold greater than it actually is, leading to generations of parents believing they were feeding an exceedingly nutritious — if slightly unpalatable — green vegetable to their children.
Once the decimal error was uncovered, spinach was exposed as just another green leafy vegetable, its superpowers a charade. Case closed.
So, when one of my coworkers forwarded me an article all about misconceptions in the literature about spinach, I set it aside, thinking it was certainly another retelling of the decimal error and spinach story.
Well, I was wrong. Moreover, I was shaken by the story the paper told. In addition, I realized that the creator of Popeye has gotten a bum rap all these years thanks to another myth inside the spinach myth.
But I’m getting ahead of myself.
Entitled “Academic Urban Legends,” the author, Ole Bjørn Rekdal, writes about some of my favorite topics — citation misdirection, citation expansion, and citation invention. Rekdal shows that the story of the decimal-place error is as rife with mistakes and unsubstantiated statements as the tale it supposedly corrects.
For those of you who have not heard of the initial supposed debunking, here is a summary. Allegedly, in the 1890s, scientists measured the quantity of iron in spinach, but recorded it at 10 times the actual value, through a transposed decimal. This error propagated itself for decades, until the 1930s, when some German scientists measured again and discovered the error. However, by then, the notion of spinach as an iron-rich vegetable was so cemented into popular culture that it proved nearly impossible to dislodge.
Rekdal takes us through the history of the purported debunking of the spinach myth using peppery prose and many nice asides, during which he also debunks a couple of other academic urban legends — most notably, the urban legend that a lower percentage of papers are being cited today than in years past, suggesting a lower-yield literature. (This is not good news, however, as Rekdal has reason to argue persuasively that more citations probably means more misleading or lazy citations, not more legitimate or accurate citations.)
As it turns out for the spinach mythology, following the actual citations in the literature back from today through to the 1930s and then to the 1890s leads us nowhere. There is no citation to any study in the 1930s debunking a study in the 1890s (or, as one paper asserts, the 1870s). There is no decimal error to be debunked. The names of the German scientists change from version to version. There is only an academic urban legend, and by “debunking” it, we are creating a new shared delusion.
The main detective in this — a Reader in Criminology named Mike Sutton — published his paper in the Internet Journal of Criminology in 2010. In it, he traces the history of dried vs. natural spinach and their varying iron content, and the actual nutritional recommendations of a German scientist who studied spinach in 1902. This German scientist believed that spinach’s iron content could help prevent anemia, but could not correct it. He was rather isolated in his beliefs.
Popeye’s conflation with spinach became a conflation with iron, but his creator, E.C. Segar, being much better informed on the nutritional benefits of spinach, recommended spinach because it contained a lot of vitamin A. Segar made his choice clear in one panel captured in the Sutton paper, in which Popeye says, “Spinach is full of ‘Vitamin A’ and tha’s what makes hoomans strong an’ helty.”
How error-prone was this citation trail? In a 1971 paper in the Lancet in which the spinach-iron urban legend was reinforced yet again, Segar wasn’t even credited as Popeye’s creator, with the erroneous credit going to Max Fleisher.
Sutton ends his paper with a warning Rekdal would applaud:
Be warned therefore, authors who do not research the sources of supposedly “known facts” risk misleading themselves and others, and ultimately their work may become the subject of much laughter and delight when their ironic hypocrisy is exposed.
Rekdal, for his part, spends a good amount of time noting how easy it is now to locate and follow references, taking moments instead of the days it took just a couple of decades ago. This ease seems to have a downside, however, as he notes in his conclusion:
The digital revolution has certainly made it easier to expose and debunk myths, but it has also created opportunities for new and remarkably efficient academic shortcuts, highly attractive and tempting not just in milieus characterized by increasing publication pressure and more concerned with quantity than quality, but also for groups and individuals strongly involved in rhetorics of demarcation of science, but less concerned with following the scientific principles they claim to defend. Some academic urban legends may perish in the new digital academic environment, but others will thrive and have ideal conditions for explosive growth.
This all buttresses a paper that used citation networks to expose another worrisome belief system the emerged in the neurology literature. The study by Steven Greenberg is entitled, “How citation distortions create unfounded authority: analysis of a citation network,” and it deals with the same issue, concluding in its abstract that:
Through distortions in its social use that include bias, amplification, and invention, citation can be used to generate information cascades resulting in unfounded authority of claims.
That’s a nice way of saying that citations can be manipulated to spawn academic urban legends.
Greenberg found in the citation network he studied that citations drifted away from primary evidence and into secondary works, making these secondary sources hubs of authority in the network. These secondary sources often amplified, distorted, or ignored primary research findings. Worse, positive findings were over-expressed in the network, perhaps because secondary sources tended to rely on them and emerged as the hubs of authority, eclipsing the primary evidence. This has always suggested to me that perhaps we don’t have too many positive studies; rather, maybe the problem is that people only cite positive studies, that credulous academics site secondary sources too often, and that this combination of choices drives positive studies to the forefront of our collective awareness.
Rekdal recently published another paper along the same lines entitled, “Monuments to Academic Carelessness: The Self-fulfilling Prophecy of Katherine Frost Bruner.” In this paper, Rekdal meticulously reveals how the Publication Guide of the American Psychological Association has failed to accurately quote a major source from within the organization itself — a former editorial assistant at the American Psychological Association, Katherine Frost Bruner. As Rekdal writes in the abstract:
The most frequently quoted message in Bruner’s article deals with the importance of making sure that references in academic texts are complete and accurate. Exploring the citation history of this particular message reveals an ironic point: the great majority of those who have quoted Bruner’s words on reference accuracy have not done so accurately.
Other sources citing Bruner’s work mistakenly claim she was a psychologist, abuse the citation even further, or make other mistakes, expansions, and distortions from the primary material. The irony here is remarkable, given Bruner’s own words, which actually read as follows:
Incidentally, a sin one more degree heinous than an incomplete reference is an inaccurate reference; the former will be caught by the editor or the printer, whereas the latter will stand in print as an annoyance to future investigators and a monument to the writer’s carelessness.
Scientific advances come from building on the findings of others. But what does it do to scientific progress when authors cynically and sloppily perpetuate a cascade of misleading information? Must we check everything we read, all the way down to the bedrock, as Rekdal did? (Believe me, in writing this post, I wanted to verify everything I could.) Neil deGrasse Tyson touches on this in his excellent book, “Death by Black Hole: And Other Cosmic Quandaries,” writing:
We all carry some blindly believed knowledge because we cannot realistically test every statement uttered by others. When I tell you that the proton has an antimatter counterpart (the antiproton), you would need $1 billion worth of laboratory apparatus to verify my statement.
There is also the emotional pleasure of “knowing” and using this “knowledge” as intellectual shorthand in discussions, grant applications, and otherwise. The emotional valence of “knowing” has a potency that is hard to ignore.
Daniel Kahneman covers this in his excellent book, “Thinking, Fast and Slow,” in a chapter entitled, “Cognitive Ease.” In it, he writes:
When you are in a state of cognitive ease, you are probably in a good mood, like what you see, believe what you hear, trust your intuitions, and feel that the current situation is comfortably familiar. . . . When you feel strained, you are more likely to be vigilant and suspicious, invest more effort in what you are doing, feel less comfortable, and make fewer errors, but you are also less intuitive and less creative than usual. . . . Anything that makes it easier for the associative machinery to run smoothly will also bias beliefs. A reliable way to make people believe in falsehoods is frequent repetition, because familiarity is not easily distinguishable from truth.
That last sentence is worth reading twice.
Cognitive strain is what editors and researchers should exert ideally, but as Rekdal speculates, perhaps the ease with which references can be added has worked against careful use of scientific citation.
The recent “Did you know” commercials from Geico make fun of the smug satisfaction of cognitive ease, with my favorite being the “Words Can Hurt You” ad:
As these studies and others like them continue to show, we are often too credulous to properly pursue facts and verify them, too rushed to do the necessary work, and too eager to pad papers with references we can snap up with little effort and little chance of being caught stating a half-truth. It’s the cumulative effect of all these little shadings and cheats that starts to stain the big picture. As these papers and other show, researchers and academics can spend decades perpetuating — through laziness or indifference or even outright cynicism — myths, exaggerations, and distractions in the scientific literature.
With scientific and scholarly communication increasingly a counting game — number of papers, number of APCs, number of citations, number of journals, number of authors — we seem to have shifted into a mass production mindset. As these articles show, science may suffer when mass production of intellectual work leads necessarily to shortcuts and sloppy citations.
So what’s the current word on spinach? It’s a good source of Vitamin A, but nothing special when it comes to iron. If you like the taste, eat it. If you want Vitamin A, apricots, lettuce, grapefruit, and sweet potatoes are good choices, too. If you want iron, red meat, egg yolks, and iron-enriched cereals are good choices.
The final message of all this? While it’s efficient to grow spinach in sand, we certainly don’t want science built on it.
(Hat tip to EH for pointing out the two Rekdal articles discussed here.)
Discussion
12 Thoughts on "Well, Blow Me Down — A Tale of Spinach, Citations, Nutrition, Epistemology, and Cognitive Ease"
This picture is complicated by the fact that citations are often interpretive. Scientific knowledge is an evolving system, not merely an accumulation of facts. Thus the meaning of prior work may change over time. (This may be especially true in HSS.) But of course there is also error and deception, as with any human activity. In any case most of what we know we learn from others.
The possibility (certainty?) of occasional errors is good reason, why, in citing a paper, one should always check for subsequent errata and discussions. Online publication is great in this regard, but only if you are looking at the official publisher’s site.
ERRATUM: Delete the first comma. See? 😉
I addressed this problem of citation error in my article “The Value Added by Copyediting” (Against the Grain, Sept. 2008), where I observe:
But even at this level there are risks of propagating errors, as in mistakes in quotations that once used incorrectly may be multiplied many times over, as readers do not bother to go back to the original sources to check for accuracy but trust the authority of the scholar using them to have quoted them correctly. (My correspondent who edits articles for science journals confirms the seriousness of this problem: “Huge errors can creep into the literature when authors use preprint [unedited, unreviewed] versions of papers, and the problem snowballs: so few authors return to primary sources that incorrect interpretations are perpetuated and persist in the literature to damage future generations.”) Surely, then, for purposes of formal publication, the additional level of quality control that is provided by good copyediting is a value worth paying for, and libraries would do well to reflect whether their needs as repositories of authoritative knowledge would be well served by relying on anything but the versions of articles that are in their very final form, suitable for long-term archiving. Whether students and scholars who access the unedited versions will bother to go to the archival versions for citations in writings that they produce remains to be seen, but clearly they should be encouraged to do so—students, because they need to be taught responsible scholarly methods, and scholars, because they have a professional obligation to their peers to do so.
How big a problem may this turn out to be? Some sense of it comes from a recently published, and much discussed, paper with the cute subtitle “Fawlty Towers of Knowledge?” by Malcolm Wright and J. Scott Armstrong in the March/April 2008 issue of Interfaces, who write on “The Ombudsman: Verification of Citations” (http://marketing.wharton.upenn.edu/Marketing_Content_Management/Marketing_files/Publication_Files/Citations-Interfaces.pdf). Their first paragraph neatly summarizes the nature and extent of the problem: “The growth of scientific knowledge requires the correct reporting of relevant studies. Unfortunately, current procedures give little assurance that authors of papers published in leading academic journals follow this practice. Instead, the evidence suggests that researchers often do not read the relevant research papers. This manifests itself in two ways: First, researchers overlook relevant papers. Second, they make errors when reporting on the papers, either through incorrect referencing or incorrect quotation of the contents of the cited paper.” They go on to cite previous studies of incorrect references in other disciplines ranging from 31 percent in public health journals to as high as 67 percent in obstetrics and gynecology journals and studies of errors in quotation with similarly disturbing numbers, such as 20 percent for medical journals in a systematic survey conducted in 2003. Remember that these errors occur in published articles. The likelihood is that the rates would be significantly higher without the intervention of copyeditors.
I simply can’t resist the temptation to post a link to a third article/essay published a couple of days ago: tinyurl.com/mzvpgag
A timely article for a week in which we learned that “Hello Kitty” is not, in fact, a cat.
It would be nice to have the page numbers for the quotations from books.
Speaking of academically perpetuated urban legends:
You Don’t Need 8 Glasses Of Water A Day
http://fivethirtyeight.com/features/you-dont-need-8-glasses-of-water-a-day/