Twenty years ago, in his infamous book, Double Fold: Libraries and the Assault on Paper, Nicholson Baker ranted about the inadequacies of microfiche and microfilm as surrogates for original copies of print newspapers. At the time, much was made of his investigation into newspaper preservation and archiving practices. A starred notice in Kirkus Review concluded, “If even half of what he alleges is true, some of America’s most honored librarians have a lot of explaining to do.” Baker made clear his belief that librarians and archivists were putting their professional or institutional needs before the needs of those who sought to actively engage primary source materials.
A reviewer in the Canadian Journal of Communication noted with concern,
Double Fold is also a salient text for media historians because it grapples with how mass disbinding/microfilming policies affect media-history scholarship. Baker points out throughout his book that microfilm effectively strips the newspaper from its historical context in many ways. On a purely visual level, the microfilm format destroys the experience of reading a newspaper into an enforced linearity.
Migration of the content from paper to microfilm thus, the reviewer argued, limits its value to the researcher.
The controversy arising from the 2001 publication of Double Fold was such that the Association of Research Libraries (ARL) provided talking points to its members for countering Baker’s arguments, as well as a useful FAQ. In an issue of First Monday, Richard Cox, at the time a professor at the University of Pittsburgh School of Information Sciences, countered Baker’s accusations of library indifference and negligence by noting the costly challenges of preservation and the very limited resources made available for the purpose. The debate was discordant and very public.
I was reminded of the friction created by Baker a week or two ago when I noticed buzz associated with a roundtable discussion at the Modern Language Association (MLA) Annual Meeting. As part of that discussion, Mike Furlough, Executive Director of HathiTrust, questioned whether 1995 recommendations from the MLA about primary records (meaning physical copies of primary source materials) retained their relevance in 2021.
The original 1995 MLA statement was crafted when digitization was gaining popular momentum. Specifically, it recommended collaboration with libraries, “leading toward (1) the maximum retention and preservation of textual artifacts, as well as a refining of the selection criteria necessarily entailed, and (2) the use of responsible procedures in the creation and identification of photographic and electronic reproductions based on those artifacts.”
In support of the statement, the MLA published a special issue of their journal, Profession, that gathered papers emphasizing the knowledge to be gleaned from examination of physical forms. Scholars of the book feared that libraries would be more concerned with removing books than with their preservation and wanted to make clear to their colleagues that a digital surrogate could not serve all research requirements. Not having attended the 2021 MLA session, I can’t say whether the objective of Professor Michael Hancher’s proposed session was simply to reiterate that concern or not; however, his 145-page bibliography on the topic of weeding (or deselection) is impressively thorough in documenting the pros and cons put forward by researchers and librarians in the humanities.
Via Twitter, Furlough made public a rough script of his roundtable comments on the same day as the event itself (Jan 10). He emphasized that, of course, libraries continued to be deeply concerned with preservation of primary source materials. However, the work of print preservation requires a continuous supply of various resources (funding, space, staffing, etc.), all of which are currently under scrutiny by cash-strapped institutions dealing with effects of a pandemic. There is no comfortable solution to the problem, given the realities of the “existential financial threats” to an academic library’s parent organization. Furlough reminded his audience that there is not now nor has there ever been a way for a library to keep everything. He put three blunt questions to the audience:
- Where exactly do the library’s resources come from?
- How much money does your university allocate to the library?
- How much of that money is earmarked for the continued preservation of items that each have their own artifactual value, versus new books that keep getting published and that publishers hope libraries will buy?
For decades, there has been tremendous growth in the book and journal publishing market, both in terms of number of titles and pricing. Libraries have exerted significant energy in ensuring that collections remain current, while wrangling with publishers over the costs. Scholars and researchers may be somewhat aware of those struggles, due to the work of open access advocates, but their understanding is less robust that weeding of the library’s collection is a necessary solution. To researchers in monograph-heavy disciplines, where acquisitions have slowed due to budget concerns, getting rid of older titles may signal deprivation. Whether researcher or librarian, whose expertise finally determines the depth and shape of our body of knowledge? What formats, what versions of established knowledge survive on a long-term basis? There have been any number of case studies in the library literature about how best to manage a deselection process without creating undue alarm or alienation.
Here is where our information community gets caught between the rock and the hard place. The back and forth at MLA between librarians and scholars represents a negotiation over the current support provided to researchers in book-oriented disciplines. To assuage their concerns, stakeholders (scholars, librarians, and, yes, publishers) need to better collaborate on delivering the highest quality of access to and preservation of all of the content formats that the 21st century encompasses.
Roger Schonfeld of Ithaka pointed out recently in this blog a misalignment between the services offered by publishers and librarians and actually meeting researchers’ needs, deploring the lack of a user-centric architecture. The information community may be trying to satisfy user needs, but there are large gaps that need to be bridged. Relative newcomers to the space who try to bridge those gaps in innovative ways may be called out for not adhering to existing practices or rules, regardless of whether those community arrangements still make sense. Sense could be restored through legal as well as technological means, but to do so requires investments of time and money as well as a long-term commitment to innovation.
As a final anecdote, there was a moment of irony for me this month in revisiting Double Fold. Whether because it had been lent out to a friend or placed in storage, I couldn’t put my hands on the physical copy that Amazon assured me I purchased years back. Annoyed but impatient, I purchased the Kindle edition, only to discover that it was indeed an inferior surrogate for the print. I had wanted to consult Baker’s epilogue, which contains the recommendations he suggested librarians adopt for a more satisfactory resolution moving forward. But a basic ebook feature was missing. Vintage Books, a subsidiary of Knopf and the publisher responsible for the digital version, had neglected to include a linked table of contents, one of the most basic navigational tools on which a reader or researcher might expect to rely. A cost-saving measure adopted by a publisher in transitioning backlist titles into the digital age? Quite possibly. But not a satisfactory solution for the end user.
This is not about questioning the expertise of either library or publishing professionals; stakeholders are doing their best to deliver services in a rough environment. But transitional solutions should not be allowed to become the norm if they fail the requirements of the user. The scholarly community is happy to tell us of their requirements; are we willing to listen?