During a recent webinar, I had the chance to reflect on what various data points can tell us about how students, faculty, and scholars interacted with e-resources during the last year of distance learning and remote research. At a time when academic librarians are knee-keep in usage logs and COUNTER reports from the 2020-2021 school year, we are starting to identify some notable trends.

Demand for digital content has reached record heights and, in many cases, is still very high. During this conference season, we’ve heard several case studies from publishers like JSTOR, Springer Nature, and others who have seen up to three times the usual volume of platform activity. Some trends are familiar ones, for instance, the usage spikes commonly observed during spring and fall terms. Others offer useful indicators of valued content across the disciplines — of course, we’ve seen a spike in demand for pandemic-related preprints, but data is also showing the continued popularity of ebooks for humanities and social science subjects.

Along with that precedent-setting e-resource demand has been a high volume of authentication and access challenges, where many libraries have reported a strain on tech support and access services. Patrons have been forced to leverage many more routes to full-text content, where the traditional IP-based methods reigned supreme pre-COVID. Publishers lifting access walls to select unlicensed collections eased this access stress in many ways, removing a key pandemic pain point for remote users. This means, however, that library COUNTER reports won’t match actual usage, as usage of opened content is often recorded as unknown (or at least unattributable to an institution).

This is where we need to reach out to humanistic insights to better understand the journeys and information experiences behind our usage reports. Data breadcrumb trails left by users of scholarly resources largely give us insight into the isolated behavioral and transactional aspects of student, faculty, and researcher experiences with academic resources. What other contextual clues can we draw from information science research to flesh out our understanding of the human practices and realities that lie beyond the sights of our computational analytics?

silhouettes of people

We know from industry research that the pandemic has had far-reaching impacts on faculty and researchers, in particular women with families and early career researchers. Various reports show that burnout, anxiety, and depression are challenging faculty as well as their students, made worse by the digital divides cleaved by social and economic inequities. Experiences of uncertainty and lack of confidence in the future are often cited in library and information science research as an important aspect of our decision-making when managing information.

When we search for and evaluate scholarly resources, we are often seeking the type of emotional and conceptual connections that facilitate learning and meaningful analysis. Therefore, experiences of uncertainty disrupt and undermine that search for meaning, and therefore our use of information. Research shows that learning and scholarly exploration naturally involves a degree of uncertainty and unknowing. Yet, when that uncertainty extends beyond learning and research to include basic physical and psychological needs, such as safety and security, our stress responses to uncertainty can cloud our mental clarity.

Experiences of uncertainty influence the way we interpret, share, and engage with information. Studies have demonstrated that information management is one sort of risk management strategy. Some information scholars contend that we are all living through global information crises in the midst of the covid pandemic. I would argue that maxim also applies to crises of the sort wrought by economic instabilities, social injustices, and environmental disaster. During times of personal and collective crisis, mis- and dis-information easily proliferate and the harm spreads like wildfire in today’s social media environments.

Not knowing who to trust and what information to rely on can put us on the defensive and distort our perceptions of risk or benefit, perhaps made even worse when dealing with fatal health matters such as a global pandemic. Vulnerable populations are most at risk of information poverty in moments of global crisis. Researchers in low-income countries will leverage many more pathways to content than those with greater prosperity. Attending to ethical considerations, such as ensuring user privacy and avoiding social scapegoating, are critical information needs during emergencies like COVID. And trustworthy scholarly dissemination becomes ever more critical, where predictive research methods are in high demand to serve rapidly changing public health needs.

Experiences of uncertainty influence the way we interpret, share, and engage with information. Our sense of the future is often cited in library and information science research as an important aspect of our decision-making when managing information.

So, what does this all mean for how we interpret usage data? Research shows that cell phones are often the go-to information access point during lockdowns and shelter-in-place scenarios. As I understand it, publishers are seeing that trend toward high mobile usage reflected in their logs. Cross-disciplinary research is often a symptom of global crises and high demand for content across fields of study offers us a wealth of data that might shed light on these trends. With publishers providing free public access to a variety of collections, we have an opportunity to observe possible demographic or regional trends when it comes to fields of study and research innovations.

Experiential studies have shown that information usage during crises like natural disasters often involves drawing on past experiences and relying on trusted networks. For academic users, that includes drawing on those information routines and practices used in the ‘before times,’ which is why some familiar user journeys are seen in our usage logs. Stressful experiences of risk and uncertainty impact memory and retention, therefore we may be seeing a rash of duplicate downloads and inter-library loan requests. However, changing needs inspire changing patterns, and digital experiences vary greatly by the degree of connectivity and access. We do not yet know if the user journeys we have observed in the last 12-18 months are temporary disruptions or lasting patterns of e-resource engagement.

Qualitative research offers our community a more holistic assessment of the human realities behind statistical trends in e-resource usage. As social science research has demonstrated over the last several decades, behavioral elements of information experience are one data point – but we must consider the whole person in a contextualized framework to achieve a wider view. Analysis of the ones and zeros left behind by academic user journeys across our products and services cannot give us the full picture of the impacts the last year’s global health crisis had on the use of scholarly materials. Experiential data points that reflect situated interactions must be evaluated alongside traditional metrics and will strengthen our resulting strategies, building greater trust with our user communities.

Lettie Y. Conrad

Lettie Y. Conrad

Lettie Y. Conrad, Ph.D., is an independent researcher and consultant, leveraging a variety of R&D methods to drive human-centric product strategy and evidence-based decisions. Lettie's specialties sit at the intersection of information experience and digital product design. She currently serves as Product Experience Architect for LibLynx, Senior Product Advisor for DeepDyve, and a part-time lecturer for San Jose State's School of Information. Lettie is also an active volunteer with the Society for Scholarly Publishing and the Association for Information Science and Technology, among others.

Discussion

5 Thoughts on "Experiences Behind the Data: Making Human Sense of Pandemic Usage Reports"

Ms Conrad, can you give me examples of what ‘experiential data points’ are?
Thank you.

Thanks for the great question, Judith! In contrast with usage data, which I’d say are metrics of information behavior, experiential data points would be indicators of information experience — or any indicators of emotion, thoughts, cultural dynamics, economic pressures, or other metrics that offer a wider view beyond discrete transactions. These are contextualized metrics that may vary across regions, institutions, fields of study, age, career level, social forces, etc. Often qualitative in nature, experiential data won’t be an exact match for behavioral statistics (e.g., sample sizes will vary), but they shed a human light and broaden our view of usage metrics. Would love to hear your thoughts!

Perhaps it is to obtain more accurate usage data that some professional society publications, as well as some for-profit publishers, tease their readers by offering incomplete tables of contents (TOCs). So the user, already frantically trying to keep up with the latest work, has to engage in further clicking around to visit the publisher’s webpage. Yes, indeed, “various reports show that burnout, anxiety, and depression are challenging faculty as well as their students.” Publishers should be helping, not contributing to, this.

Totally agree! Publishers have an opportunity to invest in human-centric design that enables productive, positive experiences in the scholarly information journey, which organically inspires return traffic and loyal readers, authors, etc. It’s about avoiding the “dark patterns” that cause friction and increase experiences of uncertainty, all of which undermine the trust we aim to build with our user communities.

Interesting insights, Lettie! It also seems the pandemic has speeded up the trend toward scholarly publishers “competing for eyeballs,” thus becoming more and more like commercial media. Byproducts of this have been increased usage by non-academic readers and more hastily published articles and retractions, even in the most prestigious journals. We have also noticed scholarly journals receiving more new readers referred through TrendMD-enabled recommendations appearing on non-peer reviewed and general science news sites.

Comments are closed.