I was recently introduced to Andrew Jack, the global education editor of the Financial Times (FT), and learned about the role of the “50 Journals” list in the FT’s business schools rankings. I was intrigued by the efforts he is currently leading to explore alternative approaches to developing the list and the challenges involved in identifying new metrics reflecting a greater  range  of considerations for scholarly impact. With his upcoming “slow hackathon” to  invite community participation in the process, I was pleased Andrew agreed to share his thoughts through this interview. 

drawing of a laptop with various forms

Let’s start off with the basics. What is the “50 Journals” list and how is it used?

The FT50 is a list of the top journals compiled by the FT that business schools consider most important. The FT uses it to help assess high quality academic research output — one factor in our annual business school rankings. The rankings help prospective students, recruiters, faculty and others assess different qualifications and programs around the world. Alongside research, the rankings take into account analysis of and comparative data on graduate recruitment, salaries and satisfaction; as well as factors including sustainability in the curriculum; and gender balance and international mix of students, faculty and advisory boards. 

What’s your role with this list?

I am the global education editor and in that role I help oversee the FT’s editorial coverage of business education including our different business schools rankings. I am currently consulting on the composition of the FT50 as part of a wider reflection on ways to measure academic output and enhance our rankings. The FT50 data contribute up to 10% of the weighting in our rankings and influences faculty practice, so it is important that we hear a range of views and ideas from the community. We particularly want to incorporate measures of public or societal impact as a critical component of business scholarship. 

How was the list created originally? 

The list was originally developed in consultation with academics at the leading global business schools that participate in the FT’s rankings, and has been updated and expanded several times over the past two decades.

So, I understand that you are beginning a process to revise this list to better reflect “public impact” of scholarship. What does public impact mean in this context?

We are interested in exploring new ways to measure research that has not only academic rigor but also resonance with readers including practitioners beyond universities. We  are also seeking to enhance attention to what is relevant — notably to social benefit. Identifying data to track research outputs and outcomes that are rigorous (high quality), relevant (socially responsible) and resonant (widely disseminated for practitioners’ uptake) is a challenging task. We are asking how to take account of different forms of output alongside academic articles, such as chapters, textbooks, academic books, patents, conference participation, advisory/consultancy roles, community partnerships, funding, and reports to accreditation bodies and institutions on school activities, as well as popular articles and books. Plus how best to measure impact, using media reports, social media uptake, references in policy documents and so on. We hope this more expansive approach will give clearer indications of valuable research being undertaken to increase interactions between practitioners and academics, encourage a focus on societal priorities such as sustainability, and help guide students and decision makers in the public, private and non-profit sectors to the best institutions.

In light of this orientation to public impact, you are organizing a “public challenge” or “slow hackathon” to solicit broad engagement with the revision process? What will that entail?

The idea is to bring together academics, bibliometricians, data scientists, publishers and others — ideally forming teams/partnerships with others — to showcase and propose ways to combine different datasets and prototype ways to assess business school academic output.  

Who can participate in this hackathon? What’s the timeline? 

This will be a “slow” hackathon over several weeks with broad exploratory goals rather than an intense round-the-clock exercise with a very tightly defined outcome. Anyone with ideas, skills and ideas around relevant data is welcome to participate. To help identify current thinking and to illustrate why this is a hard problem that new big data tools might help address, we have pulled together some background reading. We are seeking expressions of interest by September, and will host an online discussion with participants to exchange initial ideas, followed by more detailed submissions in October. We will reconvene to discuss and share outcomes in November.   

Are there prizes? 

There is no money involved! We will showcase in our writing the best teams — we want them to be credited for their ideas and contributions — in keeping with the overall goal of what we are trying to accomplish, and we will explore ways to incorporate them into our rankings if feasible. We hope those taking part will learn from each other, develop new partnerships, and hopefully help identify future common ways of working with wider benefits — such as open source, standardized ways to report authors, faculty, and a broader range of academic outputs.

What comes next after the hackathon? 

Publications, articles, rankings modifications for us. Hopefully further reflections, debates, and action by us and others to make business school research as socially useful as possible in responding to the big challenges the world faces including climate change. We also welcome wider ideas and suggestions from experts such as your readers on best existing practices, new approaches, ideas data etc — whether or not they feel able to participate directly in the hackathon.

If you’ve intrigued our readers, where can they learn more?

Drop us a line on respbus@ft.com 

Lisa Janicke Hinchliffe

Lisa Janicke Hinchliffe

Lisa Janicke Hinchliffe is Professor/Coordinator for Research Professional Development in the University Library and affiliate faculty in the School of Information Sciences, European Union Center, and Center for Global Studies at the University of Illinois at Urbana-Champaign. lisahinchliffe.com

Discussion

1 Thought on "Hacking a Top Journals List: A Collective Approach to Developing Metrics?"

Encouraging news! Especially in light of the brutal role of FT50, AJG 4* and similar lists in tenure\promotion in the UK and elsewhere – which is not mentioned in the interview. I once had a chat with a colleague from the leading London university. She would really like to do some relevant and interesting research on the intersection of bibliometrics and science policy\management and publish it in Research Evaluation or the like. Alas, all they have in FT50 is Research Policy, which is not fond of bibliometrics. And she _needs_ to get a FT50 paper (as instructed by senior peers and department management), so she dropped this research theme. Heckman famously wrote about the “Tyranny of the top five” in economics. Does such usage of FT50 amount to tyranny and what’s Andrew’s view on that issue?

Comments are closed.