Categories
Uncategorized

Jisc, research analytics and metrics

Jisc has provided analytic services related to research for some time, from the JUSP and IRUS services that give insight about the use of journals and articles, to dashboards helping universities understand security threats to their research networks. In the last year or so, however, we have begun to look again at what we can offer, particularly in the context of drivers such as open access policies, the Knowledge Exchange Framework (KEF), and the increasing influence of various rankings and metrics. So, for example, we have started to experiment with the Jisc Analytics Labs, seeing whether data from various sources might provide a dashboard to help universities prepare for the REF, or make the best use of the KEF. The latter visualisations form one part of the KEF consultation currently being conducted by Research England. From February, we will build on this experiment to work with academics, university professionals and research funders to prototype three new dashboards on:

  1. the outcomes of research spend
  2. the factors that correlate with investment in research and innovation, which should be valuable to those looking at the 2.4% of GDP target for investment
  3. what might influence how well empirical research studies are reported, and therefore how reproducible those studies are.

Another opportunity is to mine the huge dataset that has been amassed by CORE, containing over 130m references, 93m abstracts and 11m full text research papers, plus the global Microsoft Academic Graph citation data, to provide both conventional analytics, such as those based on citations, and experimental new ones, such as those based on semantic analysis. CORE already has working examples of these, and PhD research at the OU continues to explore new metrics that make use of the full content of an academic paper to evaluate it, rather than relying on citation counts alone. We are discussing other research-oriented labs with both funders and universities and, while we would welcome more ideas, we recognise that we are still in a prototyping phase.

Long term, our open metrics lab is asking what might be possible in the future. For example, can we develop an automated way to assess the availability of research data for journal articles and the extent to which the data are being made available in a FAIR way? To what extent can automatic analysis of the full range of research outputs and related information provide useful assessments of that research, and how can techniques such as machine learning help? As part of the Knowledge Exchange initiative we are also exploring how an ‘openness profile’ could help with addressing the challenges of recognising contributions to open scholarship in research evaluation processes.

There are more immediate needs for research analytics. For example, Plan S for open access (OA), and the UKRI OA policy review bring into sharp focus the need for universities and funders to be able to track progress toward OA, the associated costs and benefits. Jisc Monitor is relevant here, of course, tracking and reporting compliance and APC payments, and we are looking closely at how that should work more effectively with Jisc Collections systems. On the repository side, both CORE and, at a European level, OpenAIRE offer dashboards to help repository managers, and we are partners in OpenAIRE and European Open Science Cloud projects to build on these. And, of course, the new Jisc open research hub, which is a multi-tenant cloud repository, preservation and reporting tool, includes analytic capability.

Jisc is far from being alone in exploring the potential of research analytics; hardly a week goes by without a new tool, ranking or development in the market. This rapid evolution in the landscape makes the recommendations of James Wilsdon’s Metric Tide report for HEFCE (as was) even more critical, for example that research indicators should be used responsibly and reflexively, and that an open data infrastructure underpins them. That is one reason why Jisc sits on the UK Forum for Responsible Research Metrics, convened by Research England and chaired by Professor Max Lu, Vice Chancellor of the University of Surrey. While the Forum has so far paid a lot of attention to give advice to Research England on the REF, it hopes in 2019 to turn to other, longer term issues. Given the range of challenges facing the sector, from Brexit to research integrity, for which reliable and valid evidence would be valuable, this attention will be timely.

Perhaps the most important factor influencing the degree to which research analytics and indicators are used responsibly is the expertise and capacity of staff in universities, funders and elsewhere, to critically assess the available tools, use them and advise others on their use. International documents such as declarations (DORA), manifestos (Leiden) and assessment frameworks (OS-CAM) are vital, but so are opportunities for professionals to learn from each other and build a community of practice. The lis-bibliometrics list, and the new INORMs group are very positive developments, as is the UK Reproducibility Network, and we at Jisc are keen to understand how we, with others, can support those community-driven initiatives.

There is a wider context in which research analytics can be pursued; Ellen Hazelkorn notes that rankings based on them can do “great damage” and create perverse incentives. In some cases, it is arguable that research analytics might lead us toward what Shoshana Zuboff has recently characterised as “surveillance capitalism”, if researchers’ every interaction with online systems is tracked. In this world, it is of the highest importance that the research community has significant ownership of the ways in which its work is counted and represented.

By Neil Jacobs

JISC Programme Director, Digital Infrastructure (Information Environment)

Leave a Reply

Your email address will not be published. Required fields are marked *