Parallel Session 4

Parallel Session 4 – Establishing Responsibility on Measurements, Metrics, and Rankings

Moderator: Pedro Principe, Documentation and Libraries Services of University of Minho, Portugal
Location: Tassos Papadopoulos – Room 202 (2nd Floor)

4.1) Understanding and measuring the impact of Open Science on French academic libraries

Presenter: Madeleine Géroudet, Université de Lille – ADBU, France

In the history of academic libraries, Open Science can still be considered as a recent phenomenon. Academic libraries are still in the process of building skilled teams and developing services and partnerships. They need to move fast in a changing environment and at the same time, they need to gain a better understanding of changes at work. Therefore, the role of a national association is to help open science stakeholders to analyse how academic libraries adapt to this new way of doing research and how in return open science transforms organisations.

The French association for academic library directors and managers (ADBU) has launched several initiatives on the impact of open science for academic libraries. Its 2022 annual conference gathered specialists in organisational sociology and open science stakeholders to debate about the effects of open science upon research structures and libraries.

Since 2019, the association renews every two years a survey upon research support in academic libraries. The questions relate to the organisation of services, to human and financial resources, to partnerships and to the impact of open science policies. In 2023, ADBU was able to collect data from 70 libraries.The association can now use these data to make comparisons, identify trends and have a clear vision on human resources. The 2023 survey describes several library profiles depending on the size of the research structure.

The need for this survey was all the more important as there is a lack of activity indicators at both local and national level. French open science policy includes a major effort on indicators of the openness of science. The open science monitor allows research structures to measure the increase of open research. However, this tool is only about performances and results : it does not collect data about human and financial resources or training and data processing activities. This information is not included either in the data on academic libraries collected every year by the Ministry of higher education and research. The contribution of libraries to open science is invisible in all kinds of reports and key figures. It is impossible to analyse the concrete commitment of libraries and to compare the evolution of resources and activities at local, national and European levels. This is why ADBU is now carrying a study on open science activities indicators. This study aims to define relevant, usable and scalable indicators : it takes into account the specific features of the support services for open science and the diversity of situations in research structures.

Based upon these initiatives, ADBU becomes increasingly able to describe how open science is changing academic libraries and the way they interact with their environment. The presentation will focus on major trends, especially in the fields of team building and collaborations. It will show how the specific features of open science services induce new ways of working together inside and outside the library. It will also emphasise emerging gaps between libraries depending on their size and resources and the specific impact of open science policies.

4.2) Small streams make big rivers: monitoring Open Science locally

Presenter: Laetitia Bracco, Université de Lorraine, France

In many countries, national open science initiatives have been developed in order to provide political guidance as well as to encourage global action amongst research organizations. In France, the first National Open Science Plan in 2018 gave birth to an Open Science Monitor, whose purpose was to steer the progress of open access to publications through multiple dimensions: by year, publisher, type of publication, scientific field. This Monitor, produced by the French Ministry of Higher Education and Research, relied on an open methodology, open data and open source code.

These indicators were the first of their kind in France. Indeed, even if some tools, such as the Web of Science or Scopus, already provided some figures on open access, they were not accurate enough and, more importantly, the methodology used to create these figures was not open.

Openness leads to interest, exchange, reuse. The University of Lorraine was the first institution to reuse the national Monitor’s data, in order to create from scratch its own local Monitor. It was developed by the university libraries and made openly available. As a result, dozens of other universities and research organizations decided to reuse this work and create their own local Monitors, using the same methodology.

This community growing by the day, the French Ministry of Higher Education and Research and the University of Lorraine, who is steering the extension of the national Monitor to datasets and software, created a user’s club for the Monitor in 2022. Now this club gathers 220 individuals from very different structures (libraries, research units, research and evaluation departments…), thus creating an open community around the monitoring of open science in France.

As of today, 46 institutions have publicly released their local Monitor:

https://frenchopensciencemonitor.esr.gouv.fr/declinaisons/bso-locaux

And nearly 200 French institutions reached out to the Ministry to have their indicators generated from the data they provided, even if their results are not made available openly.

In this paper I would like to address the following topics from the perspective of the first French university library that developed a local Monitor:

What is the role of university libraries in the development of open science indicators? As long-time supporters of open access and as creators of bibliometrics, libraries have legitimacy over these subjects, but they have to lead the way towards the use of open data versus proprietary databases.

What effects do local Monitors have on research support activities? As reliable indicators, they allow libraries to have a better understanding of the open science practices amongst scientific communities and to provide tailor-made support.

How should libraries communicate with research communities on these indicators without appearing to evaluate research units? To support cultural change, libraries should move with pedagogy and diplomacy. What is at stake here is a new approach of the necessary dialogue between librarians and researchers.

Lastly, how can we build a national network of local Monitors without fostering competition between universities? One of the answers would be to avoid building an open science ranking as harmful as some current rankings.

4.3) Fostering co-responsibility for open metadata quality to evaluate and monitor Open Science

Presenters: Alicia Fátima Gómez, IE University, Spain and Cristina Huidiu, Wageningen University & Research, The Netherlands

Open Science (OS) has undergone a major evolution in recent years, overlapping in time with the development of the responsible use of metrics (RMs). Moreover, Open Science has established itself as an essential paradigm for the advancement of knowledge, promoting transparency, collaboration, and accessibility in research. RMs also foster principles around openness, transparency, fairness, diversity, and equality. In brief, both seek to achieve more integrity in research, based on transparent and rigorous data. As the Open Science movement continues to reshape scholarly communication as well as bibliometrics, the accurate measurement and monitoring of metadata quality becomes pivotal.

Following the COARA recommendations, the evaluation and monitoring of OS heavily depends on the ability to measure the impact and relevance of research, considering a wide range of outputs and activities. Similarly, any responsible metrics system lies in the quality of the metadata associated with datasets, and more importantly, in comprehensive data, that include not only data from specific journals biased by language, discipline, or novelty. In this context, the range of data collection and their metadata quality emerges as a critical component to enable responsible metrics that drive effective evaluation and monitoring of OS.

This presentation seeks to explore how data harvesting and metadata curation contribute to building a more robust and ethical scientific environment, and aims to dissect the relationship between metadata quality, transparency, and the efficacy of open metrics in influencing research evaluation and monitoring (which are not the same) practices within Open Science.

As more and more organizations look to switch to open metadata sources for research intelligence, we explore ways we can work together across all the different actors in order to ensure the needed growth in data quality of open metadata sources. Focusing on the nuanced aspects of metadata inconsistencies (completeness and correctness of data), the presentation will scrutinize the impact of these gaps on the reliability of open metrics and their overall effectiveness in reshaping research evaluation. The presentation will highlight some challenges, as well as propose some strategies for improving metadata quality, promoting transparency, and enhancing the overall robustness of open metadata. By sharing best practices and lessons learned, the session aims to inspire actionable steps for stakeholders to integrate into their research evaluation workflows.

The session will conclude with a call to action, encouraging participants to actively contribute to the ongoing conversation around research metadata quality and transparency. Embracing a collaborative mindset, attendees will be invited to join efforts in reshaping the future of open metrics and fostering a more reliable and transparent research evaluation landscape based on open metadata. By addressing these issues with a collective focus on open metadata quality, we can pave the way for a robust, trustworthy, and effective open infrastructure.

54th LIBER Annual Conference