Parallel Session 14 – Shifts in Research Assessment
Date: Friday, 3 July 2026, from 09:00 to 10:30
Moderator: TBC
Location: R7
14.1) Reshaping Research Assessment: Monitoring and Rewarding Openness
Presenter: Evgenios Vlachos, University of Southern Denmark, Denmark
Research assessment reform is gaining momentum across Europe, guided by principles such as those promoted by the Coalition for Advancing Research Assessment (CoARA). Universities are seeking models that recognise diverse research contributions, promote transparency, and reduce dependence on journal and publication metrics. Since 2024, the University of Southern Denmark (SDU) has implemented a comprehensive process that links institutional incentives to the systematic monitoring of openess. We will present the SDU Open Science Champion Awards, which are granted annually to the most open research unit within each faculty, as a case study of how universities can start reshaping their research assessment practices by cultivating cultural change via monitoring, and rewarding openness.
The SDU Open Science Champion Awards recognise research units demonstrating strong performance in open access publishing, sharing FAIR datasets, and actively communicating research to society. The initiative encourages research units to make their outputs as openly available as possible, and to ensure that all publications, datasets, and outreach activities are registered in SDU’s research information management system (RIMS), where the OADO (Open Access, Data and Outreach) indicator is calculated. This creates a clear incentive to practice open science while improving the completeness and quality of research metadata. The OADO indicator aligns with the CoARA principles by recognising varied research outputs and practices, supporting qualitative assessment with responsible quantitative indicators, and rewarding behaviours that contribute to a transparent and engaged scholarly environment. Importantly, the OADO indicator helps assess research units on metrics they can directly influence, empowering them to improve openness through their own practices. To ensure fair comparisons across disciplines, we applied a weighted version, the weighted-OADO, which benchmarks research units against disciplinary peers. This avoids inappropriate cross-field comparisons, provides actionable insights for improvement, and encourages consistency in open science practices across faculties and departments.
We hope the combined effect of the OADO indicator and the Open Science Champion Awards to lead to a measurable increase in awareness and practice of open science, greater registration of outputs in RIMS, and visible recognition of research units. By linking recognition to behaviours researchers can control, universities can establish a transparent connection between open practices and institutional reward.
This presentation will outline the methodology behind the OADO indicator, the award selection workflow, and the observed outcomes. It will also invite discussion on how award-linked openness indicators can support CoARA-aligned reform, foster cultural change, and contribute to broader shifts in research assessment across Europe.
14.2) Let’s not Recreate Rankings: The Principles of Open Science Monitoring
Presenter: Laetitia Bracco, University of Lorraine, France
n May 2023, the G7 Science and Technology Ministers emphasized the need for a shared framework for monitoring Open Science. While Open Science policies and practices have seen growing international support, monitoring efforts remain fragmented across national and institutional levels. Various dashboards and tools – such as the French Open Science Monitor, Germany’s Open Access Monitor, the COKI Dashboard in Australia, the Open Access Monitor in Korea – showcase both innovative thinking and varied approaches, but lack a unified global framework.
Despite multiple important guidelines, such as the UNESCO Recommendation on Open Science, the POSI, or the recent PathOS Open Science Indicator Handbook, no shared, global framework for open science monitoring existed until recently. The Open Science Monitoring Initiative (OSMI, https://open-science-monitoring.org/) aims to fill that gap by offering principles, guidance and community engagement that support comparability, interoperability, and responsible reuse of monitoring indicators through common guidelines.
To co-create and refine the first draft, designed by a group of French experts, an international workshop was held at UNESCO in December 2023, gathering over 50 experts from very different institutions, including CERN, NASA and CWTS. Following intensive collaborative drafting, a global consultation process amplified by UNESCO was a success, with an open call for participation that received contributions from more than 150 people from 40 countries across the world. This first collaborative effort was the spark for the creation of OSMI, an initiative that does not aim to create monitoring dashboards itself, but rather to bring together various stakeholders to encourage the sharing of best practices, in particular through its working groups, which have a total of nearly 200 participants.
OSMI’s Principles of Open Science Monitoring https://open-science-monitoring.org/principles/ were published on 7th of July 2025 at a conference hosted by UNESCO and, in November 2025, they have already been downloaded more than 6,000 times. An important aspect is not to encourage the creation of new university rankings based on open science, but rather to promote best research practices and capture the nuances in how open science is monitored globally.
The Principles are organised around three core pillars:
- Relevance and significance: all open science monitoring initiatives should be well-defined, relevant, and adaptable to diverse research contexts. They should support evidence-based policies and decisions, be developed through inclusive and participatory collaborative processes, and reflect the diversity of disciplines and stakeholders. Ensuring modularity, transparency, and consistency allows for reliable assessment while accommodating different needs and practices.
- Transparency and reproducibility: open science monitoring should, wherever possible, prioritize the use of open, transparent, and reproducible information, including metadata. It should further draw on infrastructures and methodologies that adhere to shared, agreed-upon principles and rely on publicly accessible data sources.
- Self-assessment and responsible use: open science monitoring initiatives should aim for continuous improvement through regular self-assessments and alignment with these Principles of Open Science Monitoring. Importantly, open science monitoring should be used to understand and incentivise open science practices. It should not be used in isolation to evaluate individual researchers but instead as part of a multifaceted approach to assist institutions, stakeholders, academic and non-academic communities in understanding and improving their research practices.
The Principles provide a solid and international framework for developing or refining approaches to monitoring open science, at a time when this activity is becoming increasingly common in university libraries.
This presentation will introduce the Principles, discuss some initial practical applications of its content by libraries around the world, present the working group’s current outputs and activities, and outline the outlook for the coming years.
14.3) HAL as an Open Infrastructure for Automated Open-Science Monitoring: Insights from Université Paris-Saclay’s BiSO
Presenters: Hélène Bégnis, CCSD/CNRS, France; Delphine Le Piolet and Henri Bretel, University of Paris-Saclay, France
In a context marked by the evolution of research evaluation frameworks—including the integration of artificial intelligence, new bibliometric indicators, technological challenges (such as interoperability and data preservation), and geopolitical tensions (access restrictions, scientific sovereignty)—academic libraries require certified, interoperable, and sustainable infrastructures.
HAL (https://hal.science), the French multidisciplinary open archive, stands out as a unique model in Europe for addressing these challenges. Certified with the Core Trust Seal and compliant with the POSI principles, HAL does more than provide access to over 1.5 million full-text documents: it equips libraries with practical tools to implement Open Science mandates covering publications, datasets, and software.
As a national research infrastructure (HAL+ label), HAL plays a central role in France’s Open Science policy. Its governance, overseen by the CCSD (a consortium including CNRS, INRIA, and INRAE) in collaboration with the Ministry of Higher Education, national funding agencies, and over 150 academic institutions, ensures its longevity and sustainability.
HAL promotes the dissemination and reuse of publication metadata through APIs and the use of FAIR standards and norms, enhancing its interoperability and integration into international research ecosystems.
Libraries play a pivotal role in this ecosystem, mobilizing over 1,000 librarians to ensure the quality of metadata, thereby aligning HAL with national policies and local research needs.
Université Paris-Saclay, one of Europe’s largest research-intensive universities (with 230 research units, over 8,000 researchers, and 13,000 scientific publications annually), has been fully committed to Open Science for several years (DORA, Barcelona Declaration, etc.). Its Open Science strategy is formalized in a framework document and supported by the Open Science Support Service within the Department of Libraries, Information, and Open Science (DiBISO, https://www.universite-paris-saclay.fr/recherche/science-ouverte). This service is embodied in the Open Science Research Referents Network (3RSO), a group of librarians involved in promoting Open Science.
Within this context, DiBISO developed the BiSO (Bilan Science Ouverte), an annual report that provides each laboratory with an overview of its Open Access scientific output based on data from HAL. Designed as an assessment tool, this report adapts to the scale of laboratories and can be generated for any corpus of publications deposited in HAL. Its added value lies in its automated yet customizable nature.
The first version features 12 metrics and visualizations, including the open access rates of articles and conference proceedings, information on the economic models of journals, and maps and lists of international collaborations. These are supplemented by a customized overview of Open Science initiatives undertaken by each laboratory, along with recommendations for improvement.
The code (https://github.com/dibiso-upsaclay/dibisoreporting) is fully open-source and uses HAL data enriched with information from open databases like OpenAlex(https://openalex.org/). An open access preprint (https://universite-paris-saclay.hal.science/hal-05336463) details the technical choices that led to this automated and reproducible tool.
BiSO operates through the collaboration of researchers who deposit their work in HAL and the 3RSO referents who consolidate and enhance the data to produce a comprehensive and qualitative report. The benefits are immediate: laboratories receive rich, personalized reports that help them support Open Science dynamics, develop an open archiving culture, and highlight their Open Science strengths and actions. The 3RSO referents save time and strengthen their ties with their communities.
In this presentation, we will focus on the methodology and selected metrics, and the data needed to produce them. We will explore how these components enhance the monitoring, valorization, and institutional support of Open Science, thereby advancing global knowledge accessibility.