Session 13: Measuring Impact: Research Assessment
Friday 28th June 2019 – 9:00-10:30
Chair: Martin Moyle, UCL Library Services, London, United Kingdom
13.1 RI2A – Towards a Responsible Institute Impact Assessment
Gustaf Nelhans, University of Borås, Denmark, Evgenios Vlachos and Maéva Vignes, University of Southern Denmark, Denmark
Recently, there has been a shift in attitudes against improper use of metrics to evaluate research. In our effort to develop alternative approaches to identify relevant measurements for the impact of research at the institute level – beyond simple frequency counts and rankings based on single indicators – we decided to match the publications of a targeted institute to the Web of Science (WoS) database and with the use of machine learning techniques and visualization tools to explore and showcase our results respectively.
We present the Responsible Institute Impact Assessment (RI2A) for evaluating the research impact of a targeted institute with a case study. We used the 448 publications retrieved from the University of Southern Denmark’s (SDU) research registration database for the Department of Marketing and Management between 2012-2017. Of these, 170 publications satisfied the criteria of both being peer-reviewed and having a DOI to be matched in the WoS database. We then used the bibliographic coupling algorithm to cluster the articles that cited the work produced at the targeted institute. This algorithm groups data together based on the number of shared references between the identified units for analysis (article, source journal and institution level). Lastly, a co-word analysis was used to identify pair-wise relationships between keywords found in the citing articles.
A total of 1195 citing articles without self-citations where identified. The results of RI2A assist the targeted institute by allowing them to discover
- the researchers who cite their publications and the relationships among them,
- the journals that cite their publications and the relationships among them,
- the universities, institutes and organizations that use their publications and the relationships among them, and
- the main groups of keywords used in the citing articles.
Visualizing these results into graphs and making sense of them requires more work than simple publication/citation counts. Although not described in detail here, a collaboration between the SDU library and the university’s research support and policy services has started, where it has been proposed that evaluation should be based on joint work between evaluators and evaluatees focusing on strengths and weaknesses as well as timewise comparison of previous assessments. A straightforward mode would be to compare the results of the mapping exercise with an already known description of the evaluated department’s profile. For instance, by overlaying the targeted department’s research groups on the titles of the citing journals we discovered that the “Strategic Organisation Design” group is referenced a lot in “Organisation Studies”, “Journal of Management Studies”, “Strategic Management Journal” and “Human Relations” journals and that these journals form a distinct cluster based on shared citing practices.
Our approach responds to the conditions of keeping the process relatively simple and short for use in the library setting, yet meaningful for a combined quantitative/qualitative evaluation for both management and faculty whose research is affected by the evaluation procedure. Apart from showcasing the academic impact of an institute, RI2A also provides an opportunity to explore remote connections that otherwise might go unnoticed.
Dr. Gustaf Nelhans is Senior lecturer at the Swedish School of Library and Information Science (SSLIS) at University of Borås, Sweden and was previously Visiting fellow at University of Southern Denmark Library in 2018. His research generally focuses on the performativity of scientometric indicators as well as on the theory, methodology and research policy aspects of the scholarly publication in scientific practice using a science and technology studies (STS) perspective. Presently his focus of interest is directed towards evaluation of societal relevance such as professional impact, i.e., citation performance in clinical guidelines. He is WP leader within the Horizon 2020 project Data for Impact (2017-2019, Grant agreement ID: 770531) and a representative in the Swedish National Libraries Coordination work for Open Access to Research Publication in the expert group “The current merit and resource allocation system versus incentives for open access“.
13.2 How does Our Research Influence Policy on Global Societal Changes? A Bibliometric Proof of Concept Targeting the Sustainable Development Goals of the United Nations
Maurice Vanderfeesten, René Otten, Joeri Both,Vrije Universiteit Amsterdam, The Netherlands, Felix Schmidt, Eike Spielberg, Universität Duisburg-Essen, Germany, Lars Kullman, University of Gothenburg, Denmark, Jaqui Farar,, University of East Anglia, United Kingdom
University leaders asked the library for new ways to measure societal impact and the university’s connectivity to society.
In this project we created a proof of concept for analysing the research quality and policy impact related to each of the 17 Sustainable Development Goals (SDGs), which the United Nations has set as challenges for the World. We have developed a tool that gives insight into the University’s performance, the excellence of that research performance, to what extent that research is freely accessible to society, and most importantly the extent it is adopted by (non-) governmental policy.
We present all this information in an interactive dashboard, which allows users to arrange the data from different perspectives. It allows university leaders to see the unique societal profile of their research, but also helps to develop new research strategies based on the societal narrative.
With a team of bibliometricians from nine universities in the AURORA-network, we created and reviewed 17 queries – one for each SDG – based on the UN policy text and indicators for each global goal. We collected the publications using Scopus, and used Scival to get the top 10% journal citations. Open Access data was harvested from Unpaywall/Impactstory, and policy mentions from Altmetric. First we used a manual workflow to track the entire process, but have now developed an automated workflow, which allows for rapid evaluation of other societal themed queries.
The dashboard generates unique insights, distributed in particular among two quadrants: “opportunities” and “strong SDGs”. The first quadrant shows above average research excellence (horizontal axis) combined with lagging citations in policy documents (vertical axis). The strong SDGs quadrant represents SDGs where both the research excellence as well as the policy citations are greater than average. For the “opportunities” we discovered that although 58% of the “Climate Action” research (SDG13) was published in the top 10% percentile of most cited Journals, only 8% of that research was used in policy by (non-)governmental organisations. For “Good Health and Well Being” 43% of the papers fell in the top 10% percentile, and 19% ended up in policy citations. The “Opportunities” quadrant thus represents excellent research that is still left largely unused by societal policy partners. Shifting the data perspective, we can see which universities have the most policy influence, and in a group of universities like AURORA, have a conversation on how to use their network to reach policymakers better.
The challenge for us now is to make the tool robust enough to support strategic decision-making, by increasing the recall and precision of the queries underlying the data collection.
See the interactive dashboard at https://aurora-network.global/project/sdg-analysis-bibliometrics-relevance/
He believes in a multi-disciplinary approach and is responsible for co-creating library services for Open Science, Scholarly Communication Workflows and Research Intelligence.
Within the Aurora-network, a network of nine universities with a similar societal mission, he developed a dashboard to provide university leaders with insight on questions like:
– What research output do we produce on global societal topics?
– What is the excellence of that research?
– How much of that research is freely available to the public?
– How much of that research is used in policy documents from NGO’s and governmental bodies?
Maurice studied Information Sciences at Utrecht University, worked at SURF, a cooperation of Dutch universities for IT-innovation, on scholarly information infrastructures, Open Access repositories and enhanced publications, and worked at TU Delft on research data management.
13.3 Beyond Authorship, Recognising Contributions: the Value of CRedIT (contributor role taxonomy)
Liz Allen, F1000, United Kingdom
Original research papers with a small number of authors, particularly in the life sciences, are increasingly rare. Research funders and institutions are today seeking ways to easily recognise and value the diverse contributions that researchers are teams make to research outputs – beyond designation as an author. A number of initiatives are endorsing the shift from static concepts of ‘authorship’ to more dynamic and holistic concepts of contribution (e.g. UK Academy of Medical Sciences (AMS) Team Science initiative ; San Francisco Declaration on Research Assessment (DORA ).
The contributor roles taxonomy standard (CRediT ) was developed by a cross-sector collaboration involving medical journal editors, researchers, research institutions, funding agencies, publishers, libraries and learned societies. The 14 role taxonomy, going way beyond the concept of ‘authorship’, includes a range of roles such as data curation, development of methodology, software development, and data visualization.
Following the implementation of CRediT across all its journals by PLOS in 2016, CRediT has now been implemented across over 100 journals and publishing outlets and interest continues to increase. Today CRediT is used by scholarly publishers during an article submission process to capture an ‘author’s’ specific contributions in a structured format and include this within an article’s meta-data.
Part of a more general move to support and incentivise a more open and collaborative approach to scholarly research, CRediT was designed to be practical and easy to use, while aiming to deliver a range of benefits including:
- Providing visibility and recognition of the different contributions of researchers, particularly in multi-authored works – across all aspects of the research being reported (including data curation, statistical analysis, etc.)
- Helping to reduce the potential for author disputes
- Supporting adherence to authorship/contributorship processes and policies
- Support identification of peer reviewers and specific expertise
- Support grant making by enabling funders to more easily identify those responsible for specific research products, developments or breakthroughs
- Improving the ability to track the outputs and contributions of individual research specialists and grant recipients
In this talk, Liz Allen will explain a the rationale and the rapidly evolving value of CRediT across the scholarly ecosystem, while considering some of the challenges and future roadmap and opportunities.
Dance, A. (2012) Authorship: Who’s on first? Nature: 489, 591-593. doi:10.1038/nj7417-591a. http://www.nature.com/naturejobs/science/articles/10.1038/nj7417-591a
Liz Allen is Director of Strategic Initiatives at F1000 and involved in shaping new initiatives and partnerships to promote and foster open research. Prior to joining F1000 in 2015, Liz spent over a decade leading the Evaluation Team at the Wellcome Trust. In 2015 Liz became a Visiting Senior Research Fellow in the Policy Institute at King’s College London, with a particular interest in science policy research, scholarly publishing infrastructure, impact assessment and the development of science-related indicators.
In 2017 Liz was elected to serve as a Board Director of Crossref, is co-Chair of the CASRAI CRediT Programme Committee (leading the development of CRediT (Contributor Roles Taxonomy – http://www.casrai.org/CRediT)) and serves on the Advisory Board for the Software Sustainability Institute. Liz served as a Board Director of ORCID from 2010 until 2015. During 2014-15 Liz was an adviser on the UK government commissioned Independent review of the role of research metrics in research assessment https://www.hefce.ac.uk/rsrch/metrics/.
Liz Allen – http://orcid.org/0000-0002-9298-3168
13.4 Research Libraries: an Incubator for Science Communication, Public Engagement and Literacy Skills
Heather Cunningham, University of Toronto Libraries Libraries, Canada
Research libraries must play an increasingly important role in society. In an era where questionable and unreliable sources of information abound and quickly proliferate, libraries are more important than ever helping users navigate the tidal wave of information from social media, news sites and other resources and make sense of a complicated and polarized landscape. When science literacy skills, defined as the knowledge and understanding of scientific concepts required for personal decision making and participating in civic and cultural affairs, are combined with information literacy skills, one is empowered to critically interpret and verify science as presented in the media. Science festivals and science engagement events in conjunction with research libraries provide ideal opportunities to combine these two literacy skill sets.
A case study will be presented of how the University of Toronto Library, Canada’s largest research library, integrated variegated literacies into science outreach events designed to engage with a diverse community. The public, in its various forms, attends science festivals to engage with scientific experts as well as participate in a range of scientific activities. Programming at science outreach events held within the library will be discussed in how it can serve as a catalyst for deeper learning about information creation, authority and dissemination. For example, students and members of the public took a deep dive into the construction and context of authority by participating in Wikipedia-edit-athons and fake news workshops. By bringing experts out from behind classroom “paywalls” via public lectures and human library events, students and citizens can informally converse and engage in debate with scholars.
The backdrop for this case study is the annual Science Literacy Week (SLW) as well as the science communication programming which has stemmed from it. SLW began as a grassroots event in 2014 at the University of Toronto and has since grown and developed into an annual federally funded week-long celebration of science which includes over 800 events put on by over 200 partners in 100 cities across Canada. Many of the 44 libraries at the University of Toronto, including non-science libraries, participate in planning SLW public engagement activities. Since the theme of SLW is changed annually, libraries can capitalise on timely trends and issues such as the concept of post-truth and the notion of information literacy as a social practice. The success of SLW spawned a science engagement portfolio of events held throughout the year within the library. This case study will also discuss the challenges as well as the opportunities of situating a research library as a social space for public engagement. Libraries with their long history of providing democratic access to information provide a natural setting for contemporary public engagement, debate and science communication.
Heather Cunningham is the Assistant Director for Research & Innovation Services at the Gerstein Science Information Centre, the science and medical library at the University of Toronto. She has a Master of Science in genetics as well as her MLIS from McGill University in Montreal, Canada. She has been a professional librarian for over twenty years at the University of Toronto. She oversees the knowledge synthesis service as well as the entrepreneurship portfolios at her library. She was instrumental in the development of the highly successful science outreach and communication programming. She has presented on a wide range of topics including science and public engagement, research libraries as space, web development, research metrics and green strategies for academic libraries. She has been on several medical research teams in the role of information specialist. Heather also has a long teaching portfolio from her liaison roles with the Faculty of Medicine, Faculty of Arts & Sciences, and the Centre for Environment at the University of Toronto.