Research Impact Metrics
A guide for those wanting to use research impact metrics for evaluation, analytics, and reviews, e.g., promotion & tenure.
- Home
- Publication Counts
- Journal Metrics
- Author Metrics & h-index
- Bibliometrics / Citations
- Altmetrics
- Usage Statistics
- Responsible Research Assessment
- Qualitative Evaluation
- Researcher Profiles This link opens in a new window
- Individual Impact & Engagement This link opens in a new window
- Research Impact & Intelligence Department This link opens in a new window
Research Impact Librarian
Responsible Research Assessment Overview
- Principles of Responsible Use
- Responsible Use of Data
- Review, Promotion, & Tenure
- Models & Processes for Evaluating
- Research Assessment Reform
- Continuing Education
- Bibliometrics: The Leiden Manifesto for research metricsA set of ten principles for best practice in metrics-based research assessment so that researchers can hold evaluators to account, and evaluators can hold their indicators to account.
- San Francisco Declaration on Research Assessment (DORA)The Declaration on Research Assessment (DORA) recognizes the need to improve the ways in which the outputs of scholarly research are evaluated.
- Harnessing the Metric Tide: indicators, infrastructures & priorities for UK responsible research assessmentThis review was commissioned by the joint UK higher education (HE) funding bodies as part of the Future Research Assessment Programme (FRAP). It revisits the findings of the 2015 review The Metric Tide to take a fresh look at the use of indicators in research management and assessment.
- The Hong Kong Principles for assessing researchers: Fostering research integrity[Abstract]: For knowledge to benefit research and society, it must be trustworthy. Trustworthy research is robust, rigorous, and transparent at all stages of design, execution, and reporting.Assessment of researchers still rarely includes considerations related to trustworthiness, rigor, and transparency. We have developed the Hong Kong Principles (HKPs) as part of the 6th World Conference on Research Integrity with a specific focus on the need to drive research improvement through ensuring that researchers are explicitly recognized and rewarded for behaviors that strengthen research integrity. We present five principles: responsible research practices; transparent reporting; open science (open research); valuing a diversity of types of research; and recognizing all contributions to research and scholarly activity. For each principle, we provide a rationale for its inclusion and provide examples where these principles are already being adopted.
Image credit: David Parkins (from the Leiden Manifesto article), used under the fair use provision of U.S. copyright law.
- Using InCites responsibly: a guide to interpretation and good practiceThis guide has been created by bibliometric practitioners to support other users of InCites and promote a community of informed and responsible use.
- Using SciVal responsibly: a guide to interpretation and good practiceThis guide is designed to help those who use SciVal to source and apply bibliometrics in academic institutions.
- Using Altmetric Data Responsibly: A Guide to Interpretation and Good PracticeThis guide focuses specifically on data from the data provider and company, Altmetric, but other types of altmetrics are mentioned and occasionally used as a comparison in this guide, such as the Open Syllabus database to find the educational engagement with scholarly outputs.This guide opens with an introduction followed by an overview of Altmetric and the Altmetric Attention Score, Altmetrics and Responsible Research Assessment, Output Types Tracked by Altmetric, and the Altmetric Sources of Attention, which include: News and Mainstream Media, Social Media (X (formerly Twitter), Facebook, Reddit, and historical data from Google+, Pinterest, LinkedIn, and Sina Weibo); Patents, Peer Review, Syllabi (historical data only), Multimedia, Public Policy Documents, Wikipedia, Research Highlights, Reference Managers, and Blogs; finally, there is a conclusion, a list of related resources and readings, two appendices, and references. This guide is intended for use by librarians, practitioners, funders, and other users of Altmetric data or those who are interested in incorporating altmetrics into their bibliometric practice and/or research analytics. It can also help researchers who are going up for annual evaluations and promotion and tenure reviews, who can use the data in informed and practical applications. It can also be a useful reference guide for research managers and university administrators who want to understand the broader online engagement with research publications beyond traditional scholarly citations, also known as bibliometrics, but who also want to avoid misusing, misinterpreting, or abusing Altmetric data when making decisions, creating policies, and evaluating faculty members and researchers at their institutions.
- Responsible metrics: One size doesn't fit allBlog post from Ludo Waltman summary: Differentiates between micro-level and macro-level research evaluation (i.e., difference between evaluation of institutions and evaluation of individuals and research groups).
- Metrics ToolkitA resource for researchers and evaluators that provides guidance for demonstrating and evaluating claims of research impact. With the Toolkit you can quickly understand what a metric means, how it is calculated, and if it’s good match for your impact question.
- Developing Metrics Literacies: Competencies, dispositions, and knowledge for the critical assessment and ethical use of scholarly metrics [blog post]Authors discuss their groundbreaking work on the Metrics Literacies Project. The Metrics Literacies project is informed by this concept: ‘an integrated set of competencies, dispositions and knowledge that empowers individuals to recognize, interpret, critically assess and effectively and ethically use scholarly metrics,’ which is guided by the overarching question, ‘How can the understanding and use of scholarly metrics in academia be improved?’
Review, Promotion, and Tenure Project
Review, promotion, and tenure (RPT) guidelines play a key role in university workplace advancement, shaping how academic success is defined and viewed. To understand the nature of these powerful documents, our lab conducted a multi-year research project analyzing the contents of more than 850 RPT guidelines from across the US and Canada. Here, we summarize some of the key takeaways from that research, as well as a vision for a more open academic reward system.
- Review, Promotion, & Tenure ProjectHow can research institutions incentivize openness and accessibility?
- RPT Project Infographics(Example below) These infographics breakdown the results of the studies that analyzed the contents of >850 RPT guidelines from across the US and Canada.
Image credit: ScholCommLab, CC BY-NC-SA. Information from infographic can be found at 10.7554/eLife.42254
- SCOPE Framework for Research EvaluationThe SCOPE framework for research evaluation is a five-stage model for evaluating responsibly.It is a practical step-by-step process designed to help research managers, or anyone involved in conducting research evaluations, in planning new evaluations as well as check existing evaluations. SCOPE is an acronym, where S stands for START with what you value, C for CONTEXT considerations, O for OPTIONS for evaluating, P for PROBE deeply, and E for EVALUATE your evaluation.
- Protocol for Research Assessments in the NetherlandsThe Standard Evaluation Protocol (SEP) describes the methods used to assess research conducted at Dutch universities and Netherlands Organisation for Scientific Research (NWO) and Academy institutes every six years, as well as the aims of such assessments.
- How academia is exploring new approaches for evaluating researchersAn overview of how research assessment reform is being approached across the globe, but most especially in Europe, and how we can learn lessons from such attempts at reform.
- The Agreement on Reforming Research AssessmentThe Agreement on Reforming Research Assessment, from the Coalition for Advancing Research Assessment (COARA), sets a shared direction for changes in assessment practices for research, researchers and research performing organizations, with the overarching goal to maximize the quality and impact of research.The Agreement includes the principles, commitments and timeframe for reforms and lays out the principles for a Coalition of organizations willing to work together in implementing the changes. Signatories include 52 countries plus dozens of academies, societies, associations, national and regional agencies, not-for-profit organizations, public and private funding organizations, research centers and infrastructures, and universities.
- Rethinking Global University RankingsIn an effort to highlight the disconnect between the approaches taken by some of the global university rankings and community-agreed best practice in responsible research evaluation, the Rankings sub-group of the INORMS Research Evaluation Working Group developed a mechanism for rating global University Rankers.
- AGORRA: A global observatory of responsible research assessmentThe Research on Research Institute's (RoRI’s) AGORRA project aims to generate comparative data, evidence and analysis to support and accelerate the transformation of research assessment reform across national assessment systems.Its aims are fourfold:
To build and connect flexible research capacity that can be directed towards strengthening the evidence base for research assessment reform;
To monitor ongoing developments in national and international frameworks and policies for research assessment, and support transfer and scaling of good ideas;
To foster and facilitate a culture of analysis and experimentation among research actors engaged in the design, delivery and evaluation of assessment processes;
To build evidence, awareness and engagement in the possibilities, pitfalls and choices being made with new methodologies, technologies and indicators - Tools to Advance Research Assessment (TARA)Tools to Advance Research Assessment (TARA) is a project to facilitate the development of new policies and practices for academic career assessment.
- Virginia Tech Faculty Statement on Responsible Use of Research MetricsThis statement was endorsed and approved by the Virginia Tech Faculty Senate on April 21, 2023.
- List of Statements of Responsible MetricsLists universities' statements of responsible research metrics and case studies from countries and universities in which they have implemented responsible research assessment.
- 2021 Competency Model for Bibliometric WorkThe 2021 bibliometric competencies can help identify skills gaps, support progression through career stages for practitioners in the field of bibliometrics, and prepare job descriptions.
- LIS-Bibliometrics Conferences & EventsA roundup of major conferences and events in bibliometrics.
- Leiden University CWTS course programThe Centre for Science and Technology Studies (CWTS) is proud to introduce its newly developed course program.The course program consists of three courses targeted at professional users of scientometrics, including librarians, policy officers, and research managers at universities, funding agencies, and other research organizations. For PhD students CWTS offers the annual CWTS Scientometrics Summer School.
Upcoming Conferences (chronological order)
- Research Analytics Summit 2024Taking place in Albuquerque, New Mexico, March 11-12, 2024.Welcoming all research adminstrators from emerging research institutions, including Historically Black Colleges & Universities, Tribal Colleges & Universities, Hispanic Serving Institutions, and any other research-active institution including R1.
- Responsible Research Evaluation ForumWill take place immediately after the Research Analytics Summit in Albuquerque, March 13-14, 2024.The Responsible Research Evaluation Forum will establish a cohort of about 100 US Library-based research evaluators and research management professionals equipped with a practical means by which to design and implement responsible research evaluations in their various academic settings. The impact of this cohort will be (1) increased capacity across the country in responsible research evaluation; (2) establishment of communities of practice; and (3) greater equity and wellbeing for researchers within our academic institutions.
- BRIC 2024 (Bibliometrics and Research Impact Community)BRIC is a Canadian community, and BRIC 2024 will take place for the first time in Western Canada from June 4-6, 2024.Join us for the 7th BRIC conference and our first hosted in Western Canada. The event starts with a one day pre-conference followed by a single track 2 day main event, bringing together librarians, information professionals, expert searchers, and bibliometricians.
Our in-person event highlights innovative exploration of citations and impact analysis but we also present compelling research on discovery and expert search. - 28th International Conference on Science, Technology and Innovation Indicators (STI)The 28th International Conference on Science, Technology and Innovation Indicators, 2024, will be held September 18-20, 2024 in Berlin.This event seeks to explore the intricate dynamics between concepts of openness and closedness in science, technology, and innovation, emphasizing their impact on research, policy, and practice.
- Nordic Workshop on Bibliometrics (NWB) and Research Policy(Links to 2023 website - 2024 website not yet live). NWB2024 is scheduled to take place in Reykjavik, Iceland, during November 2024. The workshop will be jointly hosted by the University of Iceland and the National and University Library of Iceland. Further details for NWB2024 will be announced by the organizers at a later date.
- Last Updated: Jan 12, 2024 5:14 PM
- URL: https://guides.lib.vt.edu/research-metrics
- Print Page