- University Libraries
- Research Guides
- Topic Guides
- Tell Your Story - Impact & Engagement
- Responsible Research Assessment
Tell Your Story - Impact & Engagement: Responsible Research Assessment
For researchers, scholars, teachers, extension agents, students, and others, this guide highlights options to share your work and get to know others with like interests.
Responsible Research Assessment - A Brief Overview
Resources for Responsible Research Assessment
International Responsible Research Assessment Initiatives
- San Francisco Declaration on Research Assessment (DORA)The first initiative to push for responsible research assessment; started at the Annual Meeting of The American Society for Cell Biology (ASCB) in San Francisco, CA, on December 16, 2012 in response to increasing pressure to rely exclusively on Journal Impact Factors (JIFs) for the assessment of quality of research, hiring decisions, grant dollars, etc. DORA focuses more on the STEM fields.
- The Metric TideThe second initiative for responsible research evaluation, published in 2015.
"This report presents the findings and recommendations of the Independent Review of the Role of Metrics in Research Assessment and Management. The review was chaired by Professor James Wilsdon, supported by an independent and multidisciplinary group of experts in scientometrics, research funding, research policy, publishing, university management and administration."
Highly recommended that you read the Executive Summary and Recommendations on pages vii - xiv. - Bibliometrics: The Leiden Manifesto for research metricsThe third initiative to urge responsible research evaluation, published in 2015. Can be more broadly applied to other fields outside STEM. Urges qualitative evaluation in conjunction or complementary to quantitative assessment in all cases. Written by leading experts in research evaluation and bibliometrics.
- HuMetricsHSSHuMetricsHSS is an initiative for rethinking humane indicators of excellence in academia, focused particularly on the humanities and social sciences (HSS). Comprised of individuals and organizations from the academic, commercial, and non-profit sectors, HuMetricsHSS endeavors to create and support a values-based framework for understanding and evaluating all aspects of the scholarly life well-lived and for promoting the nurturing of these values in scholarly practice.
- The Hong Kong Principles for assessing researchers: Fostering research integrityAuthors present five principles: responsible research practices; transparent reporting; open science (open research); valuing a diversity of types of research; and recognizing all contributions to research and scholarly activity. For each principle, they provide a rationale for its inclusion and provide examples where these principles are already being adopted.
Readings on the Uses and Abuses of Metrics
- Against metrics: how measuring performance by numbers backfiresA summary and review of the book, The Tyranny of Metrics, by Jerry Z. Muller.
- The Tyranny of MetricsVirginia Tech users can read the full eBook by Jerry Z. Muller by going to this link. Brief summary: How the obsession with quantifying human performance threatens our schools, medical care, businesses, and government.
- The Tyranny of Metrics - Colleges and UniversitiesVirginia Tech users can read the book chapter, Colleges and Universities, from The Tyranny of Metrics by going to this link.
Excerpt: "Let’s take as our first case study the realm of higher education, the ground zero of my own investigations of metric fixation. Comprising a huge sector of the national economy and a central institution of all advanced societies, colleges and universities exemplify many of the characteristic flaws and unintended consequences of measured performance, as well as some of its advantages. Once we become fixated on measurement, we easily slip into believing that more is better." - Rethinking impact factors: better ways to judge a journal"We need a broader, more-transparent suite of metrics to improve science publishing, say Paul Wouters, colleagues and co-signatories."
- Bibliometrics and Research Evaluation: Uses and AbusesWhy bibliometrics is useful for understanding the global dynamics of science but generate perverse effects when applied inappropriately in research evaluation and university rankings.
Responsible Research Assessment in Practice
- DORA Resource LibraryA collection of materials to facilitate the development of responsible research and researcher assessment policies and practices.
- DORA University Case StudiesCase studies of universities and national consortia highlight key elements of institutional change to improve academic career assessment.
- Introducing SCOPE – a process for evaluating responsiblyThis gives a nice outline for a process for responsibly evaluating research. We have plenty of principles of practice, but actually putting them into practice is a little trickier.
- SCOPE - Short Version (5 minute video)If you prefer a video and haven't got much time, this is for you!
- The evaluative inquiry: a new approach to academic evaluationThis is another nice way of applying principles to process and practice. It's written by some of the leading experts on research evaluation and bibliometrics on their blog at the Center for Science and Technology (CWTS) at Leiden University (yes, some of those at CWTS are the authors of the Leiden Manifesto!)
- The research evaluation food chain and how to disrupt itA great blog post on the Bibliomagician by Lizzie Gadd, chair of the LIS-Bibliometrics Committee and the chair of the INORMS Research Evaluation Working Group.
- Responsible metrics: One size doesn't fit allHow to evaluate research based on the type and size of your analysis.
Further Readings and Resources
- The Bibliomagician Resource HubIncludes lots of readings and responsible research assessment policies at various universities (mostly in the UK).
- When Do Citations Reflect "Impact?"Karin Wulf, editor of the William and Mary Quarterly, offers a humanities perspective on citations and the meaning behind them.
- Over-optimization of academic publishing metrics: observing Goodhart’s Law in actionPeer-reviewed article from GigaScience. Abstract: The academic publishing world is changing significantly, with ever-growing numbers of publications each year and shifting publishing patterns. However, the metrics used to measure academic success, such as the number of publications, citation number, and impact factor, have not changed for decades. Moreover, recent studies indicate that these metrics have become targets and follow Goodhart’s Law, according to which, “when a measure becomes a target, it ceases to be a good measure.”
- Goodhart’s Law: Are Academic Metrics Being Gamed?A blog post and summary of the results of the article above ("Over-optimization..")
Responsible University Rankings
- Ranking universities responsibly - presentation slidesPresentation at the Ranking Best Practices symposium. Orsay, France, June 25, 2019.
- Leiden RankingThe CWTS Leiden Ranking 2019 offers important insights into the scientific performance of nearly 1000 major universities worldwide. Select your preferred indicators, generate results, and explore the performance of universities.
- What makes a fair and responsible university ranking? Rating the rankings criteriaThe Rankings sub-group of the Research Evaluation Working Group of INORMS have published a revised draft list of criteria for fair and responsible university rankings
- The Order of Things What college rankings really tell usThis journalist breaks down some of the issues with college rankings and even compares them to car rankings.
- The unsustainable goal of university rankingRanking organisations are seeking to diversify the measures use to evaluate universities. But without addressing the fundamental flaws in their methods, they will crush rather than embrace the rich complexity of our institutions of higher learning.