Skip to Main Content

Research Impact Metrics

A guide for those wanting to use research impact metrics for evaluation, analytics, and reviews, e.g., promotion & tenure.

Download Data for Research Outputs

Can Apply To

Journal articles or other serial publications

Metric Definition

A download is an event triggered by a user clicking on the download button, in contrast to simply viewing a web page.

Metric Calculation

A count of downloads during a period of time.

Data Sources

Publishers, subject repositories, institutional repositories, researcher profile systems, and altmetrics aggregators.

Appropriate Use Cases

Article downloads can be used as a leading indicator or proxy for others intent to use, rather than actual usage. This may (or may not) be reflected in eventual citations.

Limitations

Article downloads is not an accurate measure of consumption or how many people have read the item, despite the common tendency to equate downloads with usage. Downloaded files may languish unread in personal libraries (resulting in an inflated count of readership) or may be shared with a journal club or other individuals (resulting in an underestimate of readership). Tools that allow for automated crawling and downloading of content may also result in inaccurate counts. In order for web analytics tools to provide an accurate count, they need to be configured to monitor and count these events. The COUNTER Code of Practice provides a standard for processing such data, so it is more creditable and comparable, and includes a public registry of complianceStandard analytics tools like Google Analytics may not see downloads when people connect directly to the file through Google or Google Scholar. Platforms commonly used for institutional repositories can count these from the server side, or use plug-ins to provide an accurate count. Finally, the correlation between citations and downloads may vary by discipline and institution.

Inappropriate Use Cases

Article downloads should not be used as direct measures of usage, research quality, or impact.

Available Metric Sources

Downloads from publisher sites, subject repositories, and institutional repositories are likely to be fairly accurate, while downloads from personal websites, blogs, and other platforms may be less reliable, depending on the configuration of the site and analytics tools.

Transparency

Varies by source

Website

n/a

Timeframe

Typically immediate, but there may be a reporting delay of a few hours up to 30 days, depending on the source.

The explanation and interpretation of this metric comes directly from the Metric ToolkitCC BY.   

Can Apply To

Books and book chapters

Metric Definition

A download is an event triggered by a user clicking on the download button, in contrast to simply viewing a web page.

Metric Calculation

A count of downloads over a period of time.

Data Sources

Publishers, subject repositories, and institutional repositories

Appropriate Use Cases

Book and book chapter downloads can be used as a leading indicator or proxy for others intent to use, rather than actual usage. This may (or may not) be reflected in eventual citations.

Limitations

The number of file downloads is not an accurate count of consumption or how many people have read the item, despite its common use in this way. Downloaded files may languish unread in personal libraries (resulting in an inflated count of readership) or may be shared with a journal club or other individuals (resulting in an underestimate of readership). Additionally, relationships between citations and downloads may vary by discipline and institution. This is compounded by tools that allow for automated crawling and downloading of content.

In order for web analytics tools to provide an accurate count, they need to be configured to monitor and count these events. The COUNTER Code of Practice provides a standard for processing such data, so it is more creditable and comparable, and includes a public registry of compliance. Standard analytics provided for commonly used platforms for professional portfolios (such as Wordpress.com) may not see downloads when people connect directly to the file through Google or Google Scholar. Platforms commonly used for institutional repositories can count these from the server side, or use plug-ins to provide an accurate count. | Inappropriate Use Cases | Book and book chapter downloads are not direct measures of usage, research quality, or impact. | Available Metric Sources | Varies by publisher and repository, if used | Transparency | Varies by source | Website | n/a | Timeframe | Typically immediate, but there may be a reporting delay of a few hours up to a 30 days, depending on the source.

The explanation and interpretation of this metric comes directly from the Metric ToolkitCC BY.  

Can Apply To

Research software, scripts, code snippets

Metric Definition

File downloads over a period of time.

Metric Calculation

Most downloads are calculated in a straightforward manner (where the number of downloads is simply reported as-is), but others account for–and remove–downloads initiated by bots.

Data Sources

Though almost any web platform that can host files (including researcher websites), the most common and reliable sources of research software download statistics are those reported by software hosting sites (e.g. Bitbucket) or repositories (e.g. Figshare). Some altmetrics tools can report downloads for certain software hosting sites (e.g. PlumX).

Appropriate Use Cases

Software downloads can be used as an indicator for the reuse of programming code. In some cases, it can be a proxy for the number of users.

Limitations

Software is rarely formally cited, though it is often mentioned in publications. One study has found a weak, but statistically significant, correlation between Scopus citations and download counts for Google Code programs. SourceForge.net does not report downloads as of March 2017.

Inappropriate Use Cases

Software downloads are not direct measures of usage, quality, or impact.

Available Metric Sources

Google Code (defunct as of January 2016), CodeplexBitbucketLaunchpadGitHub (API), FigshareZenodo

Transparency

Varies by source

Website

n/a

Timeframe

Typically immediate, but there may be a reporting delay of a few hours up to a 30 days, depending on the source.

The explanation and interpretation of this metric comes directly from the Metric ToolkitCC BY.