Canva, under license
This section will cover what research metrics aim to achieve and the various measures available.
Research Metrics seek to quantify the impact academic research has in wider academia and society more generally. Impact can be seen as both the effect research has on creating a perceptible shift, change or influence on a discipline area, and the force or degree to the effect of change it has. In essence, what has it changed and by how much?
When finding metrics, it is important to first understand what you are aiming to achieve. If you require metrics to support academic promotion or putting together a grant application, be aware of the guidelines and application requirements as this will inform what metrics to use. In some instances, particular disciplines are quite prescriptive in terms of what metrics are considered measures of high impact (Roemer & Borchardt, 2015).
When seeking research metrics, it is important to first answer the following questions in order to identify what metrics you require.
Who is your target audience?
What are you trying to achieve?
What subject area are you working in?
What stage of your career are you in?
Are there guidelines/processes you need to follow?
To maximise your research impact, consider the following options:
Claim your author profile in the major citation databases. Be aware you may have work under multiple variations of your name, if so, be sure to merge them into one profile.
Link each of your publications to your unique ORCiD, this will make your work easy to attribute to the correct author.
When promoting your work, share links to the original publication rather than reuploading your work. This will improve the accuracy of any article level metrics measured.
Publish your work as open access to improve the availability of your work.
Visit the self-promotion page for more information on maximising your reach.
To select the appropriate metrics for your needs it is necessary to identify what level you are seeking to measure, whether that is at an individual article level, a journal level, an author level or at an institutional level
A citation count is a simple tally of how many times an individual article, researcher, journal, or dataset has been cited.
A publication count is a tally of how many publications an individual researcher has published.
Citation and publication counts are simple measures of academic impact, based on the idea that a researcher has greater impact in academia when they are highly published or cited.
A common metric used to measure author publication impact is the h-index. This measure is calculated based on the number of papers (H) that have been cited by other academics H times (Spicer, 2015). For example, an H-index of 20 suggests that a researcher has 20 papers that have been cited elsewhere at least 20 times each. This metric aims to recognise researchers who publish a broader range of highly cited papers, rather than a simple citation count which may be skewed based on one well cited paper.
Bibliographic databases such as Scopus, Web of Science and Google Scholar prominently display a researcher’s h-index on their profile page based on available data making it a commonly used metric in academia.
To ensure each site accurately displays a researcher’s h-index it is necessary to ensure the following:
Attribute each work published with your unique ORCiD.
Publish research/paper/data in one place. If sharing research, link to the original source.
Claim your profile and check there are no duplicates.
(McInerney,2011)
The M-index is the H-index divided by the number of years that a researcher has been active, i.e., as listed in Scopus. This index aims to provide a means to compare researcher impact at different stages of their career.
M-Index |
Potential for scientific impact |
<1.0 |
Average |
1.0-2.0 |
Above Average |
2.0-3.0 |
Excellent |
>3.0 |
Stellar |
Citation counts and H-index are limited by their focus on quantitative data without factoring the quality of individual research.
Lack of consistency – differences between disciplines, career stages
Well known authors accumulate increasingly greater citations compared to newer authors
The h-index should not be used to rank authors who are in different discipline or those who are at different stages of their careers.
Variation between disciplines means it is not appropriate to compare and h-index between authors in different disciplines.
Open to manipulation
Researchers can artificially inflate their index score by citing their own work or arranging others the cite their research.
Are these metrics fit for purpose?
The H-index was originally designed for librarians to select physical resources that are more likely to be used and is not reflective of other forms of academic impact.
Count metrics favour print and journal publications, while not accounting for impact in areas outside traditional academia.
Journal impact metrics (or journal impact factors) are a way to rank academic journals within a particular discipline, typically to identify influential journals, identify highly ranked journals in which to publish and in some cases can be used to determine the allocation of research funding. Visit Federation Library’s journal impact LibGuide for more information.
SCImago Journal Rank (Scopus metrics)
SCImago Institutions Rankings is a free resource that compares research intitutions based on their achievement in research, innovation, and web visibility.
Citation databases, also known as abstracting or indexing databases, are designed to index citations between research publications and provide both citation searching and tracking functions.
A citation database indexes citations between articles, linking related articles that are both cited by a key article and other papers that cite the key article itself.
Backwards reference searching |
Forwards citation searching |
Searching earlier papers that a key article has cited. |
Searching for later papers that cite a key article |
You can check the citation impact of an individual article in a citation database by locating it in the database and selecting the ‘cited by’ function to view which articles has cited the work. It is best practice to search more than one database, as citation data will vary between databases.
A multidisciplinary navigational tool that contains records going back to 1966, offering newly linked citations across the widest body of scientific abstracts available in one place. Coverage includes scientific, technical, medical, and social science literature.
Data, books, journals, proceedings & patents, Web of Science provides a single destination to access reliable, integrated, multidisciplinary research. Quality, curated content delivered alongside information on emerging trends, subject specific content and analysis tools make it easy for students, faculty, researchers, analysts, and program managers to pinpoint relevant research to inform their work.
Google Scholar Metrics provide an easy way for authors to quickly gauge the visibility and influence of recent articles in scholarly publications. Scholar Metrics summarize recent citations to many publications, to help authors as they consider where to publish their new research.
Altmetrics - 'alternative metrics' - seeks to track and collate attention that an individual piece of research receives via a multitude of online platforms. Altmetrics measures social media impact, citations, Mendeley use, and other mentions. The Altmetrics 'donut' can be seen on research outputs records in Scopus, research output repositories and in other sources. The colours in the donut relate to the source of the mentions from sources.
(Altmetrics, n.d)
The strength of Altmetrics lie in its capacity to accumulate mentions quickly across a wide variety of platforms, providing a more detailed indication of research impact beyond traditional journal citations. The nature of Altmetrics encourages academics to more readily share research online or through traditional media outlets as this is recognised quickly in Altmetrics scores. Indeed, one great benefit of Altmetrics is the ability to capture who is discussing a piece of research providing researchers with the opportunity to take further steps to understand if their work was positively received or if it made a practical impact in the work of others. However, it is necessary to also keep in mind the weaknesses of Altmetrics such as the lack of consistency in calculating Altmetrics scores between papers, and that a high Altmetrics score is an indicator of public engagement not a measure of the quality of research.
InCites is a publication metrics tool provided by Clarivate Analytics and draws on publication data from Web of Science. Researchers can use the tool to identify and compare their publications to a variety of benchmarks, including their peers, a study area, an institution, or even the global average in a specific research field. InCites also enables Federation to benchmark and showcase the areas of research in which we perform best.
https://libguides.federation.edu.au/incites
https://federation.edu.au/research/internal/research-data-and-tools/incites
InCites Benchmarking & Analytics: Getting Started with Incites
Federation University records journal impact metrics relating to the research publications of its academic staff.
Journal metrics are also used in the calculation of research workload allocations. Weightings are applied to different categories of publications. Peer-reviewed journals are weighted according to a specific metric: Scopus CiteScore ranking, with journals in the Top 10 according to CiteScore weighted the highest.
Find out more about Scopus metrics including CiteScore on the following page.
Research Workload Frequently Asked Questions (requires Federation University log-in)
Workload planning homepage (requires Federation University log-in)
If you have any questions about workload calculations, or about how the University records and reports upon publication outputs, please contact the Research Performance team (email: research.reporting@federation.edu.au).
Altmetrics. (n.d.). The donut and Altmetric Attention Score. https://www.altmetric.com/about-our-data/the-donut-and-score/
McInerney, J. (2011). H-Index, M-Index and google citations. McInerney Lab. http://mcinerneylab.com/research/h-index-m-index-and-google-citations/
Roemer, R. C., & Borchardt, R. (2015). Meaningful metrics: A 21st century librarian's guide to bibliometrics, altmetrics, and research impact. American Library Association.