Internationally, emphasis is often placed on the measurable impact of research. In Australia, factors behind this include:
*ERA 2018 included Engagement & Impact Assessment. This "examined how universities are translating their research into economic, social and other benefits and encourage greater collaboration between universities, industries and other end-users of research". In this context, the ARC used the following definitions:
Engagement: the interaction between researchers and research end-users outside of academia (including governments, businesses, non-governmental organisations, communities and community organisations), for the mutually beneficial transfer of knowledge, technologies, methods or resources.
Impact: the contribution that research makes to the economy, society, environment or culture, beyond the contribution to academic research.
See: http://www.arc.gov.au/engagement-and-impact-assessment
Evidence of scholarly impact may be demonstrated in the form of bibliometrics - traditional publication metrics. Are your publications being cited? Are you publishing in peer-reviewed journals with wide, international readership? See associated tabs on Citation Analysis and Journal Impact for more information.
Has it influenced policy or industry? Your evidence of impact will need to describe WHAT the impact was and WHO the impact affected. Impact evidence should show what has actually happened and not be aspirational.
Learning and Research Librarians can help identify and search for the following types of research activity evidence:
Type of evidence | Sources include: | Examples |
Bibliometrics |
Web of Science, Scopus, Google Scholar |
Citations, H-index, journal metrics & quality indicators |
Social media attention to researcher profile & output, eg on Twitter, ResearchGate, UTAS profile |
Altmetric Explorer, The Conversation, Google, ResearchGate, Academia, Mendeley; UTAS Google; personal blogs etc |
Number/demographics of tweets & retweets, views, downloads etc; Google results ranking; content reuse/repurposing |
Library holdings |
WorldCat |
Number of libraries & countries holding publications; type of library; editions; translations |
Media activity, coverage & response |
News/media databases |
Internal & external coverage of book launches, presentations & other events; Google results ranking; public feedback |
Publication data |
Publishing & media directories, publisher websites |
Circulation/distribution/readership/audience; quality/reach of outlet eg book publisher |
Open access publications |
UTAS/disciplinary repositories, ResearchGate, Academia, Mendeley, OA journals |
Views, downloads, reads |
Data & metadata |
Research Data Australia, UTAS/ disciplinary repositories, data journals |
Views, downloads, reads |
Grey/unpublished literature usage |
Google, registries, organisation/government websites, Hansard, repositories |
Clinical trials, citations in government policy reports, ABS, Parliamentary mentions |
IP/patents |
Scival, SciFinder Web |
Citations showing contributions to innovation eg in patent applications |
Before seeking or interpreting research metrics, become familiar with these ten principles proposed for the measurement of research performance: the Leiden Manifesto for Research Metrics published as a comment in Nature, v520, 23 April, 2015.
Quantitative evaluation should support qualitative, expert assessment
Measure performance against the research missions of the institution, group or researcher
Protect excellence in locally relevant research
Keep data collection and analytical processes open, transparent and simple
Allow those evaluated to verify data and analysis
Account for variation by field in publication and citation practices
Base assessment of individual researchers on a qualitative judgement of their portfolio
Avoid misplaced concreteness and false precision
Recognize the systemic effects of assessment and indicators
Scrutinize indicators regularly and update them.