altmetrics: the impacts on impact
posted on november 11, 2014 by sally hawkins
digital innovation has sparked a massive culture shift among researchers around the way they choose to communicate their research.
social media tools and research-specific networks like mendeley enable scientists to self-promote their discoveries so that their research output is more accessible to both colleagues and society as a whole. the growth in digital standards such as the digital object identifier (doi) attached to papers and orcid mean that it has become easier to track output and any derivative work that might emerge in social media, blogs or the news.
these innovations are further fuelling the debate around the accuracy and value of the traditional measure of impact, the journal impact factor. in 2010, with the publication of the altmetrics manifesto, an alternative measure of impact was proposed. since then, altmetrics have been hotly debated and have grown in popularity. but what are altmetrics, who uses them, and are they really the answer to traditional measurements of impact that everyone is looking for?
altmetrics indicate a wide range of activity for individual papers in digital media, such as comments in social media, mentions in the news or policy documents, and sometimes data on downloads and usage. there are a number of providers of altmetrics, such as altmetric and plum analytics, and some publishers develop their own tools. this data is displayed alongside the article.
to give an example, the society for general microbiology’s current highest-scoring paper is from the journal of general virology, entitled “the 2014 ebola virus disease outbreak in west africa”, with an altmetric score of 94. for those interested in exploring the information that altmetric can provide, download the free altmetric bookmarklet.
as publishers we are interested in altmetrics because it allows us to deepen our understanding of the impact of our publications, with new usage data becoming available sooner after publication than with traditional citation tracking methods. it also allows us to track emerging trends in subject areas and respond to these rapidly by collecting and highlighting papers which are proving popular according to altmetrics. for researchers the information can be invaluable – not only for authors who are able to more easily track the response to their papers outside of formal publication, but to readers as well who use this information to help them manage the immense amount of literature available on the web. funding institutions also gain from the uptake of altmetrics. at the recent 1:am conference in london, adam dinsmore, evaluation officer at the wellcome trust, described how altmetrics allow them to more easily track the impact of their funding, but also to delve into the detail of the data to gain a deeper understanding of the story behind it. for example, dinsmore spoke of a particular case where the article title may have influenced the amount of attention it received on social media (11,849 facebook mentions and 853 tweets to date).
that altmetrics add value is clear – this is further proved by the fact that hefce are in the process of considering how they can use altmetrics by commissioning an independent review of the role of metrics in research assessment, which will report in 2015. but what is also becoming apparent is that they should not be a substitute for citation counts and do not provide a neat solution to the problems of impact factors. altmetrics scores have many of the same shortcomings as the impact factor – there is inconsistency in data between numerous metric providers, there are not yet standards to ensure the accuracy of data, and high altmetrics scores do not necessarily reflect positive reactions to articles. there is also a worry that some researchers are more active on social media and public engagement than others, and will therefore be unfairly advantaged in altmetrics scores.
altmetrics may not provide a complete answer to the impact factor, but used in tandem with traditional means they allow us to create a much fuller picture of the impact of research, deepen our understanding of the uptake of science in the wider community and encourage us to explore and experiment with the information we produce in new ways. it is important for publishers, researchers and funders to continue the discussion so that we can begin to create best practice and industry standards and so that we can fully appreciate the potential of these new metrics. in all these discussions it is important to also consider the limitations of impact measurement; as albert einstein once said, “not everything that can be counted counts, and not everything that counts can be counted.”
videos and blog posts from the recent 1:am altmetrics conference in london are recommended for anyone interested in learning more.