For over 40 years, publishing papers in a journal with high impact factor (IF) gauges a researcher’s success.
"While IF reflects the number of citations, it does not reveal the overall impact of one's work and has many drawbacks"
In this internet age, the academic environment needs alternative citation impact metrics. That is where Altmetrics come into play.
Impact factor: why is it not enough?
The IF is a scientometric index that indicates the average annual citation gain of a scientific article.
To date, IF, along with h-index is the best-known metric. Journals with high IFs are also the highest-ranked ones. Therefore, every researcher's goal is to publish their results in high impact journals.
However, this factor has many flaws. It measures only a short period, and it cannot uniformly evaluate different academic disciplines. It is slow and leads to a tendency of self-citations or even the contemporary practice of forming clusters of scientists who extensively cite each other, which is also known as “citation farming” (1).
As a result, few articles that receive a large number of citations can significantly raise the IF of the journal. Plus, it is limited only to the scientific community - cannot reflect the broader societal impact the research could accomplish.
The strong influence of IF pushes some journals to try increasing the IF by publishing fewer articles or selecting articles that would have a higher chance of attracting citations. This practice leads to a decline (2) in original research articles published per issue in scientific journals (3) and raise the unfair competition between research groups (4).
"However, why are we employing only citations to measure the article’s popularity and impact?"
Do we have another indicator that can accumulate the feedback of the scientific work beyond the academic community, but is fast and freely accessible?
Yes: the world wide web is the answer, which now is called the second-generation web or Web 2.0.
It capitalizes on the ability of people to freely share information online via social media, blogging, and web-based communities. The unique character of Web 2.0 gave rise to alternative metrics – aka the altmetrics.
What are altmetrics?
Altmetrics (5) (ALTernative article-level metrics) gauges the scholarly impact of a research article based on the attention it gains on online tools and environments (4,6-7).
It measures the frequency of mentions (in colloquial terms, shares, and hits) that a scientific work gets. It quantifies the social media attention received by an experimental work: not only articles but datasets, presentations, software, and tools shared within researchers, scientific communities, and the public (8).
The altmetric manifesto (9) is calling altmetrics as “Tomorrow’s filter” for scholars to select the most relevant and significant sources of information from the rest.
The variety of sources for altmetrics is wide.
It calculates the number of Tweets, Facebook shares, mentions in blog posts, citations on Wikipedia or Mendeley bookmarks, and many more. It leads to the improvement of scientific communication, reflects the reach of the scientific work to the general public, but most importantly, it is easily accessible and quick.
How and why should you try altmetrics?
There are several platforms that are involved in monitoring altmetrics, the biggest of which are Altmetric (10) and Plum Analytics (11).
To know the altmetric statistics of your work, you need to install a small plug-in from altmetric.com, and then you could retrieve all data with just a click. These platforms use the so-called “altmetrics donut” (12) and Altmetrics Attention Score to present the attention that your work achieved.
While the donut shows you the precise amount of citations and their sources, the Altmetric Attention Score expresses the average level of online activity that the work has seen.
The altmetrics donut (source: https://www.altmetric.com/about-our-data/the-donut-and-score/)
In the case of Plum analytics – acquired by Elsevier – it is an integral part of many existing platforms and journal websites such as Digital Commons, EBSCOHost, EBSCO Discovery Service, Mendeley, Science Direct, Scopus and SSRN.
Each of these websites is automatically connected with the PlumX Metrics and can be accessed easily by one click.
All the citations and mentions on both platforms are easily accessible and transparent. Both services also stress their Twitter influence, as these “Tweetations” make up to 90% of altmetric mentions.
They can predict highly cited articles within the first three days of article publication (8, 13). There are already available tips for those who are interested in maximizing the Altmetric Score of their work:
However, what is it all for anyway?
Besides its use for the authors in terms of understanding the impact of the article, an attached score from Altmetrics or Plum analytics can be useful for enhancing application chances for a promotion or a grant.
Altmetrics is a handy accessory to support traditional impact metrics, which allows researchers to demonstrate the measure of impact and worth of their work in ‘real-time.’(4, 14)
The altmetrics status of a recent paper can stimulate interest and debate within the readers, directly contributing to its impact. The impact factor, being a slow metric, would have missed all of this attention (14).
After all, we want our science to reach as many people as possible, and altmetrics give us an overview of the marketing-quotient of one's work.
Are there any downsides?
Even though the altmetrics donut looks attractive and can be impressive, it is essential to interpret Altmetric scores with caution.
The popularity of an article does not prove that it is an article of high-quality. Also, substantial ‘media-buzz’ of the paper does not always lead to a high amount of traditional citations.
Very often, a very high altmetric score associated with a publication sparks controversy or public misconception.
A prime example is an article that supports the lack of a link between autism and measles-mumps-rubella (MMR) vaccination among children with older siblings with autism. It has been cited only 70 times but has been viewed more than 153000 times on social media because of the controversy connected to vaccines (15,16).
Putting too much importance on altmetrics could lead to negative consequences, such as the aggressive marketing of scientific articles on online platforms.
This might be the main drawback of Altmetrics because the barrier to cheating is minimal. Moreover, it is very easy to create fake interest in social media, which sometimes can even be purchased (17).
There is real value in altmetrics, but they cannot offer us absolute answers. Altmetrics also are not well researched; they could be misleading as every altmetric statistics provider has a different method of calculating the attention score.
However, knowing how vital Web 2.0 is now in the life of researchers, it is valuable to keep an eye on altmetrics while browsing the scientific work.
Who knows, maybe one day they will replace the traditional impact factor!
Written by Agnieszka Szmitkowska
Edited and reviewed by Somsuvro Basu and Markus Dettenhofer