ALTMETRICS: The future of research impact measures



For over 40 years, publishing papers in a journal with high impact factor (IF) gauges a researcher’s success.


"While IF reflects the number of citations, it does not reveal the overall impact of one's work and has many drawbacks" 


In this internet age, the academic environment needs alternative citation impact metrics. That is where Altmetrics come into play.



Impact factor: why is it not enough?


The IF is a scientometric index that indicates the average annual citation gain of a scientific article.


To date, IF, along with h-index is the best-known metric. Journals with high IFs are also the highest-ranked ones. Therefore, every researcher's goal is to publish their results in high impact journals.


However, this factor has many flaws. It measures only a short period, and it cannot uniformly evaluate different academic disciplines. It is slow and leads to a tendency of self-citations or even the contemporary practice of forming clusters of scientists who extensively cite each other, which is also known as “citation farming” (1).


As a result, few articles that receive a large number of citations can significantly raise the IF of the journal. Plus, it is limited only to the scientific community - cannot reflect the broader societal impact the research could accomplish. 


The strong influence of IF pushes some journals to try increasing the IF by publishing fewer articles or selecting articles that would have a higher chance of attracting citations. This practice leads to a decline (2) in original research articles published per issue in scientific journals (3) and raise the unfair competition between research groups (4).



"However, why are we employing only citations to measure the article’s popularity and impact?"



Do we have another indicator that can accumulate the feedback of the scientific work beyond the academic community, but is fast and freely accessible?


Yes: the world wide web is the answer, which now is called the second-generation web or Web 2.0. 


It capitalizes on the ability of people to freely share information online via social media, blogging, and web-based communities. The unique character of Web 2.0 gave rise to alternative metrics – aka the altmetrics.



What are altmetrics?


Altmetrics (5) (ALTernative article-level metrics) gauges the scholarly impact of a research article based on the attention it gains on online tools and environments (4,6-7).


It measures the frequency of mentions (in colloquial terms, shares, and hits) that a scientific work gets. It quantifies the social media attention received by an experimental work: not only articles but datasets, presentations, software, and tools shared within researchers, scientific communities, and the public (8).


The altmetric manifesto (9) is calling altmetrics as “Tomorrow’s filter” for scholars to select the most relevant and significant sources of information from the rest.


The variety of sources for altmetrics is wide.


It calculates the number of Tweets, Facebook shares, mentions in blog posts, citations on Wikipedia or Mendeley bookmarks, and many more. It leads to the improvement of scientific communication, reflects the reach of the scientific work to the general public, but most importantly, it is easily accessible and quick.



How and why should you try altmetrics?


There are several platforms that are involved in monitoring altmetrics, the biggest of which are Altmetric (10) and Plum Analytics (11).


To know the altmetric statistics of your work, you need to install a small plug-in from altmetric.com, and then you could retrieve all data with just a click. These platforms use the so-called “altmetrics donut” (12) and Altmetrics Attention Score to present the attention that your work achieved.


While the donut shows you the precise amount of citations and their sources, the Altmetric Attention Score expresses the average level of online activity that the work has seen.



 The altmetrics donut (source: https://www.altmetric.com/about-our-data/the-donut-and-score/)



In the case of Plum analytics – acquired by Elsevier – it is an integral part of many existing platforms and journal websites such as Digital Commons, EBSCOHost, EBSCO Discovery Service, Mendeley, Science Direct, Scopus and SSRN.


Each of these websites is automatically connected with the PlumX Metrics and can be accessed easily by one click.


All the citations and mentions on both platforms are easily accessible and transparent. Both services also stress their Twitter influence, as these “Tweetations” make up to 90% of altmetric mentions.


They can predict highly cited articles within the first three days of article publication (8, 13). There are already available tips for those who are interested in maximizing the Altmetric Score of their work:


  • Publish open access
  • Advertise your work via social media profiles and network with relevant communities
  • Co-work with your organization or publisher’s social media and press team to share significant findings
  • Research altmetrics of similar articles within the same field to identify platforms that may be interested in your work (14)



However, what is it all for anyway?


Besides its use for the authors in terms of understanding the impact of the article, an attached score from Altmetrics or Plum analytics can be useful for enhancing application chances for a promotion or a grant.


Altmetrics is a handy accessory to support traditional impact metrics, which allows researchers to demonstrate the measure of impact and worth of their work in ‘real-time.’(4, 14)


The altmetrics status of a recent paper can stimulate interest and debate within the readers, directly contributing to its impact. The impact factor, being a slow metric, would have missed all of this attention (14). 

After all, we want our science to reach as many people as possible, and altmetrics give us an overview of the marketing-quotient of one's work.



Are there any downsides?


Even though the altmetrics donut looks attractive and can be impressive, it is essential to interpret Altmetric scores with caution. 


The popularity of an article does not prove that it is an article of high-quality. Also, substantial ‘media-buzz’ of the paper does not always lead to a high amount of traditional citations.


Very often, a very high altmetric score associated with a publication sparks controversy or public misconception.


A prime example is an article that supports the lack of a link between autism and measles-mumps-rubella (MMR) vaccination among children with older siblings with autism. It has been cited only 70 times but has been viewed more than 153000 times on social media because of the controversy connected to vaccines (15,16). 


Putting too much importance on altmetrics could lead to negative consequences, such as the aggressive marketing of scientific articles on online platforms.


This might be the main drawback of Altmetrics because the barrier to cheating is minimal. Moreover, it is very easy to create fake interest in social media, which sometimes can even be purchased (17).


There is real value in altmetrics, but they cannot offer us absolute answers. Altmetrics also are not well researched; they could be misleading as every altmetric statistics provider has a different method of calculating the attention score.


However, knowing how vital Web 2.0 is now in the life of researchers, it is valuable to keep an eye on altmetrics while browsing the scientific work.


Who knows, maybe one day they will replace the traditional impact factor!



References

  1. https://www.nature.com/articles/d41586-019-02479-7?fbclid=IwAR2Hkzg3p3mxGnAdctsdljB94tG3MSPVv8w_BoU-XCxcCFBiA0-Y0siU6Aw
  2. https://scholarlykitchen.sspnet.org/2018/06/13/journal-growth-lowers-impact-factor/
  3. Abaci A. Scientific competition, impact factor, and Altmetrics. Anatol J Cardiol. 2017;18(5):313.
  4. Baheti AD, Bhargava P. Altmetrics: A Measure of Social Attention toward Scientific Research. Curr Probl Diagn Radiol. 2017;46(6):391-2.
  5. https://pitt.libguides.com/altmetrics
  6. Repiso R, Castillo-Esparcia A, Torres-Salinas D. Altmetrics, alternative indicators for Web of Science Communication studies journals. Scientometrics. 2019;119(2):941-58.
  7. Priem J, Groth P, Taraborelli D. The altmetrics collection. PLoS One. 2012;7(11):e48753.
  8. Said A, Bowman TD, Abbasi RA, Aljohani NR, Hassan S-U, Nawaz R. Mining network-level properties of Twitter altmetrics data. Scientometrics. 2019;120(1):217-35.
  9. https://pitt.libguides.com/altmetrics
  10. https://www.altmetric.com
  11. https://plumanalytics.com
  12. https://www.altmetric.com/about-our-data/the-donut-and-score/
  13. Eysenbach G. Can tweets predict citations? Metrics of social impact based on Twitter and correlation with traditional metrics of scientific impact. J Med Internet Res. 2011;13(4):e123.
  14. Griffin SA, Oliver CW, Murray A. 'Altmetrics'! Can you afford to ignore it? Br J Sports Med. 2018;52(18):1160-1.
  15. Warren HR, Raison N, Dasgupta P. The Rise of Altmetrics. JAMA. 2017;317(2):131-2.
  16. Jain A, Marshall J, Buikema A, Bancroft T, Kelly JP, Newschaffer CJ. Autism occurrence by MMR vaccine status among US children with older siblings with and without autism. Jama. 2015;313(15):1534-40.
  17. Crotty D. Altmetrics. Eur Heart J. 2017;38(35):2647-8.


Written by Agnieszka Szmitkowska 











Edited and reviewed by Somsuvro Basu and Markus Dettenhofer