Generic selectors
Exact matches only
Search in title
Search in content
Post Type Selectors
Filter by Categories
Brief Report
Case Report
Editorial
Erratum
Guest Editorial
History
Image
Images in Clinical Practice
Invited Commentary
JSSTD Symposium
Letter to Editor
Letter to the Editor
Letter to the Editor - Study Letter
Media and News
Net Case
Net Consensus Statement
Net Educational Video for Residents
Net Image
Net Letter
Net Quiz in Dermatology
Obituary
Original Article
Quiz in Dermatology
Resident’s Page
Review Article
Study Letter
Study Letter Case Series
Symposium
Generic selectors
Exact matches only
Search in title
Search in content
Post Type Selectors
Filter by Categories
Brief Report
Case Report
Editorial
Erratum
Guest Editorial
History
Image
Images in Clinical Practice
Invited Commentary
JSSTD Symposium
Letter to Editor
Letter to the Editor
Letter to the Editor - Study Letter
Media and News
Net Case
Net Consensus Statement
Net Educational Video for Residents
Net Image
Net Letter
Net Quiz in Dermatology
Obituary
Original Article
Quiz in Dermatology
Resident’s Page
Review Article
Study Letter
Study Letter Case Series
Symposium
View/Download PDF

Translate this page into:

Invited Commentary
2 (
1
); 2-4
doi:
10.25259/JSSTD_7_2020

Citation indices

Department of Dermatology, College of Medicine, King Faisal University, Hofuf, Saudi Arabia
Amanza Health Care [Nahas Skin Clinic], Perinthalmanna, Kerala, India
Corresponding author: Dr. Feroze Kaliyadan, Department of Dermatology, College of Medicine, King Faisal University, Hofuf 31982, Saudi Arabia. ferozkal@hotmail.com
Licence
This is an open-access article distributed under the terms of the Creative Commons Attribution-Non Commercial-Share Alike 4.0 License, which allows others to remix, tweak, and build upon the work non-commercially, as long as the author is credited and the new creations are licensed under the identical terms.

How to cite this article: Kaliyadan F, Ashique KT. Citation indices. J Skin Sex Transm Dis 2020;2(1):2-4.

Abstract

Impact of research is generally measured through citation counts. For author impact, the most common impact indices considered are the h-index, i10-index, and the g-index, of which the h-index is the most commonly used. There are various resources available for retrieving researcher h-indices. The most common databases used for the same are Scopus, Google Scholar, and “Web of Science.” Ethical issues related to the use of these resources for h-index calculation include – gaming/manipulation and fake citations. An issue which we have noticed cropping up of late, is researchers claiming erroneous h-indices.

Keywords

h-index
Citations
Ethics

INTRODUCTION

From the old adage of “publish or perish,” academia seems to be moving toward “cite or get out!” The value of one’s research is indicated more in terms of how much of an actual impact it has, rather than just quantity. One of the obvious measures of impact is the number of citations that one’s research gets. There are different indices to calculate the impact of a journal – like the journal impact factor (Clarivate), CiteScore (Scopus-Elsevier), and the newer ones like Altmetrics (which factors in social media impact). For authors, there are different measures of impact. The most commonly used being the h-index. Others include the i10-index and the g-index.

COMMON INDICES

h-index (also called Hirsch index or Hirsch number) is basically defined as “The maximum value of “h” such that the given author/journal has published h papers that have each been cited at least “h” times.”[1] The i10-index refers to the number of papers that the author has with at least 10 citations each.

For example, imagine if a researcher has 20 papers, 14 of which have no citations, and the remaining six have 120, 35, 10, 5, 3, and 2 citations, each. The h-index will be 4 papers with at least 4 citations and the i10-index will be 3 (3 papers with at least 10 citations). As another example imagine that a researcher has 7 papers with citations for each paper, in descending order being, as shown in Table 1.

Table 1: Sample citation table.
Paper number Citation count
1 8
2 7
3 7
4 3
5 3
6 2
7 1

The maximum value in terms of the number of papers (h) with at least many citations (cited at least “h” times) in this case would be 3. However, if the author gets one more citation for paper number 4 in Table 1, his h-index will go up to 4.

For the g-index, which is less used, all articles are ranked in decreasing order of the number of citations that they received – the g-index is the unique largest number such that the top “g” articles received together at least g citations.[2] For example, a g-index of 10 means that academic has published at least 10 articles that combined have received at least 100 citations.

ONLINE TOOLS FOR CITATION COUNTS

The h-index is now considered to be one of the standard, universally accepted measures of a researcher’s scientific outputs and citation impact. There are different resources available for retrieving a researcher’s h-index. The most commonly used resources are – Scopus (https://www.scopus.com), Google Scholar (https://scholar.google.com/), and the “Web of Science” (https://clarivate.com/webofsciencegroup/solutions/web-of-science/). However, there can be errors and variations in the h-index calculation across these databases.[2] This could be because of various factors such as exclusion of valid scientific papers or inclusion of erroneous ones, due to issues like similar author names. Some of the databases have also been plagued by fake and duplicate citations cropping up.[3] It would, therefore, be important for authors to check regularly for the accuracy of the publications and citations listed under their names in these databases.

There are some ethical issues which need to be considered while using resources for calculation of the h-index. There have been some reports, suggesting that h-indices can be manipulated and boosted by increasing self-citations.[4]

A relatively newer ethical issue which has been noticed recently with these databases for h-index calculation is when authors wrongly claim publications of other authors, fake citations or duplicate publications, knowingly or unknowingly, and use it in their resume.

Such errors would be understandable if the author has not officially “claimed” his or her profile. It would also be understandable, to some extent, that authors with a significantly large number of publications might overlook some erroneous publications listed under their name.

However, after claiming one’s profile, if there are still a significant number of publications wrongly listed, we feel that this would amount to scientific misconduct, especially when it substantially boosts one’s h-index and is being used for purposes like career advancement of grants.

For all the three databases mentioned above, it is relatively simple to delete the publications erroneously attached to one’s profile. It is also easy to add publications and to make corrections to the author profile (like combining different profiles which belong to the same author).

It is, therefore, important for researchers to understand the limitations of databases used for h-index calculation. After claiming a profile on any or all of these databases, it would be important to confirm the authenticity of the publications and citations listed under one’s name and to regularly update and verify the same. Administrators at universities and research centers should also be vigilant regarding this issue and should encourage researchers working under them to regularly check and update the accuracy of the information in the databases used to calculate their citation impact.

CONCLUSIONS

It is important for author, especially those relatively new to scientific publications, to be aware of the importance of citations, the author indices for citations, and the tools to search for one’s citation impact. The most common index used for author citation impact is the h-index. The most common sources for h-index calculation include – Google Scholar, Scopus, and Clarivate Analytics. The h-indices across these sites might vary and authors also need to be familiar with the ethical issues associated with the use of these indices.

Declaration of patient consent

Not required as there are no patients in this article.

Financial support and sponsorship

Nil.

Conflicts of interest

There are no conflicts of interest.

References

  1. . An index to quantify an individual's scientific research output. Proc Natl Acad Sci U S A. 2005;102:16569-72.
    [CrossRef] [PubMed] [Google Scholar]
  2. . Which h-index? A comparison of WoS, Scopus and Google Scholar. Scientometrics. 2008;74:257-71.
    [CrossRef] [Google Scholar]
  3. , . Detecting h-index manipulation through self-citation analysis. Scientometrics. 2011;87:85-98.
    [CrossRef] [PubMed] [Google Scholar]
  4. Fake Citations Plague Some Google Scholar Profiles. . Available from: https://www.retractionwatch.com/2014/11/17/fake-citations-plague-some-google-scholar-profiles [Last accessed on 2019 Dec 16]
    [Google Scholar]
Show Sections