73
views
0
recommends
+1 Recommend
1 collections
    0
    shares
      scite_
       
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      The H-index is an unreliable research metric for evaluating the publication impact of experimental scientists : Evaluating publication impact from experimental work

      Preprint
      In review
      research-article
        1 ,
      ScienceOpen Preprints
      ScienceOpen
      original research, perspectives, reviews, citations, scientific reputation, research community, prizes
      Bookmark

            Abstract

            Research metrics are often used to assess the reputation of scientists. One commonly employed research metric is the H-index. It measures the publication impact of scientists. But how is it conceivable for a scientist with no distinguished track record in an experimental field to generate greater publication impact than prize-winning scientists? The answer, by resorting to a publishing strategy which places less focus on experimental innovations. I make the case here that the H-index is an abysmal metric for evaluating experimental researchers and that an alternative experiment-oriented metric is sorely needed to quantitate the work of experimental scientists.

            Content

            Author and article information

            Journal
            ScienceOpen Preprints
            ScienceOpen
            4 February 2024
            Affiliations
            [1 ] UAE University ( https://ror.org/01km6p862)
            Author notes
            Author information
            https://orcid.org/0000-0002-8805-0373
            Article
            10.14293/PR2199.000690.v1
            7f281714-cca4-4620-bd73-0729f8e74647

            This work has been published open access under Creative Commons Attribution License CC BY 4.0 , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Conditions, terms of use and publishing policy can be found at www.scienceopen.com .

            History
            : 4 February 2024
            Funding
            Funded by: funder-id http://dx.doi.org/10.13039/501100006013, United Arab Emirates University;
            Award ID: UPAR 12S011
            Categories

            All data generated or analysed during this study are included in this published article (and its supplementary information files).
            Assessment, Evaluation & Research methods,Biotechnology
            reviews,scientific reputation,perspectives,research community,citations,prizes,original research

            References

            1. Akbashev Andrew R., Kalinin Sergei V.. Tackling overpublishing by moving to open-ended papers. Nature Materials. Vol. 22(3):270–271. 2023. Springer Science and Business Media LLC. [Cross Ref]

            2. Chapman Colin A., Bicca-Marques Júlio César, Calvignac-Spencer Sébastien, Fan Pengfei, Fashing Peter J., Gogarten Jan, Guo Songtao, Hemingway Claire A., Leendertz Fabian, Li Baoguo, Matsuda Ikki, Hou Rong, Serio-Silva Juan Carlos, Chr. Stenseth Nils. Games academics play and their consequences: how authorship,<i>h</i>-index and journal impact factors are shaping the future of academia. Proceedings of the Royal Society B: Biological Sciences. Vol. 286(1916)2019. The Royal Society. [Cross Ref]

            3. Koltun Vladlen, Hafner David. The h-index is no longer an effective correlate of scientific reputation. PLOS ONE. Vol. 16(6)2021. Public Library of Science (PLoS). [Cross Ref]

            4. Oransky Ivan, Marcus Adam, Abritis Alison. How bibliometrics and school rankings reward unreliable science. BMJ. 2023. BMJ. [Cross Ref]

            Comments

            Comment on this article