Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content

Advertisement

Ranking journals by voting with feet: a new method for journal evaluation

  • Published:
Scientometrics Aims and scope Submit manuscript

Abstract

This paper introduces the Voting with Feet Index (VFI), a novel method for evaluating the quality of academic journals. Unlike previous approaches that rely on citation analysis or expert surveys, which have limitations such as not indicating paper quality or being time-consuming and expensive, the VFI argues that a journal's quality should be determined by the quality of the papers it publishes. This is measured through proxies such as awards, author reputation, and institutional affiliation, as assessing each paper individually is challenging. The study also examines the relative importance of these proxies and their potential combination. To rank journals, the VFI analyzes the distribution of papers that have received important awards, papers authored by notable scholars, and papers affiliated with distinguished institutions. A score is then calculated for each journal. The VFI was applied to 40 MS & OR (Management Science & Operations Research) journals and compared with impact factor and expert survey rankings. The results demonstrate a strong correlation between the VFI and these rankings, particularly for high-level journals. Furthermore, the rankings of selected journals in physics and chemistry were determined using Nobel Prize data. This analysis not only confirmed the accuracy of the VFI in identifying top journals but also highlighted important considerations for the indicator's construction. Overall, the VFI presents a promising new approach to assess journal quality, addressing the limitations of existing methods.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

Similar content being viewed by others

Notes

  1. The AJG Journal Directory serves as a comprehensive guide to academic journals of high quality, which are published by the Chartered Association of Business Schools. The primary objective of this directory is to provide academics with a clear understanding of the purpose of each journal. The AJG journals are currently categorized into five grades, namely 1, 2, 3, 4, and 4*, with 4* being the highest and 1 being the lowest.

  2. The principle of statistics suggests that an event with a probability of occurrence below a certain threshold can be categorized as a small probability event. Typically, this threshold is considered to be within the range of 0.01 to 0.05.

  3. The variable n represents a large positive number, and in this study, we have selected a value of 40 to demonstrate the relationship between the correlation coefficient and the sample size. It is important to note that the choice of n may vary depending on the research question and the available data. However, for the purpose of this analysis, we have limited our investigation to a range of values up to 40.

  4. 242 is the total number of INFORMS fellows, and the data was acquired on March 1, 2022.

  5. The score assigned to a journal based on important awards represents the journal's proportion of the total number of award-winning papers published. This score can be expressed as a percentage, similar to a probability. For instance, a score of 0.05 can be expressed as 5%, allowing for comparison between the two. As an example, if the score threshold for excluding journals is set to 0.05, it effectively excludes journals that publish award-winning papers as a proportion of the total less than 5%.

  6. In the process of calculating the average, an AJG score of 4* is considered equivalent to 5.

  7. The Shanghai Ranking, also known as the Academic Ranking of World Universities (ARWU), is an annual publication of university rankings. It was first published in 2003 by the Shanghai Jiao Tong University in China. The ranking aims to assess and compare the performance of universities worldwide based on various indicators, primarily focusing on research output and academic excellence. The table displays the scores of the corresponding journals in the Shanghai Ranking, which are determined based on experts' votes. A score of 75% indicates that 75% of the experts consider the journal to be a top-ranked journal.

References

  • Aledo, J. A., Gámez, J. A., Molina, D., & Rosete, A. (2018). Consensus-based journal rankings: A complementary tool for bibliometric evaluation. Journal of the Association for Information Science and Technology, 69(7), 936–948.

    Article  Google Scholar 

  • Bergstrom, C. (2007). Eigenfactor: Measuring the value and prestige of scholarly journals. College & Research Libraries News, 68(5), 314–316.

    Article  Google Scholar 

  • Bergstrom, C. T., West, J. D., & Wiseman, M. A. (2008). The eigenfactor™ metrics. Journal of Neuroscience, 28(45), 11433–11434.

    Article  Google Scholar 

  • Bornmann, L. (2014). Do altmetrics point to the broader impact of research? an overview of benefits and disadvantages of Altmetrics. Journal of Informetrics, 8(4), 895–903.

    Article  Google Scholar 

  • Bornmann, L., & Haunschild, R. (2016). Citation score normalized by cited references (CSNCR): The introduction of a new Citation Impact Indicator. Journal of Informetrics, 10(3), 875–887.

    Article  Google Scholar 

  • Braun, T., Glänzel, W., & Schubert, A. (2005). A Hirsch-type index for journals. The Scientist, 19(22), 8.

    Google Scholar 

  • Chen, C. R., & Huang, Y. (2007). Author affiliation index, Finance Journal Ranking, and the pattern of authorship. Journal of Corporate Finance, 13(5), 1008–1026.

    Article  Google Scholar 

  • Cronin, B., & Meho, L. I. (2008). Applying the author affiliation index to library and information science journals. Journal of the American Society for Information Science and Technology, 59(11), 1861–1865.

    Article  Google Scholar 

  • Danielson, M. G., & Heck, J. L. (2014). Voting with their feet: In which journals do the most prolific finance researchers publish? Financial Management, 43(1), 1–27.

    Article  Google Scholar 

  • Dorta-González, P., Dorta-González, M. I., Santos-Peñate, D. R., & Suárez-Vega, R. (2014). Journal topic citation potential and between-field comparisons: The topic normalized impact factor. Journal of Informetrics, 8(2), 406–418.

    Article  Google Scholar 

  • Egghe, L. (2006). Theory and practise of the g-index. Scientometrics, 69(1), 131–152.

    Article  Google Scholar 

  • Fawcett, T. W., & Higginson, A. D. (2012). Heavy use of equations impedes communication among biologists. Proceedings of the National Academy of Sciences, 109(29), 11735–11739.

    Article  Google Scholar 

  • Fry, T. D., & Donohue, J. M. (2014). Exploring the author affiliation index. Scientometrics, 98(3), 1647–1667.

    Article  Google Scholar 

  • Garfield, E. (1955). Citation indexes for science. Science, 122(3159), 108–111.

    Article  Google Scholar 

  • Garfield, E. (1970). Citation indexing for studying science. Nature, 227(5259), 669–671.

    Article  Google Scholar 

  • Garfield, E. (1972). Citation analysis as a tool in journal evaluation: Journals can be ranked by frequency and impact of citations for science policy studies. Science, 178(4060), 471–479.

    Article  Google Scholar 

  • Ginieis, M., & Li, X. (2020). Ranking of sustainability journals using the author affiliation index and comparison to other journal metrics. Sustainability, 12(3), 1104.

    Article  Google Scholar 

  • González-Pereira, B., Guerrero-Bote, V. P., & Moya-Anegón, F. (2010). A new approach to the metric of journals’ scientific prestige: The SJR indicator. Journal of Informetrics, 4(3), 379–391.

    Article  Google Scholar 

  • Gorman, M. F., & Kanet, J. J. (2005). Evaluating operations management-related journals via the author affiliation index. Manufacturing & Service Operations Management, 7(1), 3–19.

    Article  Google Scholar 

  • Haustein, S., Larivière, V., Thelwall, M., Amyot, D., & Peters, I. (2014). Tweets vs mendeley readers: how do these two social media metrics differ? Information Technology, 56(5), 207–215.

    Google Scholar 

  • Hirsch, J. E. (2005). An index to quantify an individual’s scientific research output. Proceedings of the National Academy of Sciences, 102(46), 16569–16572.

    Article  Google Scholar 

  • Li, J., Lu, X., Li, J., & Wu, D. (2019). Evaluating Journal quality by integrating department journal lists in a developing country: Are they representative? The Journal of Academic Librarianship, 45(6), 102067.

    Article  Google Scholar 

  • Manley, S. (2022). Letters to the editor and the race for publication metrics. Journal of the Association for Information Science and Technology, 73(5), 702–707.

    Article  Google Scholar 

  • Mingers, J., & Leydesdorff, L. (2015). A review of theory and practice in scientometrics. European Journal of Operational Research, 246(1), 1–19.

    Article  Google Scholar 

  • Mingers, J., & Yang, L. (2017). Evaluating journal quality: A review of journal citation indicators and ranking in business and management. European Journal of Operational Research, 257(1), 323–337.

    Article  MathSciNet  Google Scholar 

  • Mohammadi, E., Thelwall, M., Haustein, S., & Larivière, V. (2015). Who reads research articles? An altmetrics analysis of mendeley user categories. Journal of the Association for Information Science and Technology, 66(9), 1832–1846.

    Article  Google Scholar 

  • Priem, J., & Costello, K. L. (2010). How and why scholars cite on Twitter. Proceedings of the American Society for Information Science and Technology, 47(1), 1–4.

    Article  Google Scholar 

  • Thorne, F. C. (1977). The citation index: Another case of spurious validity. Journal of Clinical Psychology, 33(4), 1157–1161.

    Article  Google Scholar 

  • Theoharakis, V., Voss, C., Hadjinicola, G. C., & Soteriou, A. C. (2007). Insights into factors affecting production and operations management (POM) journal evaluation. Journal of Operations Management, 25(4), 932–955.

    Article  Google Scholar 

  • Wang, X., & Zhang, Z. (2020). Improving the reliability of short-term citation impact indicators by taking into account the correlation between short- and long-term Citation Impact. Journal of Informetrics, 14(2), 101019.

    Article  Google Scholar 

  • Zhao, R., & Wang, X. (2019). Research on impact evaluation of academic journals from multidimensional perspective. Library High Technology, 38(2), 458–478.

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgements

This work was financially supported by Natural Science Foundation of Xinjiang Uygur Autonomous Region (Grant No. 2023D01C28), the Xinjiang University On-Campus Incubation Program (Grant No. 22CPY036), the Ph.D. Scientific research Start-up Project of Xinjiang University (Grant No. BS202104), the Tianchi Doctoral Project of Xinjiang (Grant No. TCBS202050) and the Xinjiang High-level Talents Tianchi Program (Grant No. TCBR202104).

Funding

This work was financially supported by the Xinjiang University On-Campus Incubation Program (Grant No. 22CPY036), the Ph.D. Scientific research Start-up Project of Xinjiang University (Grant No. BS202104), the Tianchi Doctoral Project of Xinjiang (Grant No. TCBS202050) and the Xinjiang High-level Talents Tianchi Program (Grant No. TCBR202104).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Guo-liang Yang.

Ethics declarations

Competing interest

The authors declare they have no financial interests.

Informed consent

(1) This article has not been published in whole or in part elsewhere; (2) the manuscript is not currently being considered for publication in another journal; (3) all authors have been personally and actively involved in substantive work leading to the manuscript, and will hold themselves jointly and individually responsible for its content.

Research involving human participants and/or animals

This article does not include any research involving humans or animals.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendix

Appendix

See Tables 5 and 6.

Table 5 INFORMS best paper awards by research communities
Table 6 The full name and abbreviations of journals

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Chen, K., Liu, X., Wupur, A. et al. Ranking journals by voting with feet: a new method for journal evaluation. Scientometrics 129, 1567–1588 (2024). https://doi.org/10.1007/s11192-023-04888-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11192-023-04888-y

Keywords