The Impact of Data Source on the Ranking of Computer Scientists Based on Citation Indicators: A Comparison of Web of Science and Scopus.
DOI:
https://doi.org/10.29173/istl1596Abstract
Conference proceedings represent a large part of the literature in computer science. Two Conference Proceedings Citation Index databases were merged with Web of Science in 200: but very few studies have been conducted to evaluate the effect of that merger of databases on citation indicators in computer science in comparison to other databases. This study explores whether or not the addition of the Conference Proceedings Citation Indexes to Web of Science has changed the citation analysis results when compared to Scopus. It compares the citation data of 25 randomly selected computer science faculty in Canadian universities in Web of Science (with Conference Proceedings Citation Indexes) and Scopus. The results show that Scopus retrieved considerably more publications including conference proceedings and journal articles. Scopus also generated higher citation counts and h-index than Web of Science in this field, though relative citation rankings from the two databases were similar. It is suggested that either database could be used if a relative ranking is sought. If the purpose is to find a more complete or higher value of citation counting or h-index, Scopus is preferable. It should be noted that no matter which source is used, citation analysis as a tool for research performance assessment must be constructed and applied with caution because of its technological and methodological limitations. [ABSTRACT FROM AUTHOR]
Downloads
References
Bar-Ilan, J. 2006. An ego-centric citation analysis of the works of Michael O. Rabin based on multiple citation indexes. Information Processing & Management 42(6):1553-66. DOI: https://doi.org/10.1016/j.ipm.2006.03.019
Bar-Ilan, J. 2010. Web of Science with the Conference Proceedings Citation Indexes: The case of computer science. Scientometrics 83(3):809-24. DOI: https://doi.org/10.1007/s11192-009-0145-4
Butler, L. 2007. Assessing university research: A plea for a balanced approach. Science and Public Policy 34(8):565-74. DOI: https://doi.org/10.3152/030234207X254404
Case, D.O. and Higgins, G.M. 2000. How can we investigate citation behavior? A study of reasons for citing literature in communication. Journal of the American Society for Information Science and Technology 51(7):635-45. DOI: https://doi.org/10.1002/(SICI)1097-4571(2000)51:7<635::AID-ASI6>3.0.CO;2-H
Cohoon, J.M., Shwalb, R. and Chen, L.-Y. 2003. Faculty turnover in CS departments. SIGCSE Bulletin 35(1): 108-12 DOI: https://doi.org/10.1145/792548.611944
Cole, S., Cole, J.R., and Simon, G.A. 1981. Chance and consensus in peer review. Science 214(4523):881-6. DOI: https://doi.org/10.1126/science.7302566
Cronin, B. 2000. Semiotics and evaluative bibliometrics. Journal of Documentation 56(4):440-53. DOI: https://doi.org/10.1108/EUM0000000007123
De Sutter, B. and Van Den Oord, A. 2012. To be or not to be cited in computer science. Communications of the ACM 55(8):69-75. DOI: https://doi.org/10.1145/2240236.2240256
Elsevier. Content Overview: Scopus. [Updated 2013]. [Internet] [cited 2013 November 25]. Available from: http://www.elsevier.com/online-tools/scopus/content-overview .
Feltes, C., Gibson, D.S., Miller, H., Norton, C., and Pollock, L. 2012. Envisioning the future of science libraries at academic research institutions. [cited 2013 September 15]. Available from {https://repository.lib.fit.edu/handle/11141/10} DOI: https://doi.org/10.14224/1.26505
Franceschet, M. 2010. A comparison of bibliometric indicators for computer science scholars and journals on web of science and google scholar. Scientometrics 83(1):243-58. DOI: https://doi.org/10.1007/s11192-009-0021-2
Garfield, E. 1955. Citation indexes for science - new dimension in documentation through association of ideas. Science 122(3159):108-11. DOI: https://doi.org/10.1126/science.122.3159.108
Goodrum, A., McCain, K., Lawrence, S., and Giles, C. 2001. Scholarly publishing in the Internet age: A citation analysis of computer science literature. Information Processing & Management 37(5):661-75. DOI: https://doi.org/10.1016/S0306-4573(00)00047-9
Hirsch, J. 2005. An index to quantify an individual's scientific research output. Proceedings of the National Academy of Sciences of the United States of America 102(46):16569-72. DOI: https://doi.org/10.1073/pnas.0507655102
Horrobin, D.F. 1990. The philosophical basis of peer review and the suppression of innovation. Journal of American Medical Association 263(10):1438-41. DOI: https://doi.org/10.1001/jama.263.10.1438
Jacso, P. 2008. The pros and cons of computing the h-index using scopus. Online Information Review 32(4):524-35. DOI: https://doi.org/10.1108/14684520810897403
Jacso, P. 2009. Database source coverage: Hypes, vital signs and reality checks. Online Information Review 33(5):997-1007. DOI: https://doi.org/10.1108/14684520911001963
Jacso, P. 2010. Pragmatic issues in calculating and comparing the quantity and quality of research through rating and ranking of researchers based on peer reviews and bibliometric indicators from Web of Science, Scopus and Google Scholar. Online Information Review 34(6):972-82. DOI: https://doi.org/10.1108/14684521011099432
Jacso, P. 2011. The h-index, h-core citation rate and the bibliometric profile of the Scopus database. Online Information Review 35(3):492-501. DOI: https://doi.org/10.1108/14684521111151487
Meho, L.I. and Rogers, Y. 2008. Citation counting, citation ranking, and h-index of human-computer interaction researchers: A comparison of scopus and web of science. Journal of the American Society for Information Science and Technology 59(11):1711-26. DOI: https://doi.org/10.1002/asi.20874
Moed, H.F. 2007. The future of research evaluation rests with an intelligent combination of advanced metrics and transparent peer review. Science and Public Policy 34(8):575-83. DOI: https://doi.org/10.3152/030234207X255179
Moed, H.F. and Visser, M.S. 2007. Developing bibliometric indicators of research performance in computer science: An exploratory study. CWTS, Leiden. [Internet]. [cited 2013 August 25]. Available from http://www.Cwts.nl/pdf/NWO_Inf_Final_Report_V_210207.Pdf
Smith, R. 1988. Problems with peer review and alternatives. British Medical Journal 296(6624):774-7. DOI: https://doi.org/10.1136/bmj.296.6624.774
University Rankings. [Updated 2013]. [Internet]. Macleans. [cited 2013 May 02]. Available from: http://oncampus.macleans.ca/education/rankings/
Usée, C., Larivière, V., and Archambault, É. 2008. Conference proceedings as a source of scientific information: A bibliometric analysis. Journal of the American Society for Information Science and Technology 59(11):1776-84. DOI: https://doi.org/10.1002/asi.20888
Van Raan, A.F. 1996. Advanced bibliometric methods as quantitative core of peer review based evaluation and foresight exercises. Scientometrics 36(3):397-420. DOI: https://doi.org/10.1007/BF02129602
Van Raan, A.F.J. 2005. Fatal attraction: Conceptual and methodological problems in the ranking of universities by bibliometric methods. Scientometrics 62(1):133-43. DOI: https://doi.org/10.1007/s11192-005-0008-6
Wainer, J., Przibisczki De Oliveira, H., and Anido, R. 2010. Patterns of bibliographic references in the ACM published papers. Information Processing and Management 47(1):135-42. DOI: https://doi.org/10.1016/j.ipm.2010.07.002
Wainer, J., Billa, C. and Goldenstein, S. 2011. Invisible work in standard bibliometric evaluation of computer science. Communications of the ACM 54(5):141-6. DOI: https://doi.org/10.1145/1941487.1941517
Wainer, J., Eckmann, M., Goldenstein, S., and Rocha, A. 2013. How productivity and impact differ across computer science subareas: How to understand evaluation criteria for CS researchers. Communications of the ACM 56(8):67-73. DOI: https://doi.org/10.1145/2492007.2492026
Weingart, P. 2005. Impact of bibliometrics upon the science system: Inadvertent consequences? Scientometrics 62(1):117-31. DOI: https://doi.org/10.1007/s11192-005-0007-7
Zhao, D. and Logan, E. 2002. Citation analysis using scientific publications on the web as data source: A case study in the XML research area. Scientometrics 54(3):449-72. DOI: https://doi.org/10.1023/A:1016090601710
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2014 Li Zhang
This work is licensed under a Creative Commons Attribution 4.0 International License.
While ISTL has always been open access and authors have always retained the copyright of their papers without restrictions, articles in issues prior to no.75 were not licensed with Creative Commons licenses. Since issue no. 75 (Winter 2014), ISTL has licensed its work through Creative Commons licenses. Please refer to the Copyright and Licensing Information page for more information.