Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3411764.3445584acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

Are You Open? A Content Analysis of Transparency and Openness Guidelines in HCI Journals

Published: 07 May 2021 Publication History
  • Get Citation Alerts
  • Abstract

    Within the wider open science reform movement, HCI researchers are actively debating how to foster transparency in their own field. Publication venues play a crucial role in instituting open science practices, especially journals, whose procedures arguably lend themselves better to them than conferences. Yet we know little about how much HCI journals presently support open science practices. We identified the 51 most frequently published-in journals by recent CHI first authors and coded them according to the Transparency and Openness Promotion guidelines, a high-profile standard of evaluating editorial practices. Results indicate that journals in our sample currently do not set or specify clear openness and transparency standards. Out of a maximum of 29, the modal score was 0 (mean = 2.5, SD = 3.6, max = 15). We discuss potential reasons, the aptness of natural science-based guidelines for HCI, and next steps for the HCI community in furthering openness and transparency.

    References

    [1]
    Alireza Ahadi, Arto Hellas, Petri Ihantola, Ari Korhonen, and Andrew Petersen. 2016. Replication in Computing Education Research: Researcher Attitudes and Experiences. In Proceedings of the 16th Koli Calling International Conference on Computing Education Research - Koli Calling ’16. ACM Press, Koli, Finland, 2–11. https://doi.org/10.1145/2999541.2999554
    [2]
    American Psychological Association (Ed.). 2019. Publication Manual of the American Psychological Association (seventh edition ed.). American Psychological Association, Washington.
    [3]
    Daryl J. Bem. 2011. Feeling the Future: Experimental Evidence for Anomalous Retroactive Influences on Cognition and Affect.Journal of Personality and Social Psychology 100, 3(2011), 407–425. https://doi.org/10.1037/a0021524
    [4]
    Nicolas Bonneel, David Coeurjolly, Julie Digne, and Nicolas Mellado. 2020. Code Replicability in Computer Graphics. ACM Transactions on Graphics 39, 4 (July 2020), 1–8. https://doi.org/10.1145/3386569.3392413
    [5]
    Christopher D. Chambers. 2013. Registered Reports: A New Publishing Initiative at Cortex. Cortex 49, 3 (March 2013), 609–610. https://doi.org/10.1016/j.cortex.2012.12.016
    [6]
    Lewis L. Chuang and Ulrike Pfeil. 2018. Transparency and Openness Promotion Guidelines for HCI. In Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems. ACM, Montreal QC Canada, 1–4. https://doi.org/10.1145/3170427.3185377 Debate document available at https://osf.io/9wrdz/.
    [7]
    Andy Cockburn, Pierre Dragicevic, Lonni Besançon, and Carl Gutwin. 2020. Threats of a Replication Crisis in Empirical Computer Science. Commun. ACM 63, 8 (July 2020), 70–79. https://doi.org/10.1145/3360311
    [8]
    Andy Cockburn, Carl Gutwin, and Alan Dix. 2018. HARK No More: On the Preregistration of CHI Experiments. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems - CHI ’18. ACM Press, Montreal QC, Canada, 1–12. https://doi.org/10.1145/3173574.3173715
    [9]
    Enrico Coiera, Elske Ammenwerth, Andrew Georgiou, and Farah Magrabi. 2018. Does Health Informatics Have a Replication Crisis?Journal of the American Medical Informatics Association 25, 8 (Aug. 2018), 963–968. https://doi.org/10.1093/jamia/ocy028
    [10]
    Committee on Publication Ethics. Accessed 16 Sep 2020. Core Practices.
    [11]
    Kerry Dwan, Douglas G. Altman, Juan A. Arnaiz, Jill Bloom, An-Wen Chan, Eugenia Cronin, Evelyne Decullier, Philippa J. Easterbrook, Erik Von Elm, Carrol Gamble, Davina Ghersi, John P. A. Ioannidis, John Simes, and Paula R. Williamson. 2008. Systematic Review of the Empirical Evidence of Study Publication Bias and Outcome Reporting Bias. PLoS ONE 3, 8 (Aug. 2008), e3081. https://doi.org/10.1371/journal.pone.0003081
    [12]
    Florian Echtler and Maximilian Häußler. 2018. Open Source, Open Science, and the Replication Crisis in HCI. In Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems. ACM, Montreal QC Canada, 1–8. https://doi.org/10.1145/3170427.3188395
    [13]
    Elsevier. Searches conducted Aug 27 2020. Scopus. Elsevier. http://api.elsevier.com and http://www.scopus.com. Accessed 2020-08-27.
    [14]
    Benedikt Fecher and Sascha Friesike. 2014. Open Science: One Term, Five Schools of Thought. In Opening Science, Sönke Bartling and Sascha Friesike (Eds.). Springer International Publishing, Cham, 17–47. https://doi.org/10.1007/978-3-319-00026-8_2
    [15]
    Sebastian S. Feger, Sünje Dallmeier-Tiessen, Albrecht Schmidt, and Pałel W. Woźniak. 2019. Designing for Reproducibility: A Qualitative Study of Challenges and Opportunities in High Energy Physics. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems - CHI ’19. ACM Press, Glasgow, Scotland Uk, 1–14. https://doi.org/10.1145/3290605.3300685
    [16]
    Sebastian S. Feger, Sünje Dallmeier-Tiessen, Paweł W. Woźniak, and Albrecht Schmidt. 2019. The Role of HCI in Reproducible Science: Understanding, Supporting and Motivating Core Practices. In Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems. ACM, Glasgow Scotland Uk, 1–6. https://doi.org/10.1145/3290607.3312905
    [17]
    Piyum Fernando and Stacey Kuznetsov. 2020. OScH in the Wild: Dissemination of Open Science Hardware and Implications for HCI. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. ACM, Honolulu HI USA, 1–13. https://doi.org/10.1145/3313831.3376659
    [18]
    Alexandra Freeman. 2019. Octopus: A Revolution in Scientific Publishing. In Oral Presentations. BMJ Publishing Group Ltd, Oxford, A29.2–A30. https://doi.org/10.1136/bmjebm-2019-EBMLive.56
    [19]
    Matthias Gamer, Jim Lemon, Ian Fellows, and Puspendra Singh. 2019. Irr: Various Coefficients of Interrater Reliability and Agreement.
    [20]
    Stephen L. George. 2016. Research Misconduct and Data Fraud in Clinical Trials: Prevalence and Causal Factors. International Journal of Clinical Oncology 21, 1 (Feb. 2016), 15–21. https://doi.org/10.1007/s10147-015-0887-3
    [21]
    Google. Accessed 15 Sep 2020. Google Scholar Top Publications: Human-Computer Interaction.
    [22]
    Data Citation Synthesis Group. 2014. Joint Declaration of Data Citation Principles. Technical Report. Force11. https://doi.org/10.25490/A97F-EGYK
    [23]
    Tamarinde Laura Haven, Timothy M. Errington, Kristian Gleditsch, Leonie van Grootel, Alan M. Jacobs, Florian Kern, Rafael Piñeiro, Fernando Rosenblatt, and Lidwine Mokkink. 2020. Preregistering Qualitative Research: A Delphi Study. Preprint. SocArXiv. https://doi.org/10.31235/osf.io/pz9jr
    [24]
    Tamarinde L Haven and Dr. Leonie Van Grootel. 2019. Preregistering Qualitative Research. Accountability in Research 26, 3 (April 2019), 229–244. https://doi.org/10.1080/08989621.2019.1580147
    [25]
    Kasper Hornbæk, Søren S. Sander, Javier Andrés Bargas-Avila, and Jakob Grue Simonsen. 2014. Is Once Enough?: On the Extent and Content of Replications in Human-Computer Interaction. In Proceedings of the 32nd Annual ACM Conference on Human Factors in Computing Systems - CHI ’14. ACM Press, Toronto, Ontario, Canada, 3523–3532. https://doi.org/10.1145/2556288.2557004
    [26]
    International Committee of Medical Journal Editors. 2020. Recommendations for the Conduct, Reporting, Editing and Publication of Scholarly Work in Medical Journals. http://www.icmje.org/icmje-recommendations.pdf.
    [27]
    John P. A. Ioannidis. 2005. Why Most Published Research Findings Are False. PLoS Medicine 2, 8 (Aug. 2005), e124. https://doi.org/10.1371/journal.pmed.0020124
    [28]
    Matthew Kay, Steve Haroz, Shion Guha, and Pierre Dragicevic. 2016. Special Interest Group on Transparent Statistics in HCI. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems - CHI EA ’16. ACM Press, San Jose, California, USA, 1081–1084. https://doi.org/10.1145/2851581.2886442
    [29]
    Mallory C. Kidwell, Ljiljana B. Lazarević, Erica Baranski, Tom E. Hardwicke, Sarah Piechowski, Lina-Sophia Falkenberg, Curtis Kennett, Agnieszka Slowik, Carina Sonnleitner, Chelsey Hess-Holden, Timothy M. Errington, Susann Fiedler, and Brian A. Nosek. 2016. Badges to Acknowledge Open Practices: A Simple, Low-Cost, Effective Method for Increasing Transparency. PLOS Biology 14, 5 (May 2016), e1002456. https://doi.org/10.1371/journal.pbio.1002456
    [30]
    Robert Kosara and Steve Haroz. 2018. Skipping the Replication Crisis in Visualization: Threats to Study Validity and How to Address Them : Position Paper. In 2018 IEEE Evaluation and Beyond - Methodological Approaches for Visualization (BELIV). IEEE, Berlin, Germany, 102–107. https://doi.org/10.1109/BELIV.2018.8634392
    [31]
    Vassilis Kostakos. 2015. The Big Hole in HCI Research. Interactions 22, 2 (Feb. 2015), 48–51. https://doi.org/10.1145/2729103
    [32]
    Yang Liu, Tim Althoff, and Jeffrey Heer. 2020. Paths Explored, Paths Omitted, Paths Obscured: Decision Points & Selective Reporting in End-to-End Data Analysis. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (Honolulu, HI, USA) (CHI ’20). Association for Computing Machinery, New York, NY, USA, 1–14. https://doi.org/10.1145/3313831.3376533
    [33]
    Regan Mandryk. 2020. PACM Statement.
    [34]
    Lokman I. Meho and Yvonne Rogers. 2008. Citation Counting, Citation Ranking, and h-Index of Human-Computer Interaction Researchers: A Comparison of Scopus and Web of Science. Journal of the American Society for Information Science and Technology 59, 11 (2008), 1711–1726. https://doi.org/10.1002/asi.20874
    [35]
    Robert K. Merton. 1973. The Sociology of Science. Theoretical and Empirical Investigations. University of Chicago Press, Chicago, London.
    [36]
    David Moher, Alessandro Liberati, Jennifer Tetzlaff, Douglas G. Altman, and The PRISMA Group. 2009. Preferred Reporting Items for Systematic Reviews and Meta-Analyses: The PRISMA Statement. PLoS Medicine 6, 7 (July 2009), e1000097. https://doi.org/10.1371/journal.pmed.1000097
    [37]
    Richard D. Morey, Christopher D. Chambers, Peter J. Etchells, Christine R. Harris, Rink Hoekstra, Daniël Lakens, Stephan Lewandowsky, Candice Coker Morey, Daniel P. Newman, Felix D. Schönbrodt, Wolf Vanpaemel, Eric-Jan Wagenmakers, and Rolf A. Zwaan. 2016. The Peer Reviewers’ Openness Initiative: Incentivizing Open Research Practices through Peer Review. Royal Society Open Science 3, 1 (Jan. 2016), 150547. https://doi.org/10.1098/rsos.150547
    [38]
    John Muschelli. 2019. Rscopus: Scopus Database ’API’ Interface.
    [39]
    Brian A. Nosek. 2019. Improving Transparency and Reproducibility of Research.
    [40]
    B. A. Nosek, G. Alter, G. C. Banks, D. Borsboom, S. D. Bowman, S. J. Breckler, S. Buck, C. D. Chambers, G. Chin, G. Christensen, M. Contestabile, A. Dafoe, E. Eich, J. Freese, R. Glennerster, D. Goroff, D. P. Green, B. Hesse, M. Humphreys, J. Ishiyama, D. Karlan, A. Kraut, A. Lupia, P. Mabry, T. Madon, N. Malhotra, E. Mayo-Wilson, M. McNutt, E. Miguel, E. L. Paluck, U. Simonsohn, C. Soderberg, B. A. Spellman, J. Turitto, G. VandenBos, S. Vazire, E. J. Wagenmakers, R. Wilson, and T. Yarkoni. 2015. Promoting an Open Research Culture. Science 348, 6242 (June 2015), 1422–1425. https://doi.org/10.1126/science.aab2374
    [41]
    Open Science Collaboration. 2015. Estimating the Reproducibility of Psychological Science. Science 349, 6251 (Aug. 2015), aac4716–aac4716. https://doi.org/10.1126/science.aac4716
    [42]
    Amy Orben. 2019. A Journal Club to Fix Science. Nature 573, 7775 (Sept. 2019), 465–465. https://doi.org/10.1038/d41586-019-02842-8
    [43]
    Matthew James Page, Joanne McKenzie, Patrick Bossuyt, Isabelle Boutron, Tammy Hoffmann, cindy d mulrow, Larissa Shamseer, Jennifer Tetzlaff, Elie Akl, Sue E Brennan, Roger Chou, Julie Glanville, Jeremy Grimshaw, Asbjørn Hróbjartsson, Manoj Mathew Lalu, Tianjing Li, Elizabeth Loder, Evan Mayo-Wilson, Steve McDonald, Luke A McGuinness, Lesley Stewart, James Thomas, Andrea Tricco, Vivian Andrea Welch, Penny Whiting, and David Moher. 2020. The PRISMA 2020 Statement: An Updated Guideline for Reporting Systematic Reviews. Preprint. MetaArXiv. https://doi.org/10.31222/osf.io/v7gm2
    [44]
    Irene V. Pasquetto, Ashley E. Sands, Peter T. Darch, and Christine L. Borgman. 2016. Open Data in Scientific Settings: From Policy to Practice. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. ACM, San Jose California USA, 1585–1596. https://doi.org/10.1145/2858036.2858543
    [45]
    Xiaoying Pu, Licheng Zhu, Matthew Kay, and Frederick Conrad. 2019. Designing for Preregistration: A User-Centered Perspective. In Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems. ACM, Glasgow Scotland Uk, 1–6. https://doi.org/10.1145/3290607.3312862
    [46]
    Steven M. Quiring. 2007. Trends in Publication Outlets of Geographer-Climatologists. The Professional Geographer 59, 3 (Aug. 2007), 357–364. https://doi.org/10.1111/j.1467-9272.2007.00618.x
    [47]
    Stuart Ritchie. 2020. Science Fictions: Exposing Fraud, Bias, Negligence and Hype in Science. Bodley Head, London.
    [48]
    Anne M. Scheel, Mitchell Schijen, and Daniel Lakens. 2020. An Excess of Positive Results: Comparing the Standard Psychology Literature with Registered Reports. Preprint. PsyArXiv. https://doi.org/10.31234/osf.io/p6e9c
    [49]
    Kenneth F Schulz, Douglas G Altman, David Moher, and CONSORT Group. 2010. CONSORT 2010 Statement: Updated Guidelines for Reporting Parallel Group Randomised Trials. BMJ 340, mar23 1 (March 2010), c332–c332. https://doi.org/10.1136/bmj.c332
    [50]
    Joseph P Simmons, Leif D Nelson, and Uri Simonsohn. 2011. False-Positive Psychology: Undisclosed Flexibility in Data Collection and Analysis Allows Presenting Anything as Significant. Psychological Science 22, 11 (Nov. 2011), 1359–1366. https://doi.org/10.1177/0956797611417632
    [51]
    Victoria Stodden and Sheila Miguez. 2014. Best Practices for Computational Science: Software Infrastructure and Environments for Reproducible and Extensible Research. Journal of Open Research Software 2, 1 (July 2014), e21. https://doi.org/10.5334/jors.ay
    [52]
    Theresa Velden. 2013. Explaining Field Differences in Openness and Sharing in Scientific Communities. In Proceedings of the 2013 Conference on Computer Supported Cooperative Work - CSCW ’13. ACM Press, San Antonio, Texas, USA, 445. https://doi.org/10.1145/2441776.2441827
    [53]
    Jan Benjamin Vornhagen, April Tyack, and Elisa D Mekler. 2020. Statistical Significance Testing at CHI PLAY: Challenges and Opportunities for More Transparency. Preprint. Open Science Framework. https://doi.org/10.31219/osf.io/58wzp
    [54]
    George Vrettas and Mark Sanderson. 2015. Conferences versus journals in computer science. Journal of the Association for Information Science and Technology 66, 12(2015), 2674–2684.
    [55]
    Chat Wacharamanotham, Lukas Eisenring, Steve Haroz, and Florian Echtler. 2020. Transparency of CHI Research Artifacts: Results of a Self-Reported Survey. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. ACM, Honolulu HI USA, 1–14. https://doi.org/10.1145/3313831.3376448
    [56]
    Qi Wang and Ludo Waltman. 2016. Large-Scale Analysis of the Accuracy of the Journal Classification Systems of Web of Science and Scopus. Journal of Informetrics 10 (2016), 347–364. https://doi.org/10.1016/j.joi.2016.02.003
    [57]
    Mark D. Wilkinson, Michel Dumontier, IJsbrand Jan Aalbersberg, Gabrielle Appleton, Myles Axton, Arie Baak, Niklas Blomberg, Jan-Willem Boiten, Luiz Bonino da Silva Santos, Philip E. Bourne, Jildau Bouwman, Anthony J. Brookes, Tim Clark, Mercè Crosas, Ingrid Dillo, Olivier Dumon, Scott Edmunds, Chris T. Evelo, Richard Finkers, Alejandra Gonzalez-Beltran, Alasdair J.G. Gray, Paul Groth, Carole Goble, Jeffrey S. Grethe, Jaap Heringa, Peter A.C ’t Hoen, Rob Hooft, Tobias Kuhn, Ruben Kok, Joost Kok, Scott J. Lusher, Maryann E. Martone, Albert Mons, Abel L. Packer, Bengt Persson, Philippe Rocca-Serra, Marco Roos, Rene van Schaik, Susanna-Assunta Sansone, Erik Schultes, Thierry Sengstag, Ted Slater, George Strawn, Morris A. Swertz, Mark Thompson, Johan van der Lei, Erik van Mulligen, Jan Velterop, Andra Waagmeester, Peter Wittenburg, Katherine Wolstencroft, Jun Zhao, and Barend Mons. 2016. The FAIR Guiding Principles for Scientific Data Management and Stewardship. Scientific Data 3, 1 (Dec. 2016), 160018. https://doi.org/10.1038/sdata.2016.18
    [58]
    Max L. L. Wilson, Paul Resnick, David Coyle, and Ed H. Chi. 2013. RepliCHI: The Workshop. In CHI ’13 Extended Abstracts on Human Factors in Computing Systems on - CHI EA ’13. ACM Press, Paris, France, 3159. https://doi.org/10.1145/2468356.2479636

    Cited By

    View all
    • (2024)Understanding fraudulence in online qualitative studies: From the researcher's perspectiveProceedings of the CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642732(1-17)Online publication date: 11-May-2024
    • (2023)Changes in Research Ethics, Openness, and Transparency in Empirical Studies between CHI 2017 and CHI 2022Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580848(1-23)Online publication date: 19-Apr-2023
    • (2022)The Value of Open Data in HCI: A Case Report from Mobile Text Entry ResearchMultimodal Technologies and Interaction10.3390/mti60900716:9(71)Online publication date: 23-Aug-2022
    • Show More Cited By

    Index Terms

    1. Are You Open? A Content Analysis of Transparency and Openness Guidelines in HCI Journals
      Index terms have been assigned to the content through auto-classification.

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      CHI '21: Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems
      May 2021
      10862 pages
      ISBN:9781450380966
      DOI:10.1145/3411764
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 07 May 2021

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. Transparency and Openness Promotion guidelines
      2. editorial practice
      3. open science

      Qualifiers

      • Research-article
      • Research
      • Refereed limited

      Conference

      CHI '21
      Sponsor:

      Acceptance Rates

      Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)122
      • Downloads (Last 6 weeks)13
      Reflects downloads up to 11 Aug 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)Understanding fraudulence in online qualitative studies: From the researcher's perspectiveProceedings of the CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642732(1-17)Online publication date: 11-May-2024
      • (2023)Changes in Research Ethics, Openness, and Transparency in Empirical Studies between CHI 2017 and CHI 2022Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580848(1-23)Online publication date: 19-Apr-2023
      • (2022)The Value of Open Data in HCI: A Case Report from Mobile Text Entry ResearchMultimodal Technologies and Interaction10.3390/mti60900716:9(71)Online publication date: 23-Aug-2022
      • (2022)Apéritif: Scaffolding Preregistrations to Automatically Generate Analysis Code and Methods DescriptionsProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3517707(1-15)Online publication date: 29-Apr-2022
      • (2022)Transparent Practices for Quantitative Empirical ResearchExtended Abstracts of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491101.3503760(1-5)Online publication date: 27-Apr-2022

      View Options

      Get Access

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      HTML Format

      View this article in HTML Format.

      HTML Format

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media