Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content

Comparison of Moderated and Unmoderated Remote Usability Sessions for Web-Based Simulation Software: A Randomized Controlled Trial

  • Conference paper
  • First Online:
Human-Computer Interaction. Theoretical Approaches and Design Methods (HCII 2022)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 13302))

Included in the following conference series:

Abstract

Usability studies are a crucial part of developing user-centered designs and they can be conducted using a variety of different methods. Unmoderated usability surveys are more efficient and cost-effective and lend themselves better to larger participant pools in comparison to moderated usability surveys. However, unmoderated usability surveys could increase the collection of unreliable data due to the survey participants’ careless responding (CR). In this study, we compared the remote moderated and remote unmoderated usability testing sessions for a web-based simulation and modeling software. The usability study was conducted with 72 participants who were randomly assigned into a moderated and unmoderated groups. Our results show that moderated sessions produced more reliable data in most of the tested outcomes and that the data from unmoderated sessions needed some optimization in order to filter out unreliable data. We discuss methods to isolate unreliable data and recommend ways of managing it.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Apaolaza, A., Vigo, M.: WevQuery: testing hypotheses about web interaction patterns. In: Proceedings of the ACM on Human-Computer Interaction, EICS, vol. 1, pp. 1–17 (2017)

    Google Scholar 

  2. Ball, H.L.: Conducting online surveys. J. Hum. Lact. 35(3), 413–417 (2019)

    Article  Google Scholar 

  3. Bjornsdottir, G., et al.: From paper to web: mode equivalence of the ARHQ and NEO-FFI. Comput. Hum. Behav. 41, 384–392 (2014)

    Article  Google Scholar 

  4. Brush, A.B., Ames, M., Davis, J.: A comparison of synchronous remote and local usability studies for an expert interface. In: CHI 2004 Extended Abstracts on Human Factors in Computing Systems, pp. 1179–1182 (2004)

    Google Scholar 

  5. Casey, T.W., Poropat, A.: Beauty is more than screen deep: improving the web survey respondent experience through socially-present and aesthetically-pleasing user interfaces. Comput. Hum. Behav. 30, 153–163 (2014)

    Article  Google Scholar 

  6. Chen, Y., Pandey, M., Song, J.Y., Lasecki, W.S., Oney, S.: Improving crowd-supported GUI testing with structural guidance. In: Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, pp. 1–13 (2020)

    Google Scholar 

  7. Coppers, S., et al.: Intellingo: an intelligible translation environment. In: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, pp. 1–13 (2018)

    Google Scholar 

  8. Fan, W., Yan, Z.: Factors affecting response rates of the web survey: a systematic review. Comput. Hum. Behav. 26(2), 132–139 (2010)

    Article  Google Scholar 

  9. Gardey, J.C., Garrido, A., Firmenich, S., Grigera, J., Rossi, G.: UX-painter: an approach to explore interaction fixes in the browser. Proc. ACM Hum. Comput. Interact. EICS 4, 1–21 (2020)

    Article  Google Scholar 

  10. Harrati, N., Bouchrika, I., Tari, A., Ladjailia, A.: Exploring user satisfaction for E-learning systems via usage-based metrics and system usability scale analysis. Comput. Hum. Behav. 61, 463–471 (2016)

    Article  Google Scholar 

  11. Hassib, M., Buschek, D., Wozniak, P.W., Alt, F.: HeartChat: heart rate augmented mobile chat to support empathy and awareness. In: Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, pp. 2239–2251 (2017)

    Google Scholar 

  12. Hertzum, M., Borlund, P., Kristoffersen, K.B.: What do thinking-aloud participants say? a comparison of moderated and unmoderated usability sessions. Int. J. Hum. Comput. Interact. 31(9), 557–570 (2015)

    Article  Google Scholar 

  13. Hertzum, M., Molich, R., Jacobsen, N.E.: What you get is what you see: revisiting the evaluator effect in usability tests. Behav. Inf. Technol. 33(2), 144–162 (2014)

    Article  Google Scholar 

  14. Hong, S., Kim, J.: Architectural criteria for website evaluation-conceptual framework and empirical validation. Behav. Inf. Technol. 23(5), 337–357 (2004)

    Article  Google Scholar 

  15. Huang, H.M.: Do print and web surveys provide the same results? Comput. Hum. Behav. 22(3), 334–350 (2006)

    Article  Google Scholar 

  16. Hudson, N., Lafreniere, B., Chilana, P.K., Grossman, T.: Investigating how online help and learning resources support children’s use of 3D design software. In: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, pp. 1–14 (2018)

    Google Scholar 

  17. Jacobsen, N.E., Hertzum, M., John, B.E.: The evaluator effect in usability studies: problem detection and severity judgments. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, vol. 42, pp. 1336–1340. SAGE Publications, Sage (1998)

    Google Scholar 

  18. Kharrufa, A., Rix, S., Osadchiy, T., Preston, A., Olivier, P.: Group spinner: recognizing and visualizing learning in the classroom for reflection, communication, and planning. In: Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, pp. 5556–5567 (2017)

    Google Scholar 

  19. Kieffer, S., Vanderdonckt, J.: Stratus: a questionnaire for strategic usability assessment. In: Proceedings of the 31st Annual ACM Symposium on Applied Computing, pp. 205–212 (2016)

    Google Scholar 

  20. Lewis, J.R.: IBM computer usability satisfaction questionnaires: psychometric evaluation and instructions for use. Int. J. Hum. Comput. Interact. 7(1), 57–78 (1995)

    Article  Google Scholar 

  21. Lewis, J.R.: Introduction to the special issue on usability and user experience: methodological evolution (2015)

    Google Scholar 

  22. Lewis, J.R.: The system usability scale: past, present, and future. Int. J. Hum. Comput. Interact. 34(7), 577–590 (2018)

    Article  Google Scholar 

  23. Lewis, J.R., Brown, J., Mayes, D.K.: Psychometric evaluation of the EMO and the SUS in the context of a large-sample unmoderated usability study. Int. J. Hum. Comput. Interact. 31(8), 545–553 (2015)

    Article  Google Scholar 

  24. Madathil, K.C., Greenstein, J.S.: An investigation of the efficacy of collaborative virtual reality systems for moderated remote usability testing. Appl. Ergon. 65, 501–514 (2017)

    Article  Google Scholar 

  25. Marky, K., Kulyk, O., Renaud, K., Volkamer, M.: What did i really vote for? on the usability of verifiable E-voting schemes. In: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, pp. 1–13 (2018)

    Google Scholar 

  26. McDonald, S., Cockton, G., Irons, A.: The impact of thinking-aloud on usability inspection. Proc. ACM Hum. Comput. Interact. EICS 4, 1–22 (2020)

    Google Scholar 

  27. McKay, A.S., Garcia, D.M., Clapper, J.P., Shultz, K.S.: The attentive and the careless: examining the relationship between benevolent and malevolent personality traits with careless responding in online surveys. Comput. Hum. Behav. 84, 295–303 (2018)

    Article  Google Scholar 

  28. Micallef, M., Porter, C., Borg, A.: Do exploratory testers need formal training? an investigation using HCI techniques. In: 2016 IEEE Ninth International Conference on Software Testing, Verification and Validation Workshops, ICSTW, pp. 305–314. IEEE (2016)

    Google Scholar 

  29. Mohorko, A., Hlebec, V.: Degree of cognitive interviewer involvement in questionnaire pretesting on trending survey modes. Comput. Hum. Behav. 62, 79–89 (2016)

    Article  Google Scholar 

  30. Orfanou, K., Tselios, N., Katsanos, C.: Perceived usability evaluation of learning management systems: empirical evaluation of the system usability scale. Int. Rev. Res. Open Distrib. Learn. 16(2), 227–246 (2015)

    Google Scholar 

  31. Petrovčič, A., Petrič, G., Manfreda, K.L.: The effect of email invitation elements on response rate in a web survey within an online community. Comput. Hum. Behav. 56, 320–329 (2016)

    Article  Google Scholar 

  32. Schroeders, U., Wilhelm, O.: Computer usage questionnaire: structure, correlates, and gender differences. Comput. Hum. Behav. 27(2), 899–904 (2011)

    Article  Google Scholar 

  33. Wang, C.C., Liu, K.S., Cheng, C.L., Cheng, Y.Y.: Comparison of web-based versus paper-and-pencil administration of a humor survey. Comput. Hum. Behav. 29(3), 1007–1011 (2013)

    Article  Google Scholar 

  34. Ward, M.K., Pond, S.B., III.: Using virtual presence and survey instructions to minimize careless responding on internet-based surveys. Comput. Hum. Behav. 48, 554–568 (2015)

    Article  Google Scholar 

  35. Ward, M., Meade, A.W., Allred, C.M., Pappalardo, G., Stoughton, J.W.: Careless response and attrition as sources of bias in online survey assessments of personality traits and performance. Comput. Hum. Behav. 76, 417–430 (2017)

    Article  Google Scholar 

  36. Zhang, X., Kuchinke, L., Woud, M.L., Velten, J., Margraf, J.: Survey method matters: online/offline questionnaires and face-to-face or telephone interviews differ. Comput. Hum. Behav. 71, 172–180 (2017)

    Article  Google Scholar 

  37. Zhou, L., DeAlmeida, D., Parmanto, B.: Applying a user-centered approach to building a mobile personal health record app: development and usability study. JMIR Mhealth Uhealth 7(7), e13194 (2019)

    Article  Google Scholar 

Download references

Acknowledgment

This work was supported in part by Insight grant 21820 from the Social Sciences and Humanities Research Council. The authors would also like to thank the members of DaTALab (www.datalab.science) for proofreading the paper and providing inputs.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Andrew Fisher .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Khayyatkhoshnevis, P., Tillberg, S., Latimer, E., Aubry, T., Fisher, A., Mago, V. (2022). Comparison of Moderated and Unmoderated Remote Usability Sessions for Web-Based Simulation Software: A Randomized Controlled Trial. In: Kurosu, M. (eds) Human-Computer Interaction. Theoretical Approaches and Design Methods. HCII 2022. Lecture Notes in Computer Science, vol 13302. Springer, Cham. https://doi.org/10.1007/978-3-031-05311-5_16

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-05311-5_16

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-05310-8

  • Online ISBN: 978-3-031-05311-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics