Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/1518701.1518948acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

Let your users do the testing: a comparison of three remote asynchronous usability testing methods

Published: 04 April 2009 Publication History

Abstract

Remote asynchronous usability testing is characterized by both a spatial and temporal separation of users and evaluators. This has the potential both to reduce practical problems with securing user attendance and to allow direct involvement of users in usability testing. In this paper, we report from an empirical study where we systematically compared three methods for remote asynchronous usability testing: user-reported critical incidents, forum-based online reporting and discussion, and diary-based longitudinal user reporting. In addition, conventional laboratory-based think-aloud testing was included as a benchmark for the remote methods. The results show that each remote asynchronous method supports identification of a considerable number of usability problems. Although this is only about half of the problems identified with the conventional method, it requires significantly less time. This makes remote asynchronous methods an appealing possibility for usability testing in many software projects.

Supplementary Material

JPG File (1518948.jpg)
index.html (index.html)
Slides from the presentation
Audio only (1518948.mp3)
Video (1518948.mp4)

References

[1]
Andreasen, M. S., Nielsen, H. V., Schrøder, S. O. and Stage, J. What happened to remote usability testing? An empirical study of three methods. Proceedings of CHI 2007, ACM Press (2007), 1405--1414.
[2]
Au, I., Boardman, R., Jeffries, R., Larvie, P., Pavese, A., Riegelsberger, J., Rodden, K., and Stevens, M. User experience at google: focus on the user and all else will follow. Proceedings of CHI 2008, ACM Press (2008), 3681--3686.
[3]
Bak, J. O., Nguyen, K., Risgaard, P., and Stage, J. Obstacles to usability evaluation in practice: a survey of software development organizations. Proceedings of NordiCHI 2008. ACM Press (2008), 23--32.
[4]
Brush, A. B., Ames, M., and Davis, J. A comparison of synchronous remote and local usability studies for an expert interface. Proceedings of CHI 2004, ACM Press (2004), 1179--1182.
[5]
Capra, M. G. An Exploration of End-User Critical Incident Classification. Master thesis, Virginia Polytechnic Institute and State University, 2001.
[6]
Castillo, J. C. The User-Reported Critical Incident Method for Remote Usability Evaluation. Master thesis, Virginia Polytechnic Institute and State University, 1997.
[7]
Castillo, J. C., Hartson, H. R. and Hix, D. Remote usability evaluation: Can users report their own critical incidents? Proceedings of CHI 1998, ACM Press (1998), 253--254.
[8]
Desurvire, Heather W. Faster, cheaper!! Are Usability Inspection Methods as Effective as Empirical Testing? John Wiley&Sons, 173--202, 1994.
[9]
Dray, S. and Siegel, D. Remote possibilities?: International usability testing at a distance. interactions 11, 2 (2004), 10--17.
[10]
Ericsson, K. A. and Simon, H. A. How to study thinking in everyday life: contrasting think-aloud protocols with descriptions and explanations of thinking. Mind, Culture, and Activity 5, 3 (1998), 178--186.
[11]
Følstad, A., Brandtzæg, P. B. and Heim, J. Usability Analysis and Evaluation of Mobile ICT Systems. http://www.hft.org/HFT01/paper01/mobility/25_01.pdf
[12]
Hammontree, M., Weiler, P. and Nayak, N. Remote usability testing. Interactions 1, 3 (1994), 21--25.
[13]
Hartson, H. R. and Castillo, J. C. Remote evaluation for post-deployment usability improvement. Proceedings of AVI 1998, ACM Press (1998), 22--29.
[14]
Hartson, H. R., Castillo, J. C., Kelso, J. and Neale, W. C. Remote evaluation: The network as an extension of the usability laboratory. Proceedings of CHI 1996, ACM Press (1996), 228--235.
[15]
Hertzum, M. and Jacobsen, N. E. The evaluator effect: A chilling fact about usability evaluation methods. International Journal of Human-Computer Interaction 15, 1 (2003), 183--204.
[16]
Hilbert, D. M. and Redmiles, D. F. Separating the wheat from the chaff in internet-mediated user feedback expectation-driven event monitoring. ACM SIGGROUP Bulletin 20, 1 (1999), 35--40.
[17]
Kjeldskov, J., Skov, M. B. and Stage, J. Instant Data Analysis: Evaluating Usability in a Day. Proceedings of NordiCHI 2004, ACM Press (2004), 233--240.
[18]
Larsen, J. M. Playful Interaction. Master thesis, Aalborg University, Department of Computer Science, 2008.
[19]
Marsh, S. L., Dykes, J. and Attilakou, F. Evaluating a geovisualization prototype with two approaches: remote instructional vs. face-to-face exploratory. Proceedings of Information Visualization 2006, IEEE (2006), 310--315.
[20]
Millen, D. R. Remote usability evaluation: user participation in the design of a web-based email service. ACM SIGGROUP Bulletin 20, 1 (1999), 40--45.
[21]
Nielsen, C. M., Overgaard, M., Pedersen, M. B., Stage, J. and Stenild, S. It's worth the hassle! the added value of evaluating the usability of mobile systems in the field. Proceedings of NordiCHI 2006, ACM Press (2006), 272--280.
[22]
Nielsen, J. Finding usability problems through heuristic evaluation. Proceedings of CHI 1992, ACM Press (1992), 373--380.
[23]
Nielsen, J. and Molich, R. Heuristic evaluation of user interfaces. Proceedings of CHI 1990, ACM Press (1990), 249--256.
[24]
Nielsen, J. Usability inspection methods. Proceedings of CHI 1994, ACM Press (1994), 377--378.
[25]
Olmsted, E. and Gill, M. In-person usability study compared with self-administered web (remote-different time-place) study: does mode of study produce similar results? Proceedings of UPA 2005, UPA (2005).
[26]
Petrie, H., Hamilton, F., King, N. and Pavan, P. Remote usability evaluation with disabled people. Proceedings of CHI 2006, ACM Press (2006), 1133--1141.
[27]
Rubin, J. Handbook of Usability Testing. John Wiley and Sons, 1994.
[28]
Scholtz, J. A case study: developing a remote, rapid and automated usability testing methodology for on-line books. Proceedings of HICSS 1999, IEEE (1999).
[29]
Scholtz, J. and Downey, L. Methods for identifying usability problems with web sites. Proceedings of IFIP Conference, ACM Press (1998), 191--206.
[30]
Steves, M. P. et. al. A comparison of usage evaluation and inspection methods for assessing groupware usability. Proceedings of CSCW 2001, ACM Press (2001), 125--134.
[31]
Thompson, J. A. Investigating the Effectiveness of Applying the Critical Incident Technique to Remote Usability Evaluation. Master thesis, Virginia Polytechnic Institute and State University, 1999.
[32]
Tullis, T., Fleischman, S., McNulty,M., Cianchette, C. and Bergel, M. An empirical comparison of lab and remote usability testing of web sites. http://home.comcast.net/~tomtullis/publications/RemoteVsLab.pdf
[33]
Vermeeren, A. P. O. S., van Kesteren, I. and Bekker, M. M. Managing the 'evaluator effect' in user testing. Proceedings of INTERACT 2003, IOS Press (2003).
[34]
Waterson, S., Landay, J. A. and Matthews, T. In the lab and out in the wild: remote web usability testing for mobile devices. Proceedings of CHI 2002, ACM Press (2002), 796--797.
[35]
West, R. and Lehman, K. R. Automated Summative Usability Studies: An Empirical Evaluation. Proceedings of CHI 2006, ACM Press (2006), 631--639.
[36]
Winckler, M. A. A., Freitas, C. M. D. S. and de Lima, J. V.Remote usability testing: a case study. Proceedings of OzCHI 1999, CHISIG (1999).
[37]
Winckler, M. A. A., Freitas, C. M. D. S. and de Lima, J. V. Usability remote evaluation for www. Proceedings of CHI 2000, ACM Press (2000), 131--132.
[38]
--ijö, R. and Mantere, J. Are Non-Expert Usability Evaluations Valuable? http://www.hft.org/HFT01/paper01/acceptance/2_01.pdf

Cited By

View all
  • (2024)Challenges of Remote XR Experimentation: Balancing the Trade-Offs for a Successful Remote StudyProceedings of the 27th International Academic Mindtrek Conference10.1145/3681716.3689451(294-300)Online publication date: 8-Oct-2024
  • (2024)Remote HRI: a Methodology for Maintaining COVID-19 Physical Distancing and Human Interaction Requirements in HRI StudiesInformation Systems Frontiers10.1007/s10796-021-10162-426:1(91-106)Online publication date: 1-Feb-2024
  • (2023)An Empirical Comparison of Moderated and Unmoderated Gesture Elicitation Studies on Soft Surfaces and Objects for Smart Home ControlProceedings of the ACM on Human-Computer Interaction10.1145/36042457:MHCI(1-24)Online publication date: 13-Sep-2023
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
CHI '09: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
April 2009
2426 pages
ISBN:9781605582467
DOI:10.1145/1518701
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 04 April 2009

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. asynchronous testing
  2. empirical study
  3. remote testing
  4. usability testing

Qualifiers

  • Research-article

Conference

CHI '09
Sponsor:

Acceptance Rates

CHI '09 Paper Acceptance Rate 277 of 1,130 submissions, 25%;
Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

Upcoming Conference

CHI 2025
ACM CHI Conference on Human Factors in Computing Systems
April 26 - May 1, 2025
Yokohama , Japan

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)122
  • Downloads (Last 6 weeks)14
Reflects downloads up to 20 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Challenges of Remote XR Experimentation: Balancing the Trade-Offs for a Successful Remote StudyProceedings of the 27th International Academic Mindtrek Conference10.1145/3681716.3689451(294-300)Online publication date: 8-Oct-2024
  • (2024)Remote HRI: a Methodology for Maintaining COVID-19 Physical Distancing and Human Interaction Requirements in HRI StudiesInformation Systems Frontiers10.1007/s10796-021-10162-426:1(91-106)Online publication date: 1-Feb-2024
  • (2023)An Empirical Comparison of Moderated and Unmoderated Gesture Elicitation Studies on Soft Surfaces and Objects for Smart Home ControlProceedings of the ACM on Human-Computer Interaction10.1145/36042457:MHCI(1-24)Online publication date: 13-Sep-2023
  • (2023)What’s (Not) Working in Programmer User Studies?ACM Transactions on Software Engineering and Methodology10.1145/358715732:5(1-32)Online publication date: 24-Jul-2023
  • (2023)Prototypes, Platforms and Protocols: Identifying Common Issues with Remote, Unmoderated Studies and their Impact on Research ParticipantsExtended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544549.3585836(1-6)Online publication date: 19-Apr-2023
  • (2023)Making Usability Test Data Actionable! A Quantitative Test-Driven Prototyping ApproachExtended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544549.3585659(1-6)Online publication date: 19-Apr-2023
  • (2022)Observed differences between lab and online tests using the AttrakDiff semantic differential scaleJournal of Usability Studies10.5555/3532689.353269114:2(65-75)Online publication date: 18-Apr-2022
  • (2022)Exploring unmanned systems with virtual prototypes in augmented reality videoProceedings of the 33rd European Conference on Cognitive Ergonomics10.1145/3552327.3552338(1-10)Online publication date: 4-Oct-2022
  • (2022)Critical Incident Technique and Gig-Economy Work (Deliveroo): Working with and Challenging Assumptions around AlgorithmsExtended Abstracts of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491101.3519865(1-6)Online publication date: 27-Apr-2022
  • (2022)Git-Truck: Hierarchy-Oriented Visualization of Git Repository Evolution2022 Working Conference on Software Visualization (VISSOFT)10.1109/VISSOFT55257.2022.00021(131-140)Online publication date: Oct-2022
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media