Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

The Impact of Thinking-Aloud on Usability Inspection

Published: 18 June 2020 Publication History

Abstract

This study compared the results of a usability inspection conducted under two separate conditions: An explicit concurrent think-aloud that required explanations and silent working. 12 student analysts inspected two travel websites thinking-aloud and working in silence to produce a set of problem predictions. Overall, the silent working condition produced more initial predictions, but the think-aloud condition yielded a greater proportion of accurate predictions as revealed by falsification testing. The analysts used a range of problem discovery methods with system searching being favoured by the silent working condition and the more active, goal playing discovery method in the think-aloud condition. Thinking-aloud was also associated with a broader spread of knowledge resources.

References

[1]
Chi, M.T.H., de Leeuw, N., Chiu, M.H. and La Vancher, C. (1994) Eliciting self-explanations improves understanding. Cognitive Science, 18, 439--477
[2]
Cockton, G, Woolrych, A, Hall, L, Hindmarch, M. (2003): Changing Analysts' Tunes: The Surprising Impact of a New Instrument for Usability Inspect. In: Proceedings of the HCI03 Conference on People and Computers XVII, 2003, .pp. 145--162
[3]
Cockton, G, Woolrych, A and Hindmarch, M (2004) Reconditioned merchandise: extended structured report formats in usability inspection. In: Extended abstracts of the 2004 conference on Human factors and computing systems - CHI '04. Association for Computing Machinery, New York, pp. 1433--1436. ISBN 1--58113--703--6
[4]
Cockton, G, Woolrych, A, Hornbæk, K and Frøkjær, E (2012) Inspection-based evaluations. In: The Human-Computer Interaction Handbook: fundamentals, evolving technologies, and emerging applications [3rd edition]. CRC Press, Boca Raton, FL, pp. 1279--1298. ISBN 978--1439829431
[5]
Ericsson, A. and Simon, H. A. (1993) Protocol Analysis: Verbal reports as data. MIT Press, London, UK.
[6]
Fernandez, A., Anrahao, S and Insfran, E (2013) Empirical validation of a usability inspection method for model-driven Web development Journal of Systems and Software, 68 (1) p161--186.
[7]
Fox, M.C., Ericsson, K.A. & Best, R., (2011). Do procedures for verbal reporting of thinking have to be reactive? A meta-analysis and recommendations for best reporting methods. Psychological Bulletin, 137(2), pp.316--344
[8]
Hertzum, M., Hansen, K.D. and Andersen, H.H.K. (2009). Scrutinising usability evaluation: does thinking aloud affect behaviour and mental workload? Behaviour & Information Technology. 28 (2). pp. 165--181
[9]
Hertzum, M & Jacobsen, NE (2003) The Evaluator Effect: A Chilling Fact About Usability Evaluation Methods, International Journal of Human--Computer Interaction, 15:1, 183--204.
[10]
Holtzblatt, K., Wendell, J and Wood, S., (2004) Rapid Contextual Design. Morgan Kaufmann
[11]
Hornbaek, K. (2010). Dogmas in the assessment of usability evaluation methods. Behaviour and Information Technology, 29(1), 97--111.
[12]
Hornbaek, K and Frokjaer, E (2004) Two psychology-based usability inspection techniques studied in a diary experiment. NordiCHI '04: Proceedings of the third Nordic conference on Human-computer interaction October 2004 Pages 3--12
[13]
Hornbæk, K. and Frøkjær, E. (2008), "Comparison of Techniques for Matching of Usability-Problem Descriptions", Interacting with Computers, 20(6), December 2008, Pages 505--514.ISO 2019
[14]
Hornbæk, K and Frøkjær, E (2008) Making use of business goals in usability evaluation: an experiment with novice evaluators. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '08). ACM, New York, NY, USA, 903--912.
[15]
Hvannberg, Ebba Thora, Law, Effie Lai-Chong and Larusdottir, Marta Kristin (2007) Heuristic evaluation: Comparing ways of finding and reporting usability problems. Interacting with Computers Vol 19, no 2, p 225--240
[16]
International Standards Organisation. (2019) ISO 9241--210:2019 Ergonomics of human-system interaction - Part 210: Human-centred design for interactive systems
[17]
Jacobsen, N.E., Hertzum, M. and John, B.E. (1998) The Evaluator Effect in Usability Studies: Problem Detection and Severity Judgments. Proc. HFES 42nd Annual Meeting. Human Factors and Ergonomics Society, SantaMonica, CA, pp. 1336--1340.
[18]
Keinonen, T. (2009). Design method: Instrument, competence or agenda. Multiple Ways to Design Research: Research Cases Reshape the Design Discipline. M. Botta (ed). Swiss Design Network, 280--293
[19]
Lavery, D Cockton, G, Atkinson, M.P. (1997): Comparison of Evaluation Methods Using Structured Usability Problem Reports. Behaviour and Information Technology, 16 (4) pp. 246--266
[20]
McDonald, S., and Petrie, H. (2013) The effect of global instructions on think-aloud testing. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. Paris, France, April 27 - May 2. ACM New York, NY, USA. pp. 2941--2944
[21]
McDonald, S., Zhao, T and Edwards, H.M. (2015) Look Who's Talking: Evaluating the Utility of Interventions During an Interactive Think-Aloud. Interacting with Computers.
[22]
Molich, (2018). Are usability evaluations reproducible? Interactions 25(6), 82--85.
[23]
Monahan, K, McDonald, S and Cockton, G (2006) Modified Contextual Design as a Field Evaluation Method. In: Fourth Nordichi conference on human computer interaction.
[24]
Nielsen, J. (1992). Finding usability problems through heuristic evaluation. Proceedings ACM CHI'92 Conference (Monterey, CA, May 3--7), 373--380
[25]
Sears, A and Hess, D. J. (1998). The effect of task description detail on evaluator performance with cognitive walkthroughs. In CHI 98 Conference Summary on Human Factors in Computing Systems (CHI '98). ACM, New York, NY, USA, 259--260.
[26]
Schön, D.A. (1983). The Reflective Practitioner: How Professionals Think in Action. Basic Books
[27]
Van Den Haak, M.J., De Jong, M.D.T., and Schellens, P.J. (2006) Constructive Interaction: An Analysis of Verbal Interaction in a Usability Setting, IEEE Transactions on Professional Communication, 49(4), 311--324.
[28]
VanLehn, K and Jones, R.M. and Chi, M.T.H (1992) A model of the Self-explanation. Journal of the Learning Sciences. (1), 1--60
[29]
Woolrych, A., Cockton, G. and Hindmarch, M., (2004) Falsification Testing for Usability Inspection Method Assessment. Proceedings of HCI 2004, Volume 2, eds. A. Dearden and Leon Watts, Research Press International, 137--140, 1--897851--13--8
[30]
Woolrych, A, Hornbaek, K, Frokjaer, E and Cockton, G (2011) Ingredients and meals rather than recipes: a proposal for research that does not treat usability evaluation methods as indivisible wholes. International Journal of Human-Computer Interaction, 27 (10). pp. 940--970. ISSN 1044--7318
[31]
Woolrych, A. (2011) Understanding and Improving Analyst Decision Making in Usability Inspection Unpublished PhD thesis, University of Sunderland, UK
[32]
Zhao, T. McDonald, S., and Edwards, H.M. (2014) The impact of two different think-aloud instructions in a usability test: a case of just following orders? Behaviour and Information Technology, 33, 163--183.

Cited By

View all
  • (2024)airTac: A Contactless Digital Tactile Receptor for Detecting Material and Roughness via Terahertz SensingProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36785868:3(1-37)Online publication date: 9-Sep-2024
  • (2023)MoCaPoseProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/35808837:1(1-40)Online publication date: 28-Mar-2023
  • (2022)Interaction with Touch-Sensitive Knitted Fabrics: User Perceptions and Everyday Use ExperimentsProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3502077(1-20)Online publication date: 29-Apr-2022
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image Proceedings of the ACM on Human-Computer Interaction
Proceedings of the ACM on Human-Computer Interaction  Volume 4, Issue EICS
EICS
June 2020
534 pages
EISSN:2573-0142
DOI:10.1145/3407187
Issue’s Table of Contents
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 18 June 2020
Online AM: 07 May 2020
Published in PACMHCI Volume 4, Issue EICS

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. evaluation resources
  2. heuristic evaluation
  3. think-aloud
  4. usability inspection

Qualifiers

  • Research-article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)53
  • Downloads (Last 6 weeks)4
Reflects downloads up to 11 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2024)airTac: A Contactless Digital Tactile Receptor for Detecting Material and Roughness via Terahertz SensingProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36785868:3(1-37)Online publication date: 9-Sep-2024
  • (2023)MoCaPoseProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/35808837:1(1-40)Online publication date: 28-Mar-2023
  • (2022)Interaction with Touch-Sensitive Knitted Fabrics: User Perceptions and Everyday Use ExperimentsProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3502077(1-20)Online publication date: 29-Apr-2022
  • (2022)Comparison of Moderated and Unmoderated Remote Usability Sessions for Web-Based Simulation Software: A Randomized Controlled TrialHuman-Computer Interaction. Theoretical Approaches and Design Methods10.1007/978-3-031-05311-5_16(232-251)Online publication date: 26-Jun-2022

View Options

Login options

Full Access

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media