Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/1358628.1358654acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

A comparative evaluation of heuristic-based usability inspection methods

Published: 05 April 2008 Publication History

Abstract

Given that heuristic evaluation (HE) is a popular evaluation method among practitioners despite criticisms surrounding its performance and reliability, there is a need to improve the method's performance. Several studies have shown HE-Plus, an emerging variant of HE, to outperform HE in both effectiveness and reliability. HE-Plus uses the same set of heuristics as HE; the only difference between these two methods is the 'usability problems profile' element in HE-Plus. This paper reports our attempt to verify the original profile employed in HE-Plus based on usability problem classification in the User Action Framework and an experiment evaluating its outcome by comparing HE with two HE variants using a profile (HE-Plus and HE++) and a control group. Our results confirmed the role of the 'usability problems profiles' on improving the performance and reliability of heuristic evaluation: both HE-Plus and HE++ outperformed HE in terms of effectiveness as well as reliability.

References

[1]
AARP Audience-Centered Heuristics: Older Adults. http://www.redish.net/content/handouts/Audience-Centered_Heuristics.pdf.
[2]
Andre, T. S., Hartson, H.R., Belz, S.M. & McCreary, F.A. The user action framework: a reliable foundation for usability engineering support tools. International Journal of Human-Computer Studies, 54 (2001), 107--136.
[3]
Bailey, R. W. Heuristic evaluation vs User testing. UI design update newsletter - January 2001, http://www.humanfactors.com/downloads/jan01.asp
[4]
Chattratichart, J. & Brodie, J. Extending the heuristic evaluation method through contextualisation. In Proceedings of the 46th Annual Meeting of the Human Factors and Ergonomics Society, HFES (2002), 641--645.
[5]
Chattratichart, J. & Brodie, J. HE-Plus -- Towards usage-centered expert review for website design. In Proc. forUse 2003, MA:Ampersand Press (2003), 155--169.
[6]
Chattratichart, J. & Brodie, J. Applying User Testing Data to UEM Performance Metrics, In Proc. CHI 2004, ACM Press (2004), 1119--1122.
[7]
Gray, W. D., & Salzman, M. C. Damaged merchandise? A review of experiments that compare usability evaluation methods, iHuman-Computer Interaction, 13 (1998), 203--262.
[8]
Hartson, H. R., Andre, T. S., & Williges, R. W. Criteria for evaluating usability evaluation methods. International Journal of Human-Computer Interaction, 15(1) (2003), 145--181.
[9]
Khalayli, N., Nyhus, S., Hamnes, K., Terum, T. Persona based rapid usability kick-off. In Proc. CHI 2007, ACM Press (2007), 1771--1776.
[10]
Levi, M. D. & Conrad G. F. A Heuristic Evaluation of a World Wide Web Prototype. http://www.bls.gov/ore/htm_papers/st960160.htm.
[11]
Lindgaard, G. & Chattratichart, J. Usability Testing: What Have We Overlooked? In Proc. CHI 2007, ACM Press (2007), 1415--1424.
[12]
Lund, A. M. The need for a standardized set of usability metrics. In Proceedings of the Human Factors and Ergonomics Society 42nd Annual Meeting (pp. 688--691). Santa Monica, CA: Human Factors and Ergonomics Society (1998).
[13]
Molich, R., & Dumas, J. S. Comparative usability evaluation (CUE-4). Behaviour & Information Technology, Taylor & Francis {electronic version} (2006).
[14]
Nielsen, J. Heuristic Evaluation. http://www.useit.com/papers/heuristic/.
[15]
Nielsen, J. Usability Engineering. SF: Morgan Kaufman (1993).
[16]
Norman, D. A. Cognitive engineering. In D. A. Norman & S. W. Draper (Eds.) User Centered System Design: New Perspectives on Human-Computer Interaction, pp. 31--61. Hillsdale, NJ: Lawrence Erlbaum Associates (1986).
[17]
Perfetti, C. Usability Testing Best Practices: An Interview with Rolf Molich. http://www.webpronews.com/topnews/2003/07/30/usability-testing-best-practices-an-interview-with-rolf-molich.
[18]
Redish, G., Chisnell, D., & Lee, A. A new take on heuristic evaluation: Bringing personas, tasks, and heuristics together with a new model for understanding older adults as users. http://www.redish.net/content/talks.html.
[19]
Schaffer, E. Why "how many users" is just the wrong question, UI Design Newsletter -- May 2007: Insights from Human Factors International. http://www.humanfactors.com/downloads/may07.asp.
[20]
Sears, A. Heuristic Walkthroughs: Finding the problems without the noise. International Journal of Human-Computer Interaction, 9(3), (1997), 213--234.
[21]
The Webby Awards Judging Criteria. http://www.webbyawards.com/entries/criteria.php.

Cited By

View all

Index Terms

  1. A comparative evaluation of heuristic-based usability inspection methods

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    CHI EA '08: CHI '08 Extended Abstracts on Human Factors in Computing Systems
    April 2008
    2035 pages
    ISBN:9781605580128
    DOI:10.1145/1358628
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 05 April 2008

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. cue 4
    2. heuristic evaluation
    3. performance metrics
    4. uem

    Qualifiers

    • Research-article

    Conference

    CHI '08
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 6,164 of 23,696 submissions, 26%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)22
    • Downloads (Last 6 weeks)2
    Reflects downloads up to 10 Oct 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2022)Usability inspection: Novice crowd inspectors versus expertJournal of Systems and Software10.1016/j.jss.2021.111122183(111122)Online publication date: Jan-2022
    • (2018)Digital cross-channel usability heuristicsJournal of Usability Studies10.5555/3294038.329404013:2(52-72)Online publication date: 1-Feb-2018
    • (2017)Developing Specific Usability Heuristics for Evaluating the Android ApplicationsMobile and Wireless Technologies 201710.1007/978-981-10-5281-1_15(139-147)Online publication date: 17-Jun-2017
    • (2016)Working beyond technical aspectsProceedings of the 34th ACM International Conference on the Design of Communication10.1145/2987592.2987607(1-10)Online publication date: 23-Sep-2016
    • (2016)The Mobile App Usability Inspection (MAUi) Framework as a Guide for Minimal Viable Product (MVP) Testing in Lean Development CycleProceedings of the 2nd International Conference in HCI and UX Indonesia 201610.1145/2898459.2898460(1-11)Online publication date: 13-Apr-2016
    • (2015)Heuristics for the evaluation of captchas on smartphonesProceedings of the 2015 British HCI Conference10.1145/2783446.2783583(126-135)Online publication date: 13-Jul-2015
    • (2015)A Reference to Usability Inspection MethodsInternational Colloquium of Art and Design Education Research (i-CADER 2014)10.1007/978-981-287-332-3_43(407-419)Online publication date: 2015
    • (2014)Developing an architectural visualization using 3D for photo tourism2014 International Conference on Computer, Communications, and Control Technology (I4CT)10.1109/I4CT.2014.6914220(429-433)Online publication date: Sep-2014
    • (2014)Towards Qualitative and Quantitative Data Integration Approach for Enhancing HCI Quality Evaluation16th International Conference on Human-Computer Interaction. Theories, Methods, and Tools - Volume 851010.1007/978-3-319-07233-3_43(469-480)Online publication date: 22-Jun-2014
    • (2013)A new proposal for improving heuristic evaluation reports performed by novice evaluatorsProceedings of the 2013 Chilean Conference on Human - Computer Interaction10.1145/2535597.2535601(72-75)Online publication date: 11-Nov-2013
    • Show More Cited By

    View Options

    Get Access

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media