Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3121283.3121288acmotherconferencesArticle/Chapter ViewAbstractPublication PagesecceConference Proceedingsconference-collections
research-article

How Could an Intranet be Like a Friend to Me?: Why Standardized UX Scales Don't Always Fit

Published: 19 September 2017 Publication History

Abstract

"I hope that this survey is a joke because it made me laugh so much". This quote is just one example of many negative respondents' reactions gathered during a large-scale user experience (UX) study. Unfortunately, the survey was no joke, rather a well-constructed and validated standardized UX scale. This paper critically reflects on the use and relevance of standardized UX scales for the evaluation of UX in business contexts. We report on a real-world use case where the meCUE questionnaire has been used to assess employees' experience (N=263) with their organization's intranet. Strong users' reactions to the survey's items and statistical analyses both suggest that the scale is unsuitable for the evaluation of business-oriented systems. Drawing on the description of this inadequacy, we discuss the quality of academic UX tools, calling into question the relevance for practice of academic methods.

References

[1]
R. Alves, P. Valente, and N. J. Nunes. 2014. The state of user experience evaluation practice. In Proc. of NordiCHI'14. New York, NY: ACM Press (pp. 93--102).
[2]
Regina Bernhaupt and Michael Pirker. 2013. Evaluating User Experience for Interactive Television: Towards the Development of a Domain-Specific User Experience Questionnaire. IFIP TC13 Conference on Human-Computer Interaction INTERACT'13. Lecture Notes in Computer Science 8118, 642--659. Berlin/Heidelberg: Springer.
[3]
J. H. Brockmyer, C. M. Fox, K. A. Curtiss, E. McBroon, K. M. Burkhart, and J. N. Pidruzny. 2009. The development of the Game Engagement Questionnaire: A measure of engagement in video game-playing. Journal of Experimental Social Psychology, 45, 624--634.
[4]
John Brooke. 1996. SUS: a "quick and dirty" usability scale. In P. W. Jordan, B. Thomas, B. A. Weerdmeester, & A. L. McClelland (Eds.). Usability Evaluation in Industry. London: Taylor and Francis.
[5]
Paul Cairns. 2007. HCI… Not As It Should Be: Inferential Statistics in HCI Research. Proceedings of the 21st British HCI Group Annual Conference on People and Computers BCS-HCI '07. Swinton, UK: British Computer Society.
[6]
D. B. Chertoff, B. Goldiez, and J. J. LaViola. 2010. Virtual Experience Test: A virtual environment evaluation questionnaire. Proceedings of the 2010 IEEE Virtual Reality Conference VR'10. Washington, DC, USA: IEEE Computer Society.
[7]
Alan Dix. 2010. Human-Computer Interaction: a stable discipline, a nascent science, and the growth of the long tail. Interacting with Computers, 22(1), 13--27.
[8]
C. M. Gray, E. Stolterman and M. A. Siegel. 2014. Reprioritizing the Relationship Between HCI Research and Practice: Bubble-Up and Trickle- Down Effects. Proceedings of the 2014 conference on Designing interactive systems DIS'14. New York, NY: ACM Press.
[9]
M. Hassenzahl, M. Burmester, and F. Koller. 2003. AttrakDiff: Ein Fragebogen zur Messung wahrgenommener hedonischer und pragmatischer Qualität. In J. Ziegler, & G. Szwillus (Eds.), Mensch & Computer 2003. Interaktion in Bewegung (pp. 187--196). Stuttgart, Germany: B.G. Teubner.
[10]
Jurek Kirakowski and Mary Corbett. 1993. SUMI: The Software Usability Measurement Inventory. British Journal of Educational Technology, 24, 210--212.
[11]
Carina Kuhr. 2013. Measuring the User Experience of Mobile Applications - an Empirical Validation of a Quantitative Method. (Master thesis). Technische Universität Berlin. http://mecue.de/Homepage%20Content/02%20Links%20%26%20Literatur/MA_Kuhr.pdf
[12]
Carine Lallemand. 2015. Towards Consolidated Methods for the Design and Evaluation of User Experience. (Doctoral dissertation). University of Luxembourg. https://publications.uni.lu/handle/10993/21463
[13]
C. Lallemand, V. Koenig, G. Gronier, and R. Martin. 2015. Création et validation d'une version française du questionnaire AttrakDiff pour l'évaluation de l'expérience utilisateur des systèmes interactifs. Revue européenne de psychologie appliquée, 65(5), 239--252.
[14]
B. Laugwitz, T. Held, and M. Schrepp. 2008. Construction and Evaluation of a User Experience Questionnaire. In A. Holzinger (Ed.): HCI and Usability for Education and Work USAB 2008, LNCS 5298, pp. 63--76. Berlin/Heidelberg: Springer.
[15]
M. Minge, M. Thüring, and I. Wagner. 2016. Developing and Validating an English Version of the meCUE Questionnaire for Measuring User Experience. In Proc. of the HFES'16.
[16]
Michael Minge and Laura Riedel. 2013. meCUE -- Ein modularer Fragebogen zur Erfassung des Nutzungserlebens. In: S. Boll, S. Maaß & R. Malaka (Hrsg.): Mensch und Computer 2013: Interaktive Vielfalt (S. 89--98). München, Oldenbourg Verlag.
[17]
Paul E. Spector. 1992. Summated rating scale construction. An Introduction. Newbury Park: Sage.
[18]
Manfred Thüring and Sascha Mahlke. 2007. Usability, aesthetics and emotions in human--technology interaction. International Journal of Psychology, 42(4), 253--264.
[19]
Tom Tullis and William Albert. 2013. Measuring the User Experience (2nd ed.): Collecting, Analyzing, and Presenting Usability Metrics. Burlington, MA: Morgan Kaufmann Publishers.
[20]
K. Väänänen-Vainio-Mattila, V. Roto, and M. Hassenzahl. 2008. Towards practical user experience evaluation methods. Proceedings of Meaningful Measures: Valid Useful User Experience Measurement (VUUM), 5th COST294-MAUSE Open Workshop, 18th June 2008, Reykjavik, Iceland.
[21]
Kaisa Väänänen-Vainio-Mattila and Minna Wäljas. 2009. Development of Evaluation Heuristics for Web Service User Experience. Extended Abstracts on Human Factors in Computing Systems (CHI EA'09). New York, NY, USA: ACM Press.
[22]
Robert J. Vallerand. 1989. Vers une méthodologie de validation transculturelle de questionnaires psychologiques: implications pour la recherche en langue française. Psychologie Canadienne, 30(4), 662--689.
[23]
Fons Van de Vijver and Ronald K. Hambleton. 1996. Translating tests: Some practical guidelines. European Psychologist, 1, 89--99.
[24]
Fons Van de Vijver and Norbert K. Tanzer. 2004. Bias and equivalence in cross-cultural assessment: An overview. European Review of Applied Psychology, 54, 119--135.
[25]
A. Vermeeren, E. Law, V. Roto, M. Obrist, J. Hoonhout and K. Väänänen-Vainio-Mattila. 2010. User Experience Evaluation Methods: Current State and Development Needs. Proceedings of NordiCHI 2010, New York, NY: ACM Press.

Cited By

View all
  • (2025)User Experience of a Semi-Immersive Musical Serious Game to Stimulate Cognitive Functions in Hospitalized Older Patients: Questionnaire StudyJMIR Serious Games10.2196/5703013(e57030-e57030)Online publication date: 6-Jan-2025
  • (2024)Improving the user experience of standardized questionnaires by incorporating gamified mechanisms2024 IEEE VII Congreso Internacional en Inteligencia Ambiental, Ingeniería de Software y Salud Electrónica y Móvil (AmITIC)10.1109/AmITIC62658.2024.10747589(1-8)Online publication date: 25-Sep-2024
  • (2024)Sentence Completion as a User Experience Research Method: Recommendations From an Experimental StudyInteracting with Computers10.1093/iwc/iwae00236:1(48-61)Online publication date: 6-Mar-2024
  • Show More Cited By

Index Terms

  1. How Could an Intranet be Like a Friend to Me?: Why Standardized UX Scales Don't Always Fit

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    ECCE '17: Proceedings of the European Conference on Cognitive Ergonomics
    September 2017
    214 pages
    ISBN:9781450352567
    DOI:10.1145/3121283
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    In-Cooperation

    • EACE: European Association for Cognitive Ergonomics
    • Umeå University

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 19 September 2017

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Standardized UX scales
    2. intranet
    3. meCUE questionnaire
    4. workplace UX assessment

    Qualifiers

    • Research-article
    • Research
    • Refereed limited

    Conference

    ECCE 2017

    Acceptance Rates

    ECCE '17 Paper Acceptance Rate 29 of 54 submissions, 54%;
    Overall Acceptance Rate 56 of 91 submissions, 62%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)40
    • Downloads (Last 6 weeks)3
    Reflects downloads up to 13 Jan 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2025)User Experience of a Semi-Immersive Musical Serious Game to Stimulate Cognitive Functions in Hospitalized Older Patients: Questionnaire StudyJMIR Serious Games10.2196/5703013(e57030-e57030)Online publication date: 6-Jan-2025
    • (2024)Improving the user experience of standardized questionnaires by incorporating gamified mechanisms2024 IEEE VII Congreso Internacional en Inteligencia Ambiental, Ingeniería de Software y Salud Electrónica y Móvil (AmITIC)10.1109/AmITIC62658.2024.10747589(1-8)Online publication date: 25-Sep-2024
    • (2024)Sentence Completion as a User Experience Research Method: Recommendations From an Experimental StudyInteracting with Computers10.1093/iwc/iwae00236:1(48-61)Online publication date: 6-Mar-2024
    • (2024)User Experience (UX) with Mobile Devices: A Comprehensive Model to Demonstrate the Relative Importance of Instrumental, Non-Instrumental, and Emotional Components on User SatisfactionInternational Journal of Human–Computer Interaction10.1080/10447318.2024.2352210(1-12)Online publication date: 3-Jun-2024
    • (2024)Humanized AI in hiring: an empirical study of a virtual AI job interviewer’s social skills on applicants’ reactions and experienceThe International Journal of Human Resource Management10.1080/09585192.2024.2440784(1-29)Online publication date: 18-Dec-2024
    • (2024)A model-driven approach for prospective ergonomics: application to ikigai roboticsErgonomics10.1080/00140139.2024.2418960(1-11)Online publication date: 19-Oct-2024
    • (2024)Usability of the digit-tracking technique in a geriatric population of inpatients with and without neurocognitive disorders: The DIGICOG-start studyJournal of Neural Transmission10.1007/s00702-024-02858-zOnline publication date: 11-Nov-2024
    • (2023)Innovative protocol of an exploratory study evaluating the acceptability of a humanoid robot at home of deaf children with cochlear implantsPLOS ONE10.1371/journal.pone.028592718:6(e0285927)Online publication date: 16-Jun-2023
    • (2023)UXR-kit: An Ideation Method for Collaborative and User-Centered Design about eXtended Reality solutions.Adjunct Proceedings of the 34th Conference on l'Interaction Humain-Machine10.1145/3577590.3589605(1-7)Online publication date: 3-Apr-2023
    • (2023)3D Printed Interactive Multi-Storey Model for People with Visual ImpairmentsProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581304(1-15)Online publication date: 19-Apr-2023
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media