Abstract
Inspection-based evaluation methods predicting usability problems can be applied for evaluating products without involving users. A new method (named SEEM), inspired by Norman’s theory-of-action model [18] and Malone’s concepts of fun [15], is described for predicting usability and fun problems in children’s computer games. This paper describes a study to assess SEEM’s quality. The results show that the experts in the study predicted about 76% of the problems found in a user test. The validity of SEEM is quite promising. Furthermore, the participating experts were able to apply the inspection-questions in an appropriate manner. Based on this first study ideas for improving the method are presented.
Chapter PDF
Similar content being viewed by others
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
References
Milo and the magical stones (Max en de toverstenen). MediaMix Benelux (2002)
Rabbit, R.: Group 3: Fun in the Clouds (Robbie Konijn, Groep 3: Pret in de Wolken). Mindscape (2003)
Barendregt, W., Bekker, M.M.: Towards a Framework for Design Guidelines for Young Children’s Computer Games. In: Rauterberg, M. (ed.) ICEC 2004. LNCS, vol. 3166, pp. 365–376. Springer, Heidelberg (2004)
Barendregt, W., Bekker, M.M., Bouwhuis, D., Baauw, E.: Predicting effectiveness of children participants in user testing based on personality characteristics. Submitted to Behaviour & Information Technology (Unpublished manuscript)
Chattratichart, J., Brodie, J.: Applying User Testing Data to UEM Performance Metrics. In: Late Breaking Results Paper, Vienna, Austria, April 24, pp. 1119–1122 (2004)
Cockton, G., Lavery, D., Woolrych, A.: Inspection-based evaluations. In: Jacko, J., Sears, A. (eds.) The Human-Computer Interaction Handbook: Fundamentals, Evolving Technologies and Emerging Applications. Lawrence Erlbaum Associates, Mahwah (2003)
Cockton, G., Woolrych, A., Hall, L., Hindmarch, M.: Changing Analysts’ Tunes: The Surprising Impact of a New Instrument for Usability Inspection Method Assessment. In: Palanque, P., Johnson, P., O’Neill, E. (eds.) People and Computers, Designing for Society (Proceedings of HCI 2003), pp. 145–162. Springer, Heidelberg (2003)
Cockton, G., Woolrych, A.: Understanding Inspection Methods: Lessons from an Assessment of Heuristic Evaluation. In: Blandford, A., Vanderdonckt, J., Gray, P.D. (eds.), pp. 171–192. Springer, Heidelberg (2001)
Desurvire, H., Caplan, M., Toth, J.A.: Using heuristics to evaluate the playability of games. In: CHI extended abstracts 2004, Vienna, Austria, pp. 1509–1512
Federoff, M.A.: Heuristics and usability guidelines for the creation and evaluation of fun in video games. Msc Department of Telecommunications of Indiana University (2002)
Gray, W.D., Salzman, M.C.: Damaged merchandise? A review of experiments that compare usability evaluation methods. Human-Computer Interaction 13(3), 203–261 (1998)
Hartson, H.R., Andre, T.S., Williges, R.C.: Criteria for evaluating usability evaluation methods. International Journal of Human-Computer Interaction: Special issue on Empirical Evaluation of Information Visualisations 13(4), 373–410 (2001)
Kanis, H., Arisz, H.J.: How many participants: A simple means for concurrent monitoring. In: Proceedings of the IEA 2000/HFES 2000 Congress, pp. 637–640 (2000)
Lavery, D., Cockton, G., Atkinson, M.P.: Comparison of Evaluation Methods Using Structured Usability Problem Reports. Behaviour and Information Technology 16(4), 246–266 (1997)
Malone, T.W.: What makes things fun to learn? A study of intrinsically motivating computer games. Technical Report CIS-7, Xerox PARC, Palo Alto (1980)
von Nes, F.: On the validity of design guidelines and the role of standardisation. In: Nicolle, C., Abascal, J. (eds.) Inclusive Design Guidelines for HCI, pp. 61–70. Taylor & Francis Group, London (2001)
Nielsen, J., Mack, R.L.: Usability Inspection Methods. John Wiley & Sons, Inc., New York (1994)
Norman, D.A.: The design of everyday things. MIT Press, London (1998)
Pagulayan, R.J., Keeker, K., Wixon, D., Romero, R., Fuller, T.: User-centered design in games. In: Jacko, J., Sears, A. (eds.) Handbook for Human-Computer Interaction in Interactive Systems, pp. 883–906. Lawrence Erlbaum Associates, Mahwah (2003)
Sears, A.: Heuristic Walkthroughs: Finding the problems without the noise. International Journal of Human-Computer Interactions 9(3), 213–234 (1997)
Zapf, D., Maier, G.W., Irmer, C.: Error Detection, Task Characteristics, and Some Consequences for Software Design. Applied Psychology: an international review 43, 499–520 (1994)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2005 IFIP International Federation for Information Processing
About this paper
Cite this paper
Baauw, E., Bekker, M.M., Barendregt, W. (2005). A Structured Expert Evaluation Method for the Evaluation of Children’s Computer Games. In: Costabile, M.F., Paternò, F. (eds) Human-Computer Interaction - INTERACT 2005. INTERACT 2005. Lecture Notes in Computer Science, vol 3585. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11555261_38
Download citation
DOI: https://doi.org/10.1007/11555261_38
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-28943-2
Online ISBN: 978-3-540-31722-7
eBook Packages: Computer ScienceComputer Science (R0)