Week - 10 Evaluation
Week - 10 Evaluation
Week - 10 Evaluation
CLO 5
Course Learning Outcomes (CLO)
CLO1:Explain the concepts underpinning the interaction between users and computer
interfaces including dialog techniques and accessibility guidelines
CLO2:Understand the cognitive principles that support human-centered design
CLO3:Analyze and evaluate existing interactive user interfaces, based on usability and
principles of good design
CLO4:Apply the steps in interactive design including requirements definition, task
analysis, prototyping and usability testing
CLO5:Build lo-fi and hi-fi prototypes for a user interface that meet HCI best practices
Kahoot Quiz
WEEK 10
3
Chapter Objectives
Why: To check users’ requirements and confirm that users can utilize the product and
that they like it
What: A conceptual model, early and subsequent prototypes of a new system, more
complete prototypes, and a prototype to compare with competitors’ products
“Iterative design, with its repeating cycle of design and testing, is the
only validated methodology in existence that will consistently produce
successful results. If you don’t have user-testing as an integral part of
your design process you are going to throw buckets of money down the
drain.”
See AskTog.com for topical discussions about design and evaluation
Types of evaluation
Controlled settings that directly involve users (for example, usability and
research labs)
Any setting that doesn’t directly involve users (for example, consultants and
researchers critique the prototypes, and may predict and model how successful
they will be when used by users
Living labs
An early example was the Aware Home that was embedded with a
complex network of sensors and audio/video recording devices (Abowd
et al., 2000)
Living labs (continued)
More recent examples include whole blocks and cities that house
hundreds of people, for example, Verma et al., research in
Switzerland (2017)
Many citizen science projects can also be thought of as living labs,
for instance, iNaturalist.org
Players were more engaged when playing against another person than
when playing against a computer
Number of prewritten
experience responses
submitted by
participants to the pre-
established questions
that Ethnobot asked
them about their
experiences
What did we learn from the case studies?
Asking experts x x
Testing x
Modeling x
The language of evaluation
Type Forms
Informed consent form
Analytical
Analytics Biases In the wild evaluation
evaluation
Living laboratory
Controlled
Crowdsourcing
Ecological Predictive evaluation
experiment validity
Reliability
Scope
Expert review Formative
Field study
or criticism evaluation Summative evaluation
Usability laboratory
Heuristic
evaluation User studies
Usability testing
Users or participants
Participants’ rights and getting their consent
Reliability: Does the method produce the same results on separate occasions?
Ecological validity: Does the environment of the evaluation distort the results?
Controlled settings
The data is used to calculate performance times and to identify and explain errors
Emphasis on:
Selecting representative users
Developing representative tasks
Informed consent form explains procedures and deals with ethical issues
How many participants is enough for user testing?
Some of the same data gathering methods are used in evaluation as for
establishing requirements and identifying users’ needs, for example,
observation, interviews, and questionnaires
Usability testing and experiments enable the evaluator to have a high level of control over what
gets tested, whereas evaluators typically impose little or no control on participants in field studies