Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Hci M2 PDF

Download as pdf or txt
Download as pdf or txt
You are on page 1of 187

chapter 7

design rules
design rules

Designing for maximum usability


– the goal of interaction design

• Principles of usability
– general understanding

• Standards and guidelines


– direction for design

• Design patterns
– capture and reuse design knowledge
types of design rules

• principles
– abstract design rules
– low authority
– high generality

increasing generality
• standards
– specific design rules
– high authority
– limited application
• guidelines
– lower authority
increasing authority
– more general application
Principles to support usability

Learnability
the ease with which new users can begin effective
interaction and achieve maximal performance

Flexibility
the multiplicity of ways the user and system exchange
information

Robustness
the level of support provided the user in determining
successful achievement and assessment of
goal-directed behaviour
Principles of learnability

Predictability
– determining effect of future actions based on
past interaction history
– operation visibility

Synthesizability
– assessing the effect of past actions
– immediate vs. eventual honesty
Principles of learnability (ctd)

Familiarity
– how prior knowledge applies to new system
– guessability; affordance

Generalizability
– extending specific interaction knowledge to new
situations

Consistency
– likeness in input/output behaviour arising from similar
situations or task objectives
Principles of flexibility

Dialogue initiative
– freedom from system imposed constraints on input
dialogue
– system vs. user pre-emptiveness

Multithreading
– ability of system to support user interaction for more
than one task at a time
– concurrent vs. interleaving; multimodality

Task migratability
– passing responsibility for task execution between user
and system
Principles of flexibility (ctd)

Substitutivity
– allowing equivalent values of input and
output to be substituted for each other
– representation multiplicity; equal opportunity

Customizability
– modifiability of the user interface by user
(adaptability) or system (adaptivity)
Principles of robustness

Observability
– ability of user to evaluate the internal state of the
system from its perceivable representation
– browsability; defaults; reachability; persistence;
operation visibility

Recoverability
– ability of user to take corrective action once an error
has been recognized
– reachability; forward/backward recovery;
commensurate effort
Principles of robustness (ctd)

Responsiveness
– how the user perceives the rate of
communication with the system
– Stability

Task conformance
– degree to which system services support all
of the user's tasks
– task completeness; task adequacy
Using design rules

increasing generality
Design rules
• suggest how to increase usability
• differ in generality and authority
increasing authority
Standards

• set by national or international bodies to


ensure compliance by a large community of
designers standards require sound underlying
theory and slowly changing technology

• hardware standards more common than


software high authority and low level of detail

• ISO 9241 defines usability as effectiveness,


efficiency and satisfaction with which users
accomplish tasks
Guidelines

• more suggestive and general


• many textbooks and reports full of guidelines
• abstract guidelines (principles) applicable
during early life cycle activities
• detailed guidelines (style guides) applicable
during later life cycle activities
• understanding justification for guidelines aids
in resolving conflicts
Golden rules and heuristics

• “Broad brush” design rules


• Useful check list for good design
• Better design using these than using nothing!
• Different collections e.g.
– Nielsen’s 10 Heuristics (see Chapter 9)
– Shneiderman’s 8 Golden Rules
– Norman’s 7 Principles
Shneiderman’s 8 Golden Rules

1. Strive for consistency


2. Enable frequent users to use shortcuts
3. Offer informative feedback
4. Design dialogs to yield closure
5. Offer error prevention and simple error
handling
6. Permit easy reversal of actions
7. Support internal locus of control
8. Reduce short-term memory load
Norman’s 7 Principles

1. Use both knowledge in the world and


knowledge in the head.
2. Simplify the structure of tasks.
3. Make things visible: bridge the gulfs of
Execution and Evaluation.
4. Get the mappings right.
5. Exploit the power of constraints, both natural
and artificial.
6. Design for error.
7. When all else fails, standardize.
HCI design patterns

• An approach to reusing knowledge about


successful design solutions
• Originated in architecture: Alexander
• A pattern is an invariant solution to a
recurrent problem within a specific context.
• Examples
– Light on Two Sides of Every Room (architecture)
– Go back to a safe place (HCI)
• Patterns do not exist in isolation but are linked
to other patterns in languages which enable
complete designs to be generated
HCI design patterns (cont.)
• Characteristics of patterns
– capture design practice not theory
– capture the essential common properties of good examples
of design
– represent design knowledge at varying levels: social,
organisational, conceptual, detailed
– embody values and can express what is humane in
interface design
– are intuitive and readable and can therefore be used for
communication between all stakeholders
– a pattern language should be generative and assist in the
development of complete designs.
Summary

Principles for usability


– repeatable design for usability relies on maximizing
benefit of one good design by abstracting out the
general properties which can direct purposeful design
– The success of designing for usability requires both
creative insight (new paradigms) and purposeful
principled practice

Using design rules


– standards and guidelines to direct design activity
chapter 9

evaluation techniques
Evaluation Techniques

• Evaluation
– tests usability and functionality of system
– occurs in laboratory, field and/or in collaboration
with users
– evaluates both design and implementation
– should be considered at all stages in the design life
cycle
Goals of Evaluation

• assess extent of system functionality

• assess effect of interface on user

• identify specific problems


Evaluating Designs

Cognitive Walkthrough
Heuristic Evaluation
Review-based evaluation
Cognitive Walkthrough

Proposed by Polson et al.


– evaluates design on how well it supports user
in learning task
– usually performed by expert in cognitive
psychology
– expert ‘walks though’ design to identify
potential problems using psychological
principles
– forms used to guide analysis
Cognitive Walkthrough (ctd)

• For each task walkthrough considers


– what impact will interaction have on user?
– what cognitive processes are required?
– what learning problems may occur?

• Analysis focuses on goals and


knowledge: does the design lead the
user to generate the correct goals?
Heuristic Evaluation

• Proposed by Nielsen and Molich.

• usability criteria (heuristics) are identified


• design examined by experts to see if these are
violated

• Example heuristics
– system behaviour is predictable
– system behaviour is consistent
– feedback is provided

• Heuristic evaluation `debugs' design.


Review-based evaluation

• Results from the literature used to support or


refute parts of design.

• Care needed to ensure results are transferable


to new design.

• Model-based evaluation

• Cognitive models used to filter design options


e.g. GOMS prediction of user performance.

• Design rationale can also provide useful


evaluation information
Evaluating through user
Participation
Laboratory studies

• Advantages:
– specialist equipment available
– uninterrupted environment

• Disadvantages:
– lack of context
– difficult to observe several users cooperating

• Appropriate
– if system location is dangerous or impractical for
constrained single user systems to allow controlled
manipulation of use
Field Studies

• Advantages:
– natural environment
– context retained (though observation may alter it)
– longitudinal studies possible

• Disadvantages:
– distractions
– noise

• Appropriate
– where context is crucial for longitudinal studies
Evaluating Implementations

Requires an artefact:
simulation, prototype,
full implementation
Experimental evaluation

• controlled evaluation of specific aspects of


interactive behaviour
• evaluator chooses hypothesis to be tested
• a number of experimental conditions are
considered which differ only in the value of
some controlled variable.
• changes in behavioural measure are attributed
to different conditions
Experimental factors

• Subjects
– who – representative, sufficient sample
• Variables
– things to modify and measure
• Hypothesis
– what you’d like to show
• Experimental design
– how you are going to do it
Variables

• independent variable (IV)


characteristic changed to produce different
conditions
e.g. interface style, number of menu items

• dependent variable (DV)


characteristics measured in the experiment
e.g. time taken, number of errors.
Hypothesis

• prediction of outcome
– framed in terms of IV and DV

e.g. “error rate will increase as font size decreases”

• null hypothesis:
– states no difference between conditions
– aim is to disprove this

e.g. null hyp. = “no change with font size”


Experimental design

• within groups design


– each subject performs experiment under each
condition.
– transfer of learning possible
– less costly and less likely to suffer from user
variation.
• between groups design
– each subject performs under only one condition
– no transfer of learning
– more users required
– variation can bias results.
Analysis of data

• Before you start to do any statistics:


– look at data
– save original data

• Choice of statistical technique depends on


– type of data
– information required

• Type of data
– discrete - finite number of values
– continuous - any value
Analysis - types of test

• parametric
– assume normal distribution
– robust
– powerful

• non-parametric
– do not assume normal distribution
– less powerful
– more reliable

• contingency table
– classify data by discrete attributes
– count number of data items in each group
Analysis of data (cont.)

• What information is required?


– is there a difference?
– how big is the difference?
– how accurate is the estimate?

• Parametric and non-parametric tests


mainly address first of these
Experimental studies on groups

More difficult than single-user experiments

Problems with:
– subject groups
– choice of task
– data gathering
– analysis
Subject groups

larger number of subjects


⇒ more expensive
longer time to `settle down’
… even more variation!
difficult to timetable
so … often only three or four groups
The task

must encourage cooperation


perhaps involve multiple channels
options:
– creative task e.g. ‘write a short report on …’
– decision games e.g. desert survival task
– control task e.g. ARKola bottling plant
Data gathering

several video cameras


+ direct logging of application
problems:
– synchronisation
– sheer volume!

one solution:
– record from each perspective
Analysis

N.B. vast variation between groups

solutions:
– within groups experiments
– micro-analysis (e.g., gaps in speech)
– anecdotal and qualitative analysis

look at interactions between group and media

controlled experiments may `waste' resources!


Field studies

Experiments dominated by group formation

Field studies more realistic:


distributed cognition ⇒ work studied in context
real action is situated action
physical and social environment both crucial

Contrast:
psychology – controlled experiment
sociology and anthropology – open study and rich data
Observational Methods

Think Aloud
Cooperative evaluation
Protocol analysis
Automated analysis
Post-task walkthroughs
Think Aloud

• user observed performing task


• user asked to describe what he is doing and
why, what he thinks is happening etc.

• Advantages
– simplicity - requires little expertise
– can provide useful insight
– can show how system is actually use
• Disadvantages
– subjective
– selective
– act of describing may alter task performance
Cooperative evaluation

• variation on think aloud


• user collaborates in evaluation
• both user and evaluator can ask each other
questions throughout

• Additional advantages
– less constrained and easier to use
– user is encouraged to criticize system
– clarification possible
Protocol analysis
• paper and pencil – cheap, limited to writing speed
• audio – good for think aloud, difficult to match with other
protocols
• video – accurate and realistic, needs special equipment,
obtrusive
• computer logging – automatic and unobtrusive, large
amounts of data difficult to analyze
• user notebooks – coarse and subjective, useful insights,
good for longitudinal studies

• Mixed use in practice.


• audio/video transcription difficult and requires skill.
• Some automatic support tools available
automated analysis – EVA

• Workplace project
• Post task walkthrough
– user reacts on action after the event
– used to fill in intention
• Advantages
– analyst has time to focus on relevant incidents
– avoid excessive interruption of task
• Disadvantages
– lack of freshness
– may be post-hoc interpretation of events
post-task walkthroughs

• transcript played back to participant for


comment
– immediately → fresh in mind
– delayed → evaluator has time to identify
questions
• useful to identify reasons for actions
and alternatives considered
• necessary in cases where think aloud is
not possible
Query Techniques

Interviews
Questionnaires
Interviews

• analyst questions user on one-to -one basis


usually based on prepared questions
• informal, subjective and relatively cheap

• Advantages
– can be varied to suit context
– issues can be explored more fully
– can elicit user views and identify unanticipated
problems
• Disadvantages
– very subjective
– time consuming
Questionnaires

• Set of fixed questions given to users

• Advantages
– quick and reaches large user group
– can be analyzed more rigorously
• Disadvantages
– less flexible
– less probing
Questionnaires (ctd)

• Need careful design


– what information is required?
– how are answers to be analyzed?

• Styles of question
– general
– open-ended
– scalar
– multi-choice
– ranked
Physiological methods

Eye tracking
Physiological measurement
eye tracking

• head or desk mounted equipment tracks the


position of the eye
• eye movement reflects the amount of
cognitive processing a display requires
• measurements include
– fixations: eye maintains stable position. Number and
duration indicate level of difficulty with display
– saccades: rapid eye movement from one point of
interest to another
– scan paths: moving straight to a target with a short
fixation at the target is optimal
physiological measurements
• emotional response linked to physical changes
• these may help determine a user’s reaction to
an interface
• measurements include:
– heart activity, including blood pressure, volume and pulse.
– activity of sweat glands: Galvanic Skin Response (GSR)
– electrical activity in muscle: electromyogram (EMG)
– electrical activity in brain: electroencephalogram (EEG)
• some difficulty in interpreting these
physiological responses - more research
needed
Choosing an Evaluation Method

when in process: design vs. implementation


style of evaluation: laboratory vs. field
how objective:subjective vs. objective
type of measures: qualitative vs. quantitative
level of information: high level vs. low level
level of interference:obtrusive vs. unobtrusive
resources available: time, subjects,
equipment, expertise
DESIGN RULES
Design rules
◻ What is the goal of interaction design?
Designing for the maximum usability
◻ Types of design rules
Standards/Guidelines/Principles
◻ Standards and guidelines
Direction for design
Types of design rules
◻ We can classify these rules along two dimensions based on the
rule’s authority and generality
◻ By authority, we mean
Rule must be followed in the design or
It is only suggested
◻ By generality, we mean
Rule can be applied to many situations/applications or
It is focused on a limited situations/applications

increasing generality
increasing authority
Types of design rules (contd.)
◻ Standards
Specific design rules
High authority
Low generality (Limited application)
◻ Guidelines
Lower authority
High generality (More general application)
◻ Principles

increasing generality
Abstract design rules
Lower authority
High generality

increasing authority
Using design rules

increasing generality
Design rules
◻ Suggest how to increase usability
◻ Differ in generality and authority

increasing authority
Standards
◻ Higher level of authority but so useful for specific design

◻ Set by national or international bodies to ensure compliance


by a large community of designers standards require sound
underlying theory and slowly changing technology

◻ Hardware standards more common than software high


authority and low level of detail
Guidelines
◻ More technology oriented, but they are also general
◻ Abstract guidelines applicable during early life cycle activities
◻ Detailed guidelines (style guides) applicable during later life
cycle activities
◻ ISO 9241 defines usability as effectiveness, efficiency and
satisfaction with which users accomplish tasks

◻ Understanding justification for guidelines aids in resolving


conflicts
Principles
◻ Principles are derived from knowledge of the
psychological, computational and sociological
aspects of the problem domain
◻ Independent of the technology
◻ Therefore can be applied to widely but not so
useful for specific design
Golden rules and heuristics
◻ “Broad Brush” design rules
◻ Useful check list for good design
◻ Different collections of design rules includes such as
Shneiderman’s 8 Golden Rules
Norman’s 7 Principles
Nielsen’s 10 Heuristics
SHNEIDERMAN’S 8 GOLDEN
RULES
Shneiderman’s 8 Golden Rules
1. Strive for consistency
2. Cater to Universal Usability (Enable frequent users to use
shortcuts)
3. Offer informative feedback
4. Design dialogs to yield closure
5. Offer error prevention and simple error handling
6. Permit easy reversal of actions
7. Support internal locus of control
8. Reduce short-term memory load
4.
Graphical Screen Design

CRAP – contrast, repetition, alignment, proximity


Grids are an essential tool for graphical design
Other visual design concepts
consistency relationships
organization legibility and readability
navigational cues appropriate imagery
familiar idioms

Major sources: Designing Visual Interfaces, Mullet & Sano, Prentice Hall / Robin Williams Non-Designers Design Book, Peachpit Press

Slide deck by Saul Greenberg. Permission is granted to use this for non-commercial purposes as long as general credit to Saul Greenberg is clearly maintained.
Warning: some material in this deck is used from other sources without permission. Credit to the original source is given if it is known.

CRAP
Contrast
– make different things different
– brings out dominant elements
– mutes lesser elements
– creates dynamism

Repetition
– repeat design throughout the interface
– consistency
– creates unity

Alignment
– visually connects elements
– creates a visual flow

Proximity
– groups related elements
– separates unrelated ones
Robin Williams Non-Designers Design Book, Peachpit Press Saul Greenberg

Graphical Design 1
Graphical Design 2
Original

Proximity

Graphical Design 3
Alignment

Contrast

Graphical Design 4
Repetition

Grids

Horizontal and vertical lines to locate window components


– aligns related components

Format of
Organization variable
– contrast for dominant elements contents
– element groupings by proximity
– organizational structure
Standard Message text in
– alignment icon set Arial 14, left
adjusted
Widget to
Consistency widget
– location spacing

– format No Ok
– element repetition
– organization window to
widget Fixed
spacing components
Saul Greenberg

Graphical Design 5
Standard Message text in
icon set Arial 14, left
adjusted

?
Do you really want
to delete the file
“myfile.doc” from
No Ok the folder “junk”?

No Ok

!
Cannot move the
file “myfile.doc” to
Apply the folder “junk”
The file was because the disc is
destroyed full.
Cancel

Ok

8 9

Two-level Hierarchy
Logic of organizational
•indentation
flow
•contrast

Alignment connects Grouping


visual elements in a by white
sequence space

Graphical Design 6
Visual consistency (repetition)

internal consistency
– elements follow same conventions and rules
– set of application-specific grids enforce this

external consistency
– follow platform and interface style conventions
– use platform and widget-specific grids

deviate only when it provides a clear benefit to user

Warning Tip of the day: Monday, Mar 12


Help
mmmm mmm
mmmm mmm
? mmm mmm ! mmmm
mmm
mmm
mmm
Dismiss
Okay Okay

9 8
Saul Greenberg

Relating screen elements

proximal clusters
alignment
white (negative) space
explicit structure

Mmmm: Mmmm: Mmmm:


Mmmm:
Mmmm: Mmmm:

Mmmm: Mmmm:
Mmmm:
Mmmm: Mmmm:
Mmmm:
Mmmm: Mmmm:
Mmmm:

8 9

Saul Greenberg

Graphical Design 7
Terrible alignment
– no flow

Poor contrast
– cannot distinguish colored labels from editable fields

Poor repetition
– buttons do not look like buttons

Poor explicit structure


– blocks compete with alignment
Webforms Saul Greenberg

No regard for
order and
organization

IBM's Aptiva Communication Center

Graphical Design 8
Haphazard layout

Mullet & Sano

Repairing the layout

Mullet & Sano

Graphical Design 9
Spatial Tension

Mullet & Sano

Using explicit structure


as a crutch
Mullet & Sano

Graphical Design 10
Overuse of 3-d effects makes the window unnecessarily cluttered

WebForms

How do you chose when you cannot discriminate


screen elements from each other?

GIF Construction Set Microsoft Access 2.0

Graphical Design 11
Navigational cues

provide initial focus

direct attention as appropriate to important 2ndary, or


peripheral items as appropriate

order should follow a user’s conceptual model of sequences

8 9

Saul Greenberg

Redesigning a layout using alignment and factoring

Mullet & Sano

Graphical Design 12
The importance of negative space and alignment

Mullet & Sano Saul Greenberg

Economy of visual elements

minimize number of controls

include only those that are necessary


– eliminate, or relegate others to secondary windows

minimize clutter
– so information is not hidden
MMMM NNNN
xxx: ____ xxx: ____
MMMM
xxx: ____ xxx: ____
xxx: ____
xxx: ____ xxx: ____
xxx: ____
xxx: ____ xxx: ____
xxx: ____

NNNN
xxx: ____ xxx: ____
xxx: ____ xxx: ____

8 9
Saul Greenberg

Graphical Design 13
Repairing excessive display density

Mullet & Sano

Tabs
– excellent means for factoring related items
– but can be overdone

Graphical Design 14
Legibility and readability

Characters, symbols, graphical elements should be easily


noticable and distinguishable

Text set in TEXT SET IN


Helvetica CAPITOLS
Text set in
Text set in Braggadocio
Times Roman
Text set in
Courier

9 8

Saul Greenberg

Legibility and readability

Proper use of typography


– 1-2 typefaces (3 max)
– normal, italics, bold
– 1-3 sizes max

Large Large
Medium Medium
Small Small

Readable Unreadable
Design components to be Design components to be
inviting and attractive inviting and attractive
Design components to be Design components to be
inviting and attractive inviting and attractive

9 8

Saul Greenberg

Graphical Design 15
Legibility and readability

typesetting
– point size
– word and line spacing
– line length
– Indentation
– color

Readable Unreadable: Design components


Design components to be to be easy to interpret and
inviting and attractive understand. Design components to
be inviting and attractive
Design components to be
inviting and attractive

9 8

Saul Greenberg

Popkin Software’s System Architect

Graphical Design 16
These choices must be really important,
or are they?

Time & Chaos

Greyed-out example text hard to read.


Why not make it black?

Regional preferences in Windows 95

Graphical Design 17
Text orientation
difficult to read

Microsoft Word

Imagery

Signs, icons, symbols


– right choice within spectrum from concrete to abstract

Icon design very hard


– except for most familiar, always label them

Image position and type should be related


– image “family”

Consistent and relevant image use


– identifies situations, offerings...

Partial icon family


Saul Greenberg

Graphical Design 18
Choosing levels of abstraction

8 8 9 9 8

Mullet & Sano

Refined vs excessive literal metaphors

9 8

Mullet &
Sano

Graphical Design 19
What do these images mean?
• no tooltips included
• one of the tabs is a glossary explaining these images!
which one?

Novell GroupWise 5.1

Idioms

Familiar ways of using GUI components


– appropriate for casual to expert users
– builds upon computer literacy
– must be applied carefully in walk up and use systems
Files
Window manipulation

Standard
typographic controls
Toolbars and tooltips

What you see is what you get


displays

Pulldown menus Cascading menu

Microsoft Powerpoint Dialog box item Saul Greenberg

Graphical Design 20
How to choose between widgets

What components must be in the display?


– necessary visual affordances
– frequent actions
• direct manipulation for core activities
• buttons/forms/toolbar/special tools for frequent/immediate actions
• menus/property window for less frequent actions
• secondary windows for rare actions

How are components related?


– organize related items as “chunks”

What are familiar and expected idioms?


– cross application look and feel

Saul Greenberg

Displaying core functionality

Apple MacPaint & Macwrite, from


ll & S

Graphical Design 21
Widgets and complexity

how can window navigation be reduced?


– avoid long paths
– avoid deep hierarchies

9
8
Saul Greenberg

Exercise

Graphical redesign

Create a grid emphasising:


– visual consistency
– relationships between
screen elements
– navigational cues
– economy
– legibility and readability
– imagery

Saul Greenberg

Graphical Design 22
Constructing a grid
1. Maintain consistency with GUI style
• locate standard components - title bar, window controls, …

2. Decide navigational layout + white space + legibility + typography


• annotated grid shows location of generic components
• these generic components may have their own grids.

Using the grid


3. Determine relationships, navigational structure
• map navigational structure onto the grid

4. Economize
• collapse two windows into one
• trim sound dialog

Graphical Design 23
Using the grid
5. Evaluate by displaying actual examples

6. Economize further
• decide which we prefer

vs

What you now know

CRAP

Grids are an essential tool for graphical design

Other visual concepts include


– visual consistency
• repetition

– visual organization
• contrast, alignment and navigational cues

– visual relationships
• proximity and white space

– familiar idioms

– legibility and readability


• typography

– appropriate imagery

Saul Greenberg

Graphical Design 24
Interface Design and Usability Engineering
Articulate: Brainstorm Refined Completed
•who users are designs designs designs
Goals: •their key tasks

Task Graphical
centered Psychology of Participatory
everyday interaction screen
system design
design Evaluate things Usability Field
tasks User involvement Interface testing testing
Methods: Participatory Task scenario guidelines
design Representation & walk-
metaphors through Style Heuristic
User- guides
centered evaluation
design

low fidelity high fidelity


prototyping prototyping
methods methods

Products: User and Throw-away Testable Alpha/beta


task paper prototypes systems or
descriptions prototypes complete
specification

Graphical Design 25
chapter 6

HCI in the software


process
HCI in the software process

• Software engineering and the design process


for interactive systems
• Usability engineering

• Iterative design and prototyping


• Design rationale
the software lifecycle

• Software engineering is the discipline for


understanding the software design process, or
life cycle

• Designing for usability occurs at all stages of


the life cycle, not as a single isolated activity
The waterfall model
Requirements
specification

Architectural
design

Detailed
design

Coding and
unit testing

Integration
and testing

Operation and
maintenance
Activities in the life cycle
Requirements specification
designer and customer try capture what the system is
expected to provide can be expressed in natural language or
more precise languages, such as a task analysis would
provide

Architectural design
high-level description of how the system will provide the
services required factor system into major components of the
system and how they are interrelated needs to satisfy both
functional and nonfunctional requirements

Detailed design
refinement of architectural components and interrelations to
identify modules to be implemented separately the refinement
is governed by the nonfunctional requirements
Verification and validation

Real-world
requirements
and constraints The formality gap

Verification
designing the product right
Validation
designing the right product

The formality gap


validation will always rely to some extent on subjective means
of proof
Management and contractual issues
design in commercial and legal contexts
The life cycle for interactive
systems
Requirements cannot assume a linear
specification
sequence of activities
Architectural
as in the waterfall model
design

Detailed
design

Coding and
unit testing

Integration
lots of feedback! and testing

Operation and
maintenance
Usability engineering
The ultimate test of usability based on measurement of user
experience
Usability engineering demands that specific usability measures be
made explicit as requirements

Usability specification
– usability attribute/principle
– measuring concept
– measuring method
– now level/ worst case/ planned level/ best case

Problems
– usability specification requires level of detail that may not be
– possible early in design satisfying a usability specification
– does not necessarily satisfy usability
part of a usability
specification for a VCR

Attribute: Backward recoverability


Measuring concept: Undo an erroneous programming
sequence
Measuring method: Number of explicit user actions
to undo current program
Now level: No current product allows such an undo
Worst case: As many actions as it takes to
program-in mistake
Planned level: A maximum of two explicit user actions
Best case: One explicit cancel action
ISO usability standard 9241

adopts traditional usability categories:


• effectiveness
– can you achieve what you want to?
• efficiency
– can you do it without wasting effort?
• satisfaction
– do you enjoy the process?
some metrics from ISO 9241
Usability Effectiveness Efficiency Satisfaction
objective measures measures measures

Suitability Percentage of Time to Rating scale


for the task goals achieved complete a task for satisfaction

Appropriate for Number of power Relative efficiency Rating scale for


trained usersfeatures used compared with satisfaction with
an expert user power features

Learnability Percentage of Time to learn Rating scale for


functions learned criterion ease of learning

Error tolerance Percentage of Time spent on Rating scale for


errors corrected correcting errors error handling
successfully
Iterative design and
prototyping
• Iterative design overcomes inherent problems of incomplete
requirements

• Prototypes
– simulate or animate some features of intended system
– different types of prototypes
• throw-away
• incremental
• evolutionary

• Management issues
– time
– planning
– non-functional features
– contracts
Techniques for prototyping
Storyboards
need not be computer-based
can be animated

Limited functionality simulations


some part of system functionality provided by designers
tools like HyperCard are common for these
Wizard of Oz technique

Warning about iterative design


design inertia – early bad decisions stay bad
diagnosing real usability problems in prototypes….
…. and not just the symptoms
Design rationale

Design rationale is information that explains why


a computer system is the way it is.

Benefits of design rationale


– communication throughout life cycle
– reuse of design knowledge across products
– enforces design discipline
– presents arguments for design trade-offs
– organizes potentially large design space
– capturing contextual information
Design rationale (cont’d)

Types of DR:
• Process-oriented
– preserves order of deliberation and decision-making
• Structure-oriented
– emphasizes post hoc structuring of considered design
alternatives

• Two examples:
– Issue-based information system (IBIS)
– Design space analysis
Issue-based information
system (IBIS)
• basis for much of design rationale research
• process-oriented
• main elements:
issues
– hierarchical structure with one ‘root’ issue
positions
– potential resolutions of an issue
arguments
– modify the relationship between positions and issues
• gIBIS is a graphical version
structure of gIBIS
supports
Position Argument
responds to
Issue
responds to
objects to
Position Argument
specializes

Sub-issue generalizes

questions

Sub-issue

Sub-issue
Design space analysis

• structure-oriented
• QOC – hierarchical structure:
questions (and sub-questions)
– represent major issues of a design
options
– provide alternative solutions to the question
criteria
– the means to assess the options in order to make a choice

• DRL – similar to QOC with a larger language


and more formal semantics
the QOC notation
Criterion
Option

Question Option Criterion

Option Criterion

… Consequent …
Question
Question
Psychological design rationale

• to support task-artefact cycle in which user tasks are


affected by the systems they use
• aims to make explicit consequences of design for users
• designers identify tasks system will support
• scenarios are suggested to test task
• users are observed on system
• psychological claims of system made explicit
• negative aspects of design can be used to improve next
iteration of design
Summary

The software engineering life cycle


– distinct activities and the consequences for
interactive system design
Usability engineering
– making usability measurements explicit as
requirements
Iterative design and prototyping
– limited functionality simulations and animations
Design rationale
– recording design knowledge
– process vs. structure
Implicit Human Computer Interaction Through Context
Albrecht Schmidt
Telecooperation Office (TecO), University of Karlsruhe
Vincenz-Prießnitz-Str. 1, 76131 Karlsruhe
Germany
albrecht@teco.edu
www.teco.edu

Abstract (ranging form simple temperature sensors to


In this paper the term implicit human computer cameras), and the resulting perceptional
interaction is defined. It is discussed how the capabilities as well as the fact that the main user
availability of processing power and advanced group of current computing devices (e.g. mobile
sensing technology can enable a shift in HCI phones, PDAs, etc.) are non experts, we may
from explicit interaction, such as direct observe yet another shift in HCI. Devices that
manipulation GUIs, towards a more implicit have perceptional capabilities (even if they are
interaction based on situational context. In the very limited) will start the shift from explicit HCI
paper an algorithm that is based on a number of towards a more implicit interaction with
questions to identify applications that can machines.
facilitate implicit interaction is given. An XML- A vision of future devices
based language to describe implicit HCI is We will be able to create (mobile) devices that
proposed. The language uses contextual variables can see, hear and feel. Based on their perception,
that can be grouped using different types of these devices will be able to act and react
semantics as well as actions that are called by according to the situational context in which they
triggers. The term of perception is discussed and are used.
four basic approaches are identified that are In this paper it will be shown that this vision is
useful when building context-aware applications. not as far ahead as it seems. In our research we
Providing two examples, a wearable context start with the perception of simple concepts and
awareness component and a sensor-board, it is with their exploitation. Providing a number of
shown how sensor-based perception can be examples and demonstrators it is discussed how
implemented. It is also discussed how situational basic perception could enable a shift from explicit
context can be exploited to improve input and towards implicit HCI.
output of mobile devices.
2 Implicit Interaction
Keywords: context awareness, context sensing, Observing communication between humans we
implicit human computer interaction, perception, can see that a lot of information is only
ubiquitous computing. exchanged implicitly. The way people interact
with each other and also the situation in which
1 Introduction they interact carries information that is often
The way people interact with devices is vital for implicitly exploited in the exchange of messages.
their success. Looking at HCI it is apparent that While in a conversation the behavior of
interaction techniques are limited by the participants as well as what happens in the
technology available. Furthermore the anticipated surrounding environment supplies valuable
user groups influence the interaction metaphors to information that is often vital for the
a large extent. Considering the shift from punch understanding of messages. In many cases the
cards to interactive text terminals and also the robustness of human-to-human communication is
shift from command line interfaces to graphical based on the implicitly introduced contextual
user interfaces (GUI) this was observable. information, such as gestures, body language, and
voice. Another example is redundancy between
Bearing in mind current and upcoming
body language (e.g. nodding) and spoken
technologies, such as increased processing power
languages (e.g. the word ‘yes’). This implicitly
(even on mobile devices), availability of sensors
introduced knowledge is also used to rudimentary level, e.g. automatic light control
disambiguate information, e.g. in a discussion (switches on the light when it is dark and
with a student pointing at a computer the term someone is walking by) and active badge systems
‘sun’ has a different meaning than the same term (automatically open a door when someone with
when on the beach together with friends; a more appropriate permission likes to enter the
in depth discussion is given in [11]. building). In current computer systems we can
observe that agent technology is used to build
2.1 Implicit vs. Explicit Human Computer systems that have a certain ability to act
Interaction proactively. These approaches are mainly based
Considering current computer technology on user profiles and usage information [13]. In
interaction is explicit – the user tells the computer these cases perception is limited to information
in a certain level of abstraction (e.g. by gathered in the virtual space.
command-line, direct manipulation using a GUI, If we look concepts that are needed to facilitate
gesture, or speech input) what she expects the implicit interaction three basic building blocks
computer to do. This is considered as explicit can be identified:
interaction.
1. the ability to have perception of the use, the
Definition: Implicit Human Computer Interaction environment, and the circumstances,
Implicit human computer interaction is an action,
2. mechanisms to understand what the sensors
performed by the user that is not primarily aimed
see, hear and feel, and
to interact with a computerized system but which
such a system understands as input. 3. applications that can make use of this
information.
The action of a user is always performed in a
certain environment. Implicit interaction is based On a conceptual level (1) and (2) can be
on the assumption that the computer has a certain described as situational context. And (3) are
understanding of our behavior in the given applications that are context enabled. In the next
situation. This knowledge is then considered as section context is discussed in more detail.
an additional input to the computer while doing a
2.2 What is Context
task.
The notion of context is used in many different
A simple example is the garbage bin [14] that
ways. In our work we propose to regard
scans in the bar code of products and reproduces
situational context, such as location, surrounding
the information for a suggested shopping list. The
environment or state of the device, as implicit
action performed by the user (e.g. throw away an
input to the system. We use the term situational
empty can in a bin) is the same as with any other
context to describe implicit interaction fragments.
garbage bin. The recognition of the system (by
This extends the concept of context beyond the
scanning the bar code) and the built-in
informational context into real world
interpretation of the system (all things that go into
environments.
the bin may be on the next shopping list again)
make use of the action performed by the user. The word Context in general use has a multitude
The user herself does not explicitly interact with of meanings. Even within the field of computer
the computer, thus the process describes an science different disciplines, such as artificial
implicit interaction. As we see from the example intelligence, natural language processing, image
implicit interaction is based on two main recognition, and more recently mobile computing,
concepts: have their very own understanding of what
context is. In our work we found that very general
•= perception descriptions of context as given by a dictionary
•= interpretation. and also synonyms found in a thesaurus come
very close to our understanding. To illustrate this
For most applications implicit interaction will be we like to provide the following definitions:
used additionally to explicit interaction. Context n 1: discourse that surrounds a
There are other systems implemented that also language unit and helps to determine its
facilitate the idea of implicit interaction on a interpretation [syn: linguistic context,

2
context of use] 2: the set of facts or only a few seconds and making a note on a PDA
circumstances that surround a situation or is often in the range of several seconds up to
event; "the historical context" (Source: some minutes. Also the fact that the applications
WordNet ® 1.6) are mainly used while doing something else or to
Context: That which surrounds, and gives carry out a certain task (like tools in the real
meaning to, something else. world) calls for a reduction of the explicit human-
(Source: The Free On-line Dictionary of Computing) machine interaction and creates the need to shift
towards implicit HCI.
Synonyms Context:
Circumstance, situation, phase, position, Knowledge about the situational context is of
posture, attitude, place, point; terms; primary interest to the application, because we
regime; footing, standing, status, occasion, consider that the application will adapt to the
surroundings, environment, location, context.
dependence. (Source: www.thesaurus.com) It can be observed that an application (mobile or
To build applications that have knowledge about stationary alike) is:
their situational context it is important to gain an (a) running on a specific device (e.g. input
understanding what context is. Current research system, screen size, network access,
in context-awareness in mobile computing shows portability, etc.),
a strong focus on location [1], [12]. Location is a (b) at a certain time (absolute time e.g. 9:34 p.m.,
concept that is well understood. Also the benefit class of time e.g. in the morning)
of location-awareness is clearly given, at certain (c) used by one or more users (concurrently or
locations particular services are more important sequentially),
than others. An architectural approach, based on a (d) in a certain physical environment (absolute
smart environment is described by Schilit et. al. location, type of location, conditions such as
[17]. Other scenarios are using RF and GPS to light, audio, and temperature, infrastructure,
determine the users location, e.g. [4], [15]. But, as etc.),
pointed out in [20] context is more than location. (e) social setting (people co-located and social
We use the term context considering mobile role),
computing in a more general way, as also (f) to solve a particular task (single task, group
suggested by [2], to describe the environment, of tasks, or a general goal).
situation, state, surroundings, task, and so on. A We consider the items (a) to (f) as the basic
wider view of context is also given by [19]. They building blocks of context. For mobile
suggest to consider the way a device is used applications especially (d) and (e) are of major
(mobile phone in the users hand, on the table, in interest. In mobile settings the physical
pocket, etc.) to be treated as context. environment can change while an application is
executed e.g. making a phone call while walking
2.3 Applications in Context from the office desk to the car park. The
telephone application is running, while the noise
Analyzing the way people use ultra-mobile level changes between office and outside.
devices (e.g. personal digital assistants, smart
mobile phones, handheld and wearable 2.4 Identifying Implicit Human Computer
computers) it becomes apparent that the periods Interaction
of interaction are much shorter than in traditional To identify applications that can be improved by
mobile settings. Notebooks – considered as implicit HCI input and output of the application
mobile computers - are mainly used in stationary and the real world environment in which it is
setting, e.g. one takes a notebook to a meeting executed have to be analyzed. Then ways to
and takes note and a salesman takes a mobile capture the situational context must be assessed.
computer to a customer for a presentation. In Furthermore mechanisms for the interpretation of
general in these scenarios the application is used the situational context have to be found. Finally
in a stationary setting between several minutes the reaction of the application has to be defined.
and hours. Whereas considering the usage of The following questions help to identify these
ultra-mobile devices interaction periods are often points:
much shorter e.g. looking up an address takes

3
•= What happens around an application while On 3: For each Ci the accuracy Ai and the update
the application is in use? Are there any rate Ui that are needed to make the measurements
changes at all? useful are defined. Then a sensing device that
•= Do the surroundings (behavior, environment, matches these requirements is identified. If the
circumstances) carry any valuable cost for the identified sensing device Di is
information for the application? Does it acceptable, then the vector describing the
matter for the application? condition, the sensing device, the required
•= Are there any means to capture and extract accuracy and update rate is added to the set D.
the information in a way that is acceptable for For conditions that cannot be sensed the cost is
the application or device (processing cost, infinite.
sensor cost, weight, etc.)? On 4: If any conditions that are feasible to
•= How to understand the information? What measure exist then for each of these conditions
interpretation and reasoning is possible and one or more range values that are meaningful
useful. What is an appropriate way for the (temperature between 15°C and 25°C, location is
application to react? inside my office, user is moving, etc.) are
identified and for these ranges the reaction of the
Putting all of these together we can set up the
application (switch to notepad, etc.) is defined.
algorithm in figure 1. The algorithm works as
follows: 2.5 Modeling Implicit Human Computer
On 1: C is the set of surrounding conditions that Interaction
carry information that is useful for the To specify applications that facilitate implicit
application. Each element Ci stands for one HCI it is inevitable to have a specification
condition, e.g. location, temperature, current user, language to describe situational context linked to
device orientation, etc. The set is created by events/change that occur in the application. In our
asking what condition change in the environment. recent work we found it helpful to use a notation
that is human readable as well as easily to process
On 2: D is initialized – at the beginning no
using a computer. We decided to use a markup
sensing devices are identified.
language that is specified in XML for this
1. create the set C
2. set D = {}
3. for each Ci ∈ C
define Ai. // accuracy
define Ui. // update rate
identify Si // a sensor device
// that is appropriate
if cost(Si, Ai, Ui) is acceptable then
D = D ∪{( Ci, Si, Ai, Ui)}
fi
next
4. if D ≠ {} then
for each vector Di in D
define a set of application reaction Ri = {(Iij, Rij)}
// Iij is input range, application reaction pairs Iij
// Rij is application reaction
else
// implicit interaction is not used
//(either no condition that are useful or too costly)

Figure 1: Identifying Implicit HCI.

4
purpose. Extending the SGML based description <!ELEMENT context_interaction
model introduced by Brown in [2], [3] we added (context , action )>
two more concepts - grouping context with <!ELEMENT context (group+ )>
matching attributes and trigger attributes to make <!ELEMENT group (#PCDATA )>
<!ATTLIST group match
the description more expressive and suitable for
(one | all | none ) #REQUIRED >
our projects. See figure 2 for the XML data type <!ELEMENT action (#PCDATA )>
definition (DTD). <!ATTLIST action
In the <context> section contextual variables time CDATA ’0’
are used to describe the conditions. These trigger (enter | leave | in )
variables are made of two parts, the first is used #REQUIRED >
to specify the context sensing module, in figure 5, Figure 2: Data Type Definition
the sensor module (sensor_module) and the
palm pilot (pilot) and in the second part the <context_interaction>
<context>
variables provided by this module. <group match=’one’>
In the <action> section function calls are used sensor_module.touch
to specify the action to be carried out in case the pilot.on
trigger evaluates to true. These calls are also </group>
hierarchically structured; specifying the device, <group match=’none’>
sensor_module.alone
the application, and the function to be performed.
pilot.pen_down
Depending on the platform (e.g. context sensing </group>
module in a microcontroller) we use a different </context>
implementation language. <action trigger=’enter’ time=’3’>
pilot.notepad.confidential
If contexts are composed of a number of
</action>
components we found it very helpful to have a </context_interaction>
mechanism to bundle certain contextual variables
in groups and select a matching semantic for each Figure 3: Context description
group description. For matching in a group we
provided the following semantics: one (match In figure 3 an example of a description of a
one or more of the variables in the following context and an action is shown. The context
group), all (match all variables in the following description consists of two groups of contextual
group), none (match none of the variables in the variables. In the first group the match semantics
following group). All groups within the context is that at least one of the variables must be true, in
description must evaluate to true to cause the this case either the device is touched or the state
trigger. of the device is on. In the second group the match
We discriminate three different triggers: ‘enter a semantics is ‘none’, which means that the
context’, ‘leave a context’, and ‘while in a contextual variable alone must not be true and
context’. The ‘enter’ and ‘leave’ triggers take a that the user must not have touched the screen
time value that specifies the time after1 which the with a pen.
action is triggered if the context stays stable over If the context evaluates to true, an action is
this time. For the ‘while in a context’ trigger the triggered. Here the semantics is that if the context
time indicates the interval in which the trigger is is entered and is stable for at least three seconds
fired again. then the action is performed.
The complete description means that if the device
is on or in the users hand and if the user is not
alone and he is not touching the screen with the
1 The parameter indicating the time after that an action is pen then after three seconds the display should be
performed is often 0 (immediate context action coupling) hidden by an image as depicted in figure 6 (d) in
or positive. In certain circumstances, when future the later section.
situations can be predicted (e.g. you drive your car into the
parking, the context walking will appear soon) a negative
value does make sense, too.

5
3 Perception information of this kind is also useful for other
There are several ways to equip devices with applications, for instance calendars, email
perceptional capabilities. The range of notification, pagers and mobile phones can make
complexity to consider is very wide, starting from use of any context that gives an indication of
simple sensors that know the way a device is held whether or not it is a good time to interrupt a
[18] to complex audio and video analysis. We user.
identified the following four basic approaches: The specific contexts that we chose for our study
•= device-databases (e.g. calendars, todo- are based on aural information: user speaking,
lists, address books, profile, etc.) others speaking, noisy, and quiet. And based on
movement of the user: walking, running,
•= input to the application running (notepad
stationary. Movement context was included as it
- taking notes, calendar - looking up a
gives an indication as to whether a user can be
date, etc.)
interrupted visually.
•= active environments (active badges [10],
For recognition of aural and movement contexts,
IR-networks, cameras, audio, etc.)
we integrated two microphones and an
•= sensing context using sensors (TEA [5], accelerometer in our design. One of the
[21], Sensor Badges [1], GPS, cameras, microphones is placed near the user’s throat, the
audio [16], etc.) other pointing away from the user. With this
The perceptual capabilities can be located in the configuration the distinction of speaker and
device itself, in the environment or in another environment is feasible with minimal processing
device that shares the context over a network (e.g. cost. The acceleration sensor is used to
body area network). discriminate whether a user is standing still,
In the remainder of this section we concentrate on walking or running.
sensor based perception, also knowing that in The sensor placement considerations led us to
most scenarios a combination of all four cases is build the context-awareness component into a tie
the way of choice. First we introduce two sensor – it may be considered to build them into other
devices developed in our group and then provide accessories worn in similar ways (e.g. jewelry,
some information on other sensor based devices neckerchief, or necklace). We also liked that a tie
that supply contextual information. stresses the component’s design as accessory
rather than as stand-alone device, see figure 4.
3.1 Context Awareness Component
The hardware of our context-awareness
In this part a wearable context-awareness
component is build around a TINY-Tiger
component that integrates low-cost sensors is
microcontroller, which offers four analog inputs
described. Simple methods are used to derive
context information from sensor data. The
derived context is application-independent and
can be exploited by other wearable or personal
technologies in a body network, for instance
wearable computers, mobile phones, digital
cameras, and personal digital assistants.
Here we chose to address a number of contexts
that relate to how interruptible the user is. These
contexts describe only a certain aspect of real
world situations but they are general in the sense
that they can be exploited by a range of
applications. Such context is for instance
implemented and used in the audio wearable
described in [16], mimicking the human ability to
recognize situations in which it is rude to
interrupt, for instance when a person is engaged
in a conversation or giving a talk. Context Figure 4: Context-Awareness Tie.

6
and two serial lines. The two signals from the 3.2 Sensor Board
microphones are amplified and connected to the Using this Board we collect data on the
analog inputs. To measure the motion we used a situational context by using a combination of low
two-axis accelerometer (Analog Devices level sensors. In this project we built a context
ADXL202). A more detailed description has been recognition device equipped with a light sensor,
published in [22]. acceleration sensor, a passive infrared sensor, a
The software is realized in Tiger-BASIC, a touch sensor, and a temperature sensor. All
multitasking basic dialect for the TINY-Tiger. It sensors, but the touch sensor are standard sensors
reads and analyzes sensor data in a time window and produce analog voltage level. The touch
of about four seconds. The methods to analyze sensor recognizes the human body as a capacitor
the signals are deliberately simple; they work and supplies a digital value. The heart of the
within the time domain and are based on basic device is a BASICTiger microcontroller that reads
statistical measurements. Based on the features from all the physical input channels (it offers four
calculated from sensor data the contexts are analog digital converters and a number of digital
detected. IOs) and statistical methods are applied to
The communication is based on a serial line recognize contexts. The board is depicted in
connection using 9600 bit/s, in a simple request- figure 5. The PDA requests contextual variable
reply manner. The client requests the contextual while the application is idle, e.g. catching the
variables from the context-awareness component NullEvent on the PalmPilot.
that sends back the variables together with the
values. 3.3 Related Work on Context Sensing
Experimentation with the context-aware tie In robotics this way of perception is widely used
showed that contexts were recognized in a very but with a different objective – giving machines
reliable way. Both ‘user speaking’ vs. ‘others the ability to operate autonomously.
speaking’ and ‘stationary’ vs. ‘walking’ vs. For the use with handheld devices the project
‘running’ were discriminated correctly. A key TEA [5] developed a sensor board (equipped with
finding is that sensor placement can be used 8 sensors, light, acceleration, pressure,
effectively to increase reliability and to reduce temperature, etc.) that supplies contextual
required processing. information; communication is done via serial
The device can provide information on the line. The application described is a mobile phone
situational context of the user for other personal that recognizes its context (in users hand, on the
technologies in a body area network. Using this table, in suitcase, outdoors) and it adapts ringing
device the implicit HCI can be facilitated. modes according to users preferences in that
situation [19].
Using a similar approach a system to facilitate
indoor location awareness based on low level
sensors is described in [8]. The system reads data
from different sensors (acceleration, light,
magnetic field, etc.) and provides location
information.
In [7] a cup is described that has an acceleration
and temperature sensor build in together with a
microcontoller and infrared communication. The
cup is aware of its state (warm, cold, on a table,
drinking, moved). The information from a
number of cups communicated to a server is then
used to supply information about the group of
users. All these projects focus on a completely
Figure 5: Context Sensing Device and sensor based approach to context awareness.
PalmPilot A jacket that knows if it is on the hanger or with
the user is presented in [6]. The sensor jacket has

7
woven in sensors that give information if the user concern. Implicit HCI does not solve these
is wearing the jacket, what movements the user is problems in general but can help to:
making, etc. As one application correcting •= adapt the input system to the current situation
movements in sports (automated tennis coach) is (e.g. audio filter, recognition algorithms, etc)
suggested in the paper. In this project the
development of robust sensing technology is very •= limit need for input (e.g. information is
central. already provided by the context and can be
captured)
4 How Can HCI benefit from •= reduce selection space (e.g. only offer
appropriate options in current context)
Context?
HCI for mobile devices is concerned with the 4.3 ContextNotePad on a PalmPilot
general trade-off between devices qualities (e.g. To explore ways of implicit communication
small size, light-weight, little energy between users and their environment with mobile
consumption, etc.) and the demand for optimal devices we built a context aware NotePad
input-output capabilities. Here implicit HCI can application. The system uses the perceptional
offer interesting alternatives. capabilities of the sensor board, described in the
previous section and provides an application that
4.1 Output in Context
is very similar in functionality as the built-in
Over recent years the output systems for mobile notepad application on the PalmPilot.
devices became much better; features such as Additionally the application can adapt to the
stereo audio output, high-resolution color screens current situational context and can also react in
for PDAs and even on mobile phones as well as this way to the implicit interaction. The
display systems for wearable computers are application changes its behavior according to the
commercially available. Also unobtrusive situation. The following context adaptations have
notification mechanisms (e.g. vibration) have been implemented.
become widely used in phones and PDAs. Still on
the lower end devices with very poor display •= On/Off. The user has the device in her hand.
quality enter the marked. Situational context can In this case the application is switched on, if
help to: the user is putting the device out of her hand
it is switched off after a certain time. It
•= adapt the output to the current situation assumes that if the user takes the device in
(fontsize, volume, brightness, privacy her hand she wants to work with the device.
settings, etc) [19].
•= Fontsize. If the device is moved (e.g. while
•= find the most suitable time interruption [16], walking or on a bumpy road) the font size is
[22]. increased to ease reading. Whereas while
•= reduce the need for interruptions (e.g. you having the device in a stable position (e.g.
don’t need to remind someone to go to a device stationary in your hand or on the table)
meeting if he is already there.) the font is made smaller to display more text
at the same screen, see figure 6(a) and(b).
4.2 Input in Context
•= Backlight. This adaptation is straightforward
Considering very small appliances the space for a but still not build in in current PDAs. By
keyboard is very limited what results in bad monitoring the light condition the application
usability. Other input systems, such as graffiti and switches on the backlight if the brightness
handwriting recognition have been developed level in the environment is below a certain
further but still lack in speed and accuracy [9]. threshold. Accordingly if it becomes brighter
Advances in voice recognition have been made in the light is switched off again, see figure 6(c).
recent years, but for non office settings (e.g. in a
car, in a crowded place, sharing rooms with •= Privacy settings. If you are not alone and
others, and in industry workplaces), the you are not writing (or touching the screen)
recognition performance is still poor. Also the content on the display is hidden by an
privacy and acceptance issues are a major image, see figure 6(d). To sense if someone is

8
Figure 6: Adaptation to Context a) small font, b) large font, c) backlight, d) privacy

walking the passive infrared sensor is From current projects we learned that there is a
deployed. need for a simple specification language for
Currently we decrease the size of the context- implicit HCI, based on situational context. We
awareness device to make it feasible to plug it propose an XML-based markup language that
into the pilot to allow proper user studies. supports three different trigger semantics. The
language is easily human readable and also easy
to process.
5 Conclusion and Further Work
Basic mechanisms of perception to acquire
Based on observations of new sensing
situational context are discussed. In the first
technology, available sensors and anticipated
example a wearable context awareness
users a new interaction metaphor is proposed.
component build into a tie is described. Also a
Implicit HCI is defined as an action, performed
sensor-based context-awareness device is
by the user that is not primarily aimed to interact
introduced. Both devices supply context to other
with a computerized system but which such a
devices over a simple request reply protocol over
system understands as input. It is further
the serial line.
identified that perception and interpretation of the
user, the environment, and the circumstances are In a further section benefits of implicit interaction
key concepts for implicit HCI. Furthermore trough situational context to HCI are discussed.
applications that exploit this information are In an example implementation the feasibility of
required. the concepts introduced earlier is demonstrated.
Perception and interpretation are considered as
situational context. Therefore we motivate a References
broad view of context, and also suggest that the [1] Beadle, P., Harper, B., Maguire, G.Q. and
context is described from the perspective of the Judge, J. Location Aware Mobile
application. To identify applications that can Computing. Proc. of IEEE Intl. Conference
make use of situational context and thus can on Telecommunications, Melbourne,
facilitate implicit HCI a number of questions are Australia, April 1997.
raised and an algorithm is suggested. It is based [2] Brown, P. J., Bovey, J. D., Chen, X.
on the central questions: what happens around the Context-Aware Applications: From the
application, how can this be sensed or captured, Laboratory to the Marketplace. IEEE
how to interpret this information, and how can Personal Communications, October 1997.
applications make use of it. [3] Brown, P.J. The stick-e Dokument: A
Frameowrk for creating context-aware
Applications. Proc. EP´96, Palo Alto, CA.

9
(published in EP-odds, vol 8. No 2, pp. 259- in the Field", Workshop on Human
72) 1996. Computer Interaction with Mobile Devices,
[4] Cheverst K, Blair G, Davies N, and Friday University of Glasgow, United Kingdom,
A. Supporting Collaboration in Mobile- 21-23 May 1998, GIST Technical Report
aware Groupware. Personal Technologies, G98-1. 1998.
Vol 3, No 1, March 1999. [16] Sawhney, N., and S., Chris. "Nomadic
[5] Esprit Project 26900. Technology for Radio: A Spatialized Audio Environment for
enabling Awareness (TEA). 1998. Wearable Computing." Proceedings of the
http://tea.starlab.net /. International Symposium on Wearable
[6] Farringdon, J., Moore, A.J., Tilbury, N., Computing, Cambridge, MA, October 13-14,
Church, J., Biemond, P.D. Wearable Sensor 1997.
Badge & Sensor Jacket for Context [17] Schilit, B.N., Adams, N.L., Want, R.
Awareness. In Proceedings of the third Context-Aware Computing Applications.
International Symposium on Wearable Proc. of the Workshop on Mobile
Computers. San Fransico, 18-19. Oct. 1999. Computing Systems and Applications, Santa
[7] Gellersen, H-W., Beigl, M., Krull, H. The Cruz, CA, December 1994. IEEE Computer
MediaCup: Awareness Technology Society. 1994.
embedded in an Everyday Object, 1th [18] Schmidt, A., Beigl, M., Gellersen, H-W.
International Symposium on Handheld and Sensor-based adaptive mobile user
Ubiquitous Computing (HUC99), Karlsruhe, interfaces. In Proceedings 8th International
Germany, 1999. Conference on Human-Computer
Interaction, München, Germany, August
[8] Golding, A., Lesh, N. Indoor Navigation
1999.
Using a Diverse Set of cheap wearable
sensors. In Proceedings of the third [19] Schmidt, A., Aidoo, K.A., Takaluoma, A.,
International Symposium on Wearable Tuomela, U., Van Laerhoven, K., Van de
Computers. San Fransico, 18-19. Oct. 1999. Velde, W. Advanced Interaction in Context.
1th International Symposium on Handheld
[9] Goldstein, M., Book, R. Alsiö, G., Tessa, S.
and Ubiquitous Computing (HUC99),
Non-Keyboard QWERTY Touch Typing: A
Karlsruhe, Germany, 1999 & Lecture notes
Portable Input Interface For The Mobile
in computer science; Vol 1707, ISBN 3-540-
User. Proceedings of the CHI 99, Pittsburg,
66550-1; Springer, 1999.
USA 1999.
[10] Harter, A. and Hopper, A. A Distributed [20] Schmidt, A., Beigl, M., Gellersen, H.-W.
Location System for the Active Office. IEEE There is more to context than location. Proc.
Network, Vol. 8, No. 1, 1994. of the Intl. Workshop on Interactive
[11] Lenat, D.. The Dimensions of Context Applications of Mobile Computing (IMC98),
Space. 1999. Rostock, Germany, November 1998.
http://www.cyc.com/publications.html. [21] Schmidt, A., Forbess, J. What GPS Doesn't
Tell You: Determining One's Context with
[12] Leonhard, U., Magee, J., Dias, P. Location
Low-Level Sensors. The 6th IEEE
Service in Mobile Computing Environments.
International Conference on Electronics,
Computer & Graphics. Special Issue on
Circuits and Systems, September 5 - 8, 1999,
Mobile Computing. Volume 20, Numer 5,
Paphos, Cyprus. 1999.
September/October 1996.
[13] Maes, P., P. Maes on Software Agents: [22] Schmidt, A., Gellersen, H-W., Beigl, M. A
Humanizing the Global Computer. IEEE Wearable Context-Awareness Component -
Internet Computing July-August. 1997. Finally a Good Reason to Wear a Tie. In
[14] NCR Corp. Mülleimer informiert Proceedings of the third International
Supermarkt. Symposium on Wearable Computers. San
http://www.heise.de/newsticker/data/anm- Fransico, 18-19. Oct. 1999.
28.10.99-001/.
[15] Pascoe, J., Ryan, N. S., and Morse D. R.,
"Human Computer Giraffe Interaction: HCI

10
HCI

Design Rules

Presenter

Stephen Kimani
Universita' di Roma "La Sapienza"
DIS
Via Ariosto 25
00185 Rome
Italy

Web: http://www.dis.uniroma1.it/~kimani
E-mail: stephenkimani@gmail.com
HCI

Design Rules

Roadmap

• Introduction
• Usability Principles
• Heuristics and Golden Rules
• AOB
HCI
Design Rules

Introduction

• Design rules (or usability rules) are rules that a designer can follow in order to
increase the usability of the system/product e.g., principles, standards, guidelines.

• NB: Differences [based on level of abstraction/generality and level of authority]:


Principles
(e.g. interface should be easy to navigate)
Abstract and have high generality & low in authority.
Widely applicable and enduring.

Guidelines
(e.g. use colour to highlight links)
Can guide/advise on how achieve a principle
Narrowly focused.
Can be too specific, incomplete, & hard to apply BUT they are more general and
lower in authority than Standards (e.g. use colour RGB #1010D0 on home links)
which are very specific & high in authority.
HCI
Design Rules

Introduction

• Principles:
Example - usability principles by Dix et al (HCI book)

• Standards: They are often set by national (eg British Standards Institution) or
international bodies (ISO).
Example [of standards] - ISO 9241 "Ergonomic Requirements for Office Work with
Visual Display Terminals (VDT)s"

• Guidelines:
Example - Smith and Mosier's "Guidelines for User Interface Software" [MITRE
Corporation 1986].
HCI
Design Rules

Introduction

• Design rules should be used early in the lifecycle [e.g., during the design;
note that they can also be used to evaluate the usability of the system]

• We will:
 First look at abstract principles for supporting usability
 Later on, we will look at the most well used and well known sets of
heuristics or 'golden rules‘, which tend to provide a succinct summary of
the essential characteristics of good design (Nielsen's heuristics,
Shneiderman's golden rules and Norman's principles [the last set, study
on your own])
HCI
Design Rules

Usability Principles

by Dix et al (HCI book)


1. Learnability: the ease with which new users can begin effective interaction and
achieve maximal performance.
2. Flexibility: the multiplicity of ways the user and system exchange information.
3. Robustness: the level of support provided to the user in determining successful
achievement and assessment of goal-directed behavior.

1. Learnability
The ease with which new users can begin effective interaction and achieve
maximal performance.
• Predictability, Synthesizability, Familiarity, Generalizability, Consistency.
HCI
Design Rules

Usability Principles

1. Learnability
The ease with which new users can begin effective interaction and achieve
maximal performance.
• Predictability, Synthesizability, Familiarity, Generalizability, Consistency.
HCI
Design Rules

Usability Principles

Learnability (contd.)
• Predictability: support for the user to determine the effect of future action based on
past interaction history (can I ‘tell’ what will happen based on what I have gone
through in past?).
HCI
Design Rules

Usability Principles

Learnability (contd.)
• Synthesizability: support for the user to assess the effect of past operations on the
current state (can I ‘tell’ why I am here based on what I have gone through in the
past?).
HCI
Design Rules

Usability Principles

Learnability (contd.)
• Familiarity: the extent to which a user's knowledge and experience in other real-
world or computer-based domains can be applied when interacting with a new
system.
HCI
Design Rules

Usability Principles

Learnability (contd.)
• Generalizability: support for the user to extend knowledge of specific
interaction within and across applications to other similar situations.

• Consistency: likeness in input-output behavior arising from similar situations


or similar task objectives.

PS: Familiarity can be considered as 'consistency' wrt past real-world experience.


Generalizability as 'consistency' wrt experience with the same system or set
of applications on the same platform.
HCI
Design Rules

Usability Principles

2. Flexibility
The multiplicity of ways the user and system exchange information.
• Dialogue initiative, Multithreading, Task migratability, Subsitutivity,
Customizability.

• Dialogue initiative: user freedom from


artificial constraints on the input dialog
imposed by the system;
user vs system - who has the initiative
in the dialog?
HCI
Design Rules

Usability Principles

Flexibility (contd.)
• Multithreading: the ability of the system to support user interaction for more
than one task at a time.
HCI
Design Rules

Usability Principles

Flexibility (contd.)
• Task migratability: the ability to transfer
control for execution of tasks between
the system and the user
(consider e.g., spell-checking task).
HCI
Design Rules

Usability Principles

Flexibility (contd.)
• Substitutivity: the extent to which an application allows equivalent input and
output values to be substituted for each other (values in input eg
fractions/decimals, values in output eg both digital and analog, output/input
eg output can be reused as input).
HCI
Design Rules

Usability Principles

Flexibility (contd.)
• Customizability: the ability of the user or the system to modify the user
interface. (adaptability vs adaptivity) ?-initiated modification.
HCI
Design Rules

Usability Principles

3. Robustness
The level of support provided to the user in determining successful
achievement and assessment of goal-directed behavior.
• Observability, Recoverability, Responsiveness, Task conformance.

• Observability: the extent to which the user can evaluate the internal state of
the system from the representation on the user interface.
HCI
Design Rules

Usability Principles

Robustness (contd.)
• Recoverability: the extent to which the user can reach the intended goal after
recognizing an error in the previous interaction.
HCI
Design Rules

Usability Principles

Robustness (contd.)
• Responsiveness: a measure of the rate of communication between the user
and the system.
HCI
Design Rules

Usability Principles

Robustness (contd.)
• Task conformance: the extent to
which the system services
support all the tasks the user
would wish to perform and
in the way the user would wish
to perform.
HCI
Design Rules

Heuristics and Golden Rules

Jakob Nielsen’s 10 Usability Heuristics


1. Visibility of system status: the system should always keep users informed about
what is going on, through appropriate feedback within reasonable time.
2. Match between system and the real world: the system should speak the users'
language, with words, phrases and concepts familiar to the user, rather than
system-oriented terms. Follow real-world conventions, making information
appear in a natural and logical order.
3. User control and freedom: users often choose system functions by mistake and
will need a clearly marked "emergency exit" to leave the unwanted state without
having to go through an extended dialogue. Support undo and redo.
4. Consistency and standards: users should not have to wonder whether different
words, situations, or actions mean the same thing. Follow platform conventions.
5. Error prevention: even better than good error messages is a careful design
which prevents a problem from occurring in the first place.
6. Recognition rather than recall: make objects, actions, and options visible. The
user should not have to remember information from one part of the dialogue to
another. Instructions for use of the system should be visible or easily retrievable
whenever appropriate.
HCI
Design Rules

Heuristics and Golden Rules

Jakob Nielsen’s 10 Usability Heuristics (contd.)


7. Flexibility and efficiency of use: accelerators -- unseen by the novice user --
may often speed up the interaction for the expert user such that the system can
cater to both inexperienced and experienced users. Allow users to tailor
frequent actions.
8. Aesthetic and minimalist design: dialogues should not contain information which
is irrelevant or rarely needed. Every extra unit of information in a dialogue
competes with the relevant units of information and diminishes their relative
visibility.
9. Help users recognize, diagnose, and recover from errors: error messages
should be expressed in plain language (no codes), precisely indicate the
problem, and constructively suggest a solution.
10. Help and documentation: even though it is better if the system can be used
without documentation, it may be necessary to provide help and documentation.
Any such information should be easy to search, focused on the user's task, list
concrete steps to be carried out, and not be too large.
HCI
Design Rules

Heuristics and Golden Rules

Ben Shneiderman's 8 Golden Rules


1. Strive for consistency: layout, terminology, command usage, etc.
2. Cater for universal usability: recognize the requirements of diverse users and
technology. For instance add features for novices eg explanations, support
expert users eg shortcuts.
3. Offer informative feedback: for every user action, offer relevant feedback and
information, keep the user appropriately informed, human-computer interaction.
4. Design dialogs to yield closure: help the user know when they have completed a
task.
5. Offer error prevention and simple error handling: prevention and (clear and
informative guidance to) recovery; error management.
6. Permit easy reversal of actions: to relieve anxiety and encourage exploration,
because the user knows s/he can always go back to previous states.
7. Support internal locus of control: make the user feel that s/he is in control of the
system, which reponds to his/her instructions/commands.
8. Reduce short-term memory load: make menus and UI elements/items visible,
easily available/retrievable, ...
HCI
Design Rules

Heuristics and Golden Rules

[Donald] Norman's 7 Principles [study on your own]


1. Use both knowledge in the world and knowledge in the head.
2. Simplify the structure of tasks.
3. Make things visible: bridge the gulfs of Execution and Evaluation.
4. Get the mappings right.
5. Exploit the power of constraints, both natural and artificial.
6. Design for error.
7. When all else fails, standardize.
HCI
Design Rules

AOB

Any Questions?
Usability Principles

John Stasko
Spring 2007

This material has been developed by Georgia Tech HCI faculty, and continues
to evolve. Contributors include Gregory Abowd, Al Badre, Jim Foley, Elizabeth
Mynatt, Jeff Pierce, Colin Potts, Chris Shaw, John Stasko, and Bruce Walker.
Permission is granted to use with acknowledgement for non-profit purposes.
Last revision: January 2007.

Agenda
• Usability Principles
– Why?
– System of principles
• Learnability
– Support for learning for users of all levels
• Flexibility
– Support for multiple ways of doing tasks
• Robustness
– Support for recovery
– Style guides

• Project preparation

6750-Spr ‘07 2

1
Good Design (our goal!)

“Every designer wants to build a high-


quality interactive system that is admired
by colleagues, celebrated by users,
circulated widely, and imitated
frequently.” (Shneiderman,
Shneiderman, 1992, p.7)

…and anything goes!…

6750-Spr ‘07 3

Why Principles & Guidelines?


• …Because, well, not everything goes…

• Intended to prevent many bad designs, before


they begin, or evaluate existing designs on a
scientific basis
• Guidelines based on previous designs,
experimental findings
• Rules can all be “broken” (but usually in order
to satisfy another principle)

6750-Spr ‘07 4

2
Concepts, Principles, Guidelines

• No “cookbooks”
• No simple, universal checklists
• There are many concepts, principles, and
guidelines
• Understand the higher level principles that
apply across situations, display types, etc.
• Implement the standards and guidelines
…a few details…

6750-Spr ‘07 5

Many Sets of Design Principles

• Shneiderman, Designing the User


Interface
• Dix, Finlay, Abowd, Beale, Human-
Computer Interaction
• Foley et al, Computer Graphics: Principles
and Practice
• And many more - including in
styleguides, discussed later

6750-Spr ‘07 6

3
Levels of Consideration

1. Meta-display level
– Apply to the whole system, across media & across
displays
– Focus on this in Basic Layout Stage

2. Display Layout
– Apply to groups of elements in a display
– Focus on this in Prototyping and Redesign

3. Element level
– Details about specific parts of a display
– Colors, sound attributes, symbols

6750-Spr ‘07 7

UI Design Principles (Dix et al)

• Categories
– Learnability
• Support for learning for users of all levels
– Flexibility
• Support for multiple ways of doing tasks
– Robustness
• Support for recovery

• Always think about these in terms of


meta-display, display, and element levels

6750-Spr ‘07 8

4
1. Learnability Principles

• Ease with which new users can begin


effective interaction and achieve maximal
performance
– Predictability
– Synthesizability
– Familiarity
– Generalizability
– Consistency

6750-Spr ‘07 9

1.1 Predictability
• I think that this action will do….

• Operation visibility - Can see avail actions


– e.g. menus vs.
command shell
– grayed menu items

6750-Spr ‘07 10

5
1.2 Synthesizability
• Support for user in assessing the effect of past
operations on current system state

Can the user figure out


what caused this error?

– Moving a file in UNIX shell vs. Mac/Windows


– Is same feedback needed for all users, all apps?

6750-Spr ‘07 11

1.3 Familiarity

• Does UI task leverage existing real-world


or domain knowledge?
– Really relevant to first impressions

– Use of metaphors
• Potential pitfalls

– Are there limitations on familiarity?

6750-Spr ‘07 12

6
Metaphors at the UI - What

• Metaphor - Application of name or


descriptive term to another object which
is not literally applicable
– Use: Natural transfer - apply existing
knowledge to new, abstract tasks
– Problem: May introduce incorrect mental
model

6750-Spr ‘07 13

1.4 Generalizability

• Can knowledge of one system/UI be


extended to other similar ones?
– Example: cut & paste in different
applications

– Does knowledge of one aspect of a UI apply


to rest of the UI?

– Aid: UI Developers guidelines

6750-Spr ‘07 14

7
1.5 Consistency

• Likeness in behavior between similar


tasks/operations/situations
– In different things
• interacting
• output
• screen layout

• Is this always desirable for all systems,


all users?

6750-Spr ‘07 15

(In)Consistency Example - Macintosh


Drag a file icon to: Result:
Folder on same File is moved to folder
physical disk
Folder on another File is copied there
physical disk
Different disk File is copied there
Trash can
File is discarded

6750-Spr ‘07 16

8
2. Flexibility Principles

• Multiplicity of ways that users and system


exchange information
– Dialog Initiative
– Multithreading
– Task migratability
– Substitutivity
– Customizability

6750-Spr ‘07 17

2.1 Dialog Initiative

• Not hampering the user by placing


constraints on how dialog is done
– User pre-emptive
• User initiates actions
• More flexible, generally more desirable
– System pre-emptive
• System does all prompts, user responds
• Sometimes necessary

6750-Spr ‘07 18

9
2.2 Multithreading

• Allowing user to perform more than one


task at a time
• Two types
– Concurrent
• Input to multiple tasks simultaneously
– Interleaved
• Many tasks, but input to one at a time

6750-Spr ‘07 19

2.3 Task Migratability

• Ability to move performance of task to


entity (user or system) who can do it
better
– Auto-pilot in planes
– Spell-checking
– Safety controls in plant

– For what kinds of tasks should the user be in


control?

6750-Spr ‘07 20

10
2.4 Substitutivity

• Flexibility in details of operations


– Allow user to choose suitable interaction
methods
– Allow different ways to
• perform actions, specify data, configure
– Allow different ways of presenting output
• to suit task & user

6750-Spr ‘07 21

2.5 Customizability
• Ability of user to modify interface
– By user - adaptability
• Is this a good thing?

– By system - adaptivity
• Is this a good thing?

6750-Spr ‘07 22

11
3. Robustness Principles

• Supporting user in determining


successful achievement and assessment
of goals
– Observability
– Recoverability
– Responsiveness
– Task Conformance

6750-Spr ‘07 23

3.1 Observability
• Can user determine internal state of
system from what she perceives?
– Browsability
• Explore current state
(without changing it)
– Reachability
• Navigate through
observable states
– Persistence
• How long does observable state persist?

6750-Spr ‘07 24

12
Observability - Role of Feedback
• Feedback helps create observability
• Feedback taxonomy (generally don’t need all of
these)
– “I understand what you have asked me to do”
– “I am doing what you have asked me to do”
• “And it will take me this much longer”
• Song and dance routine to distract user (busy interval
as opposed to idle interval)
• “And here are some intermediate results to keep you
happy until I am done
– “All done, what’s next?”

6750-Spr ‘07 25

Observability – Acrobat Reader

Acrobat Reader
with ToC to
give context
Forest is the
bookmarks,
tree is the
single page

6750-Spr ‘07 26

13
3.2 Recoverability

• Ability to take corrective action upon


recognizing error
– Difficulty of recovery procedure should relate
to difficulty of original task
– Forward recovery
• Ability to fix when we can’t undo
– Backward recovery
• Undo previous error(s)

6750-Spr ‘07 27

Do Not Set the User Up

• Make it hard for the user to make errors


– Instead of allowing them to make error and
then saying “tsk, tsk”

• Gray out disabled menu items


• Ask for confirmation of major actions

6750-Spr ‘07 28

14
Do Not Set the User Up

• Don’t let the user do something that will


lead to an error message

6750-Spr ‘07 29

3.3 Responsiveness

• Users perception of rate of


communication with system
– Response time
• Time for system to respond in some way to
user action(s)
– Users perceptions not always right
– Response OK if matches user expectations
– Once user enjoys fast response, is hard to go
back to slower one
• Dial-
Dial-up versus DSL/Cable modem

6750-Spr ‘07 30

15
Responsiveness
• Response to motor actions
– Keyboarding, mouse movement – less than 100
msecs
– Rich human factors literature on this
• Consistency is important – experimental results
– Users preferred longer but more consistent response
time
– Times that differed 10%-
10%-20% were seen as same
• Sometimes argued that too fast is not good
– Makes user feel like they need to do something
quickly to keep up with computer

6750-Spr ‘07 31

3.4 Task Conformance

• Does system support all tasks user


wishes to perform in expected ways?
– Task completeness
• Can system do all tasks of interest?
– Task adequacy
• Can user understand how to do tasks?

– Does it allow user to define new tasks?

6750-Spr ‘07 32

16
Application

• In doing design and implementation of


your project, revisit this list
• Assess your design against these
usability principles

6750-Spr ‘07 33

Styleguides

• Codify many of these principles for a


particular look and feel
– Mac OS, Windows, Motif, Palm, Blackberry

• Developed in concert with toolkit, but go


beyond toolkit

6750-Spr ‘07 34

17
Typical TOC - MAC OS X
Introduction to the Apple Human Using Existing Technologies
Providing User Assistance
Interface Guidelines Internationalizing Your Application
What Are the Mac OS X Human Interface Guidelines? Storing Passwords
Who Should Read This Document? Printing
Organization of This Document Choosing Colors
Conventions Used in This Document Setting Fonts and Typography Characteristics
See Also Selecting Attributes Associated With People
Speech Technologies

Part I: Fundamentals
Human Interface Design Part III: The Aqua Interface
Human Interface Design Principles User Input
Keep Your Users in Mind The Mouse and Other Pointing Devices
The Keyboard
The Development Process
Selecting
Design Decisions
Editing Text
Managing Complexity
Extending the Interface Drag and Drop
Involving Users in the Design Process Drag and Drop Overview
Drag and Drop Semantics
Part II: The Macintosh Experience
Selection Feedback
First Impressions Drag Feedback
Packaging Destination Feedback
Installation Drop Feedback
General Installer Guidelines Clippings
Setup Assistants
Text
Mac OS X Environment Fonts
The Finder Style
The Dock
Icons
The File System
Icon Genres and Families
Multiple Users
Icon Perspectives and Materials
Remote Log In
Conveying an Emotional Quality in Icons
Assistive Technologies
Suggested Process for Creating Aqua Icons
Networking
Tips for Designing Aqua Icons
Application Services
Displays Cursors
The Always-On Environment Standard Cursors
Designing Your Own Cursors

6750-Spr ‘07 35

More TOC
Menus Layout Examples
Menu Behavior Positioning Controls
Designing the Elements of Menus Sample Layouts
The Menu Bar and Its Menus Grouping Controls
Contextual Menus
Using Small and Mini Versions of Controls
Dock Menus
Keyboard Shortcuts Quick Reference
Windows
Types of Windows Tab View Differences Between Mac OS X Versions
Window Appearance
Window Behavior Document Revision History
Utility Windows
The About Window
Preferences Windows
Inspectors and Info Windows
Find Window
Fonts Window and Colors Window

Dialogs
Types of Dialogs and When to Use Them
Dialog Behavior
The Open Dialog
Dialogs for Saving, Closing, and Quitting
The Choose Dialog
The Printing Dialogs

Controls
Buttons
Selection Controls
Adjustment Controls
Indicators
Text Controls
View Controls
Grouping Controls

6750-Spr ‘07 36

18
Excerpt from OS X Styleguide
Drag and Drop Overview
Ideally, users should be able to drag any content from any window to any other window that accepts the content’s type. If the source
and destination are not visible at the same time, the user can create a clipping by dragging data to a Finder window; the clipping
can then be dragged into another application window at another time.

Drag and drop should be considered an ease-of-use technique. Except in cases where drag and drop is so intrinsic to an
application that no suitable alternative methods exist—dragging icons in the Finder, for example—there should always be another
method for accomplishing a drag-and-drop task.

The basic steps of the drag-and-drop interaction model parallel a copy-and-paste sequence in which you select an item, choose
Copy from the Edit menu, specify a destination, and then choose Paste. However, drag and drop is a distinct technique in itself and
does not use the Clipboard. Users can take advantage of both the Clipboard and drag and drop without side effects from each
other.

A drag-and-drop operation should provide immediate feedback at the significant points: when the data is selected, during the drag,
when an appropriate destination is reached, and when the data is dropped. The data that is pasted should be target-specific. For
example, if a user drags an Address Book entry to the “To” text field in Mail, only the email address is pasted, not all of the person’s
address information.

You should implement Undo for any drag-and-drop operation you enable in your application. If you implement a drag-and-drop
operation that is not undoable, display a confirmation dialog before implementing the drop. A confirmation dialog appears, for
example, when the user attempts to drop an icon into a write-only drop box on a shared volume, because the user does not have
privileges to open the drop box and undo the action.

(Color added for emphasis.)

6750-Spr ‘07 37

Styleguides
• General User Interface Design Style Guides
– Apple Human Interface Guidelines (Mac OS X) Design Guidelines
– Microsoft User Interface Guidelines (Click in the left tree on User Interface Design...)
– Windows XP Guidelines
– Yale Web Style Guide (2nd Edition)
– Java Look and Feel Guidelines (version 1)
– Java Look and Feel Guidelines version 2
– Java Look and Feel Guidelines: Advanced Topics
– IBM 3D design Guidelines
– Silicon Graphics Indigo Magic User Interface Guidelines

• Open Source Usability Guidelines


– Motif Style Guide
– KDE User Interface Guidelines
– Gnome Human Interface Guidelines 1.0

• Corporate User Interface Standards and Guidelines (samples)


– Telstra Online Standards
– Taligent Human Interface Guidelines
– Ameritech Graphical User Interface Standards and Design Guidelines

• http://www.experiencedynamics.com/science_of_usability/ui_style_guides/
http://www.experiencedynamics.com/science_of_usability/ui_style_guides/

6750-Spr ‘07 38

19
And More Styleguides ….
• Government funded Usability Guidelines
– MITRE Guidelines for Designing User Interface Software (US Airforce)
– Research based Web Design and Usability Guidelines (Dept. of Health and Human Services)
– Cancer Institute Usability Guidelines
– NASA User Interface Guidelines
– Canadian Command Decision Aiding Technology (COMDAT) Operator-Machine Interface (OMI) Style
Guide: Version 1.0

• Gaming Devices (J2ME games)


– Games Usability Guidelines (from Nokia)

• Wireless and Mobile Usability Guidelines


– Palm OS Design Guidelines
– Openwave GSM Guidelines
– Openwave Top 10 Usability Guidelines for WAP Applications
– Blackberry and RIM wireless handheld UI Developers Guide (PDF)
– Sprint Usability Requirements for XHTML (Application Developers Program)
– NTT DoCoMo imode service guideline (user interfaces)

• Accessibility Guidelines
– Techniques for Web content Accessibility Guidelines 1.0

6750-Spr ‘07 39

Project

• Anyone without a team yet?


– You need to find one!!!

• Interesting topics?

6750-Spr ‘07 40

20
Upcoming

• Human Capabilities
– Physical
– Cognitive

• Project team & topic due Thursday

6750-Spr ‘07 41

21

You might also like