Hci M2 PDF
Hci M2 PDF
Hci M2 PDF
design rules
design rules
• Principles of usability
– general understanding
• Design patterns
– capture and reuse design knowledge
types of design rules
• principles
– abstract design rules
– low authority
– high generality
increasing generality
• standards
– specific design rules
– high authority
– limited application
• guidelines
– lower authority
increasing authority
– more general application
Principles to support usability
Learnability
the ease with which new users can begin effective
interaction and achieve maximal performance
Flexibility
the multiplicity of ways the user and system exchange
information
Robustness
the level of support provided the user in determining
successful achievement and assessment of
goal-directed behaviour
Principles of learnability
Predictability
– determining effect of future actions based on
past interaction history
– operation visibility
Synthesizability
– assessing the effect of past actions
– immediate vs. eventual honesty
Principles of learnability (ctd)
Familiarity
– how prior knowledge applies to new system
– guessability; affordance
Generalizability
– extending specific interaction knowledge to new
situations
Consistency
– likeness in input/output behaviour arising from similar
situations or task objectives
Principles of flexibility
Dialogue initiative
– freedom from system imposed constraints on input
dialogue
– system vs. user pre-emptiveness
Multithreading
– ability of system to support user interaction for more
than one task at a time
– concurrent vs. interleaving; multimodality
Task migratability
– passing responsibility for task execution between user
and system
Principles of flexibility (ctd)
Substitutivity
– allowing equivalent values of input and
output to be substituted for each other
– representation multiplicity; equal opportunity
Customizability
– modifiability of the user interface by user
(adaptability) or system (adaptivity)
Principles of robustness
Observability
– ability of user to evaluate the internal state of the
system from its perceivable representation
– browsability; defaults; reachability; persistence;
operation visibility
Recoverability
– ability of user to take corrective action once an error
has been recognized
– reachability; forward/backward recovery;
commensurate effort
Principles of robustness (ctd)
Responsiveness
– how the user perceives the rate of
communication with the system
– Stability
Task conformance
– degree to which system services support all
of the user's tasks
– task completeness; task adequacy
Using design rules
increasing generality
Design rules
• suggest how to increase usability
• differ in generality and authority
increasing authority
Standards
evaluation techniques
Evaluation Techniques
• Evaluation
– tests usability and functionality of system
– occurs in laboratory, field and/or in collaboration
with users
– evaluates both design and implementation
– should be considered at all stages in the design life
cycle
Goals of Evaluation
Cognitive Walkthrough
Heuristic Evaluation
Review-based evaluation
Cognitive Walkthrough
• Example heuristics
– system behaviour is predictable
– system behaviour is consistent
– feedback is provided
• Model-based evaluation
• Advantages:
– specialist equipment available
– uninterrupted environment
• Disadvantages:
– lack of context
– difficult to observe several users cooperating
• Appropriate
– if system location is dangerous or impractical for
constrained single user systems to allow controlled
manipulation of use
Field Studies
• Advantages:
– natural environment
– context retained (though observation may alter it)
– longitudinal studies possible
• Disadvantages:
– distractions
– noise
• Appropriate
– where context is crucial for longitudinal studies
Evaluating Implementations
Requires an artefact:
simulation, prototype,
full implementation
Experimental evaluation
• Subjects
– who – representative, sufficient sample
• Variables
– things to modify and measure
• Hypothesis
– what you’d like to show
• Experimental design
– how you are going to do it
Variables
• prediction of outcome
– framed in terms of IV and DV
• null hypothesis:
– states no difference between conditions
– aim is to disprove this
• Type of data
– discrete - finite number of values
– continuous - any value
Analysis - types of test
• parametric
– assume normal distribution
– robust
– powerful
• non-parametric
– do not assume normal distribution
– less powerful
– more reliable
• contingency table
– classify data by discrete attributes
– count number of data items in each group
Analysis of data (cont.)
Problems with:
– subject groups
– choice of task
– data gathering
– analysis
Subject groups
one solution:
– record from each perspective
Analysis
solutions:
– within groups experiments
– micro-analysis (e.g., gaps in speech)
– anecdotal and qualitative analysis
Contrast:
psychology – controlled experiment
sociology and anthropology – open study and rich data
Observational Methods
Think Aloud
Cooperative evaluation
Protocol analysis
Automated analysis
Post-task walkthroughs
Think Aloud
• Advantages
– simplicity - requires little expertise
– can provide useful insight
– can show how system is actually use
• Disadvantages
– subjective
– selective
– act of describing may alter task performance
Cooperative evaluation
• Additional advantages
– less constrained and easier to use
– user is encouraged to criticize system
– clarification possible
Protocol analysis
• paper and pencil – cheap, limited to writing speed
• audio – good for think aloud, difficult to match with other
protocols
• video – accurate and realistic, needs special equipment,
obtrusive
• computer logging – automatic and unobtrusive, large
amounts of data difficult to analyze
• user notebooks – coarse and subjective, useful insights,
good for longitudinal studies
• Workplace project
• Post task walkthrough
– user reacts on action after the event
– used to fill in intention
• Advantages
– analyst has time to focus on relevant incidents
– avoid excessive interruption of task
• Disadvantages
– lack of freshness
– may be post-hoc interpretation of events
post-task walkthroughs
Interviews
Questionnaires
Interviews
• Advantages
– can be varied to suit context
– issues can be explored more fully
– can elicit user views and identify unanticipated
problems
• Disadvantages
– very subjective
– time consuming
Questionnaires
• Advantages
– quick and reaches large user group
– can be analyzed more rigorously
• Disadvantages
– less flexible
– less probing
Questionnaires (ctd)
• Styles of question
– general
– open-ended
– scalar
– multi-choice
– ranked
Physiological methods
Eye tracking
Physiological measurement
eye tracking
increasing generality
increasing authority
Types of design rules (contd.)
◻ Standards
Specific design rules
High authority
Low generality (Limited application)
◻ Guidelines
Lower authority
High generality (More general application)
◻ Principles
increasing generality
Abstract design rules
Lower authority
High generality
increasing authority
Using design rules
increasing generality
Design rules
◻ Suggest how to increase usability
◻ Differ in generality and authority
increasing authority
Standards
◻ Higher level of authority but so useful for specific design
Major sources: Designing Visual Interfaces, Mullet & Sano, Prentice Hall / Robin Williams Non-Designers Design Book, Peachpit Press
Slide deck by Saul Greenberg. Permission is granted to use this for non-commercial purposes as long as general credit to Saul Greenberg is clearly maintained.
Warning: some material in this deck is used from other sources without permission. Credit to the original source is given if it is known.
CRAP
Contrast
– make different things different
– brings out dominant elements
– mutes lesser elements
– creates dynamism
Repetition
– repeat design throughout the interface
– consistency
– creates unity
Alignment
– visually connects elements
– creates a visual flow
Proximity
– groups related elements
– separates unrelated ones
Robin Williams Non-Designers Design Book, Peachpit Press Saul Greenberg
Graphical Design 1
Graphical Design 2
Original
Proximity
Graphical Design 3
Alignment
Contrast
Graphical Design 4
Repetition
Grids
Format of
Organization variable
– contrast for dominant elements contents
– element groupings by proximity
– organizational structure
Standard Message text in
– alignment icon set Arial 14, left
adjusted
Widget to
Consistency widget
– location spacing
– format No Ok
– element repetition
– organization window to
widget Fixed
spacing components
Saul Greenberg
Graphical Design 5
Standard Message text in
icon set Arial 14, left
adjusted
?
Do you really want
to delete the file
“myfile.doc” from
No Ok the folder “junk”?
No Ok
!
Cannot move the
file “myfile.doc” to
Apply the folder “junk”
The file was because the disc is
destroyed full.
Cancel
Ok
8 9
Two-level Hierarchy
Logic of organizational
•indentation
flow
•contrast
Graphical Design 6
Visual consistency (repetition)
internal consistency
– elements follow same conventions and rules
– set of application-specific grids enforce this
external consistency
– follow platform and interface style conventions
– use platform and widget-specific grids
9 8
Saul Greenberg
proximal clusters
alignment
white (negative) space
explicit structure
Mmmm: Mmmm:
Mmmm:
Mmmm: Mmmm:
Mmmm:
Mmmm: Mmmm:
Mmmm:
8 9
Saul Greenberg
Graphical Design 7
Terrible alignment
– no flow
Poor contrast
– cannot distinguish colored labels from editable fields
Poor repetition
– buttons do not look like buttons
No regard for
order and
organization
Graphical Design 8
Haphazard layout
Graphical Design 9
Spatial Tension
Graphical Design 10
Overuse of 3-d effects makes the window unnecessarily cluttered
WebForms
Graphical Design 11
Navigational cues
8 9
Saul Greenberg
Graphical Design 12
The importance of negative space and alignment
minimize clutter
– so information is not hidden
MMMM NNNN
xxx: ____ xxx: ____
MMMM
xxx: ____ xxx: ____
xxx: ____
xxx: ____ xxx: ____
xxx: ____
xxx: ____ xxx: ____
xxx: ____
NNNN
xxx: ____ xxx: ____
xxx: ____ xxx: ____
8 9
Saul Greenberg
Graphical Design 13
Repairing excessive display density
Tabs
– excellent means for factoring related items
– but can be overdone
Graphical Design 14
Legibility and readability
9 8
Saul Greenberg
Large Large
Medium Medium
Small Small
Readable Unreadable
Design components to be Design components to be
inviting and attractive inviting and attractive
Design components to be Design components to be
inviting and attractive inviting and attractive
9 8
Saul Greenberg
Graphical Design 15
Legibility and readability
typesetting
– point size
– word and line spacing
– line length
– Indentation
– color
9 8
Saul Greenberg
Graphical Design 16
These choices must be really important,
or are they?
Graphical Design 17
Text orientation
difficult to read
Microsoft Word
Imagery
Graphical Design 18
Choosing levels of abstraction
8 8 9 9 8
9 8
Mullet &
Sano
Graphical Design 19
What do these images mean?
• no tooltips included
• one of the tabs is a glossary explaining these images!
which one?
Idioms
Standard
typographic controls
Toolbars and tooltips
Graphical Design 20
How to choose between widgets
Saul Greenberg
Graphical Design 21
Widgets and complexity
9
8
Saul Greenberg
Exercise
Graphical redesign
Saul Greenberg
Graphical Design 22
Constructing a grid
1. Maintain consistency with GUI style
• locate standard components - title bar, window controls, …
4. Economize
• collapse two windows into one
• trim sound dialog
Graphical Design 23
Using the grid
5. Evaluate by displaying actual examples
6. Economize further
• decide which we prefer
vs
CRAP
– visual organization
• contrast, alignment and navigational cues
– visual relationships
• proximity and white space
– familiar idioms
– appropriate imagery
Saul Greenberg
Graphical Design 24
Interface Design and Usability Engineering
Articulate: Brainstorm Refined Completed
•who users are designs designs designs
Goals: •their key tasks
Task Graphical
centered Psychology of Participatory
everyday interaction screen
system design
design Evaluate things Usability Field
tasks User involvement Interface testing testing
Methods: Participatory Task scenario guidelines
design Representation & walk-
metaphors through Style Heuristic
User- guides
centered evaluation
design
Graphical Design 25
chapter 6
Architectural
design
Detailed
design
Coding and
unit testing
Integration
and testing
Operation and
maintenance
Activities in the life cycle
Requirements specification
designer and customer try capture what the system is
expected to provide can be expressed in natural language or
more precise languages, such as a task analysis would
provide
Architectural design
high-level description of how the system will provide the
services required factor system into major components of the
system and how they are interrelated needs to satisfy both
functional and nonfunctional requirements
Detailed design
refinement of architectural components and interrelations to
identify modules to be implemented separately the refinement
is governed by the nonfunctional requirements
Verification and validation
Real-world
requirements
and constraints The formality gap
Verification
designing the product right
Validation
designing the right product
Detailed
design
Coding and
unit testing
Integration
lots of feedback! and testing
Operation and
maintenance
Usability engineering
The ultimate test of usability based on measurement of user
experience
Usability engineering demands that specific usability measures be
made explicit as requirements
Usability specification
– usability attribute/principle
– measuring concept
– measuring method
– now level/ worst case/ planned level/ best case
Problems
– usability specification requires level of detail that may not be
– possible early in design satisfying a usability specification
– does not necessarily satisfy usability
part of a usability
specification for a VCR
• Prototypes
– simulate or animate some features of intended system
– different types of prototypes
• throw-away
• incremental
• evolutionary
• Management issues
– time
– planning
– non-functional features
– contracts
Techniques for prototyping
Storyboards
need not be computer-based
can be animated
Types of DR:
• Process-oriented
– preserves order of deliberation and decision-making
• Structure-oriented
– emphasizes post hoc structuring of considered design
alternatives
• Two examples:
– Issue-based information system (IBIS)
– Design space analysis
Issue-based information
system (IBIS)
• basis for much of design rationale research
• process-oriented
• main elements:
issues
– hierarchical structure with one ‘root’ issue
positions
– potential resolutions of an issue
arguments
– modify the relationship between positions and issues
• gIBIS is a graphical version
structure of gIBIS
supports
Position Argument
responds to
Issue
responds to
objects to
Position Argument
specializes
Sub-issue generalizes
questions
Sub-issue
Sub-issue
Design space analysis
• structure-oriented
• QOC – hierarchical structure:
questions (and sub-questions)
– represent major issues of a design
options
– provide alternative solutions to the question
criteria
– the means to assess the options in order to make a choice
Option Criterion
… Consequent …
Question
Question
Psychological design rationale
2
context of use] 2: the set of facts or only a few seconds and making a note on a PDA
circumstances that surround a situation or is often in the range of several seconds up to
event; "the historical context" (Source: some minutes. Also the fact that the applications
WordNet ® 1.6) are mainly used while doing something else or to
Context: That which surrounds, and gives carry out a certain task (like tools in the real
meaning to, something else. world) calls for a reduction of the explicit human-
(Source: The Free On-line Dictionary of Computing) machine interaction and creates the need to shift
towards implicit HCI.
Synonyms Context:
Circumstance, situation, phase, position, Knowledge about the situational context is of
posture, attitude, place, point; terms; primary interest to the application, because we
regime; footing, standing, status, occasion, consider that the application will adapt to the
surroundings, environment, location, context.
dependence. (Source: www.thesaurus.com) It can be observed that an application (mobile or
To build applications that have knowledge about stationary alike) is:
their situational context it is important to gain an (a) running on a specific device (e.g. input
understanding what context is. Current research system, screen size, network access,
in context-awareness in mobile computing shows portability, etc.),
a strong focus on location [1], [12]. Location is a (b) at a certain time (absolute time e.g. 9:34 p.m.,
concept that is well understood. Also the benefit class of time e.g. in the morning)
of location-awareness is clearly given, at certain (c) used by one or more users (concurrently or
locations particular services are more important sequentially),
than others. An architectural approach, based on a (d) in a certain physical environment (absolute
smart environment is described by Schilit et. al. location, type of location, conditions such as
[17]. Other scenarios are using RF and GPS to light, audio, and temperature, infrastructure,
determine the users location, e.g. [4], [15]. But, as etc.),
pointed out in [20] context is more than location. (e) social setting (people co-located and social
We use the term context considering mobile role),
computing in a more general way, as also (f) to solve a particular task (single task, group
suggested by [2], to describe the environment, of tasks, or a general goal).
situation, state, surroundings, task, and so on. A We consider the items (a) to (f) as the basic
wider view of context is also given by [19]. They building blocks of context. For mobile
suggest to consider the way a device is used applications especially (d) and (e) are of major
(mobile phone in the users hand, on the table, in interest. In mobile settings the physical
pocket, etc.) to be treated as context. environment can change while an application is
executed e.g. making a phone call while walking
2.3 Applications in Context from the office desk to the car park. The
telephone application is running, while the noise
Analyzing the way people use ultra-mobile level changes between office and outside.
devices (e.g. personal digital assistants, smart
mobile phones, handheld and wearable 2.4 Identifying Implicit Human Computer
computers) it becomes apparent that the periods Interaction
of interaction are much shorter than in traditional To identify applications that can be improved by
mobile settings. Notebooks – considered as implicit HCI input and output of the application
mobile computers - are mainly used in stationary and the real world environment in which it is
setting, e.g. one takes a notebook to a meeting executed have to be analyzed. Then ways to
and takes note and a salesman takes a mobile capture the situational context must be assessed.
computer to a customer for a presentation. In Furthermore mechanisms for the interpretation of
general in these scenarios the application is used the situational context have to be found. Finally
in a stationary setting between several minutes the reaction of the application has to be defined.
and hours. Whereas considering the usage of The following questions help to identify these
ultra-mobile devices interaction periods are often points:
much shorter e.g. looking up an address takes
3
•= What happens around an application while On 3: For each Ci the accuracy Ai and the update
the application is in use? Are there any rate Ui that are needed to make the measurements
changes at all? useful are defined. Then a sensing device that
•= Do the surroundings (behavior, environment, matches these requirements is identified. If the
circumstances) carry any valuable cost for the identified sensing device Di is
information for the application? Does it acceptable, then the vector describing the
matter for the application? condition, the sensing device, the required
•= Are there any means to capture and extract accuracy and update rate is added to the set D.
the information in a way that is acceptable for For conditions that cannot be sensed the cost is
the application or device (processing cost, infinite.
sensor cost, weight, etc.)? On 4: If any conditions that are feasible to
•= How to understand the information? What measure exist then for each of these conditions
interpretation and reasoning is possible and one or more range values that are meaningful
useful. What is an appropriate way for the (temperature between 15°C and 25°C, location is
application to react? inside my office, user is moving, etc.) are
identified and for these ranges the reaction of the
Putting all of these together we can set up the
application (switch to notepad, etc.) is defined.
algorithm in figure 1. The algorithm works as
follows: 2.5 Modeling Implicit Human Computer
On 1: C is the set of surrounding conditions that Interaction
carry information that is useful for the To specify applications that facilitate implicit
application. Each element Ci stands for one HCI it is inevitable to have a specification
condition, e.g. location, temperature, current user, language to describe situational context linked to
device orientation, etc. The set is created by events/change that occur in the application. In our
asking what condition change in the environment. recent work we found it helpful to use a notation
that is human readable as well as easily to process
On 2: D is initialized – at the beginning no
using a computer. We decided to use a markup
sensing devices are identified.
language that is specified in XML for this
1. create the set C
2. set D = {}
3. for each Ci ∈ C
define Ai. // accuracy
define Ui. // update rate
identify Si // a sensor device
// that is appropriate
if cost(Si, Ai, Ui) is acceptable then
D = D ∪{( Ci, Si, Ai, Ui)}
fi
next
4. if D ≠ {} then
for each vector Di in D
define a set of application reaction Ri = {(Iij, Rij)}
// Iij is input range, application reaction pairs Iij
// Rij is application reaction
else
// implicit interaction is not used
//(either no condition that are useful or too costly)
4
purpose. Extending the SGML based description <!ELEMENT context_interaction
model introduced by Brown in [2], [3] we added (context , action )>
two more concepts - grouping context with <!ELEMENT context (group+ )>
matching attributes and trigger attributes to make <!ELEMENT group (#PCDATA )>
<!ATTLIST group match
the description more expressive and suitable for
(one | all | none ) #REQUIRED >
our projects. See figure 2 for the XML data type <!ELEMENT action (#PCDATA )>
definition (DTD). <!ATTLIST action
In the <context> section contextual variables time CDATA ’0’
are used to describe the conditions. These trigger (enter | leave | in )
variables are made of two parts, the first is used #REQUIRED >
to specify the context sensing module, in figure 5, Figure 2: Data Type Definition
the sensor module (sensor_module) and the
palm pilot (pilot) and in the second part the <context_interaction>
<context>
variables provided by this module. <group match=’one’>
In the <action> section function calls are used sensor_module.touch
to specify the action to be carried out in case the pilot.on
trigger evaluates to true. These calls are also </group>
hierarchically structured; specifying the device, <group match=’none’>
sensor_module.alone
the application, and the function to be performed.
pilot.pen_down
Depending on the platform (e.g. context sensing </group>
module in a microcontroller) we use a different </context>
implementation language. <action trigger=’enter’ time=’3’>
pilot.notepad.confidential
If contexts are composed of a number of
</action>
components we found it very helpful to have a </context_interaction>
mechanism to bundle certain contextual variables
in groups and select a matching semantic for each Figure 3: Context description
group description. For matching in a group we
provided the following semantics: one (match In figure 3 an example of a description of a
one or more of the variables in the following context and an action is shown. The context
group), all (match all variables in the following description consists of two groups of contextual
group), none (match none of the variables in the variables. In the first group the match semantics
following group). All groups within the context is that at least one of the variables must be true, in
description must evaluate to true to cause the this case either the device is touched or the state
trigger. of the device is on. In the second group the match
We discriminate three different triggers: ‘enter a semantics is ‘none’, which means that the
context’, ‘leave a context’, and ‘while in a contextual variable alone must not be true and
context’. The ‘enter’ and ‘leave’ triggers take a that the user must not have touched the screen
time value that specifies the time after1 which the with a pen.
action is triggered if the context stays stable over If the context evaluates to true, an action is
this time. For the ‘while in a context’ trigger the triggered. Here the semantics is that if the context
time indicates the interval in which the trigger is is entered and is stable for at least three seconds
fired again. then the action is performed.
The complete description means that if the device
is on or in the users hand and if the user is not
alone and he is not touching the screen with the
1 The parameter indicating the time after that an action is pen then after three seconds the display should be
performed is often 0 (immediate context action coupling) hidden by an image as depicted in figure 6 (d) in
or positive. In certain circumstances, when future the later section.
situations can be predicted (e.g. you drive your car into the
parking, the context walking will appear soon) a negative
value does make sense, too.
5
3 Perception information of this kind is also useful for other
There are several ways to equip devices with applications, for instance calendars, email
perceptional capabilities. The range of notification, pagers and mobile phones can make
complexity to consider is very wide, starting from use of any context that gives an indication of
simple sensors that know the way a device is held whether or not it is a good time to interrupt a
[18] to complex audio and video analysis. We user.
identified the following four basic approaches: The specific contexts that we chose for our study
•= device-databases (e.g. calendars, todo- are based on aural information: user speaking,
lists, address books, profile, etc.) others speaking, noisy, and quiet. And based on
movement of the user: walking, running,
•= input to the application running (notepad
stationary. Movement context was included as it
- taking notes, calendar - looking up a
gives an indication as to whether a user can be
date, etc.)
interrupted visually.
•= active environments (active badges [10],
For recognition of aural and movement contexts,
IR-networks, cameras, audio, etc.)
we integrated two microphones and an
•= sensing context using sensors (TEA [5], accelerometer in our design. One of the
[21], Sensor Badges [1], GPS, cameras, microphones is placed near the user’s throat, the
audio [16], etc.) other pointing away from the user. With this
The perceptual capabilities can be located in the configuration the distinction of speaker and
device itself, in the environment or in another environment is feasible with minimal processing
device that shares the context over a network (e.g. cost. The acceleration sensor is used to
body area network). discriminate whether a user is standing still,
In the remainder of this section we concentrate on walking or running.
sensor based perception, also knowing that in The sensor placement considerations led us to
most scenarios a combination of all four cases is build the context-awareness component into a tie
the way of choice. First we introduce two sensor – it may be considered to build them into other
devices developed in our group and then provide accessories worn in similar ways (e.g. jewelry,
some information on other sensor based devices neckerchief, or necklace). We also liked that a tie
that supply contextual information. stresses the component’s design as accessory
rather than as stand-alone device, see figure 4.
3.1 Context Awareness Component
The hardware of our context-awareness
In this part a wearable context-awareness
component is build around a TINY-Tiger
component that integrates low-cost sensors is
microcontroller, which offers four analog inputs
described. Simple methods are used to derive
context information from sensor data. The
derived context is application-independent and
can be exploited by other wearable or personal
technologies in a body network, for instance
wearable computers, mobile phones, digital
cameras, and personal digital assistants.
Here we chose to address a number of contexts
that relate to how interruptible the user is. These
contexts describe only a certain aspect of real
world situations but they are general in the sense
that they can be exploited by a range of
applications. Such context is for instance
implemented and used in the audio wearable
described in [16], mimicking the human ability to
recognize situations in which it is rude to
interrupt, for instance when a person is engaged
in a conversation or giving a talk. Context Figure 4: Context-Awareness Tie.
6
and two serial lines. The two signals from the 3.2 Sensor Board
microphones are amplified and connected to the Using this Board we collect data on the
analog inputs. To measure the motion we used a situational context by using a combination of low
two-axis accelerometer (Analog Devices level sensors. In this project we built a context
ADXL202). A more detailed description has been recognition device equipped with a light sensor,
published in [22]. acceleration sensor, a passive infrared sensor, a
The software is realized in Tiger-BASIC, a touch sensor, and a temperature sensor. All
multitasking basic dialect for the TINY-Tiger. It sensors, but the touch sensor are standard sensors
reads and analyzes sensor data in a time window and produce analog voltage level. The touch
of about four seconds. The methods to analyze sensor recognizes the human body as a capacitor
the signals are deliberately simple; they work and supplies a digital value. The heart of the
within the time domain and are based on basic device is a BASICTiger microcontroller that reads
statistical measurements. Based on the features from all the physical input channels (it offers four
calculated from sensor data the contexts are analog digital converters and a number of digital
detected. IOs) and statistical methods are applied to
The communication is based on a serial line recognize contexts. The board is depicted in
connection using 9600 bit/s, in a simple request- figure 5. The PDA requests contextual variable
reply manner. The client requests the contextual while the application is idle, e.g. catching the
variables from the context-awareness component NullEvent on the PalmPilot.
that sends back the variables together with the
values. 3.3 Related Work on Context Sensing
Experimentation with the context-aware tie In robotics this way of perception is widely used
showed that contexts were recognized in a very but with a different objective – giving machines
reliable way. Both ‘user speaking’ vs. ‘others the ability to operate autonomously.
speaking’ and ‘stationary’ vs. ‘walking’ vs. For the use with handheld devices the project
‘running’ were discriminated correctly. A key TEA [5] developed a sensor board (equipped with
finding is that sensor placement can be used 8 sensors, light, acceleration, pressure,
effectively to increase reliability and to reduce temperature, etc.) that supplies contextual
required processing. information; communication is done via serial
The device can provide information on the line. The application described is a mobile phone
situational context of the user for other personal that recognizes its context (in users hand, on the
technologies in a body area network. Using this table, in suitcase, outdoors) and it adapts ringing
device the implicit HCI can be facilitated. modes according to users preferences in that
situation [19].
Using a similar approach a system to facilitate
indoor location awareness based on low level
sensors is described in [8]. The system reads data
from different sensors (acceleration, light,
magnetic field, etc.) and provides location
information.
In [7] a cup is described that has an acceleration
and temperature sensor build in together with a
microcontoller and infrared communication. The
cup is aware of its state (warm, cold, on a table,
drinking, moved). The information from a
number of cups communicated to a server is then
used to supply information about the group of
users. All these projects focus on a completely
Figure 5: Context Sensing Device and sensor based approach to context awareness.
PalmPilot A jacket that knows if it is on the hanger or with
the user is presented in [6]. The sensor jacket has
7
woven in sensors that give information if the user concern. Implicit HCI does not solve these
is wearing the jacket, what movements the user is problems in general but can help to:
making, etc. As one application correcting •= adapt the input system to the current situation
movements in sports (automated tennis coach) is (e.g. audio filter, recognition algorithms, etc)
suggested in the paper. In this project the
development of robust sensing technology is very •= limit need for input (e.g. information is
central. already provided by the context and can be
captured)
4 How Can HCI benefit from •= reduce selection space (e.g. only offer
appropriate options in current context)
Context?
HCI for mobile devices is concerned with the 4.3 ContextNotePad on a PalmPilot
general trade-off between devices qualities (e.g. To explore ways of implicit communication
small size, light-weight, little energy between users and their environment with mobile
consumption, etc.) and the demand for optimal devices we built a context aware NotePad
input-output capabilities. Here implicit HCI can application. The system uses the perceptional
offer interesting alternatives. capabilities of the sensor board, described in the
previous section and provides an application that
4.1 Output in Context
is very similar in functionality as the built-in
Over recent years the output systems for mobile notepad application on the PalmPilot.
devices became much better; features such as Additionally the application can adapt to the
stereo audio output, high-resolution color screens current situational context and can also react in
for PDAs and even on mobile phones as well as this way to the implicit interaction. The
display systems for wearable computers are application changes its behavior according to the
commercially available. Also unobtrusive situation. The following context adaptations have
notification mechanisms (e.g. vibration) have been implemented.
become widely used in phones and PDAs. Still on
the lower end devices with very poor display •= On/Off. The user has the device in her hand.
quality enter the marked. Situational context can In this case the application is switched on, if
help to: the user is putting the device out of her hand
it is switched off after a certain time. It
•= adapt the output to the current situation assumes that if the user takes the device in
(fontsize, volume, brightness, privacy her hand she wants to work with the device.
settings, etc) [19].
•= Fontsize. If the device is moved (e.g. while
•= find the most suitable time interruption [16], walking or on a bumpy road) the font size is
[22]. increased to ease reading. Whereas while
•= reduce the need for interruptions (e.g. you having the device in a stable position (e.g.
don’t need to remind someone to go to a device stationary in your hand or on the table)
meeting if he is already there.) the font is made smaller to display more text
at the same screen, see figure 6(a) and(b).
4.2 Input in Context
•= Backlight. This adaptation is straightforward
Considering very small appliances the space for a but still not build in in current PDAs. By
keyboard is very limited what results in bad monitoring the light condition the application
usability. Other input systems, such as graffiti and switches on the backlight if the brightness
handwriting recognition have been developed level in the environment is below a certain
further but still lack in speed and accuracy [9]. threshold. Accordingly if it becomes brighter
Advances in voice recognition have been made in the light is switched off again, see figure 6(c).
recent years, but for non office settings (e.g. in a
car, in a crowded place, sharing rooms with •= Privacy settings. If you are not alone and
others, and in industry workplaces), the you are not writing (or touching the screen)
recognition performance is still poor. Also the content on the display is hidden by an
privacy and acceptance issues are a major image, see figure 6(d). To sense if someone is
8
Figure 6: Adaptation to Context a) small font, b) large font, c) backlight, d) privacy
walking the passive infrared sensor is From current projects we learned that there is a
deployed. need for a simple specification language for
Currently we decrease the size of the context- implicit HCI, based on situational context. We
awareness device to make it feasible to plug it propose an XML-based markup language that
into the pilot to allow proper user studies. supports three different trigger semantics. The
language is easily human readable and also easy
to process.
5 Conclusion and Further Work
Basic mechanisms of perception to acquire
Based on observations of new sensing
situational context are discussed. In the first
technology, available sensors and anticipated
example a wearable context awareness
users a new interaction metaphor is proposed.
component build into a tie is described. Also a
Implicit HCI is defined as an action, performed
sensor-based context-awareness device is
by the user that is not primarily aimed to interact
introduced. Both devices supply context to other
with a computerized system but which such a
devices over a simple request reply protocol over
system understands as input. It is further
the serial line.
identified that perception and interpretation of the
user, the environment, and the circumstances are In a further section benefits of implicit interaction
key concepts for implicit HCI. Furthermore trough situational context to HCI are discussed.
applications that exploit this information are In an example implementation the feasibility of
required. the concepts introduced earlier is demonstrated.
Perception and interpretation are considered as
situational context. Therefore we motivate a References
broad view of context, and also suggest that the [1] Beadle, P., Harper, B., Maguire, G.Q. and
context is described from the perspective of the Judge, J. Location Aware Mobile
application. To identify applications that can Computing. Proc. of IEEE Intl. Conference
make use of situational context and thus can on Telecommunications, Melbourne,
facilitate implicit HCI a number of questions are Australia, April 1997.
raised and an algorithm is suggested. It is based [2] Brown, P. J., Bovey, J. D., Chen, X.
on the central questions: what happens around the Context-Aware Applications: From the
application, how can this be sensed or captured, Laboratory to the Marketplace. IEEE
how to interpret this information, and how can Personal Communications, October 1997.
applications make use of it. [3] Brown, P.J. The stick-e Dokument: A
Frameowrk for creating context-aware
Applications. Proc. EP´96, Palo Alto, CA.
9
(published in EP-odds, vol 8. No 2, pp. 259- in the Field", Workshop on Human
72) 1996. Computer Interaction with Mobile Devices,
[4] Cheverst K, Blair G, Davies N, and Friday University of Glasgow, United Kingdom,
A. Supporting Collaboration in Mobile- 21-23 May 1998, GIST Technical Report
aware Groupware. Personal Technologies, G98-1. 1998.
Vol 3, No 1, March 1999. [16] Sawhney, N., and S., Chris. "Nomadic
[5] Esprit Project 26900. Technology for Radio: A Spatialized Audio Environment for
enabling Awareness (TEA). 1998. Wearable Computing." Proceedings of the
http://tea.starlab.net /. International Symposium on Wearable
[6] Farringdon, J., Moore, A.J., Tilbury, N., Computing, Cambridge, MA, October 13-14,
Church, J., Biemond, P.D. Wearable Sensor 1997.
Badge & Sensor Jacket for Context [17] Schilit, B.N., Adams, N.L., Want, R.
Awareness. In Proceedings of the third Context-Aware Computing Applications.
International Symposium on Wearable Proc. of the Workshop on Mobile
Computers. San Fransico, 18-19. Oct. 1999. Computing Systems and Applications, Santa
[7] Gellersen, H-W., Beigl, M., Krull, H. The Cruz, CA, December 1994. IEEE Computer
MediaCup: Awareness Technology Society. 1994.
embedded in an Everyday Object, 1th [18] Schmidt, A., Beigl, M., Gellersen, H-W.
International Symposium on Handheld and Sensor-based adaptive mobile user
Ubiquitous Computing (HUC99), Karlsruhe, interfaces. In Proceedings 8th International
Germany, 1999. Conference on Human-Computer
Interaction, München, Germany, August
[8] Golding, A., Lesh, N. Indoor Navigation
1999.
Using a Diverse Set of cheap wearable
sensors. In Proceedings of the third [19] Schmidt, A., Aidoo, K.A., Takaluoma, A.,
International Symposium on Wearable Tuomela, U., Van Laerhoven, K., Van de
Computers. San Fransico, 18-19. Oct. 1999. Velde, W. Advanced Interaction in Context.
1th International Symposium on Handheld
[9] Goldstein, M., Book, R. Alsiö, G., Tessa, S.
and Ubiquitous Computing (HUC99),
Non-Keyboard QWERTY Touch Typing: A
Karlsruhe, Germany, 1999 & Lecture notes
Portable Input Interface For The Mobile
in computer science; Vol 1707, ISBN 3-540-
User. Proceedings of the CHI 99, Pittsburg,
66550-1; Springer, 1999.
USA 1999.
[10] Harter, A. and Hopper, A. A Distributed [20] Schmidt, A., Beigl, M., Gellersen, H.-W.
Location System for the Active Office. IEEE There is more to context than location. Proc.
Network, Vol. 8, No. 1, 1994. of the Intl. Workshop on Interactive
[11] Lenat, D.. The Dimensions of Context Applications of Mobile Computing (IMC98),
Space. 1999. Rostock, Germany, November 1998.
http://www.cyc.com/publications.html. [21] Schmidt, A., Forbess, J. What GPS Doesn't
Tell You: Determining One's Context with
[12] Leonhard, U., Magee, J., Dias, P. Location
Low-Level Sensors. The 6th IEEE
Service in Mobile Computing Environments.
International Conference on Electronics,
Computer & Graphics. Special Issue on
Circuits and Systems, September 5 - 8, 1999,
Mobile Computing. Volume 20, Numer 5,
Paphos, Cyprus. 1999.
September/October 1996.
[13] Maes, P., P. Maes on Software Agents: [22] Schmidt, A., Gellersen, H-W., Beigl, M. A
Humanizing the Global Computer. IEEE Wearable Context-Awareness Component -
Internet Computing July-August. 1997. Finally a Good Reason to Wear a Tie. In
[14] NCR Corp. Mülleimer informiert Proceedings of the third International
Supermarkt. Symposium on Wearable Computers. San
http://www.heise.de/newsticker/data/anm- Fransico, 18-19. Oct. 1999.
28.10.99-001/.
[15] Pascoe, J., Ryan, N. S., and Morse D. R.,
"Human Computer Giraffe Interaction: HCI
10
HCI
Design Rules
Presenter
Stephen Kimani
Universita' di Roma "La Sapienza"
DIS
Via Ariosto 25
00185 Rome
Italy
Web: http://www.dis.uniroma1.it/~kimani
E-mail: stephenkimani@gmail.com
HCI
Design Rules
Roadmap
• Introduction
• Usability Principles
• Heuristics and Golden Rules
• AOB
HCI
Design Rules
Introduction
• Design rules (or usability rules) are rules that a designer can follow in order to
increase the usability of the system/product e.g., principles, standards, guidelines.
Guidelines
(e.g. use colour to highlight links)
Can guide/advise on how achieve a principle
Narrowly focused.
Can be too specific, incomplete, & hard to apply BUT they are more general and
lower in authority than Standards (e.g. use colour RGB #1010D0 on home links)
which are very specific & high in authority.
HCI
Design Rules
Introduction
• Principles:
Example - usability principles by Dix et al (HCI book)
• Standards: They are often set by national (eg British Standards Institution) or
international bodies (ISO).
Example [of standards] - ISO 9241 "Ergonomic Requirements for Office Work with
Visual Display Terminals (VDT)s"
• Guidelines:
Example - Smith and Mosier's "Guidelines for User Interface Software" [MITRE
Corporation 1986].
HCI
Design Rules
Introduction
• Design rules should be used early in the lifecycle [e.g., during the design;
note that they can also be used to evaluate the usability of the system]
• We will:
First look at abstract principles for supporting usability
Later on, we will look at the most well used and well known sets of
heuristics or 'golden rules‘, which tend to provide a succinct summary of
the essential characteristics of good design (Nielsen's heuristics,
Shneiderman's golden rules and Norman's principles [the last set, study
on your own])
HCI
Design Rules
Usability Principles
1. Learnability
The ease with which new users can begin effective interaction and achieve
maximal performance.
• Predictability, Synthesizability, Familiarity, Generalizability, Consistency.
HCI
Design Rules
Usability Principles
1. Learnability
The ease with which new users can begin effective interaction and achieve
maximal performance.
• Predictability, Synthesizability, Familiarity, Generalizability, Consistency.
HCI
Design Rules
Usability Principles
Learnability (contd.)
• Predictability: support for the user to determine the effect of future action based on
past interaction history (can I ‘tell’ what will happen based on what I have gone
through in past?).
HCI
Design Rules
Usability Principles
Learnability (contd.)
• Synthesizability: support for the user to assess the effect of past operations on the
current state (can I ‘tell’ why I am here based on what I have gone through in the
past?).
HCI
Design Rules
Usability Principles
Learnability (contd.)
• Familiarity: the extent to which a user's knowledge and experience in other real-
world or computer-based domains can be applied when interacting with a new
system.
HCI
Design Rules
Usability Principles
Learnability (contd.)
• Generalizability: support for the user to extend knowledge of specific
interaction within and across applications to other similar situations.
Usability Principles
2. Flexibility
The multiplicity of ways the user and system exchange information.
• Dialogue initiative, Multithreading, Task migratability, Subsitutivity,
Customizability.
Usability Principles
Flexibility (contd.)
• Multithreading: the ability of the system to support user interaction for more
than one task at a time.
HCI
Design Rules
Usability Principles
Flexibility (contd.)
• Task migratability: the ability to transfer
control for execution of tasks between
the system and the user
(consider e.g., spell-checking task).
HCI
Design Rules
Usability Principles
Flexibility (contd.)
• Substitutivity: the extent to which an application allows equivalent input and
output values to be substituted for each other (values in input eg
fractions/decimals, values in output eg both digital and analog, output/input
eg output can be reused as input).
HCI
Design Rules
Usability Principles
Flexibility (contd.)
• Customizability: the ability of the user or the system to modify the user
interface. (adaptability vs adaptivity) ?-initiated modification.
HCI
Design Rules
Usability Principles
3. Robustness
The level of support provided to the user in determining successful
achievement and assessment of goal-directed behavior.
• Observability, Recoverability, Responsiveness, Task conformance.
• Observability: the extent to which the user can evaluate the internal state of
the system from the representation on the user interface.
HCI
Design Rules
Usability Principles
Robustness (contd.)
• Recoverability: the extent to which the user can reach the intended goal after
recognizing an error in the previous interaction.
HCI
Design Rules
Usability Principles
Robustness (contd.)
• Responsiveness: a measure of the rate of communication between the user
and the system.
HCI
Design Rules
Usability Principles
Robustness (contd.)
• Task conformance: the extent to
which the system services
support all the tasks the user
would wish to perform and
in the way the user would wish
to perform.
HCI
Design Rules
AOB
Any Questions?
Usability Principles
John Stasko
Spring 2007
This material has been developed by Georgia Tech HCI faculty, and continues
to evolve. Contributors include Gregory Abowd, Al Badre, Jim Foley, Elizabeth
Mynatt, Jeff Pierce, Colin Potts, Chris Shaw, John Stasko, and Bruce Walker.
Permission is granted to use with acknowledgement for non-profit purposes.
Last revision: January 2007.
Agenda
• Usability Principles
– Why?
– System of principles
• Learnability
– Support for learning for users of all levels
• Flexibility
– Support for multiple ways of doing tasks
• Robustness
– Support for recovery
– Style guides
• Project preparation
6750-Spr ‘07 2
1
Good Design (our goal!)
6750-Spr ‘07 3
6750-Spr ‘07 4
2
Concepts, Principles, Guidelines
• No “cookbooks”
• No simple, universal checklists
• There are many concepts, principles, and
guidelines
• Understand the higher level principles that
apply across situations, display types, etc.
• Implement the standards and guidelines
…a few details…
6750-Spr ‘07 5
6750-Spr ‘07 6
3
Levels of Consideration
1. Meta-display level
– Apply to the whole system, across media & across
displays
– Focus on this in Basic Layout Stage
2. Display Layout
– Apply to groups of elements in a display
– Focus on this in Prototyping and Redesign
3. Element level
– Details about specific parts of a display
– Colors, sound attributes, symbols
6750-Spr ‘07 7
• Categories
– Learnability
• Support for learning for users of all levels
– Flexibility
• Support for multiple ways of doing tasks
– Robustness
• Support for recovery
6750-Spr ‘07 8
4
1. Learnability Principles
6750-Spr ‘07 9
1.1 Predictability
• I think that this action will do….
6750-Spr ‘07 10
5
1.2 Synthesizability
• Support for user in assessing the effect of past
operations on current system state
6750-Spr ‘07 11
1.3 Familiarity
– Use of metaphors
• Potential pitfalls
6750-Spr ‘07 12
6
Metaphors at the UI - What
6750-Spr ‘07 13
1.4 Generalizability
6750-Spr ‘07 14
7
1.5 Consistency
6750-Spr ‘07 15
6750-Spr ‘07 16
8
2. Flexibility Principles
6750-Spr ‘07 17
6750-Spr ‘07 18
9
2.2 Multithreading
6750-Spr ‘07 19
6750-Spr ‘07 20
10
2.4 Substitutivity
6750-Spr ‘07 21
2.5 Customizability
• Ability of user to modify interface
– By user - adaptability
• Is this a good thing?
– By system - adaptivity
• Is this a good thing?
6750-Spr ‘07 22
11
3. Robustness Principles
6750-Spr ‘07 23
3.1 Observability
• Can user determine internal state of
system from what she perceives?
– Browsability
• Explore current state
(without changing it)
– Reachability
• Navigate through
observable states
– Persistence
• How long does observable state persist?
6750-Spr ‘07 24
12
Observability - Role of Feedback
• Feedback helps create observability
• Feedback taxonomy (generally don’t need all of
these)
– “I understand what you have asked me to do”
– “I am doing what you have asked me to do”
• “And it will take me this much longer”
• Song and dance routine to distract user (busy interval
as opposed to idle interval)
• “And here are some intermediate results to keep you
happy until I am done
– “All done, what’s next?”
6750-Spr ‘07 25
Acrobat Reader
with ToC to
give context
Forest is the
bookmarks,
tree is the
single page
6750-Spr ‘07 26
13
3.2 Recoverability
6750-Spr ‘07 27
6750-Spr ‘07 28
14
Do Not Set the User Up
6750-Spr ‘07 29
3.3 Responsiveness
6750-Spr ‘07 30
15
Responsiveness
• Response to motor actions
– Keyboarding, mouse movement – less than 100
msecs
– Rich human factors literature on this
• Consistency is important – experimental results
– Users preferred longer but more consistent response
time
– Times that differed 10%-
10%-20% were seen as same
• Sometimes argued that too fast is not good
– Makes user feel like they need to do something
quickly to keep up with computer
6750-Spr ‘07 31
6750-Spr ‘07 32
16
Application
6750-Spr ‘07 33
Styleguides
6750-Spr ‘07 34
17
Typical TOC - MAC OS X
Introduction to the Apple Human Using Existing Technologies
Providing User Assistance
Interface Guidelines Internationalizing Your Application
What Are the Mac OS X Human Interface Guidelines? Storing Passwords
Who Should Read This Document? Printing
Organization of This Document Choosing Colors
Conventions Used in This Document Setting Fonts and Typography Characteristics
See Also Selecting Attributes Associated With People
Speech Technologies
Part I: Fundamentals
Human Interface Design Part III: The Aqua Interface
Human Interface Design Principles User Input
Keep Your Users in Mind The Mouse and Other Pointing Devices
The Keyboard
The Development Process
Selecting
Design Decisions
Editing Text
Managing Complexity
Extending the Interface Drag and Drop
Involving Users in the Design Process Drag and Drop Overview
Drag and Drop Semantics
Part II: The Macintosh Experience
Selection Feedback
First Impressions Drag Feedback
Packaging Destination Feedback
Installation Drop Feedback
General Installer Guidelines Clippings
Setup Assistants
Text
Mac OS X Environment Fonts
The Finder Style
The Dock
Icons
The File System
Icon Genres and Families
Multiple Users
Icon Perspectives and Materials
Remote Log In
Conveying an Emotional Quality in Icons
Assistive Technologies
Suggested Process for Creating Aqua Icons
Networking
Tips for Designing Aqua Icons
Application Services
Displays Cursors
The Always-On Environment Standard Cursors
Designing Your Own Cursors
6750-Spr ‘07 35
More TOC
Menus Layout Examples
Menu Behavior Positioning Controls
Designing the Elements of Menus Sample Layouts
The Menu Bar and Its Menus Grouping Controls
Contextual Menus
Using Small and Mini Versions of Controls
Dock Menus
Keyboard Shortcuts Quick Reference
Windows
Types of Windows Tab View Differences Between Mac OS X Versions
Window Appearance
Window Behavior Document Revision History
Utility Windows
The About Window
Preferences Windows
Inspectors and Info Windows
Find Window
Fonts Window and Colors Window
Dialogs
Types of Dialogs and When to Use Them
Dialog Behavior
The Open Dialog
Dialogs for Saving, Closing, and Quitting
The Choose Dialog
The Printing Dialogs
Controls
Buttons
Selection Controls
Adjustment Controls
Indicators
Text Controls
View Controls
Grouping Controls
6750-Spr ‘07 36
18
Excerpt from OS X Styleguide
Drag and Drop Overview
Ideally, users should be able to drag any content from any window to any other window that accepts the content’s type. If the source
and destination are not visible at the same time, the user can create a clipping by dragging data to a Finder window; the clipping
can then be dragged into another application window at another time.
Drag and drop should be considered an ease-of-use technique. Except in cases where drag and drop is so intrinsic to an
application that no suitable alternative methods exist—dragging icons in the Finder, for example—there should always be another
method for accomplishing a drag-and-drop task.
The basic steps of the drag-and-drop interaction model parallel a copy-and-paste sequence in which you select an item, choose
Copy from the Edit menu, specify a destination, and then choose Paste. However, drag and drop is a distinct technique in itself and
does not use the Clipboard. Users can take advantage of both the Clipboard and drag and drop without side effects from each
other.
A drag-and-drop operation should provide immediate feedback at the significant points: when the data is selected, during the drag,
when an appropriate destination is reached, and when the data is dropped. The data that is pasted should be target-specific. For
example, if a user drags an Address Book entry to the “To” text field in Mail, only the email address is pasted, not all of the person’s
address information.
You should implement Undo for any drag-and-drop operation you enable in your application. If you implement a drag-and-drop
operation that is not undoable, display a confirmation dialog before implementing the drop. A confirmation dialog appears, for
example, when the user attempts to drop an icon into a write-only drop box on a shared volume, because the user does not have
privileges to open the drop box and undo the action.
6750-Spr ‘07 37
Styleguides
• General User Interface Design Style Guides
– Apple Human Interface Guidelines (Mac OS X) Design Guidelines
– Microsoft User Interface Guidelines (Click in the left tree on User Interface Design...)
– Windows XP Guidelines
– Yale Web Style Guide (2nd Edition)
– Java Look and Feel Guidelines (version 1)
– Java Look and Feel Guidelines version 2
– Java Look and Feel Guidelines: Advanced Topics
– IBM 3D design Guidelines
– Silicon Graphics Indigo Magic User Interface Guidelines
• http://www.experiencedynamics.com/science_of_usability/ui_style_guides/
http://www.experiencedynamics.com/science_of_usability/ui_style_guides/
6750-Spr ‘07 38
19
And More Styleguides ….
• Government funded Usability Guidelines
– MITRE Guidelines for Designing User Interface Software (US Airforce)
– Research based Web Design and Usability Guidelines (Dept. of Health and Human Services)
– Cancer Institute Usability Guidelines
– NASA User Interface Guidelines
– Canadian Command Decision Aiding Technology (COMDAT) Operator-Machine Interface (OMI) Style
Guide: Version 1.0
• Accessibility Guidelines
– Techniques for Web content Accessibility Guidelines 1.0
6750-Spr ‘07 39
Project
• Interesting topics?
6750-Spr ‘07 40
20
Upcoming
• Human Capabilities
– Physical
– Cognitive
6750-Spr ‘07 41
21