Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
SlideShare a Scribd company logo
MoreWaystoActuallyMeasureDesign
Or, why our jobs in 10 years probably include data
JESSA PARETTE | 2021 UX STRATEGY
@jessaparette
Alittleaboutme:
HEAD OF DESIGN
AUTO FINANCE | CAPITAL ONE
RESEARCH, STRATEGY & SYSTEMS
Data and design have an undeniable future together, and
automation of measuring quality needs to be part of it
1
We can measure design quality in objective, quanti
fi
able
ways (but it’s hard)
2
Design leaders have a responsibility to in
fl
uence the data
being built in the product lifecycle
3
Whatwe
willlearn
JESSA PARETTE | UX STRATEGY 2021
If we ignore the need to
automate measurement of
design quality at scale,
design becomes robotic
How we connect systems,
including design systems
and tools, becomes vital and
complex part of quality
Whythisfeelsimportant
Measurement is incredibly
hard, and will demand new
types of talent joining the
ranks of design teams.
MEASURING QUALITY
IS HARD
TECHNOLOGY IS
ONLY GOING TO GET
MORE COMPLEX
DESIGN DIGITIZES
FASTER THAN WE CAN
MEASURE
JESSA PARETTE | UX STRATEGY 2021
Keytakeawaysin
thistalk
We often overlook objectively measuring
quality as we develop products, but
opportunity exists
1
We can apply a formulaic framework to help
identify low-hanging fruit
2
In
fl
uencing the development process with the
right type of experience design data gets
easier when design leaders learn new skills
3
JESSA PARETTE | UX STRATEGY 2021
Thisisnotaboutwhatyoualreadydo
Code compliance, down
time, sales & marketing
data are all great
JESSA PARETTE | UX STRATEGY 2021
Thisisnotaboutwhatyoualreadydo
But you probably don’t
track design experience
data with the same rigor
JESSA PARETTE | UX STRATEGY 2021
Thisisnotaboutwhatyoualreadydo
What I’m talking about is
design data in the context
of a user interacting with
a system
JESSA PARETTE | UX STRATEGY 2021
Speci
fi
cally crafting a
view that will tell us about
the design and experience
quality as a byproduct of a
user’s interaction with the
system as a whole
Thisisnotaboutwhatyoualreadydo
JESSA PARETTE | UX STRATEGY 2021
Ifthedatawehave
todaywereenough

Ourdesignbudgets
wouldbemuchbigger
JESSA PARETTE | UX STRATEGY 2021
Ourdesignteamswould
neverbeshortof
headcount
Ifthedatawehave
todaywereenough

JESSA PARETTE | UX STRATEGY 2021
Astandarddesignteam
wouldhaveengineers
andanalysts
Ifthedatawehave
todaywereenough

JESSA PARETTE | UX STRATEGY 2021
‘ChiefDesignOfficer’
becomesascommonas
‘ChiefFinancialOfficer’
Ifthedatawehave
todaywereenough

JESSA PARETTE | UX STRATEGY 2021
SOWHATISTHEPROBLEM?
Organizations often chase metrics
irrelevant to the user experience,
or have a “catch all” proxy metric.
JESSA PARETTE | UX STRATEGY 2021
SOWHATISTHEPROBLEM?
Even if there are a few, there is usually
no systematic place to track them.


The cost: Device telemetry &
instrumentation lack design-data
standards as part of a requirements
package
JESSA PARETTE | UX STRATEGY 2021
Theconsequencesarebiggerthanyouthink
IT’SEXPENSIVE
Reactive decision
making is costly
JESSA PARETTE | UX STRATEGY 2021
JESSA PARETTE | UX STRATEGY 2021
Inareactionarystate,you
neverhaveenough
researchers&yourbudget
forstudieshastobesaved
for‘trulyimportant’insights
Theconsequencesarebiggerthanyouthink
IT’SEXPENSIVE ITALIENATESALLIES
Reactive decision
making is costly
Usability methods
should be accessible to
our partners
JESSA PARETTE | UX STRATEGY 2021
JESSA PARETTE | UX STRATEGY 2021
Researchisnot‘doneinthe
dark’oronlypartofanelite
practice.Itisuniversal.
Theconsequencesarebiggerthanyouthink
IT’SEXPENSIVE ITALIENATESALLIES ITCREATESBURNOUT
Reactive decision
making is costly
Usability methods
should be accessible to
our partners
Ad-hoc practices
frustrate the experts
JESSA PARETTE | UX STRATEGY 2021
Researchersanddesignersfind
themselvesconstantlyre-proving
thesamepointwithdifferent
teams(orthesameteam)
And sooner or later the team decides to solve a ‘different’ problem
JESSA PARETTE | UX STRATEGY 2021
Evenobjective
measuresof
usabilityface
challenges
There are few guidelines or standardized
models on the de
fi
nitions and usage of
usability metrics that developers can apply
consistently
1
Most frameworks are missing design-centric
data requirements, and not well integrated
into current development practices
2
We (probably) didn’t become designers
because we loved math, data and statistics
3
JESSA PARETTE | UX STRATEGY 2021
JESSA PARETTE | UX STRATEGY 2021
DESIGNISEVOLVING
Once upon a time
..
There was graphic
design and a technical
architect only
JESSA PARETTE | UX STRATEGY 2021
BEPARTOFTHE
EVOLUTION
Evolution is easier than
revolution
A designer’s role has
grown in the
responsibility of
knowing more than just
design
“Whenyoucanmeasurewhatyouarespeaking
aboutandexpressitinnumbers,youknow
somethingaboutit.Butwhenyoucannotmeasureit,
whenyoucannotexpressitinnumbers
itmaybe
thebeginningofknowledge,butyouhavescarcely

advancedtothestageofscience.”
Lord Kelvin (1891)
JESSA PARETTE | UX STRATEGY 2021
WHAT I
HOPE
INSPIRES
What if we could
express, in numbers,
the impact of
design?
JESSA PARETTE | UX STRATEGY 2021
DESIGNQUALITYISNOTONETHING
There are subjective and objective means of managing the user experience
Wecanseparatetypes
ofqualitymeasurementin
designbylookingatthe
categoricaldifference
betweensubjective&
objectivemeasures
OBJECTIVE SUBJECTIVE
USABILITY USER EXPERIENCE
JESSA PARETTE | UX STRATEGY 2021
Today’sfocus
CATEGORIZETHEFACTORSOFDESIGNQUALITY
Right now, there are generally 10 groups
First,determinethedesign
factorsthatmakeupquality
EFFICIENT
EFFECTIVE
SATISFYING
PRODUCTIVE
LEARNABLE
SAFE
TRUSTWORTHY
ACCESSIBLE
UNIVERSAL
USEFUL
JESSA PARETTE | UX STRATEGY 2021
BREAKDOWNTHEDESIGNFACTORS
Identify the criteria that encompasses each design quality factor
TIME
FEEDBACK
MINIMAL MEMORY
NAVIGATION
MINIMAL ACTION
RESOURCE USE
OPERABLE
UNDERSTANDABLE
EFFICIENT
If you want something to be:
You measure it’s encompassing criteria:
JESSA PARETTE | UX STRATEGY 2021
BREAKDOWNTHEDESIGNFACTORS
Identify the criteria that encompasses each design quality factor
LEARNABLE
USER GUIDANCE
FAMILIAR
MINIMAL MEMORY
SELF-DESCRIBING
MINIMAL ACTION
SIMPLICITY
CONSISTENT
UNDERSTANDABLE
If you want something to be:
You measure it’s encompassing criteria
JESSA PARETTE | UX STRATEGY 2021
BREAKDOWNTHEFACTORS
Identify the criteria that encompasses each factor
LEARNABLE
USER GUIDANCE
If you want something to be:
You measure it’s encompassing criteria
JESSA PARETTE | UX STRATEGY 2021
‱ Do our
fi
eld forms have proper indication?


‱ How many actions can be canceled by the user after they
have started?


‱ What is the rate of error for our most popular task
fl
ows?
TIME
Time spent performing some usable task
UNDERSTANDABLE
Capability of the software or product to convey its
purpose and give clear user assistance in its operation
FEEDBACK
The system, product or services’ capability to
respond to user actions
Definitionof
usabilityattributes
iskey
JESSA PARETTE | UX STRATEGY 2021
UNDERSTANDABLE
Capability of the software or product to convey its
purpose and give clear user assistance in its operation
Definitionof
usabilityattributes
iskey
JESSA PARETTE | UX STRATEGY 2021
This is different from the user self-
reporting whether they understand
Do our components have the right
microcopy?


Can we measure how many times a
user clicks on a help link?
These are back-end indicators or
benchmarks that signal a problem
Ef
fi
cient Effective Satisfaction Accessible Productive
Appearance √
Accurate √
Control √
Consistent √ √
Complete √
Flexible √ √ √
Feedback √ √
Flexible √
Likeable √
Load Time √ √
Minimal Action √ √ √
Minimal Memory √ √ √
Navigation √ √ √
Operable √ √ √
Presentation √
Readable √
Resource Use √ √
Self-Describing √
Simple √
Time √ √
Understandable √ √ √
User Guidance √ √
Define
experience
criteria&
createa
systematic
grouping
Criteria
Design Factors
JESSA PARETTE | UX STRATEGY 2021
HERE IS
WHERE IT
GETS FUN
Things like time,
productivity &
learning are all made
up of something, and
that something can
be quanti
fi
ed
(To an extent)
JESSA PARETTE | UX STRATEGY 2021
FINDTHENUMERICVALUEFORYOURDESIGNCRITERIA
Because it exists somewhere (probably)
MINIMAL ACTION
If you are measuring
It is made up of measurements like:
‱Task time


‱Number of commands


‱Task concordance


‱Completion rate


‱Layout appropriateness
(Part of measuring ‘ef
fi
ciency’)
JESSA PARETTE | UX STRATEGY 2021
FINDTHENUMERICVALUE
Because it exists somewhere (probably)
TIME
If you are measuring
It is made up of measurements like:
‱Time on task


‱Completion rate


‱Throughput


‱Response time


‱Time based ef
fi
ciency
(Part of measuring ‘ef
fi
ciency’)
JESSA PARETTE | UX STRATEGY 2021
IDENTIFYTHEMETRICSFORCRITERIA
Breaking them into numeric values
TIME
If you are measuring
De
fi
nitions of the metrics are:
METRIC DEFINITION
Throughput How many tasks can be successfully performed over a given period of time?
Throughput (Mean amount of
throughput)
What is the average number of concurrent tasks the system can handle over a set unit of time?
Throughput (Worst case thorughput
ratio)
What is the absolute limit on the system in terms of the number and handling of concurrent tasks as
throughput?
Response Time What is the time taken to complete a speci
fi
ed task?
Time based efficiency The time taken (in seconds or minutes) for users to complete a task
Overall relative efficiency
The overall relative ef
fi
ciency uses the ratio of the time taken by the users who successfully
completed the task in relation to the total time taken by all users.
Time on task How much time does it take a user to complete a task?
JESSA PARETTE | UX STRATEGY 2021
METRICSCANBEDESCRIBEDASFORMULASORCOUNTABLEDATA
The output is a numeric value that summarizes the status
METRIC DEFINITION FORMULA
Throughput
How many tasks can be successfully performed over a given period of
time?
X=A/T (A = number of completed events/tasks; T = observation time period)
Throughput (Mean amount
of throughput)
What is the average number of concurrent tasks the system can


handle over a set unit of time?
X = Xmean / Rmean Xmean = ÎŁ(Xi)/N Rmean = required mean throughput


Xi = Ai / Ti (Ai = number of concurrent tasks observed over set period of time for i- th evaluation; Ti = set period of time for i- th
evaluation; N = number of evaluations)
Throughput (Worst case
thorughput ratio)
What is the absolute limit on the system in terms of the number and
handling of concurrent tasks as throughput?
X = Xmax / Rmax Xmax = MAX(Xi) (for i = 1 to N)


Rmax = required maximum throughput.


MAX(Xi) = maximum number of job tasks among evaluations


Xi = Ai / Ti (Ai = number of concurrent tasks observed


over set period of time for i- th evaluation; Ti = set period of time for i- th evaluation;N= number of evaluation)
Response Time What is the time taken to complete a speci
fi
ed task? T = ( time of gaining the result) - ( time of command entry
fi
nished)
Time based ef
fi
ciency The time taken (in seconds or minutes) for users to complete a task
Overall relative ef
fi
ciency
The overall relative ef
fi
ciency uses the ratio of the time taken by the
users who successfully completed the task in relation to the total time
taken by all users.
Time on task How much time does it take a user to complete a task? Total time between task start and task completion
JESSA PARETTE | UX STRATEGY 2021
Yournextquestionisprobably

WHERE
DO WE
GET THIS
DATA?
JESSA PARETTE | UX STRATEGY 2021
WE LOOK FOR
EXISTING
PRODUCT DATA
(Analysts are your BFF) Design can comprise new views of the user experience
by re-arranging existing product interaction data
(If it exists. And that’s a big if.)
JESSA PARETTE | UX STRATEGY 2021
WE LOOK FOR
EXISTING
PRODUCT DATA
(Analysts are your BFF)
BASIC LIST OF PRODUCT ANALYTICS
JESSA PARETTE | UX STRATEGY 2021
- ServerTime


- SessionID


- UserID


- InteractionID


- PageLoadTime
To measure ef
fi
ciency, we need
- ServerTime


- SessionID


- ———-


- InteractionID


- PageLoadTime
But here is what we have
- ServerTime


- SessionID


- UserID


- InteractionID


- PageLoadTime
We need to code for
You go on a journey to see what
you need
Use data to map where your
design measures are
Fill in the missing pieces
IT’SKINDOFLIKEARCHEOLOGY
(Without the epic John Williams soundtrack)
JESSA PARETTE | UX STRATEGY 2021
THIS IS
NOT NEW
Even in design, this
approach is part of
our DNA
JESSA PARETTE | UX STRATEGY 2021
WE DO
THIS ALL
THE TIME
Usually in interface
de
fi
nition, and saying
‘atoms’ instead of
‘metrics’
JESSA PARETTE | UX STRATEGY 2021
Components
Molecules
Atoms Patterns
JESSA PARETTE | UX STRATEGY 2021
IN A DESIGN SYSTEM
Designcriteria Metricsforcriteria Dataformetrics
Designfactors
JESSA PARETTE | UX STRATEGY 2021
IN DESIGN QUALITY MEASURES
Whatarethebenefits?
Why does usability quality matter so much?
JESSA PARETTE | UX STRATEGY 2021
Itcanreduce
thecostof
usabilitytesting
PROVIDE A CONSISTENT BASIS FOR UNDERSTANDING & COMPARING DIFFERENT USABILITY METRICS
“We went from 2 weeks lead time for research to half a day.”
#1
JESSA PARETTE | UX STRATEGY 2021
Compliment
expert-based
evaluationof
usabilitywith
fastdata
RESEARCHERS & DESIGNERS BACK OBSERVATIONS WITH UNDENIABLE NUMBERS THAT ARE NOT NPS
NPS lies. A lot.
#2
JESSA PARETTE | UX STRATEGY 2021
Laythe
groundworkfor
usability
measures
betweenyour
dev&design
YOU’LL KNOW IT’S WORKING WHEN DEVELOPERS RELY ON USABILITY TO MEASURE THEIR QUALITY
“We kinda judge our development quality now by looking
at these numbers.”
#3
(Paraphrased, but pretty much what the developer said)
JESSA PARETTE | UX STRATEGY 2021
Democratize
usability
measurement
practice(and
automate)
INFLUENCE THE TELEMETRY & INSTRUMENTATION TO INCLUDE DESIGN QUALITY MEASURES
The sheer amount of data you could extract from device
interaction is unending. We can demystify it.
#4
JESSA PARETTE | UX STRATEGY 2021
USABILITY WITHOUT
CONTEXT OF USE IS
MEANINGLESS
JESSA PARETTE | UX STRATEGY 2021
DESIGN WITHOUT THE
CONTEXT OF HUMANITY
IS USELESS
JESSA PARETTE | UX STRATEGY 2021
THINGS THAT ARE HARDEST TO MEASURE ARE
USUALLY WHAT IS MOST WORTH MEASURING
(And we like to do what it easy)
JESSA PARETTE | UX STRATEGY 2021
TO MEASURE ANY OF THIS IS HARD BECAUSE IT
WILL MEAN CREATING SOMETHING NEW
(And new things scare us)
JESSA PARETTE | UX STRATEGY 2021
WE ARE PRONE TO KEY FALLACIES WHEN IT COMES
TO MEASURING WHAT IS HARD TO QUANTIFY
(Not just in UX design)
JESSA PARETTE | UX STRATEGY 2021
Wefirst
measure
whateveris
easiest
JESSA PARETTE | UX STRATEGY 2021
LEARN FROM HISTORY: THE FALLACIES TO AVOID
That’s ok
Fallacy#1
Then,we
disregardwhat
isn’teasyto
measureorgiveit
arandom
quantitativevalue JESSA PARETTE | UX STRATEGY 2021
LEARN FROM HISTORY: THE FALLACIES TO AVOID
Misleading & fake
Fallacy#2
Next,assumethat
ifanythingcannot
beeasily
measured,itmust
notbeimportant
JESSA PARETTE | UX STRATEGY 2021
LEARN FROM HISTORY: THE FALLACIES TO AVOID
This is blindness
Fallacy#3
Finally,wesay
thatifitcannotbe
easilymeasured,
itdoesn’texist
JESSA PARETTE | UX STRATEGY 2021
LEARN FROM HISTORY: THE FALLACIES TO AVOID
This is suicide.
Fallacy#4
THESE ARE NOT NEW
(These are called the McNamara Fallacies)
JESSA PARETTE | UX STRATEGY 2021
Robert
McNamarawas
theUSSecretary
ofDefense
duringthe
VietnamWar
JESSA PARETTE | UX STRATEGY 2021
BODY COUNT BECAME THE CHOSEN METHOD OF DETERMINING SUCCESS
This approach was blindsided by chaos, humanity, destruction of
life & the general horror of war that is hard to measure
Measuresshouldreveal
whatwereallyneedto
knowaboutdesign
JESSA PARETTE | UX STRATEGY 2021
WORTH =
VALUE
COST
Thankyou
JESSA PARETTE | 2021 UX STRATEGY
@jessaparette

More Related Content

UX STRAT Online 2021 Presentation by Jessa Parette, Capital One