Charting User Experience in Physical Human–Robot Interaction
Abstract
1 Introduction
2 Related Work
2.1 Surveys on pHRI
2.2 Definition and Metrics for UX
3 Methods
3.1 Identification
3.2 Screening
3.3 Coding the Included Articles
Physical Interaction Parameters | Description and Examples | N | % |
---|---|---|---|
1. Role of touch | |||
Support completing a task | Physical contact is needed for completing a task [41, 100]. | 8 | 18 |
Communicate or influence | Physical contact is mainly provided to communicate to or influence users or robots (e.g., emotions, effort, and judgment) or socially support them [12, 54]. | 26 | 59 |
Teach or guide movement | Physical contact is used to teach or guide movement of the robot or user [74, 79]. | 10 | 23 |
Unintended contact | Physical contact is not intended in the interaction, but it happens (or appear to happen) as a result of an error [64]. | 1 | 2 |
2. Who | |||
Human | The human initiates and is active in the physical contact [42, 76]. | 19 | 43 |
Robot | The robot initiates and is active in the physical contact [32, 73]. | 10 | 23 |
Mutual | Both the human and the robot participate in the physical interaction (e.g., handover, handshake, and hug) [18, 109]. | 20 | 45 |
3. Type of touch | |||
Touch (general) | The action is generally reported as “touching” in the article. This category often involves brief or static contact [12, 124]. | 13 | 30 |
Move | Holding onto and moving a body part in space [5, 38]. | 6 | 14 |
Handover | Passing objects without direct physical contact between the actors [8, 98]. | 7 | 16 |
Handshake | Taking hold of and shaking each other’s hand [9, 14]. | 4 | 9 |
Hug | Embracing or being embraced in one’s arms [18, 111]. | 5 | 11 |
Stroke | Actions that were described as stroking or wiping in the studies [32, 110]. | 2 | 5 |
Other | Push, pull, tap, hand clapping, or when users could select any touch actions from a set of available actions [31, 42]. | 11 | 25 |
4. Body location | |||
Hand or end effector | Anywhere on or below the wrist for the user as well as the robot’s end effector [39, 79]. | 19 | 43 |
Arm | Forearm or upper arm of the user, a humanoid robot, or any location on a robotic arm [5, 124]. | 7 | 16 |
Whole body | The physical contact involved multiple body parts (e.g., during a hug) or the contact could be applied to any body part (e.g., touching a robot anywhere on its body) [19, 109]. | 10 | 23 |
Other | Other body locations included shoulder, upper back, waist, buttock, and the robot’s tray [69, 73]. | 11 | 25 |
5. Duration | |||
Brief | ≤ 60 seconds [8, 88] | 13 | 30 |
Long | > 60 seconds [109, 123] | 2 | 5 |
Unlimited | No time limit was imposed on the physical interactions and the timing varied across the users [52, 59]. | 8 | 18 |
Not reported | Duration of the contact is unclear from the article [76, 79]. | 21 | 48 |
Independent Variable | Description and Examples | N | % |
---|---|---|---|
Physical interaction | Touch/no touch ( \(\textit{n}=10\) ), role of touch ( \(\textit{n}=2\) ), who ( \(\textit{n}=4\) ), location ( \(\textit{n}=1\) ), sensation or motion parameters ( \(\textit{n}=13\) ), duration or timing of touch ( \(\textit{n}=5\) ) [69, 110]. | 35 | 80 |
Visual | Facial expressions ( \(\textit{n}=2\) ), gaze behavior ( \(\textit{n}=1\) ), or visual appearance of the robot ( \(\textit{n}=1\) ) [18, 39]. | 4 | 9 |
Sound/utterances | Verbal utterances or noises made by the robot as part of the physical interaction [32]. | 1 | 2 |
Task | The study task to be completed [31, 42]. | 6 | 14 |
Intention | Robot or human’s attitude or social role [63, 126]. | 9 | 20 |
Demographics | Gender or other demographic characteristics of the robot or participants [12, 109]. | 3 | 7 |
Other | Previous interactions, task outcome (success or failure), human vs. robot [52, 10]. | 6 | 14 |
None | This category includes qualitative studies without independent variables and studies where the researchers collected numerical data but no parameters were varied systematically [14, 100]. | 5 | 11 |
Categorization Scheme | Description and Examples | N | % |
---|---|---|---|
1. Time of data collection | |||
Before | Electrodermal arousal before touch [76]; NARS before any interaction with the robot [130] | 11 | 25 |
During | Video recordings of each touch event [126]; time to complete task [31] | 25 | 57 |
After | RoSAS after interaction with the robot [98]; post-study interview [79] | 37 | 84 |
2. Method of data collection | |||
Questionnaires | Self-developed questionnaire on robot friendliness [110], social perception of human-to-robot handovers with RoSAS [98] | 38 | 86 |
Video recordings | Video analysis of emotions exhibited by participants during clap interaction [39], timing and frequency information from videos of handovers [45] | 15 | 34 |
Datalog | Motion tracking to collect average of speed, step-length, and cadence [74], temperature and tactile sensor data [14] | 7 | 16 |
Interviews | Interviews regarding attitudes toward assistance from a robot in homes [100], interviews regarding the whole experience of touch interactions with a humanoid robot [130] | 4 | 9 |
Physiological signals | Skin conductance response [130], respiration rate [121] | 3 | 7 |
Other | Expert rating of task outcomes [5], manual timing [42], and think aloud [38, 45] | 6 | 14 |
3. Questionnaire type | |||
Existing questionnaires (validated) | NASA TLX [48], SAM [22], PANAS [117], PAD [87], and RoSAS [29] | 16 | 36 |
Existing questionnaires (not validated) | GodSpeed [16], RAS [92], and NARS [93] | 12 | 27 |
Self-developed (items known) | Likert scale rating on robot’s social qualities [12], Likert scale rating on user’s convenience while receiving an object during a handover [8] | 16 | 36 |
Self-developed (items unknown) | Open-ended questions regarding kinesthetic teaching methods [5], questionnaire regarding subjective experience of interacting with the robot via touch [10] | 4 | 9 |
3.4 Identifying User Experience Metrics through Affinity Diagraming
UX Metric | Definition | Example Measurements and Rated Statement | N (Rated) | N (Measured) |
---|---|---|---|---|
F1—Overall | ||||
Overall evaluation | Assessment of overall experience as positive or negative, including statements about user preference and liking. | “I think using the robot is a good idea.” [19], “I would have preferred that the robot did not touch my arm” [32]. | 13 (21) | - |
Descriptive measures | Summary statistics describing the task/interaction without a positive or negative connotation | Number of actions [110], gesture intensity [126] | - | 11 (15) |
F2—Usability | ||||
Time | Time needed to complete a task | Completion time [42], response time [76], “I am satisfied with the time it took to complete the task using the interface.” [31], “Efficient” [42] | 2 (2) | 8 (13) |
Accuracy | The accuracy with which a task is completed, that is some quantification of error | Number of collisions [31], if the robot accidentally dropped the object [45], “accurate” [88] | 1 (1) | 8 (10) |
Ease of use | General satisfaction with using the interface | “I think the robot is easy to use.” [18], “I was worried that I might break the robot using the interface” [31]. | 8 (22) | - |
Understanding the task* | Understanding or learning of information in the interface | “I found the voice of the robot easy to understand.” [41], “The interface was intuitive to use to complete the task.” [31] | 6 (12) | - |
Workload | The physical (e.g., energy) and/or mental resources users spend on the interaction, including NASA Task Load Index (TLX) as an established instrument | “I really had to concentrate to use the robot.” [41], “Was the handling physically exhausting?” [123] | 3 (4) | - |
Feedback | The amount and quality of information given to the user during the interaction | “The instructions from the robot were sufficient.” [41], “Do you think the feedback was helpful?” [123] | 3 (4) | - |
Learnability | User attitude toward how easy it is to learn to use the interface | “It was easy to learn how to use the touching interface.” [31], ”How difficult was to learn how to use the robot?” [74] | 2 (3) | - |
F3—Sensory | ||||
Visual | Qualities judged based on appearance | “Large/small” [124], “laid-back/busy” [124] | 7 (15) | - |
Physical sensation | Qualities judged through touch | “Smooth/rough” [124], “The robot looks very strong.” [32] | 3 (16) | - |
Auditory | Qualities judged through sound | “Quiet/noisy” [124] | 1 (1) | - |
F4—Personal and Interpersonal | ||||
Social traits or behavior | Traits or behavior that relate to interactions with others | Percentage of eye-contact [14], face distance [14], and frequency of prompted/unprompted touches [34], “This hug made the robot seem (unfriendly–friendly).” Block et al. [18], “Likeable” [42] | 14 (32) | 7 (21) |
Personal traits | Qualities that typically belong to a person | “I think the robot went out of its way to help the person.” [12], “Principled” [73] | 9 (37) | - |
Capability | Assessment of the skills or ability of an entity which may or may not refer to a specific task | “I felt that the robot was very capable of performing its job.” [12], “I trust the robot to do the right thing at the right time.” [45] | 8 (19) | - |
Active or passive | Assessment of the overall activity of an entity. This includes statements about the speed or frequency of action or reaction. | “The robot moves its arms too slowly.” [63], “The robot showed an passive behavior” [41]. | 8 (17) | - |
Intelligence | A subset of capability that focuses on mental skills or ability | “I feel understood by the robot.” [19], “The robot understood what I explained to it.” [63] | 8 (11) | - |
Predictability | Qualities of reliability, consistency, and anticipating the next action of an entity | “I always knew what the robot was going to do next.” [42], “The robot worked the way I expected it to.” [41] | 7 (11) | - |
Teamwork | A subset of capability that focuses on joint abilities or skills between two or more entities | “The robot has specialized capabilities that can increase our performance.” [12], “Someday I could work with this robot to build something of interest.” [63] | 5 (14) | - |
F5—Experiential | ||||
Safety | Feelings of fear, being threatened or nervous, and danger | “I felt safe.” [8], “I feel threatened by the robot.” [19] | 16 (26) | - |
Enjoyment | Comfort, enjoyment, or engagement | “It was enjoyable when the robot was touching my arm.” [32], “I feel uncomfortable with the robot.” [45] | 15 (25) | - |
Emotion | Affect instruments (PAD [87], SAM [22], and PANAS [117]), emotion labels in Russell’s circumplex model of affect [105], or reference to user feelings | Arousal level from Galvanic Skin Response [76], facial expressions [39], “I found it exciting to interact with the robot.” [41], “Interesting/Boring” [14] | 8 (18) | 5 (8) |
Symbolic | Referring to the value or meaning of something in the society (among people) [6] | “People would be impressed if I had such a robot.” [19], “I would feel nervous operating a robot in front of other people.” [130] | 2 (2) | - |
Motivation | Internal desire or external pressure to do something | “I was motivated to walk.” [74], “I felt pressure or resistance for walking faster/slower.” [74] | 1 (2) | - |
Autonomy | Sense of control or independence in the interaction | “I felt independent to walk, even being supported by the platform.” [74] | 1 (1) | - |
4 Results
4.1 Physical Interaction
4.2 User Study Variables and Methods
4.3 UX Metrics
5 Discussion
5.1 A Conceptual Model of pHRI Experience
5.2 Implications for Future Research
5.3 Reflecting on the pHRI Experience
5.4 Limitations
6 Conclusion
Acknowledgments
References
Index Terms
- Charting User Experience in Physical Human–Robot Interaction
Recommendations
Physical Human-Robot Interaction Through Hugs with CASTOR Robot
Social RoboticsAbstractHugs play an essential role in social bonding between people. This study evaluates the hug interactions with a robot identifying the perception. Four hug release methods in adults were applied, a short-time hug, a long-time hug, a touch-controlled ...
User intent estimation during robot learning using physical human robot interaction primitives
AbstractAs robotic systems transition from traditional setups to collaborative work spaces, the prevalence of physical Human Robot Interaction has risen in both industrial and domestic environments. A popular representation for robot behavior is movement ...
User Experience in Social Human-Robot Interaction
Socially interactive robots are expected to have an increasing importance in human society. For social robots to provide long-term added value to people's lives, it is of major importance to stress the need for positive user experience UX of such ...
Comments
Information & Contributors
Information
Published In
Publisher
Association for Computing Machinery
New York, NY, United States
Publication History
Check for updates
Author Tags
Qualifiers
- Tutorial
Funding Sources
- National Science Foundation
Contributors
Other Metrics
Bibliometrics & Citations
Bibliometrics
Article Metrics
- 0Total Citations
- 1,012Total Downloads
- Downloads (Last 12 months)1,012
- Downloads (Last 6 weeks)357
Other Metrics
Citations
View Options
Get Access
Login options
Check if you have access through your login credentials or your institution to get full access on this article.
Sign in