Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article
Open access

What Can a Robot’s Skin Be? Designing Texture-changing Skin for Human–Robot Social Interaction

Published: 14 April 2023 Publication History

Abstract

Biological skin has numerous functions like protection, sensing, expression, and regulation. On the contrary, a robot’s skin is usually regarded as a passive and static separation between the body and environment. In this article, we explore the design opportunities of a robot’s skin as a socially expressive medium. Inspired by living organisms, we discuss the roles of interactive robotic skin from four perspectives: expression, perception, regulation, and mechanical action. We focus on the expressive function of skin to sketch design concepts and present a flexible technical method for embodiment. The proposed method integrates pneumatically actuated dynamic textures on soft skin, with forms and kinematic patterns generating a variety of visual and haptic expressions. We demonstrate the proposed design space with six texture-changing skin prototypes and discuss their expressive capacities.

1 Introduction

This article explores the design potential of a robot’s skin as a medium for human–robot interaction. We outline the design space for expressive robotic skins and present a general process of designing and prototyping expressive robotic skin with dynamically changing textures, drawing on biological metaphors.
For humans and animals, the skin is a large and complex organ with numerous functions [59]: The skin protects the body from physical injury, dehydration, toxic substances, and other external factors. Biological skin also has rich sensing capacities, which include sensing pressure, temperature, texture, and pain. Furthermore, in many organisms, the skin also communicates information about the body’s health, age, sex, and a variety of internal and affective states.
Contrary to nature, a robot’s “skin” is a much less capable component of the robot design. For simplicity and consistency, we denote all types of robotic shells, covers, and enclosures as “skin.” These robotic skins are generally conceived as a passive and protective barrier between the internal mechanisms and the external world. There are, of course, some robotic skins that are sensitive, allowing for the perception of users and the environment through tactile channels [4, 74]. In contrast, there are fewer robots that use their skin as an output medium. Among those that do are robots changing their skin color [78], temperature [54], vibration patterns [85], and other physical properties [33, 70] to display affective states. However, the existing skin vocabulary for expression is still limited to macro-scale change throughout the skin and has a limited number of parameters. Subsequently, a robot’s skin has unexplored and promising interaction opportunities.
We start with asking a general question: “What can a robot’s skin be?” We discuss the potential roles of a robot’s skin in human–robot interaction, drawing on analogies to biological skins [21], interactive and tangible surfaces [41, 43, 49], architecture facades [44], and fashion [61]. We identify design and technical opportunities for interactive robot skin from four perspectives: skin as an expressive medium, skin as a perceptual layer, skin as an exchange-regulating filter, and skin as a functional mechanical effector.
We then focus on the first of these perspectives: skin as an expressive medium. We provide an exploration of the design space for actively expressive robot skin through Research-through-Design activities of taxonomy development, sketching, and prototyping. This exploration is grounded in inspiration from skin capabilities of living organisms as design metaphors, which provide meaningful analogies for a robot’s states and processes that are related to skin changes [40]. The metaphors explored in our research include piloerection to express affect, wrinkles to show aging and concentration, blemishes caused by illness, and pores for interaction regulation. Using these metaphors, we identify the following possible coding categories [25] for expressive robotic skin: iconic messaging, affect display, relationship expression, interaction mode indication, health display, and capturing temporal change.
To embody the explored design concepts, we present an engineering process to integrate pneumatically actuated dynamic textures on soft robotic skin, deforming in response to internal air pressure. Using this technique, we can build robotic skins with different texture shapes and kinetic movements to convey a variety of expressions. We demonstrate this through six bio-inspired expressive skin prototypes: an emotion-expressing skin with goosebumps and spikes, skin furrows for displaying concentration, moving tentacles to create a sense of liveliness, adhesive skin for expressing attachment, oriented textures for attention direction, and skin pores to indicate the willingness for interaction exposure.
The main contribution of this work is to enrich the design space of robotic skins from passive boundaries to a rich expressive medium through dynamically changing textures. The skin can be expressive both visually and haptically and augment the expression especially for non-anthropomorphic robots. Our previous work has demonstrated an approach to modulate pneumatic texture arrays with goosebumps and spikes through a framework of mechanical design, fabrication, actuation, and control [38]. A user experiment was presented in Reference [37] to evaluate the emotional expression capacity of skin texture change using the actuated goosebumps and spikes, with programmable frequency and amplitude patterns. This article goes beyond goosebumps and spikes and explores more texture forms and expressions through a biomimetic approach. We present coding opportunities for a variety of expressive skin textures and a broader vocabulary of texture properties to vary. Given that a robot’s skin is an under-explored and potentially rich interaction medium, we hope to set the foundation for researchers and designers to broaden the use of actively expressive robotic skin.

2 What Can A Robot’s “Skin” Be?

To date, relatively few studies have explicitly investigated the roles and functions of a robot’s skin in the context of human–robot interaction. Usually, a robot’s skin is treated as a passive and static barrier between the robot’s internal mechanisms and the external world. The roles of a static skin include interaction affordance, user expectation through visual presentation, and protective covering (Figure 1, left).1 These are achieved through the skin’s unchanging physical properties, including material, texture, color, or temperature.
Fig. 1.
Fig. 1. Design space of potential roles for a robot’s skin: A skin may serve static (left, most existing works) or dynamic (right, proposed design space) roles in human–robot interaction. The darker shaded boxes include the main focus of this article.
The material of a robot’s skin can affect “what a robot feels like” [81], suggesting an interaction affordance and directing the user’s approach: pliability affords squishing or throwing, while rigidity encourages caution [13]. McGinn et al. [55] studied the textural properties of 27 materials acting as robot’s skin and their perceived suitability for use on a service robot. The results suggest that soft surfaces are strongly preferred in blind tests.
The visual quality of a robot’s skin, stemming from both form and material, determines “what a robot looks like,” embodies aesthetic qualities and suggests a robot’s capabilities. The appearance of a robot’s skin can help classify the robot into either humanlike, creaturelike, or appliance-like; thus setting an expectation for humans about the robot’s functionalities [6]. In addition, appearance can visually elicit gender-stereotypical judgments about a robot, subsequently affecting the perceived appropriateness for different tasks [9].
Finally, a robot’s skin can afford a protective function, which increases interaction safety for both humans and robots. For robots, a soft covering can protect sensors embedded underneath the substrate and reduce damages to the robot’s internal hardware [4]. For humans, a robot’s skin reduces the risks of contact with a robot’s rigid and sharp mechanisms.
All the above-mentioned roles of robotic skins are based on passive and unchanging skins. Existing research has focused on improving interactivity on using static skin features such as material, form, and color. In this section, we wish to challenge the conventional design space by proposing active and dynamic skin functions. We draw on the literature in robotics, biology, human–computer interaction (HCI), architecture, and wearable arts to investigate potential roles that a robot’s skin may serve in human–robot interaction.
We identify four roles of active robot’s skin, illustrated in Figure 1, center: skin as an expressive medium, skin as a perceptual layer, skin as an exchange-regulating filter, and skin as a functional mechanical effector. In Section 3, we illustrate the potential of skin as an expressive medium through a series of speculative design sketches. These sketches suggest that the robotic skin’s texture can serve as a highly expressive element. We continue by describing an engineering approach that would allow a more finely controllable robotic skin texture through soft fluidic texture units and control channels and present working prototypes of some of these design proposals.

2.1 Skin as an Expressive Medium

Biological skin is an expressive medium that conveys a variety of information about an organism, e.g., age, sex, race, and health condition. In addition, an organism’s affective states can also be expressed through skin color [78], trembling, sweating, temperature [54], and muscle movements beneath the skin [35]. Some animals express their aroused states by changing skin textures or appendages’ shapes, a phenomenon called piloerection [20]. Such behaviors include humans displaying goosebumps, birds ruffling their feathers, cats raising their back fur, blowfish protruding spikes, and so on.
These biological phenomena have inspired a variety of expressive surfaces in the field of tangible human–computer interfaces. Examples include a fur interface that mimics bristling effects through vibration [30], an interface with luminescent tentacles following a user’s hand waves [60], and a carpetlike texturally rich interface for tactile and visual communication [14]. Similar ideas have been investigated in responsive clothing that expresses or exaggerates its wearer’s physical or emotional states. For example, bristling clothes developed by Ohkubo et al. [63] raise hairs in response to a wearer’s breathing rate. A face mask named “Aposema” [57] augments a person’s facial expressions with inflation and change of color.
Social robots can also express their emotions through haptic channels, with movements on or beneath the skin. Most of the movements simulate animals’ behaviors, such as breathing, heartbeat, purring, and warming. For example, the Haptic Creature robot [85, 86] displays affective states through its breathing rate, ear stiffness, and vibrotactile purr. The Cuddlebits robot [13] expresses emotions with 1-degree-of-freedom (DOF) breathinglike behaviors. Robots also express their emotions through the change of skin color [77], temperature [64], haptic forces [33], and even scents [70].
In each case, robots change physical properties globally across the skin, using vibrations, temperature, and overall color, or deform their skin on a full-body scale. Locally expressive texture change of a robot’s skin, as proposed in this work, remains a more unexplored expressive channel in social interaction. In addition, most of the existing works have heavily focused on emotional expression, with other expressive contexts under-explored.

2.2 Skin as a Perceptual Layer

Human skin is an active sensory organ with widely distributed tactile receptors that detect touch, pressure, pain, temperature, and so on [45]. Perception of touch affords transmission of social messages, as humans often use touch to share their feelings and enhance other forms of verbal or non-verbal communications [31].
Sensitive robotic skin has been widely explored [4, 74]. Enabling touch perception on robots’ skin allows them to be physically safe and interactive to be near humans [16, 76]. Most robots combine multiple sensor devices within the skin substrate, with arrays of sensors forming continuous coverage over the skin [39, 58]. For example, the Haptic Creature robot [86] embeds a network of 56 surface-mounted force-sensing resistors to recognize affective touch. The Huggable robot [75] utilizes over 1,000 Quantum Tunnelling Composite sensors, 400 temperature sensors, and 45 electric field sensing electrodes across the full body to detect the social contents of touch. Alternative sensing methods with lighter hardware but lower resolution include acoustic sensing [1], air-pressure sensing [2], conductive fur sensing [28], EIT-based sensing [73], and so on. Our recent work “ShadowSense” [36] uses computer vision to classify shadows created by touching the robot’s skin. It achieves high resolution and a large detection range with minimal hardware and is especially apt for sensing social touches on translucent and deformable skin.

2.3 Skin as an Exchange-Regulating Filter

Contrary to a conventional robot’s skin that serves as a solid boundary, biological skins are usually selective and permeable. Cellular membranes can actively regulate the exchange of molecules based on the cell’s requirements while maintaining a stable internal order irrespective of environmental changes [26]. Pores on human skin allow sweat and oil to escape from it while getting rid of toxins [19].
As for artificial skins, dynamic facades of buildings are capable of modulating permeability, for example by allowing in a controlled amount of air and light to regulate internal temperature and ventilation [15, 48]. Clothing materials can selectively let heat and sweat escape, while protecting the wearer from rain or snow. In the area of wearable HCI research, the “Second Skin” project [84] describes a device that can open texture elements to cool it off when the skin temperature rises during running and sweating.
Like biological and artificial skins, a robot’s skin may benefit from such selective and permeable features, enabling exchange regulation with surrounding environments. For example, a robot may open its “pores” to cool off internal mechanisms or close them to make itself heatproof, based on environmental conditions. A permeable skin may act as a social filter that regulates information exchange. For example, skin may change its translucence level to allow or prevent the robot from capturing the user’s visual data, thus adjusting to privacy requirements posed by interaction contexts. It may also change its transparency to metaphorically indicate an “openness to interact,” placing this function on the boundary between regulation and expression.

2.4 Skin as a Functional Mechanical Effector

Finally, the skins of some animals serve as mechanical and functional effectors that actively change physical relations to environments, such as through locomotion, camouflaging, and adhesion. To list a few examples, hairlike cilia on the surface of cells can move the surrounding fluid and enable such cells to swim [51]. An octopus can change its skin color and textures to blend in with the seafloor [52]. Geckos’ feet are covered with hundreds of tiny hairs named setae, helping them to reversibly cling to any surfaces [10].
Roboticists have been inspired by such naturally occurring solutions to explore novel materials and mechanisms for functional robot’s skin. Examples include a snake-skin-inspired crawling robot [67], a climbing robot using gecko-inspired skin [56], an octopus-inspired three-dimensional (3D) morphing and camouflaging skin [65], and a musclelike skin for actuating inanimate object [12]. The above-mentioned bio-inspired robotic skins are primarily aimed at achieving functional purposes. Similar metaphors could be used to design functional skin effectors that convey social meanings.

3 Designing Expressive Robotic Skin

While most of the literature on interactive robotic skin has focused on the perceptual function of recognizing social touch, the expressive capacity of skin to generate touch experiences has been under-explored. In the rest of the article, we focus on the first aspect of interactive robotic skin: expression. We do so through an exploration of the design, technical, and interaction opportunities arising from an expressive robotic skin, inspired by biological functions.
We start by categorizing six possible coding opportunities for robotic skins. When categorizing the repertoire of nonverbal behaviors, Ekman defines “coding” as the “correspondence between the act and its meaning” [25]. Robotic skin may encode different expressive meanings (see Figure 1, right), including explicit messages, affective states, human–robot relation expressions, interaction modes, temporal change, and the robot’s health or other internal states.
We illustrate these design opportunities with speculative sketches in Figure 2, and detail the design potential for these six codings in the following sections. These sketches are not intended to be a comprehensive or exhaustive list of options. Instead, the intention is to illustrate design possibilities to inspire more practice and attention in the space.
Fig. 2.
Fig. 2. Illustrating bio-inspired skin expression with free-hand sketching: Skin textures can express a robot’s (a) explicit message, (b) affective state, (c) human–robot relation, (d) interaction mode, (e) temporal change, and (f) health and other internal states.

3.1 Explicit Message

A robot’s skin can display an explicit or iconic message through direct visual or haptic projection. Some social robots blend a screen within their skin, which enables displays of verbal messages or visual abstractions as information presentation. For example, the ENRICHME robot [17] displays written texts on its torso as instructions; an evacuation robot [68] sends emergency alerts using on-body dynamic signage. Robots may convey messages through a haptic channel as well, such as projecting braille with reconfigurable dots on the skin [82] (Figure 2(a)). Messages with explicit verbal translations directly communicate robot’s intent or state with no ambiguity but could require a high degree-of-freedom skin transformation, as well as more processing effort and a matched cultural background to comprehend abstract information.

3.2 Affective State

Physiological changes of the skin can be caused by affective experiences on the psychological scale. In response to sudden and intense emotions such as fright and shock, animals can get piloerection, that is, erection of tiny textures or bristling of hairs due to involuntary muscle contraction, such as human goosebumps [7]. The color of skins may rapidly change out of psychological arousal (e.g., human faces “blush” in response to situations of personal embarrassment; the Panther Chameleon changes to red and yellow when angered as a warning to others to back off [27]).
Inspired by the biological equivalent, robots may rapidly alter skin textures, color, or shape of hair to express their affective states. For example, a robot can respond to alert with spiky, trembling textures; a furry robot being suddenly awakened may bristle its fur; getting angry causes tensed, inflated muscles to appear on the robot’s skin (Figure 2(b)).

3.3 Human–Robot Relation

As described in Section 2.4, biological skins actively exert forces onto environmental surfaces and have inspired functional applications in robotics, like a robot’s skin applying adhesive forces in climbing or gripping, shear forces in locomotion. Apart from functional purposes, a robot’s skin manipulating spatial relation can be a form of expression, reflecting the robot’s relationship to users and environments. For example, a robot’s skin may actively stick to a person’s body to display affection or to allow the individual to feel attractive or repulsive forces with its hand by communicating a willingness to be touched. Likewise, a robot may direct its attention with shear forces parallel to the skin, such as referring to a person in the space by moving its skin toward it (Figure 2(c)).

3.4 Interaction Mode

The visual appearance and material property of a robot’s skin can suggest its roles and modes of interaction, as mentioned in Section 2. Section 2.3 described a permeable robot’s skin as an active filter for functional and social regulation. We also conceptualize the exposure of skin as reflecting a robot’s dynamic modes in interaction. Users may visualize the level of disclosure of their personal data in front of a robot through permeability change. For instance, robots with sensors embedded underneath the skin may open pores to see and hear persons more clearly, thus suggesting the robot is more actively involved in interaction. However, a robot can selectively expose its internal states in front of a user, which may signal the robot’s identity and traits. For example, a robot may make itself more machinelike by revealing its internal gears and motors. Uncovering the hidden thinking process underneath the skin may increase its perceived honesty (Figure 2(d)).

3.5 Temporal Change

Biological skins and their appendages gradually change their physical properties over time (e.g., hairs grow in length; aging human skin leads to wrinkles, loss of elasticity, laxity, and rough textures). Long-term growth and evolution across generations may result in skin changes in adaptation to environments, such as through natural selection [24].
Although robots do not suffer from natural growth or decay, growinglike mechanisms are of emerging interest in the field of biomimetic robots. A robot that grows and evolves its body allows for better adaptation to task constraints and complex, changing environments [53]. For example, Hawkes et al. [34] created a soft robot that navigates the environment through growth. Similarly, Dottore et al. [23] presented a plant-inspired growing robot that self-builds its structure and adapts the morphology to environments. In each case, growth is thought of as a functional behavior, such as allowing a robot to search, explore, and adapt to the environment.
We envision growthlike behaviors in a robot’s skin as a potential expressive language. Robots may integrate a short-term or long-term skin change to reflect relationships with environments, users, and time. To illustrate, in Figure 2(e), a vacuum robot grows its hairs to express the amount of dust collected; a companion robot gradually develops wrinkles on the skin, indicating the robot’s aging while an accompanied kid grows up; a plantlike robot gradually evolves its appendages reflecting living conditions.

3.6 Health and Other Internal States

The physical properties of skin can signal the health conditions of an organism and the state of being alive or dead. For instance, patches on the skin can indicate illness or injury; a warm and regularly breathing skin can be a signal of life; some species, such as sea anemones, have subtly moving tentacles communicating liveliness.
A robot’s skin may signal its health state (i.e., being alive or damaged) or similar representations, such as continuously moving skin appendages, changing color, and rigidity. For example, a robot may subtly wave its hairs or tentacles to express the sense of being alive and awake; a robot may highlight damages or errors with abnormal skin color displayed in malfunctioning areas (Figure 2(f)).
To summarize, we argue that there is a potentially rich design space for interactive robotic skin, beyond the state of the art. We note that for many of these expressive functions, we need robotic skin that can be actuated beyond globally controlled skin color and shape changes, and that skin texture would need to be finely controlled. Using texture thus as an expressive medium, we proposed six expressive codes, illustrated by speculative sketches. We now proceed to describe a design and engineering technique to achieve these textures, along with a design vocabulary afforded by these artificial skin textures.

4 Engineering Robotic Skin Textures

To ground the design space presented above, we present an engineering solution to realize an expressive robot skin. We develop a general method to manufacture dynamic skin textures for expression and present a texture design vocabulary to generate different visual and haptic experiences.
To manufacture the skin, we use soft materials due to their flexible nature, in the hope of simulating organic and biological deformation [47]. Several technologies can be used to actuate soft robots, such as pneumatic actuators [50, 65], tension cables [22], SMA actuators [60, 62], chemical reaction [72], magnetic field [66], vibrators [30], and so on. Our method uses a pneumatically actuated elastomer because of the following considerations: (1) Pneumatic actuators are safe and easy to use and deformation can be easily manipulated through designing the geometry of the elastomer; (2) it might achieve complex and continuous shape changes with low degree-of-freedom actuation; and (3) the material (e.g., silicone) has a low cost and a long operational life.
Mapping the morphology of a pneumatic elastomer to its pressure–deformation relationship has been researched extensively, but this mostly focused on shape changes of the entire actuator, such as bending, extending, and twisting [69]. Only a few robots develop local texture changes on surfaces: Pikul et al. [65] presented a mapping from two-dimensional planar surfaces into complex 3D shapes, but the method can only realize a skin with 1-DOF vertical motion, with its resting state always a flat surface.
To generate more diverse texture shapes and deformations, we designed a pneumatically actuated multi-layer elastomer skin, which changes its surface textures in response to air pressure. To generate textures, we model the skin as an array of Texture Units (TUs). Each TU is specifically designed to be both visually and haptically expressive. The TUs are then combined onto a substrate to form a Texture Module (TM). TUs of the same type are pneumatically connected and uniformly controlled, while different types of TUs can independently generate varied surface properties. Finally, to make the skin fully autonomous, we integrate a power system. Below we specify design properties and engineering methods in each step.

4.1 Designing Texture Units

A TU is the minimal element of a texture-changing skin, designed to be both visually and haptically expressive. Each TU has an application-specific shape, consisting of an elastomer body, an internal air cavity, and (optionally) a haptic-expressive element.
Figure 3 presents a design vocabulary for textures. Multiple properties of texture units can be manipulated in design to provide different haptic and visual experiences, including shape, materiality, motion, and force. Shape is defined by the geometry of external elastomer and the structure of internal air cavity and mainly affects a TU’s visual feedback (e.g., a cone versus a bump). The shape of the embedded air cavity affects the deformation of the TU. For example, when being deflated, a cylindrical cavity may result in a dent on the surface, while a valleylike cavity will generate a furrow. Materiality with varied stiffness and haptic elements can more strongly determine the haptic experience during the interaction. For example, a soft tip of a spike TU can be replaced with a sharp, rigid element (shaded area) to amplify the spiky and unpleasant feeling. In terms of the TU’s motion, each unit can be inflated and deflated. For each motion primitive, the speed of deformation can be converted by the frequency of internal airflow, while stiffness and range of deformation is determined by internal air pressure and the elasticity of elastomer. In addition, a texture unit can be structurally designed to transmit different types of forces with varied haptic feedback (e.g., repulsive force by a goosebump, attractive force by a suction cup, an oriented TU generating force perpendicular to the inclined surface, and a furrow TU applying contraction force along the skin surface).
Fig. 3.
Fig. 3. Design vocabulary of textures. The single Texture Unit can vary its shape, materiality, cavity, and force to generate diverse visual and haptic experiences. Combining the TUs considers properties like their distribution, configuration, resolution, and connection.
To illustrate the variations, we present six examples of TU designs and their resulting deformation under inflation (green, solid boundary line) and deflation (yellow, dashed boundary line) of internal air cavities (Figure 4). We take a biomimetic approach to translate patterns of textures in nature to TU mechanisms that can transit between resting states and actuated states. Several design iterations were performed on the shape of cavities, thickness of walls, and placements of haptic materials, finally converging on the six visually and haptically expressive units. Each diagram illustrates the structure of a TU from a front sectional view (section plane perpendicular to the skin surface), except the stoma TU that shows a top view. A more detailed description of the textures is presented in Section 5.
Fig. 4.
Fig. 4. Each Texture Unit is composed of an elastomer body, an embedded air cavity and optionally a haptic element. The TU deforms its surface under air inflation (green) and deflation (yellow).

4.2 Combining TUs to Form Texture Modules

Our design allows the combination of a large number of TUs into a substrate to form a TM. A substrate is made up of an elastomer body and embedded fluidic networks that connect to the air cavities of TUs. Each network can be separately pressure controlled. A substrate can be described by its shape and internal network connections. The shape of a substrate determines the skin’s macro-scale geometry, which depends on the shape of the robot. Embedded fluidic networks determine how TUs are connected and controlled.
Figure 3 provides a design vocabulary for combining TUs to form a Texture Module. The distribution of textures can provide distinct visual and haptic representations. Uniformly distributed textures may represent global patterns across the skin, while clustered textures may highlight local groups. The configuration of directional texture units (e.g., scales) is determined by the orientations of textures and their geometric relationship. For example, textures facing the same direction form a parallel pattern and may generate a directional force flow across the skin. The TM’s resolution, determined by the size of the TU and the density of texture placement, affects display quality as well as haptic finger, hand, and body interactions. The connectivity between texture units determines grouping and freedom of actuation. Having higher degrees of freedom may realize more diverse motions but results in more complicated actuation and control systems.
To fabricate a Texture Module, two elastomer layers are molded through a standard casting process with silicone rubber. The mold used for the upper layer models a desired TM structure, with one part casting the outer TM shape, and the other for fluidic cavities and chambers. The mold used for the lower layer may contain an inextensible film, such as a fiber or a article to constrain deformation downward. Then the two layers are separately cured and joined together by applying a thin layer of elastomer [38]. A detailed fabrication guide and 3D CAD files of molds are provided in supplemental files.

4.3 Powering the Textures

To integrate skins in the context of interactive robots, the size and noise level of the powering systems are important considerations, favoring a self-contained, low-noise profile. Thus, we designed a power screw actuated linear displacement pump to actuate textures. The core of this design is a re-purposed syringe, which we used as a cylindrical pump with a plunger displaced by a linear stepper motor. This system afforded low noise, high control accuracy, and high efficiency, compared to a commonly used rotary pump. A full description of the mechanism can be found in a supplemental file and in a previous publication [38].

5 Demonstrating Bio-inspired Skin Prototypes for Expression

In this section, we present six bio-inspired skin prototypes to illustrate the expressive potential of texture-changing skins (Figure 5). The prototypes were designed and engineered using methods in Sections 3 and 4. Figure 6 summarizes skin prototypes in terms of how they make use of texture unit properties and how they can achieve some of the expression coding categories described above. The \(x\) -axis captures the primary consideration of texture unit properties to design a desired expression. For example, the furrow texture uses shape and force properties as it expresses mainly through its wrinkle-mimicking geometry and haptic tension. The \(y\) -axis points to the expression coding of the prototype, categorized by the design space in Section 3. For example, the furrow textures may express a robot’s affective state, i.e., being confused, concentrated or nervous. The graph is intended to visualize the design framework with proposed prototypes. The empty spaces in the graph point to potential design opportunities in future research. A video supplement is provided to demonstrate the skin prototypes in motion.
Fig. 5.
Fig. 5. Bio-inspired skin prototypes for expression: (a) dynamic goosebumps and spikes for emotion expression; (b) skin furrows for concentration display; (c) moving tentacles to express liveliness; (d) adhesive suction cups for attachment display; (e) oriented textures for directing attention; (f) skin pores for expressing exposure. Image credits: (a-1) is from [18], (a-2) is photographed by Yuhan Hu, (b-1) is cropped from Reference [83], (c-1) is by Andreas Berget [8], (d-1) is by Ann Antonova [3], (e-1) is by Laura James [42], (f-1) is from Reference [80].
Fig. 6.
Fig. 6. Skin texture prototypes in terms of their use of texture unit properties and expressive coding categories.

5.1 Dynamic Spikes and Goosebumps for Emotion Expression

Inspired by biological piloerection, such as humans growing goosebumps (Figure 5(a-1)) and porcupine fish protruding spikes (Figure 5(a-2)), we design a texture module with two 2D arrays of goosebumps and spikes, separately connected with two inner fluidic networks. Goosebumps transform from a resting flat surface to smooth bumps under positive air pressure (Figure 5(a-3)). Spikes units are structured as cones with rigid haptic elements embedded on the TU tips (Figure 5(a-4)). The spikes deform and retract the sharp tips under negative pressure.
We use the textures’ shapes and the speed of their movements to convey a variety of emotions. We map the shape as communicating an emotional valence, in that spikes naturally represent angry, defensive states, and goosebumps are related to pleasure and excitement. The frequency of textures’ movements is mapped to an arousal dimension, with higher texture change frequency communicating higher arousal. This is also inspired by biological analogies, such as the frequency of breath and heartbeats falls to a lower level when in a low arousal state and increases when aroused. The mappings were further validated with evidence in a controlled human experiment [37]. The results of the experiment indicated that participants consistently perceive the proposed texture behaviors as expressing specific emotions, with a similar distribution across interaction modes, including video viewing, in person observation, and touching the texture. The spike-goosebump combination is proved to be effective in conveying both emotional arousal and valence, with valance being more easily differentiated when touching the textures. This design improves upon most current haptic social robotic research with breathinglike or vibrotactile behaviors, which were shown to be somewhat ambiguous and ineffective in conveying emotional valence [85].

5.2 Skin Furrows for Concentration Display

Muscle contraction can cause skin to bunch together, forming dynamic wrinkles between the bulk of muscles. For example, forehead wrinkles appear when a person frowns or furrows the eyebrows, expressing states like confused, concentrated, worried, or annoyed (Figure 5(b-1)). We simulate dynamic wrinkles on skin to display similar emotions. Each wrinkle unit is structured with a valleylike air cavity embedded in a flat elastomer body. The skin changes from a resting flat surface indicating a robot’s “relaxed” state (Figure 5(b-2)) to a furrowed surface with haptic tension under negative air pressure, indicating the robot being “concentrated” or “nervous” (Figure 5(b-3)).

5.3 Moving Tentacles to Display Liveliness

Inspired by sea anemones slowly waving their tentacles (Figure 5(c-1)), we design a skin module with subtly moving tentacles for displaying liveliness in robots. Unlike the general method of designing each TU as an independent actuator, we use passive under-actuated tentacles driven by an actuation layer underneath. In the presented prototype, the module has a 4-DOF actuation layer with a resting flat surface attached by static tentacles (Figure 5(c-2)). Pressurizing the actuation layer deforms the surface, thus driving tentacles to move accordingly (Figure 5(c-3)).

5.4 Adhesive Suction Cups to Express Attachment

An octopus’s suckers (Figure 5(d-1)) can reversibly adhere to different substrates by forming a seal at the rim and reducing pressure in the acetabular cavity [79]. This has inspired adhesive robotic skins in performing climbing, manipulation, and grasping tasks [5, 32]. Using a similar principle, we implement a skin module with arrays of suction cups: the skin can deflate to adhere to an environmental surface (Figure 5(d-3)) and pressurize to detach (Figure 5(d-2)). In human–robot interaction, a robot may display affection to a person by attaching to his body or stick to the ground and express resistance to be moved or picked up.

5.5 Oriented Textures for Attention Direction

Humans use nonverbal behaviors to direct attention, such as pointing to or gazing at a specific object [6]. We use oriented skin textures to communicate a similar cue. The textures are designed by simulating biological folding patterns, like scales of pinecones [46] (Figure 5(e-1)) and using rhythmic beating movements to generate a directional flow. Each oriented texture unit has an inclined surface under positive air and is strengthened by an embedded rigid layer (Figure 5(e-3)). The unit can flatten and blend in the skin under air deflation (Figure 5(e-2)). With the beating of oriented textures, a skin can generate a continuous force toward the oriented direction. The force can be further visualized by passing an object along the skin or, by pointing the TUs toward the ground plane, support locomotion by the robot. In interaction, a robot may refer to a person or object by moving the skin or passing an object toward it.

5.6 Skin Pores Indicating Interaction Exposure

Inspired by plants’ stomata that open and close to regulate exchange (Figure 5(f-1)), we make “pores” on a skin module to indicate the level of interaction exposure. Each texture unit opens the central pole in a resting state (Figure 5(f-2)). When pressurized, walls around a pore deform toward the center, thus closing the pore (Figure 5(f-3)). In this way, the skin can control the amount of air and light to pass through and indicate the willingness of exposure. Opening pores may increase a robot’s exposure to the environment, permitting it to acquire richer data; however, it is less protected by the reduced physical filtering.

6 Discussion

In this section, we discuss additional design considerations for expressive texture-changing robotic skin, including the skin’s relationship to the robot’s overall morphology, multi-modal aspects of robotic skin expression, and the assumption of involuntary activation of a robotic skin’s expression.

6.1 Relationship to the Robot’s Morphology

The robotic skin modules described in the previous sections are small stand-alone patches of expressive texture behaviors. In practice, robotic skins will always be part of a robot’s larger morphological and functional design. The relationship between the textured skin and the robot’s overall morphology prompts several design considerations.
First, texture expression, as presented here, can be particularly useful for non-anthropomorphic and low-degree-of-freedom robots, as it is less constrained by the robot’s morphology compared to gestures and facial expressions. Many robots have functional shapes (e.g., vacuum cleaners, automatic doors, and cars), making it impossible for them to exploit anthropomorphic elements like faces or arms to express emotions and intentions. Existing non-anthropomorphic modalities, such as movements, sounds, and lights, have limited expressive effectiveness and may cause under-performance in their primary functions [11]. Thus, texture-changing skin could be a productive extension of the expressive vocabulary for such non-anthropomorphic robots.
An important design decision, when integrating an active skin in a robot design, is choosing the form and placement of the texture modules. This choice should be based on the function and structure of the robot, as well as the desired expression the skin should generate. For example, textures on a robot’s back may be less observable and communicate mainly haptically when a user is holding the robot; textures at the bottom of a robot may be neither visible nor reachable but express by moving the robot around or adhering to the ground.
The question of scalability of a texture-changing skin is not trivial and should be further explored with social robots of different shapes, sizes, and functions. The current design poses constraints on the surface property of robots, with a requirement of a relatively flat and small-scale profile. Future research will include improving fabrication techniques to allow for designing and manufacturing non-flat texture modules with potentially finer textures.
In addition, the current setup is powered by a system sized proportionally to the number of texture types and the scale of texture modules. More efforts need to be made to design a more compact actuation solution for large-scale and high-degree-of-freedom implementations.
While this article focuses on the expressive aspect of robotic skins, future work should explore other roles of interactive skins, such as adding perception channels for bi-directional communication. This could be achieved by embedding conductive sensors [28], measuring air pressure [2], and performing vision-based shadow sensing, as proposed in our previous work [36].

6.2 Multi-modal Skin Expression

Dynamic textures on a robotic skin express mainly via two sensory channels: vision and touch. These two layers generate rich and distinct experiences: shapes and deformations above the skin draw visual impressions, whereas materiality and tangibility of the textures impact users through tactile sensations. Users interacting with a robot through these two differing modalities may have different interpretations of the expression or their intensities, as evidenced in our previous work [37]. This difference may also shape user behavior, for example getting closer to the robot to discover and experience different modes of expression.
The interplay between skin expressions and other communicative modalities of a robot (e.g., a face, gestures, or voice) must also be considered. Keeping expressions consistent across modalities may help avoid confusion and increase believability. Skin expressions may also be used to amplify the existing modalities, for instance, a “happy” face accompanied with “happy” textures may increase the perceived intensity of happiness compared to a face or a texture alone. Having conflicting expressions (e.g., a robot with a “happy” face and a “sad” skin) presents opportunities to design for more complex and layered internal states of the robot (e.g., hiding sadness with a fake smile).

6.3 Involuntary Skin Change

Unlike many other nonverbal behaviors in human–human communication, skin is not normally regarded as a voluntary communication channel, nor does it have an explicit vocabulary for expression. Still, nature provides many examples of communicative skins that can be used as design metaphors, since humans have possibly developed an innate ability to interpret those signals during their co-evolution with other life forms [21]. We hope to make use of these metaphors in our design to evoke intuitive and affective meaning construction when users interact with the skin.
The involuntary nature of skin expressions provides interesting interaction possibilities, by tapping into a thus-far unused layer of meaning in human–robot interaction. Expressions may operate in a less noticeable and subconscious fashion, compared to purposeful facial and gesture movements. Ambiguity and subtleness of an expression may lead to more open interpretations [71] that vary for different people and scenarios. Interpreting such expressions could prove to be reliant on a subject’s prior knowledge and the interaction context. While the ambiguous nature of the skin expression can be challenging, it also provides room personalized designs, and opportunities for users to actively explore hidden meanings through interaction.

7 Conclusion

In this article, we explored the design potential of an interactive robot’s skin inspired by the skins of living organisms. We presented a design space including roles and expressive codes that could enable the design of future robotic skins as rich interaction components. This design space supplements the more commonly used human–robot interaction design elements of shape, material, gesture, and voice interaction. We presented a flexible technical tool for implementing soft dynamic textures on a skin that can generate different shapes and kinematic patterns for expression. We also provided a vocabulary that can be used to define and implement a variety of active texture skin components. Through a range of speculative skin concepts and prototypes, this work hopes to expand the design space of robotic skins from passive boundaries to a rich expressive medium for human–robot interaction, setting the foundation for designers and researchers to uncover the potential of actively expressive robotic skin.

Footnote

1
There is some overlap between these categories and the three roles of robotic clothing proposed in Reference [29]: “adapting to context,” “protection,” and “signaling.”

References

[1]
Fernando Alonso-Martín, Juan José Gamboa-Montero, José Carlos Castillo, Álvaro Castro-González, and Miguel Ángel Salichs. 2017. Detecting and classifying human touches in a social robot through acoustic sensing and machine learning. Sensors 17, 5 (2017), 1138.
[2]
Alexander Alspach, Joohyung Kim, and Katsu Yamane. 2018. Design and fabrication of a soft robotic hand and arm system. In Proceedings of the IEEE International Conference on Soft Robotics (RoboSoft’18). IEEE, 369–375.
[3]
Ann Antonova. 2019. Octopus in the Water Near the Coral Reefs. Pexels. Retrieved from https://www.pexels.com/photo/octopus-in-the-water-near-the-coral-reefs-5986729/.
[4]
Brenna D. Argall and Aude G. Billard. 2010. A survey of tactile human–robot interactions. Robot. Auton. Syst. 58, 10 (2010), 1159–1176.
[5]
Promode R. Bandyopadhyay, J. Dana Hrubes, and Henry A. Leinhos. 2008. Biorobotic adhesion in water using suction cups. Bioinspir. Biomim. 3, 1 (2008), 016003.
[6]
Christoph Bartneck, Tony Belpaeme, Friederike Eyssel, Takayuki Kanda, Merel Keijsers, and Selma Šabanović. 2020. Human-robot Interaction: An Introduction. Cambridge University Press.
[7]
Mathias Benedek and Christian Kaernbach. 2011. Physiological correlates and emotional specificity of human piloerection. Biol. Psychol. 86, 3 (2011), 320–329.
[8]
Andreas Berget. 2015. Pink Urticina Eques. Pexels. Retrieved from https://www.pexels.com/photo/pink-urticina-eques-7766133/.
[9]
Jasmin Bernotat, Friederike Eyssel, and Janik Sachse. 2017. Shape it–the influence of robot body shape on gender perception in robots. In Proceedings of the International Conference on Social Robotics. Springer, 75–84.
[10]
Bharat Bhushan. 2007. Adhesion of multi-level hierarchical attachment systems in gecko feet. J. Adhes. Sci. Technol. 21, 12–13 (2007), 1213–1258.
[11]
Andrea Bonarini. 2016. Can my robotic home cleaner be happy? Issues about emotional expression in non-bio-inspired robots. Adapt. Behav. 24, 5 (2016), 335–349.
[12]
Joran W. Booth, Dylan Shah, Jennifer C. Case, Edward L. White, Michelle C. Yuen, Olivier Cyr-Choiniere, and Rebecca Kramer-Bottiglio. 2018. OmniSkins: Robotic skins that turn inanimate objects into multifunctional robots. Sci. Robot. 3, 22 (2018), 1–9.
[13]
Laura Cang, Paul Bucci, and Karon E. MacLean. 2015. Cuddlebits: Friendly, low-cost furballs that respond to touch. In Proceedings of the ACM on International Conference on Multimodal Interaction. ACM, 365–366.
[14]
Marcelo Coelho and Pattie Maes. 2008. Sprout I/O: A texturally rich interface. In Proceedings of the 2nd International Conference on Tangible and Embedded Interaction. ACM, 221–222.
[15]
Marcelo Coelho and Pattie Maes. 2009. Shutters: A permeable surface for environmental control and communication. In Proceedings of the 3rd International Conference on Tangible and Embedded Interaction. ACM, 13–18.
[16]
Martin D. Cooney, Shuichi Nishio, and Hiroshi Ishiguro. 2012. Recognizing affection for a touch-based interaction with a humanoid robot. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE, 1420–1427.
[17]
Serhan Coşar, Manuel Fernandez-Carmona, Roxana Agrigoroaie, Jordi Pages, François Ferland, Feng Zhao, Shigang Yue, Nicola Bellotto, and Adriana Tapus. 2020. ENRICHME: Perception and interaction of an assistive robot for the elderly at home. Int. J. Soc. Robot. 12, 3 (2020), 779–805.
[18]
[19]
A. K. Dąbrowska, Fabrizio Spano, Siegfried Derler, Christian Adlhart, Nicholas D. Spencer, and René M. Rossi. 2018. The relationship between skin function, barrier properties, and body-dependent factors. Skin Res. Technol. 24, 2 (2018), 165–174.
[20]
Charles Darwin. 1859. On the Origin of Species. John Murray.
[21]
Charles Darwin. 1872. The Expression of the Emotions in Man and Animals. John Murray.
[22]
Felecia Davis. 2015. The textility of emotion: A study relating computational textile textural expression to emotion. In Proceedings of the ACM SIGCHI Conference on Creativity and Cognition. ACM, 23–32.
[23]
Emanuela Del Dottore, Alessio Mondini, Ali Sadeghi, and Barbara Mazzolai. 2018. A plant-inspired kinematic model for growing robots. In Proceedings of the IEEE International Conference on Soft Robotics (RoboSoft’18). IEEE, 20–24.
[24]
Lian Deng and Shuhua Xu. 2018. Adaptation of human skin color in various populations. Hereditas 155, 1 (2018), 1.
[25]
Paul Ekman. 1969. The repertoire of nonverbal behavior: Categories, origins, usage, & coding. Semiotica 1 (1969), 49–98.
[26]
Claudia Elfgang, Reiner Eckert, Hella Lichtenberg-Fraté, Anette Butterweck, Otto Traub, Roger A. Klein, Dieter F. Hülser, and Klaus Willecke. 1995. Specific permeability and selective formation of gap junction channels in connexin-transfected HeLa cells.J. Cell Biol. 129, 3 (1995), 805–817.
[27]
Gary W. Ferguson, James Bernard Murphy, Jean-Baptiste Ramanamanjato, A. P. Raselimanana, et al. 2004. The Panther Chameleon: Color Variation, Natural History, Conservation, and Captive Management.Krieger Publishing Company.
[28]
Anna Flagg, Diane Tam, Karon MacLean, and Robert Flagg. 2012. Conductive fur sensing for a gesture-aware furry robot. In Proceedings of the IEEE Haptics Symposium (HAPTICS’12). IEEE, 99–104.
[29]
Natalie Friedman, Kari Love, Ray LC, Jenny E. Sabin, Guy Hoffman, and Wendy Ju. 2021. What robots need from clothing. In Proceedings of the Designing Interactive Systems Conference. ACM, 1345–1355.
[30]
Masahiro Furukawa, Yuji Uema, Maki Sugimoto, and Masahiko Inami. 2010. Fur interface with bristling effect induced by vibration. In Proceedings of the 1st Augmented Human International Conference. ACM, 1–6.
[31]
Alberto Gallace and Charles Spence. 2010. The science of interpersonal touch: An overview. Neurosci. Biobehav. Rev. 34, 2 (2010), 246–259.
[32]
Frank W. Grasso and Pradeep Setlur. 2007. Inspiration, simulation and design for smart robot manipulators from the sucker actuation mechanism of cephalopods. Bioinspir. Biomimet. 2, 4 (2007), S170.
[33]
Yuki Hashimoto, Satsuki Nakata, and Hiroyuki Kajimoto. 2009. Novel tactile display for emotional tactile experience. In Proceedings of the International Conference on Advances in Computer Enterntainment Technology. ACM, 124–131.
[34]
Elliot W. Hawkes, Laura H. Blumenschein, Joseph D. Greer, and Allison M. Okamura. 2017. A soft robot that navigates its environment through growth. Sci. Robot. 2, 8 (2017), 1–7.
[35]
Jennifer Healey. 2014. Physiological sensing of emotion. In The Oxford Handbook of Affective Computing. Oxford University Press, 204.
[36]
Yuhan Hu, Sara Maria Bejarano, and Guy Hoffman. 2020. ShadowSense: Detecting human touch in a social robot using shadow image classification. Proc. ACM Interact. Mob. Wear. Ubiq. Technol. 4, 4 (2020), 1–24.
[37]
Yuhan Hu and Guy Hoffman. 2019. Using skin texture change to design emotion expression in social robots. In Proceedings of the 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI’19). IEEE, 2–10.
[38]
Yuhan Hu, Zhengnan Zhao, Abheek Vimal, and Guy Hoffman. 2018. Soft skin texture modulation for social robotics. In Proceedings of the IEEE International Conference on Soft Robotics (RoboSoft’18). IEEE, 182–187.
[39]
Dana Hughes and Nikolaus Correll. 2014. A soft, amorphous skin that can sense and localize textures. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA’14). IEEE, 1844–1851.
[40]
Jörn Hurtienne and Johann Habakuk Israel. 2007. Image schemas and their metaphorical extensions: Intuitive patterns for tangible interaction. In Proceedings of the 1st International Conference on Tangible and Embedded Interaction. ACM, 127–134.
[41]
Hiroshi Ishii and Brygg Ullmer. 1997. Tangible bits: Towards seamless interfaces between people, bits and atoms. In Proceedings of the ACM SIGCHI Conference on Human Factors in Computing Systems. ACM, 234–241.
[42]
Laura James. 2020. Natural Dry Pine Cone on White Background. Pexels. Retrieved from https://www.pexels.com/photo/natural-dry-pine-cone-on-white-background-6101994/.
[43]
Heekyoung Jung, Youngsuk L. Altieri, and Jeffrey Bardzell. 2010. SKIN: Designing aesthetic interactive surfaces. In Proceedings of the 4th International Conference on Tangible, Embedded, and Embodied Interaction. ACM, 85–92.
[44]
Abdulmajid Karanouh and Ethan Kerber. 2015. Innovations in dynamic architecture. J. Facade Des. Eng. 3, 2 (2015), 185–221.
[45]
Roberta L. Klatzky and Susan J. Lederman. 2003. Touch. In Handbook of Psychology. Wiley Online Library, 147–176.
[46]
A. Le Duigou and M. Castro. 2016. Evaluation of force generation mechanisms in natural, passive hydraulic actuators. Sci. Rep. 6 (2016), 18105.
[47]
Chiwon Lee, Myungjoon Kim, Yoon Jae Kim, Nhayoung Hong, Seungwan Ryu, H. Jin Kim, and Sungwan Kim. 2017. Soft robot review. Int. J. Contr. Autom. Syst. 15, 1 (2017), 3–15.
[48]
Marlén López, Ramón Rubio, Santiago Martín, and Ben Croxford. 2017. How plants inspire façades. From plants to architecture: Biomimetic principles for the development of adaptive architectural envelopes. Renew. Sust. Energy Rev. 67 (2017), 692–703.
[49]
Ellen Lupton. 2007. Skin: Surface, Substance, and Design. Princeton Architectural Press.
[50]
Andrew D. Marchese, Robert K. Katzschmann, and Daniela Rus. 2015. A recipe for soft fluidic elastomer robots. Soft Robot. 2, 1 (2015), 7–25.
[51]
Milena Marinković, Jürgen Berger, and Gáspár Jékely. 2020. Neuronal coordination of motile cilia in locomotion and feeding. Philos. Trans. Roy. Soc. B 375, 1792 (2020), 20190165.
[52]
Jennifer A. Mather, Roland C. Anderson, and James B. Wood. 2013. Octopus: The Ocean’s Intelligent Invertebrate. Timber Press.
[53]
Barbara Mazzolai and Cecilia Laschi. 2020. A vision for future bioinspired and biohybrid robots. Sci. Robot. 5, 38 (2020), eaba6893.
[54]
Richard A. McFarland. 1985. Relationship of skin temperature changes to the emotions accompanying music. Biofeedback Self-regul. 10, 3 (1985), 255–267.
[55]
Conor McGinn and Dylan Dooley. 2020. What should robots feel like? In Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction. ACM/IEEE, 281–288.
[56]
Carlo Menon, Michael Murphy, and Metin Sitti. 2004. Gecko inspired surface climbing robots. In Proceedings of the IEEE International Conference on Robotics and Biomimetics. IEEE, 431–436.
[57]
Adi Meyer, Sirou Peng, and Silvia Rueda. 2019. APOSEMA: Exploring communication in an apathetic future. In Design and Semantics of Form and Movement. DeSForM, 241.
[58]
Takashi Minato, Yuichiro Yoshikawa, Tomoyuki Noda, Shuhei Ikemoto, Hiroshi Ishiguro, and Minoru Asada. 2007. CB2: A child robot with biomimetic body for cognitive developmental robotics. In Proceedings of the 7th IEEE-RAS International Conference on Humanoid Robots. IEEE, 557–562.
[59]
William Montagna. 2012. The Structure and Function of Skin. Elsevier.
[60]
Akira Nakayasu. 2016. Luminescent tentacles: A scalable SMA motion display. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology. ACM, 33–34.
[61]
Kristin Neidlinger, Lianne Toussaint, Edwin Dertien, Khiet P. Truong, Hermie Hermens, and Vanessa Evers. 2019. Emotional prosthesis for animating awe through performative biofeedback. In Proceedings of the 23rd International Symposium on Wearable Computers. ACM, 312–317.
[62]
Takuya Nojima, Yoshiharu Ooide, and Hiroki Kawaguchi. 2013. Hairlytop interface: An interactive surface display comprised of hair-like soft actuators. In Proceedings of the World Haptics Conference (WHC’13). IEEE, 431–435.
[63]
Masaru Ohkubo, Miki Yamamura, Hiroko Uchiyama, and Takuya Nojima. 2014. Breathing clothes: Artworks using the hairlytop interface. In Proceedings of the 11th Conference on Advances in Computer Entertainment Technology. ACM, 1–4.
[64]
Denis Peña and Fumihide Tanaka. 2018. Touch to feel me: Designing a robot for thermo-emotional communication. In Companion of the ACM/IEEE International Conference on Human-Robot Interaction. ACM/IEEE, 207–208.
[65]
J. H. Pikul, S. Li, H. Bai, R. T. Hanlon, I. Cohen, and R. F. Shepherd. 2017. Stretchable surfaces with programmable 3D texture morphing for synthetic camouflaging skins. Science 358, 6360 (2017), 210–214.
[66]
Hayes Raffle, James Tichenor, and Hiroshi Ishii. 2004. Super Cilia skin: A textural interface. Textile 2, 3 (2004), 328–347.
[67]
Ahmad Rafsanjani, Yuerou Zhang, Bangyuan Liu, Shmuel M. Rubinstein, and Katia Bertoldi. 2018. Kirigami skins make a simple soft actuator crawl. Sci. Robot. 3, 15 (2018), 1–7.
[68]
Paul Robinette, Alan R. Wagner, and Ayanna M. Howard. 2014. Assessment of robot guidance modalities conveying instructions to humans in emergency situations. In Proceedings of the 23rd IEEE International Symposium on Robot and Human Interactive Communication. IEEE, 1043–1049.
[69]
Arthur Seibel and Lars Schiller. 2018. Systematic engineering design helps creating new soft machines. Robot. Biomimet. 5, 1 (2018), 5.
[70]
Hikaru Senbonmatsu and Fumihide Tanaka. 2019. Robot with an olfactory display: Decorating its movements by smells. In Proceedings of the 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN’19). IEEE, 1–5.
[71]
Phoebe Sengers and Bill Gaver. 2006. Staying open to interpretation: Engaging multiple meanings in design and evaluation. In Proceedings of the 6th Conference on Designing Interactive Systems. ACM, 99–108.
[72]
Robert F. Shepherd, Adam A. Stokes, Jacob Freake, Jabulani Barber, Phillip W. Snyder, Aaron D. Mazzeo, Ludovico Cademartiri, Stephen A. Morin, and George M. Whitesides. 2013. Using explosions to power a soft robot. Angew. Chem. Int. Ed. 52, 10 (2013), 2892–2896.
[73]
David Silvera-Tawil, David Rye, and Mari Velonaki. 2014. Interpretation of social touch on an artificial arm covered with an EIT-based sensitive skin. Int. J. Soc. Robot. 6, 4 (2014), 489–505.
[74]
David Silvera-Tawil, David Rye, and Mari Velonaki. 2015. Artificial skin and tactile sensing for socially interactive robots: A review. Robot. Auton. Syst. 63 (2015), 230–243.
[75]
Walter Dan Stiehl and Cynthia Breazeal. 2005. Affective touch for robotic companions. In Proceedings of the International Conference on Affective Computing and Intelligent Interaction. Springer, 747–754.
[76]
Walter Dan Stiehl, Jeff Lieberman, Cynthia Breazeal, Louis Basel, Levi Lalla, and Michael Wolf. 2005. Design of a therapeutic robotic companion for relational, affective touch. In Proceedings of the IEEE International Workshop on Robot and Human Interactive Communication (ROMAN’05).IEEE, 408–415.
[77]
Kazunori Terada, Atsushi Yamauchi, and Akira Ito. 2012. Artificial emotion expression for a robot by dynamic color change. In Proceedings of the 21st IEEE International Symposium on Robot and Human Interactive Communication (ROMAN’12). IEEE, 314–321.
[78]
Christopher A. Thorstenson, Andrew J. Elliot, Adam D. Pazda, David I. Perrett, and Dengke Xiao. 2018. Emotion-color associations in the context of the face.Emotion 18, 7 (2018), 1032.
[79]
Francesca Tramacere, Lucia Beccai, Fabio Mattioli, Edoardo Sinibaldi, and Barbara Mazzolai. 2012. Artificial adhesion mechanisms inspired by octopus suckers. In Proceedings of the IEEE International Conference on Robotics and Automation. IEEE, 3846–3851.
[80]
Albert Van Eeckhout, Enrique Garcia-Caurel, Teresa Garnatje, Juan Carlos Escalera, Mercè Durfort, Josep Vidal, José J. Gil, Juan Campos, and Angel Lizana. 2021. Polarimetric imaging microscopy for advanced inspection of vegetal tissues. Sci. Rep. 11, 1 (2021), 1–12.
[81]
Sanne van Waveren, Linnéa Björklund, Elizabeth J. Carter, and Iolanda Leite. 2019. Knock on wood: The effects of material choice on the perception of social robots. In Proceedings of the International Conference on Social Robotics. Springer, 211–221.
[82]
Ramiro Velázquez, Edwige E. Pissaloux, and Michael Wiertlewski. 2006. A compact tactile display for the blind with shape memory alloys. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA’06).IEEE, 3905–3910.
[83]
Tanja S. H. Wingenbach, Mark Brosnan, Monique C. Pfaltz, Peter Peyk, and Chris Ashwin. 2020. perception of discrete emotions in others: Evidence for distinct facial mimicry patterns. Sci. Rep. 10, 1 (2020), 1–13.
[84]
Lining Yao, Jifei Ou, Chin-Yi Cheng, Helene Steiner, Wen Wang, Guanyun Wang, and Hiroshi Ishii. 2015. BioLogic: Natto cells as nanoactuators for shape changing interfaces. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. ACM, 1–10.
[85]
Steve Yohanan and Karon E. MacLean. 2011. Design and assessment of the haptic creature’s affect display. In Proceedings of the 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI’11). IEEE, 473–480.
[86]
Steve Yohanan and Karon E. MacLean. 2012. The role of affective touch in human-robot interaction: Human intent and expectations in touching the haptic creature. Int. J. Soc. Robot. 4, 2 (2012), 163–180.

Cited By

View all
  • (2024)Design and Perception of a Soft Shape Change Beneath a SmartwatchProceedings of the ACM on Human-Computer Interaction10.1145/36764958:MHCI(1-23)Online publication date: 24-Sep-2024
  • (2024)A Playground for Soft Robot DesignCompanion Publication of the 2024 ACM Designing Interactive Systems Conference10.1145/3656156.3665663(294-296)Online publication date: 1-Jul-2024
  • (2024)Designing Plant-Driven Actuators for Robots to Grow, Age, and DecayProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661519(2481-2496)Online publication date: 1-Jul-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Transactions on Human-Robot Interaction
ACM Transactions on Human-Robot Interaction  Volume 12, Issue 2
Special Issue on Designing the Robot Body: Critical Perspectives on Affective Embodied Interaction
June 2023
316 pages
EISSN:2573-9522
DOI:10.1145/3586023
Issue’s Table of Contents

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 14 April 2023
Online AM: 09 May 2022
Accepted: 24 February 2022
Revised: 15 November 2021
Received: 27 May 2021
Published in THRI Volume 12, Issue 2

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Robot skin
  2. texture change
  3. human–robot interaction
  4. soft robots
  5. bio-inspired design
  6. pneumatics

Qualifiers

  • Research-article

Funding Sources

  • National Science Foundation

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)1,501
  • Downloads (Last 6 weeks)199
Reflects downloads up to 03 Oct 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Design and Perception of a Soft Shape Change Beneath a SmartwatchProceedings of the ACM on Human-Computer Interaction10.1145/36764958:MHCI(1-23)Online publication date: 24-Sep-2024
  • (2024)A Playground for Soft Robot DesignCompanion Publication of the 2024 ACM Designing Interactive Systems Conference10.1145/3656156.3665663(294-296)Online publication date: 1-Jul-2024
  • (2024)Designing Plant-Driven Actuators for Robots to Grow, Age, and DecayProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661519(2481-2496)Online publication date: 1-Jul-2024
  • (2024)Conveying Emotions through Shape-changing to Children with and without Visual ImpairmentProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642525(1-16)Online publication date: 11-May-2024
  • (2024)"I'm Not Touching You. It's The Robot!": Inclusion Through A Touch-Based Robot Among Mixed-Visual Ability ChildrenProceedings of the 2024 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3610977.3634992(511-521)Online publication date: 11-Mar-2024
  • (2024)Tailoring materials into kirigami robotsDevice10.1016/j.device.2024.1004692:9(100469)Online publication date: Sep-2024
  • (2023)TactorBots: A Haptic Design Toolkit for Out-of-lab Exploration of Emotional Robotic TouchProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580799(1-19)Online publication date: 19-Apr-2023

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Get Access

Login options

Full Access

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media