Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
survey

Human--Computer Interaction on the Skin

Published: 30 August 2019 Publication History

Abstract

The skin offers exciting possibilities for human--computer interaction by enabling new types of input and feedback. We survey 42 research papers on interfaces that allow users to give input on their skin. Skin-based interfaces have developed rapidly over the past 8 years but most work consists of individual prototypes, with limited overview of possibilities or identification of research directions. The purpose of this article is to synthesize what skin input is, which technologies can sense input on the skin, and how to give feedback to the user. We discuss challenges for research in each of these areas.

References

[1]
Joanna Bergstrom-Lehtovirta, Sebastian Boring, and Kasper Hornbæk. 2017. Placing and recalling virtual items on the skin. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. ACM, 1497--1507.
[2]
S. Adam Brasel and James Gips. 2014. Tablets, touchscreens, and touchpads: How varying touch interfaces trigger psychological ownership and endowment. Journal of Consumer Psychology 24, 2 (2014), 226--233.
[3]
Jesse Burstyn, Paul Strohmeier, and Roel Vertegaal. 2015. DisplaySkin: Exploring pose-aware displays on a flexible electrophoretic wristband. In Proceedings of the 9h International Conference on Tangible, Embedded, and Embodied Interaction. ACM, 165--172.
[4]
Liwei Chan, Yi-Ling Chen, Chi-Hao Hsieh, Rong-Hao Liang, and Bing-Yu Chen. 2015. Cyclopsring: Enabling whole-hand and context-aware interactions through a fisheye ring. In Proceedings of the 28th Annual ACM Symposium on User Interface Software and Technology. ACM, 549--556.
[5]
Liwei Chan, Rong-Hao Liang, Ming-Chang Tsai, Kai-Yin Cheng, Chao-Huai Su, Mike Y. Chen, Wen-Huang Cheng, and Bing-Yu Chen. 2013. FingerPad: Private and subtle interaction using fingertips. In Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology. ACM, 255--260.
[6]
David Coyle, James Moore, Per Ola Kristensson, Paul Fletcher, and Alan Blackwell. 2012. I did that! Measuring users’ experience of agency in their own actions. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 2025--2034.
[7]
Artem Dementyev and Joseph A. Paradiso. 2014. WristFlex: Low-power gesture input with wrist-worn pressure sensors. In Proceedings of the 27th Annual ACM Symposium on User Interface Software and Technology. ACM, 161--166.
[8]
Niloofar Dezfuli, Mohammadreza Khalilbeigi, Jochen Huber, Murat Özkorkmaz, and Max Mühlhäuser. 2014. PalmRC: Leveraging the palm surface as an imaginary eyes-free television remote control. Behaviour 8 Information Technology 33, 8 (2014), 829--843.
[9]
Madeline Gannon, Tovi Grossman, and George Fitzmaurice. 2015. Tactum: A skin-centric approach to digital design and fabrication. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. ACM, 1779--1788.
[10]
Sean Gustafson, Christian Holz, and Patrick Baudisch. 2011. Imaginary phone: Learning imaginary interfaces by transferring spatial memory from a familiar device. In Proceedings of the 24th Annual ACM Symposium on User Interface Software and Technology. ACM, 283--292.
[11]
Sean G. Gustafson, Bernhard Rabe, and Patrick M. Baudisch. 2013. Understanding palm-based imaginary interfaces: The role of visual and tactile cues when browsing. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 889--898.
[12]
Chris Harrison, Hrvoje Benko, and Andrew D. Wilson. 2011. OmniTouch: Wearable multitouch interaction everywhere. In Proceedings of the 24th Annual ACM Symposium on User Interface Software and Technology. ACM, 441--450.
[13]
Chris Harrison, Shilpa Ramamurthy, and Scott E. Hudson. 2012. On-body interaction: Armed and dangerous. In Proceedings of the 6th International Conference on Tangible, Embedded and Embodied Interaction. ACM, 69--76.
[14]
Chris Harrison, Desney Tan, and Dan Morris. 2010. Skinput: Appropriating the body as an input surface. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 453--462.
[15]
Hayati Havlucu, Mehmet Yarkın Ergin, Idil Bostan, Oğuz Turan Buruk, Tilbe Göksun, and Oğuzhan Özcan. 2017. It made more sense: Comparison of user-elicited on-skin touch and freehand gesture sets. In International Conference on Distributed, Ambient, and Pervasive Interactions. Springer, 159--171.
[16]
Christian Holz, Tovi Grossman, George Fitzmaurice, and Anne Agur. 2012. Implanted user interfaces. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 503--512.
[17]
Hsin-Liu Cindy Kao, Christian Holz, Asta Roseway, Andres Calvo, and Chris Schmandt. 2016. DuoSkin: Rapidly prototyping on-skin user interfaces using skin-friendly materials. In Proceedings of the 2016 ACM International Symposium on Wearable Computers. ACM, 16--23.
[18]
Takashi Kikuchi, Yuta Sugiura, Katsutoshi Masai, Maki Sugimoto, and Bruce H. Thomas. 2017. EarTouch: Turning the ear into an input surface. In Proceedings of the 19th International Conference on Human-Computer Interaction with Mobile Devices and Services. ACM, 27.
[19]
Scott Klemmer. 2011. Technical perspective: Skintroducing the future. Communications of the ACM 54, 8 (2011).
[20]
Jarrod Knibbe, Diego Martinez Plasencia, Christopher Bainbridge, Chee-Kin Chan, Jiawei Wu, Thomas Cable, Hassan Munir, and David Coyle. 2014. Extending interaction for smart watches: Enabling bimanual around device control. In CHI’14 Extended Abstracts on Human Factors in Computing Systems. ACM, 1891--1896.
[21]
Jarrod Knibbe, Sue Ann Seah, and Mike Fraser. 2014. VideoHandles: Replicating gestures to search through action-camera video. In Proceedings of the 2nd ACM Symposium on Spatial User Interaction. ACM, 50--53.
[22]
Gierad Laput, Robert Xiao, Xiang’Anthony’ Chen, Scott E. Hudson, and Chris Harrison. 2014. Skin buttons: Cheap, small, low-powered and clickable fixed-icon laser projectors. In Proceedings of the 27th Annual ACM Symposium on User Interface Software and Technology. ACM, 389--394.
[23]
Juyoung Lee, Hui Shyong Yeo, Murtaza Dhuliawala, Jedidiah Akano, Junichi Shimizu, Thad Starner, Aaron Quigley, Woontack Woo, and Kai Kunze. 2017. Itchy nose: Discreet gesture interaction using EOG sensors in smart eyewear. In Proceedings of the 29th ACM International Symposium on Wearable Computers (ISWC ’17). Association for Computing Machinery.
[24]
Soo-Chul Lim, Jungsoon Shin, Seung-Chan Kim, and Joonah Park. 2015. Expansion of smartwatch touch interface from touchscreen to around device interface using infrared line image sensors. Sensors 15, 7 (2015), 16642--16653.
[25]
Jhe-Wei Lin, Chiuan Wang, Yi Yao Huang, Kuan-Ting Chou, Hsuan-Yu Chen, Wei-Luan Tseng, and Mike Y. Chen. 2015. Backhand: Sensing hand gestures via back of the hand. In Proceedings of the 28th Annual ACM Symposium on User Interface Software and Technology. ACM, 557--564.
[26]
Shu-Yang Lin, Chao-Huai Su, Kai-Yin Cheng, Rong-Hao Liang, Tzu-Hao Kuo, and Bing-Yu Chen. 2011. Pub-point upon body: Exploring eyes-free interaction and methods on an arm. In Proceedings of the 24th Annual ACM Symposium on User Interface Software and Technology. ACM, 481--488.
[27]
Roman Lissermann, Jochen Huber, Aristotelis Hadjakos, Suranga Nanayakkara, and Max Mühlhäuser. 2014. EarPut: Augmenting ear-worn devices for ear-based interaction. In Proceedings of the 26th Australian Computer-Human Interaction Conference on Designing Futures: the Future of Design. ACM, 300--307.
[28]
Joanne Lo, Doris Jung Lin Lee, Nathan Wong, David Bui, and Eric Paulos. 2016. Skintillates: Designing and creating epidermal interactions. In Proceedings of the 2016 ACM Conference on Designing Interactive Systems. ACM, 853--864.
[29]
Christian Loclair, Sean Gustafson, and Patrick Baudisch. 2010. PinchWatch: A wearable device for one-handed microinteractions. In Proceedings of MobileHCI, Vol. 10.
[30]
Pedro Lopes, Doăa Yüksel, François Guimbretiere, and Patrick Baudisch. 2016. Muscle-plotter: An interactive system based on electrical muscle stimulation that produces spatial output. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology. ACM, 207--217.
[31]
Yasutoshi Makino, Yuta Sugiura, Masa Ogata, and Masahiko Inami. 2013. Tangential force sensing system on forearm. In Proceedings of the 4th Augmented Human International Conference. ACM, 29--34.
[32]
Adiyan Mujibiya, Xiang Cao, Desney S. Tan, Dan Morris, Shwetak N. Patel, and Jun Rekimoto. 2013. The sound of touch: On-body touch and gesture sensing based on transdermal ultrasound propagation. In Proceedings of the 2013 ACM International Conference on Interactive Tabletops and Surfaces. ACM, 189--198.
[33]
Kei Nakatsuma, Rhoma Takedomi, Takaaki Eguchi, Yasutaka Oshima, and Ippei Torigoe. 2015. Active bioacoustic measurement for human-to-human skin contact area detection. In 2015 IEEE SENSORS. IEEE, 1--4.
[34]
Masa Ogata and Michita Imai. 2015. SkinWatch: Skin gesture interaction for smart watch. In Proceedings of the 6th Augmented Human International Conference. ACM, 21--24.
[35]
Masa Ogata, Yuta Sugiura, Yasutoshi Makino, Masahiko Inami, and Michita Imai. 2013. SenSkin: Adapting skin as a soft interface. In Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology. ACM, 539--544.
[36]
Masa Ogata, Yuta Sugiura, Yasutoshi Makino, Masahiko Inami, and Michita Imai. 2014. Augmenting a wearable display with skin surface as an expanded input area. In International Conference of Design, User Experience, and Usability. Springer, 606--614.
[37]
Uran Oh and Leah Findlater. 2014. Design of and subjective response to on-body input for people with visual impairments. In Proceedings of the 16th International ACM SIGACCESS Conference on Computers and Accessibility. ACM, 115--122.
[38]
Uran Oh and Leah Findlater. 2015. A performance comparison of on-hand versus on-phone nonvisual input by blind and sighted users. ACM Transactions on Accessible Computing (TACCESS) 7, 4 (2015), 14.
[39]
Ryosuke Ono, Shunsuke Yoshimoto, and Kosuke Sato. 2013. Palm+ Act: Operation by visually captured 3D force on palm. In SIGGRAPH Asia 2013 Emerging Technologies. ACM, 14.
[40]
Antti Oulasvirta and Joanna Bergstrom-Lehtovirta. 2011. Ease of juggling: Studying the effects of manual multitasking. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 3103--3112.
[41]
Manuel Prätorius, Aaron Scherzinger, and Klaus Hinrichs. 2015. SkInteract: An On-body interaction system based on skin-texture recognition. In Human-Computer Interaction. Springer, 425--432.
[42]
Manuel Prätorius, Dimitar Valkov, Ulrich Burgbacher, and Klaus Hinrichs. 2014. DigiTap: An eyes-free VR/AR symbolic input device. In Proceedings of the 20th ACM Symposium on Virtual Reality Software and Technology. ACM, 9--18.
[43]
Zdravko Radman. 2013. The Hand, an Organ of the Mind: What the Manual Tells the Mental. MIT Press, 108--119.
[44]
Munehiko Sato, Ivan Poupyrev, and Chris Harrison. 2012. Touché: Enhancing touch interaction on humans, screens, liquids, and everyday objects. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 483--492.
[45]
Marcos Serrano, Barrett M. Ens, and Pourang P. Irani. 2014. Exploring the use of hand-to-face input for interacting with head-worn displays. In Proceedings of the 32nd Annual ACM Conference on Human Factors in Computing Systems. ACM, 3181--3190.
[46]
Srinath Sridhar, Anders Markussen, Antti Oulasvirta, Christian Theobalt, and Sebastian Boring. 2017. WatchSense: On-and above-skin input sensing through a wearable depth sensor. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. ACM, 3891--3902.
[47]
Paul Strohmeier, Juan Pablo Carrascal, and Kasper Hornbæk. 2016. What can doodles on the arm teach us about on-body interaction? In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems. ACM, 2726--2735.
[48]
Ying-Chao Tung, Chun-Yen Hsu, Han-Yu Wang, Silvia Chyou, Jhe-Wei Lin, Pei-Jung Wu, Andries Valstar, and Mike Y. Chen. 2015. User-defined game input for smart glasses in public space. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. ACM, 3327--3336.
[49]
Cheng-Yao Wang, Wei-Chen Chu, Po-Tsung Chiu, Min-Chieh Hsiu, Yih-Harn Chiang, and Mike Y. Chen. 2015. PalmType: Using palms as keyboards for smart glasses. In Proceedings of the 17th International Conference on Human-Computer Interaction with Mobile Devices and Services. ACM, 153--160.
[50]
Cheng-Yao Wang, Min-Chieh Hsiu, Po-Tsung Chiu, Chiao-Hui Chang, Liwei Chan, Bing-Yu Chen, and Mike Y. Chen. 2015. PalmGesture: Using palms as gesture interfaces for eyes-free input. In Proceedings of the 17th International Conference on Human-Computer Interaction with Mobile Devices and Services. ACM, 217--226.
[51]
Yuntao Wang, Chun Yu, Lin Du, Jin Huang, and Yuanchun Shi. 2014. BodyRC: Exploring interaction modalities using human body as lossy signal transmission medium. In Proceedings of the 2014 IEEE 11th International Conference on Ubiquitous Intelligence and Computing, IEEE 11th International Conference on Autonomic and Trusted Computing, and IEEE 14th International Conference on Scalable Computing and Communications and Its Associated Workshops (UTC-ATC-ScalCom’14). IEEE, 260--267.
[52]
Martin Weigel, Tong Lu, Gilles Bailly, Antti Oulasvirta, Carmel Majidi, and Jürgen Steimle. 2015. Iskin: Flexible, stretchable and visually customizable on-body touch sensors for mobile computing. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. ACM, 2991--3000.
[53]
Martin Weigel, Vikram Mehta, and Jürgen Steimle. 2014. More than touch: Understanding how people use skin as an input surface for mobile computing. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 179--188.
[54]
Martin Weigel, Aditya Shekhar Nittala, Alex Olwal, and Jürgen Steimle. 2017. SkinMarks: Enabling interactions on body landmarks using conformal skin electronics. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. ACM, 3095--3105.
[55]
Yang Zhang, Junhan Zhou, Gierad Laput, and Chris Harrison. 2016. Skintrack: Using the body as an electrical waveguide for continuous finger tracking on the skin. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. ACM, 1491--1503.

Cited By

View all
  • (2024)Optimization measures for students’ autonomous learning based on deep learning and human-computer interaction technologyJournal of Computational Methods in Sciences and Engineering10.3233/JCM-24755424:4-5(3079-3091)Online publication date: 14-Aug-2024
  • (2024)EgoTouch: On-Body Touch Input Using AR/VR Headset CamerasProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676455(1-11)Online publication date: 13-Oct-2024
  • (2024)BodyTouchProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36314267:4(1-22)Online publication date: 12-Jan-2024
  • Show More Cited By

Index Terms

  1. Human--Computer Interaction on the Skin

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Computing Surveys
    ACM Computing Surveys  Volume 52, Issue 4
    July 2020
    769 pages
    ISSN:0360-0300
    EISSN:1557-7341
    DOI:10.1145/3359984
    • Editor:
    • Sartaj Sahni
    Issue’s Table of Contents
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 30 August 2019
    Accepted: 01 May 2019
    Revised: 01 April 2019
    Received: 01 May 2018
    Published in CSUR Volume 52, Issue 4

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Skin input
    2. on-body interaction
    3. tracking technologies

    Qualifiers

    • Survey
    • Research
    • Refereed

    Funding Sources

    • European Research Council (ERC) under the European Union’s Horizon 2020 Research and Innovation Program

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)155
    • Downloads (Last 6 weeks)12
    Reflects downloads up to 10 Oct 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Optimization measures for students’ autonomous learning based on deep learning and human-computer interaction technologyJournal of Computational Methods in Sciences and Engineering10.3233/JCM-24755424:4-5(3079-3091)Online publication date: 14-Aug-2024
    • (2024)EgoTouch: On-Body Touch Input Using AR/VR Headset CamerasProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676455(1-11)Online publication date: 13-Oct-2024
    • (2024)BodyTouchProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36314267:4(1-22)Online publication date: 12-Jan-2024
    • (2024)RadarHand: A Wrist-Worn Radar for On-Skin Touch-Based Proprioceptive GesturesACM Transactions on Computer-Human Interaction10.1145/361736531:2(1-36)Online publication date: 29-Jan-2024
    • (2024)PhoneInVR: An Evaluation of Spatial Anchoring and Interaction Techniques for Smartphone Usage in Virtual RealityProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642582(1-16)Online publication date: 11-May-2024
    • (2024)TriPad: Touch Input in AR on Ordinary Surfaces with Hand Tracking OnlyProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642323(1-18)Online publication date: 11-May-2024
    • (2024)Fingerhinter Takes Center Stage: User Experience Insights from Informal Encounters with a Finger-Augmentation Device2024 IEEE International Conference on Artificial Intelligence and eXtended and Virtual Reality (AIxVR)10.1109/AIxVR59861.2024.00044(265-269)Online publication date: 17-Jan-2024
    • (2023)SolareSkin: Self-powered Visible Light Sensing Through a Solar Cell E-SkinAdjunct Proceedings of the 2023 ACM International Joint Conference on Pervasive and Ubiquitous Computing & the 2023 ACM International Symposium on Wearable Computing10.1145/3594739.3612904(664-669)Online publication date: 8-Oct-2023
    • (2023)Fingerhints: Understanding Users’ Perceptions of and Preferences for On-Finger Kinesthetic NotificationsProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581022(1-17)Online publication date: 19-Apr-2023
    • (2023)Understanding Wheelchair Users’ Preferences for On-Body, In-Air, and On-Wheelchair GesturesProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580929(1-16)Online publication date: 19-Apr-2023
    • Show More Cited By

    View Options

    Get Access

    Login options

    Full Access

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media