Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
chapter

Multimodal feedback in HCI: haptics, non-speech audio, and their applications

Published: 24 April 2017 Publication History

Abstract

Computer interfaces traditionally depend on visual feedback to provide information to users, with large, high-resolution screens the norm. Other sensory modalities, such as haptics and audio, have great potential to enrich the interaction between user and device to enable new types of interaction for new user groups in new contexts. This chapter provides an overview of research in the use of these non-visual modalities for interaction, showing how new output modalities can be used in the user interface to different devices. The modalities that will be discussed include:
Haptics: tactons (vibrotactile feedback), thermal (warming and cooling feedback), force feedback, and deformable devices;
Non-Speech Audio: auditory icons, Earcons, musicons, sonification, and spatial audio output.
One motivation for using multiple modalities in a user interface is that interaction can be distributed across the different senses or control capabilities of the person using it. If one modality is fully utilized or unavailable (e.g., due to sensory or situational impairment), then another can be exploited to ensure the interaction succeeds. For example, when walking and using a mobile phone, a user needs to focus their visual attention on the environment to avoid bumping into other people. A complex visual interface on the phone may make this difficult. However, haptic or audio feedback would allow them to use their phone and navigate the world at the same time.
This chapter does not present background on multisensory perception and multimodal action, but for insights on that topic see Chapter 2. Chapter 3 also specifically discuss multisensory haptic interaction and the process of designing for it. As a complement, this chapter presents a range of applications where multimodal feedback that involves haptics or non-speech audio can provide usability benefits, motivated by Wickens' Multiple Resources Theory [Wickens 2002]. The premise of this theory is that tasks can be performed better and with fewer cognitive resources when they are distributed across modalities. For example, when driving, which is a largely visual task, route guidance is better presented through sound rather than a visual display, as that would compete with the driving for visual cognitive resources. Making calls or texting while driving, both manual tasks, would be more difficult to perform compared to voice dialing, as speech and manual input involve different modalities. For user interface design, it is important to distribute different tasks across modalities to ensure the user is not overloaded so that interaction can succeed.

References

[1]
M. Ablaßmeier, T. Poitschke, F. Wallhoff, K. Bengler, G. and Rigoll. 2007. Eye gaze studies comparing head-up and head-down displays in vehicles. IEEE International Conference on Multimedia and Expo, pp. 2250--2252. 305
[2]
M. Akamatsu and I. MacKenzie. 1996. Movement characteristics using a mouse with tactile and force feedback. International Journal of Human-Computer Interaction, 483--493. 297
[3]
M. Akamatsu and S. Sato, S. 1994. A multi-modal mouse with tactile and force feedback. International Journal of Human-Computer Studies, 40(3):443--453. 285
[4]
J. Alexander, A. Lucero, and S. Subramanian. 2012. Tilt displays: designing display surfaces with multi-axis tilting and actuation. Proceedings MobileHCI '12, pp. 161--170.
[5]
R. L. Alexander, S. O'Modhrain, D. A. Roberts, J. A. Gilbert, and T. H. Zurbuchen. 2014. The bird's ear view of space physics: Audification as a tool for the spectral analysis of time series data. Journal of Geophysical Research Space Physics 119(7):5259--5271. 292
[6]
H. Alm and L. Nilsson. 1994. Changes in driver behaviour as a function of handsfree mobile phones - A simulator study. Accident Analysis and Prevention 26(4):441--451. 304
[7]
C. L. Baldwin, J. L. Eisert, A. Garcia, B. Lewis, S. Pratt, and C. Gonzalez. 2012. Multimodal urgency coding: auditory, visual, and tactile parameters and their impact on perceived urgency. WORK: A Journal of Prevention, Assessment & Rehabilitation, 41:3586--3591. 296
[8]
L. Barnard, J. S. Yi, J. A. Jacko, and A. Sears. 2005. An empirical comparison of use-inmotion evaluation scenarios for mobile computing devices. International Journal of Human-Computer Studies, 62(4):487--520. 302
[9]
S. Barrass and G. Kramer. 1999. Using sonification. Multimedia System, 7(1):23--31. 293
[10]
M. Blattner, D. Sumikawa, and R. Greenberg. 1989. Earcons and icons: their structure and common design principles. Human-Computer Interaction, 4(1):11--44. 279, 290, 613
[11]
J. R. Blum, M. Bouchard, and J. R. Cooperstock. 2012. What's around me? Spatialized audio augmented reality for blind users with a smartphone. Proceedings of the Mobiquitous '12, pp. 49--62. 294, 302
[12]
S. Brewster. 1998. Using nonspeech sounds to provide navigation cues. ACM Transactions of Computer-Human Interaction, 5(3):224--259. 291
[13]
S. Brewster. 2002. Overcoming the Lack of Screen Space on Mobile Computers. Person.al and Ubiquitous Computing 6(3):188--205. 303
[14]
S. Brewster and L. M. Brown. 2004. Tactons: structured tactile messages for non-visual information display. Proceedings of the AUIC '04, pp. 15--23. 279, 282, 631
[15]
S. Brewster, F. Chohan, and L. Brown. 2007. Tactile feedback for mobile interactions. Proceedings of the CHI '07, pp. 159--162. 303
[16]
S. Brewster, D. K. McGookin, and C. A. Miller. 2006. Olfoto: designing a smell-based interaction. Proceedings of the CHI '06, pp. 653--662.
[17]
S. Brewster, P. C. Wright, and A. D. N. Edwards. 1992. A detailed investigation into the effectiveness of earcons. Proceedings of the ICAD '92, pp. 471--498. 290
[18]
L. Brown, S. Brewster, and H. C. Purchase. 2005. A first investigation into the effectiveness of tactons. Proceedings of the WHC '05, pp. 167--176. 282, 296
[19]
L. Brown, S. Brewster, and H. Purchase. 2006. Multidimensional tactons for non-visual information presentation in mobile devices. Proceedings of the MobileHCI '06, pp. 231--238. 282, 284, 296
[20]
S. Cardin, D. Thalmann, and F. Vexo. 2007. A wearable system for mobility improvement of visually impaired people. The Visual Computer 23(2) 109--118. 301
[21]
T. Carter, S. A. Seah, B. Long, B. Drinkwater, and S. Subramanian. 2013. UltraHaptics: multipoint mid-air haptic feedback for touch surfaces. Proceedings of the UIST '13, pp. 505--514. 286, 307
[22]
S. Choi and K. J. Kuchenbecker. 2013. Vibrotactile display: perception, technology, and applications. Proceedings of the IEEE, 101(9):2093--2104. 283
[23]
R. W. Cholewiak, J. C. Brill, and A. Schwab. 2004. Vibrotactile localization on the abdomen: effects of place and space. Perception & Psychophysics, 66(6):970--987. 283
[24]
R. W. Cholewiak and A. A. Collins. 2003. Vibrotactile localization on the arm: effects of place, space, and age. Perception & Psychophysics, 65(6):1058--1077. 283
[25]
P. R. Cohen. 2017. Multimodal Speech and Pen Interfaces. In S. Oviatt, B. Schuller, P. Cohen, D. Sonntag, G. Potamianos, and A. Krüger, editors, Handbook of Multimodal-Multisensor Interfaces, Volume 1: Foundations, User Modeling, and Common Modality Combinations. Morgan & Claypool, Williston, VT. 288
[26]
A. Crossan and S. Brewster. 2008. Multimodal trajectory playback for teaching shape information and trajectories to visually impaired computer users. ACM Transactions on Accessible Computing 1(2):1--34. 299
[27]
F. Dombois and G. Eckel. 2011. Audification. In T. Hermann, A. Hunt, and J. G. Neuhoff, editors, The Sonification Handbook, pp. 301--324. Logos Publishing House, Berlin. 292
[28]
G. Flores, S. Kurniawan, R. Manduchi, E. Martinson, L. M. Morales, and E. A. Sisbot. 2015. Vibrotactile guidance for wayfinding of blind walkers. IEEE Transactions on Haptics, 8(3):306--317. 301
[29]
S. Follmer, D. Leithinger, A. Olwal, N. Cheng, and H. Ishii. 2012. Jamming user interfaces: programmable particle stiffness and sensing for malleable and shape-changing devices. Proceedings of the UIST '12, pp. 519--528. 287
[30]
S. Follmer, D. Leithinger, A. Olwal, A. Hogge, and H. Ishii. 2013. inFORM: dynamic physical affordances and constraints through shape and object actuation. Proceedings of the UIST '13, pp. 417--426. 307
[31]
E. Freeman, S. Brewster, and V. Lantz. 2014. Tactile feedback for above-device gesture interfaces: adding touch to touchless interactions. Proceedings of the ICMI '14, pp. 419--426. 304
[32]
E. Freeman, S. Brewster, and V. Lantz. 2015. Towards in-air gesture control of household appliances with limited displays. Proceedings INTERACT '15 Posters in LNCS 9299, pp. 611--615. 305
[33]
E. Freeman, S. Brewster, and V. Lantz. 2016. Do that, there: an interaction technique for addressing in-air gesture systems. Proceedings CHI '16. 304, 305
[34]
J. P. Fritz and K. E. Barrier. 1999. Design of a haptic data visualization system for people with visual impairments. IEEE Transactions on Rehabilitation Engineering, 7(3):372--384. 298
[35]
S. Gallo, G. Rognini, L. Santos-Carreras, T. Vouga, O. Blanke, and H. Bleuler. 2015. Encoded and crossmodal thermal stimulation through a fingertip-sized haptic display. Frontiers in Robotics and AI 2(Oct.):1--12. 297
[36]
W. Gaver. 1986. Auditory icons: using sound in computer interfaces. Human-Computer Interaction, 2(2):167--177. 279, 289, 610
[37]
W. Gaver. 1989. The SonicFinder: an interface that uses auditory icons. Human-Computer Interact., 4(1):67--94. 289, 290
[38]
A. Girouard, J. Lo, M. Riyadh, F. Daliri, A. K. Eady, and J. Pasquero. 2015. One-handed bend interactions with deformable smartphones. Proceedings of the CHI '15, ACM Press, pp. 1509--1518. 287
[39]
R. Gray, C. Spence, C. Ho, and H. Tan. 2013. Efficient Multimodal Cuing of Spatial Attention. Proceedings of the IEEE 101(9):2113--2122. 296
[40]
T. Hermann. 2011. Model-based sonification. In T. Hermann, A. Hunt, and J. G. Neuhoff, editors, The Sonification Handbook, pp. 399--427. Logos Publishing House, Berlin. 293
[41]
T. Hermann and H. Ritter. 1999. Listen to your data: model-based sonification for data analysis. International Symposium on Intelligent Multimedia and Distance Education. pp. 189--194. 292
[42]
T. Hermann and H. Ritter. 2004. Sound and meaning in auditory data display. Proceedings of the IEEE, 92(4):730--741. 293
[43]
W. Heuten, N. Henze, S. Boll, and M. Pielot. 2008. Tactile wayfinder: a non-visual support system for wayfinding. Proceedings of the NordiCHI '08, p. 172. 301
[44]
C. Ho and C. Spence. 2005. Assessing the effectiveness of various auditory cues in capturing a driver's visual attention. Journal of Experimental Pyschology: Applied 11(3):157--174. 305
[45]
C. Ho and C. Spence. 2008. The multisensory driver: Implications for ergonomic car interface design. Ashgate Publishing. 305
[46]
C. Ho, H. Z. Tan, and C. Spence. 2005. Using spatial vibrotactile cues to direct visual attention in driving scenes. Transportation Research Part F: Traffic Psychology and Behaviour, 8(6):397--412. 305
[47]
E. Hoggan and S. Brewster. 2006. Crossmodal icons for information display. CHI '06 Extended Abstracts, p. 857. 295
[48]
E. Hoggan and S. Brewster. 2007a. Designing audio and tactile crossmodal icons for mobile devices. Proceedings of the ICMI '07, pp. 162--169. 295, 296
[49]
E. Hoggan and S. Brewster. 2007b. New parameters for tacton design. CHI '07 Extended Abstracts, pp. 2417--2422. 282
[50]
E. Hoggan, S. Brewster, and J. Johnston. 2008. Investigating the effectiveness of tactile feedback for mobile touchscreens. Proceedings of the CHI '08, p. 1573. 303
[51]
E. Hoggan, R. Raisamo, and S. Brewster. 2009. Mapping information to audio and tactile icons. Proceedings of the ICMI '09, pp. 327--334. 296
[52]
S. Holland, D. R. Morse, and H. Gedenryd. 2002. AudioGPS: spatial audio navigation with a minimal attention interface. Person. Ubiquitous Comput., 6:253--259. 294, 301
[53]
A. Ion, E. J.Wang, and P. Baudisch. 2015. Skin drag displays: dragging a physical tactor across the user's skin produces a stronger tactile stimulus than vibrotactile. Proceedings of CHI '15, ACM Press, pp. 2501--2504. 282
[54]
H. Ishii, D. Lakatos, L. Bonanni, and J.-B. Labrune. 2012. Radical atoms: beyond tangible bits, toward transformable materials. ACMInteractions, 38--51. 307
[55]
T. Iwamoto, M. Tatezono, and H. Shinoda. 2008. Non-contact method for producing tactile sensation using airborne ultrasound. Proceedings of the EuroHaptics '08, Springer, pp. 504--513. 286, 307
[56]
L. A. Johnson and C. M. Higgins. 2006. A navigation aid for the blind using tactile-visual sensory substitution. 2006 International Conference of the IEEE Engineering in Medicine and Biology Society, pp. 6289--6292. 301
[57]
A. Jylhä, Y-T. Hsieh, V. Orso, S. Andolina, L. Gamberini, and G. Jacucci. 2015. A Wearable Multimodal Interface for Exploring Urban Points of Interest. Proceedings of the ICMI '15, pp. 175--182. 302
[58]
R. Kajastila and T. Lokki. 2013. Eyes-free interaction with free-hand gestures and auditory menus. International Journal of Human-Computer interaction, 71(5):627--640. 294
[59]
S. K. Kane, M. Ringel Morris, and J.O.Wobbrock. 2013. Touchplates: low-cost tactile overlays for visually impaired touch screen users. Proceedings of the ASSETS '13, Article 22. 303
[60]
A. Katsamanis, 2017. Multimodal Gesture Recognition. In S. Oviatt, B. Schuller, P. Cohen, D. Sonntag, G. Potamianos, and A. Krüger, editors, Handbook of Multimodal-Multisensor Interfaces, Volume 1: Foundations, User Modeling, and Common Modality Combinations. Morgan & Claypool, Williston, VT. 288
[61]
J. Kjeldskov and J. Stage. 2004. New techniques for usability evaluation of mobile systems. International Journal Human-Computer Studies, 60(5--6):599--620. 302
[62]
G. Kramer. 1993. Auditory Display: Sonification, Audification, and Auditory Interfaces. Perseus Publishing. 292
[63]
M. Kyriakidis, R. Happee, and J. C. F. de Winter. 2014. Public opinion on automated driving: Results of an international questionnaire among 5,000 respondents. SSRN 2506579. 305, 306
[64]
B. Lahey, A. Girouard, W. Burleson, and R. Vertegaal. 2011. PaperPhone: understanding the use of bend gestures in mobile devices with flexible electronic paper displays. Proceedings of the CHI '11, pp. 1303--1312. 287
[65]
S. J. Lederman and R. L. Klatzky. 1987. Hand movements: A window into haptic object recognition. Cognitive Psychology 19:342--368. 279, 281
[66]
D. Leithinger, D. Lakatos, A. DeVincenzi, M. Blackshaw, and H. Ishii. 2011. Direct and Gestural Interaction with Relief: A 2.5D Shape Display. Proceedings of the UIST '11, pp. 541--548). 307
[67]
K. Li, P. Baudisch, W. G. Griswold, and J. D. Hollan. 2008. Tapping and rubbing: exploring new dimensions of tactile feedback with voice coil motors. Proceedings the of UIST '08, ACM Press, p. 181. 282
[68]
G. Lohse. 1997. Models of graphical perception. In M. Helander, T. Landauer, and P. Prabhu, editors, Handbook of Human-Computer Interaction, pp. 107--135. Elsevier, Amsterdam. 298
[69]
K. MacLean. 2008a. Haptic interaction design for everyday interfaces. In M. Carswell, editor, Reviews of Human Factors and Ergonomics, pp. 149--194. Human Factors and Ergonomics Society. 279
[70]
J. R. Marston, J. M. Loomis, R. L. Klatzky, and R. G. Golledge. 2007. Nonvisual route following with guidance from a simple haptic or auditory display. Journal of Visual Impairment Blindness, 101(4):203--211. 301
[71]
T. H. Massie and J. K. Salisbury. 1994. The PHANTOM haptic interface: a device for probing virtual objects. Proceedings of the ASME Dynamic Systems and Control Division, pp. 295--299. 285
[72]
M. McGee-Lennon, M. Wolters, R. McLachlan, S. Brewster, and C. Hall. 2011. Name that tune: musicons as reminders in the home. Proceedings of the CHI '11, p. 2803. 279, 291, 292
[73]
D. K. McGookin and S. Brewster. 2004. Understanding concurrent Earcons: applying auditory scene analysis principles to concurrent Earcon recognition. ACM Transactions on Applied Perception, 1(2):130--155. 291
[74]
D. K. McGookin and S. Brewster. 2006. Graph builder: constructing non-visual visualizations. Proceedings of the HCI'06 Conference on People and Computers XX, pp. 263--278. 286, 298, 299
[75]
D.K. McGookin, E. Robertson and S. Brewster. 2010. Clutching at Straws: Using Tangible Interaction to Provide Non-Visual Access to Graphs. In Proceedings CHI '10, pp. 1715-1724. ACM Press.
[76]
R. McLachlan, D. Boland, and S. Brewster. 2014. Transient and transitional states: pressure as an auxiliary input modality for bimanual interaction. Proceedings of the CHI '14, pp. 401--410. 287, 288
[77]
R. McLachlan, M. McGee-Lennon, and S. Brewster. 2012. The sound of musicons: investigating the design of musically derived audio cues. Proceedings of the ICAD '12, pp. 148--155. 292
[78]
A. Meschtscherjakov, M. Tscheligi, D. Szostak, R. Ratan, R. McCall, I. Politis, and S. Krome. 2015. Experiencing autonomous vehicles: crossing the boundaries between a drive and a ride. CHI 2015 Extended Abstracts, pp. 2413--2416. 306
[79]
T. Murakami and N. Nakajima. 1994. Direct and intuitive input device for 3-D shape deformation. Proceedings of the CHI '94, pp. 465--470. 287
[80]
F. Naujoks, C. Mai, and A. Neukum. 2014. The effect of urgency of take-over requests during highly automated driving under distraction conditions. Proceedings of the AHFE '14, pp. 431--438. 306
[81]
M. A. Nees and B. N. Walker. 2007. Listener, task and auditory graph: toward a conceptual model of auditory graph comprehension. Proceedings of the ICAD '07, pp. 266--273. 293
[82]
A. Ng, S. Brewster, and J. H. Williamson. 2013. The impact of encumbrance on mobile interactions. Proceedings of the INTERACT '13, pp. 92--109. 302
[83]
A. Ng, S. A. Brewster, and J. H. Williamson. 2014. Investigating the effects of encumbrance on one- and two-handed interactions with mobile devices. Proceedings of the CHI '14, pp. 1981--1990. 302
[84]
J. G. Neuhoff, G. Kramer, N.W. B. Lane, and J.Wayand. 2000. Sonification and the interaction of perceptual dimensions: Can the data get lost in the map? Proceedings of the ICAD '00. 294
[85]
I. Oakley, M. R. McGee, S. Brewster, and P. Gray. 2000. Putting the feel in "look and feel". Proceedings of the CHI '00, pp. 415--422. 286
[86]
S. Pauletto and A. Hunt. 2005. A comparision of audio and visual analysis of complex time-series data sets. Proceedings of the ICAD '05, pp. 175--181. 292
[87]
B. Plimmer, A. Crossan, S. Brewster, and R. Blagojevic. 2008. Multimodal collaborative handwriting training for visually-impaired people. Proceedings of the CHI '08, p. 393. 299, 300
[88]
B. Plimmer, P. Reid, R. Blagojevic, A. Crossan, and S. Brewster. 2011. Signing on the tactile line. ACM Transactions on Computer-Human Interactions, 18(3):1--29. 299, 300
[89]
I. Politis, S. Brewster, and F. Pollick. 2013. Evaluating multimodal driver displays of varying urgency. Proceedings of the Auomotive UI '13, pp. 92--99. 296, 297, 305
[90]
I. Politis, S. Brewster, and F. Pollick. 2014. Evaluating multimodal driver displays under varying situational urgency. Proceedings of the CHI '14, pp. 4067--4076. 305
[91]
I. Politis, S. Brewster, and F. Pollick. 2015a. Language-based multimodal displays for the handover of control in autonomous cars. Proceedings of the Automotive UI '15, pp. 3--10. 296, 306
[92]
I. Politis, S. Brewster, and F. Pollick. 2015b. To beep or not to beep? Comparing abstract versus language-based multimodal driver displays. Proceedings of the CHI '15, pp. 3971--3980. 296
[93]
A. Potamianos. 2017. Audio and Visual Modality Combination in Speech Processing Applications. In S. Oviatt, B. Schuller, P. Cohen, D. Sonntag, G. Potamianos, and A. Krüger, editors, Handbook of Multimodal-Multisensor Interfaces, Volume 1: Foundations, User Modeling, and Common Modality Combinations. Morgan & Claypool, Williston, VT. 288
[94]
I. Poupyrev, S. Maruyama, and J. Rekimoto. 2002. Ambient touch: designing tactile interfaces for handheld devices. Proceedings of the UIST '02. 303
[95]
C.Rendl, D. Kim, P. Parzer, et al. 2016. FlexCase: enhancing mobile interaction with a flexible sensing and display cover. Proceedings of the CHI '16, ACM Press, pp. 5138--5150. 288
[96]
S. Robinson, C. Coutrix, J. Pearson, et al. 2016. Emergeables: deformable displays for continuous eyes-free mobile Interaction. Proceedings of the CHI '16, ACM Press, pp. 3793--3805. 307
[97]
A. Roudaut, A. Karnik, M. Löchtefeld, and S. Subramanian. 2013. Morphees: toward high "shape resolution" in self-actuated flexible mobile devices. Proceedings of the CHI '13, pp. 593--602. 307
[98]
SAE International Standard J3016. 2014. Taxonomy and definitions for terms related to on-road motor vehicle automated driving systems. 306
[99]
D. D. Salvucci. 2001. Predicting the effects of in-car interface use on driver performance: an integrated model approach. International Journal of Human-Computer Studies, 55(1):85--107. 304
[100]
C. Schwesig, I. Poupyrev, and E. Mori. 2004. Gummi: a bendable computer. Proceedings of the CHI '04, pp. 263--270. 287
[101]
P. Sines and B. Das. 1999. Peltier Haptic Interface (PHI) for improved sensation of touch in virtual environments. Virtual Reality, 4(4):260--264. 284
[102]
S. D. Speeth. 1961. Seismometer sounds. The Journal of the Acousticical Society of America, 33(7):909. 292
[103]
M. A. Srinivasan and C. Basdogan. 1997. Haptics in virtual environments: taxonomy, research status, and challenges. Computers and Graphics 21(4):393--404. 286
[104]
S. Strachan, P. Eslambolchilar, and R. Murray-Smith. 2005. gpsTunes - controlling navigation via audio feedback. Proceedings of the Mobile HCI '05, ACM Press, pp. 275--278. 302
[105]
H. Summala, D. Lamble, and M. Laakso. 1998. Driving experience and perception of the lead car's braking when looking at in-car targets. Accident Analysis Prevention, 30(4):401--407. 304
[106]
K. Tahiroglu, T. Svedström, V. Wikström, S. Overstall, J. Kildal, and T. Ahmaniemi. 2014. SoundFLEX: designing audio to guide interactions with shape-retaining deformable interfaces. Proceedings of the ICMI '14, pp. 267--274. 287
[107]
J. B. F. van Erp, K. Kyung, S. Kassner, J. Carter, S. Brewster, G. Weber, and I. Andrew. 2010. Setting the standards for haptic and tactile interactions: ISO's work. Proceedings of the EuroHaptics 2010 in Lecture Notes in Computer Science 6192 Part 2, Springer, pp. 353--358. 279, 281
[108]
J. B. F. van Erp, A. Toet, and J. B. Janssen. 2015. Uni-, bi- and tri-modal warning signals: Effects of temporal parameters and sensory modality on perceived urgency. Safety Science, 72:1--8. 296
[109]
J. B. F. van Erp and H. A. H. C. van Veen. 2001. Vibro-tactile information presentation in automobiles. Proceedings of the Eurohaptics '01, pp. 99--104. 305
[110]
J. B. F. van Erp, H. A. H. C. van Veen, C. van Jansen, and T. Dobbins. 2005. Waypoint navigation with a vibrotactile waist belt. ACM Transactions of Applied Perception, 2(2):106--117. 300
[111]
B. N. Walker and G. Kramer. 2005. Mappings and metaphors in auditory displays. ACM Transactions of Applied Perception, 2(4):407--412.
[112]
B. N. Walker, G. Kramer, and D. M. Lane. 2000. Psychophysical scaling of sonification mappings. Proceedings of the ICAD '00, pp. 99--104. 294
[113]
M. Weiss, S. Voelker, J. Borchers, and C. Wacharamanotham. 2011. FingerFlux: nearsurface haptic feedback on tabletops. Proceedings of the UIST '11, pp. 615--620. 285
[114]
C. D. Wickens. 2002. Multiple resources and performance prediction. Theoretical Issues in Ergonomics Science, 3(2):159--177. 278
[115]
J. Williamson, R. Murray-Smith, and S. Hughes. 2007. Shoogle: excitatory multimodal interaction on mobile devices. Proceedings of the CHI '07, pp. 121--124. 293
[116]
G. Wilson, S. Brewster, M. Halvey, and S. Hughes. 2012. Thermal icons: evaluating structured thermal feedback for mobile interaction. Proceedings of the MobileHCI '12, pp. 309--312. 279, 284, 285, 297, 631
[117]
G. Wilson, S. Brewster, M. Halvey, and S. Hughes. 2013. Thermal feedback identification in a mobile environment. Proceedings of the HAID 2013, Article 2. 284, 285
[118]
G. Wilson, G. Davidson, and S. Brewster. 2015. In the heat of the moment: subjective interpretations of thermal feedback during interaction. Proceedings of the CHI '15, pp. 2063--2072. 284, 285
[119]
G. Wilson, M. Halvey, S. A. Brewster, and S. A. Hughes. 2011. Some like it hot? Thermal feedback for mobile devices. Proceedings of the CHI '11, pp. 2555--2564. 284
[120]
L. Yao, R. Niiyama, and J. Ou. 2013. PneUI: pneumatically actuated soft composite materials for shape changing interfaces. Proceedings of the UIST '13. 307
[121]
W. Yu and S. Brewster. 2002. Comparing two haptic interfaces for multimodal graph rendering. Proceedings of the HAPTICS '02, pp. 3--9.
[122]
W. Yu, K. Kangas, and S. Brewster. 2003. Web-based haptic applications for blind people to create virtual graphs. Proceedings of the HAPTICS '03, pp. 318--325. 298
[123]
W. Yu, D. Reid, and S. Brewster. 2002. Web-based multimodal graphs for visually impaired people. Universal Access and Assistive Technology, pp. 97--108. 298, 299
[124]
C. Zhao, K.-Y. Chen, M. T. I. Aumi, S. Patel, and M. S. Reynolds. 2014. SideSwipe: detecting in-air gestures around mobile devices using actual GSM signal. Proceedings of the UIST '14, pp. 527--534.

Cited By

View all
  • (2024)Multisensory Trajectory Control at One Interaction Point, with RhythmProceedings of the 19th International Audio Mostly Conference: Explorations in Sonic Cultures10.1145/3678299.3678340(399-404)Online publication date: 18-Sep-2024
  • (2024)Controlling Trajectories with OneButton and RhythmProceedings of the 2024 International Conference on Advanced Visual Interfaces10.1145/3656650.3656704(1-3)Online publication date: 3-Jun-2024
  • (2024)Extended Reality (XR) Toward Building Immersive Solutions: The Key to Unlocking Industry 4.0ACM Computing Surveys10.1145/365259556:9(1-38)Online publication date: 14-Mar-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Books
The Handbook of Multimodal-Multisensor Interfaces: Foundations, User Modeling, and Common Modality Combinations - Volume 1
April 2017
662 pages
ISBN:9781970001679
DOI:10.1145/3015783

Publisher

Association for Computing Machinery and Morgan & Claypool

Publication History

Published: 24 April 2017

Permissions

Request permissions for this article.

Check for updates

Qualifiers

  • Chapter

Appears in

ACM Books

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)96
  • Downloads (Last 6 weeks)6
Reflects downloads up to 13 Sep 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Multisensory Trajectory Control at One Interaction Point, with RhythmProceedings of the 19th International Audio Mostly Conference: Explorations in Sonic Cultures10.1145/3678299.3678340(399-404)Online publication date: 18-Sep-2024
  • (2024)Controlling Trajectories with OneButton and RhythmProceedings of the 2024 International Conference on Advanced Visual Interfaces10.1145/3656650.3656704(1-3)Online publication date: 3-Jun-2024
  • (2024)Extended Reality (XR) Toward Building Immersive Solutions: The Key to Unlocking Industry 4.0ACM Computing Surveys10.1145/365259556:9(1-38)Online publication date: 14-Mar-2024
  • (2024)SPICA: Interactive Video Content Exploration through Augmented Audio Descriptions for Blind or Low-Vision ViewersProceedings of the CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642632(1-18)Online publication date: 11-May-2024
  • (2024)Vibrotactile Warnings Design for Improving Risks Awareness in Construction EnvironmentInternational Journal of Human–Computer Interaction10.1080/10447318.2024.2359742(1-23)Online publication date: 17-Jun-2024
  • (2024)SoundOrbit: motion-correlation interaction with auditory orbital trajectoriesPersonal and Ubiquitous Computing10.1007/s00779-024-01818-4Online publication date: 15-Jun-2024
  • (2023)Analysis of Caribbean XR Survey Creates an XR Development Strategy as a Path to the Regional Metaverse EvolutionJournal of Metaverse10.57019/jmv.11791043:1(43-65)Online publication date: 30-Jun-2023
  • (2023)LYDSPOR: AN URBAN SOUND EXPERIENCE WEAVING TOGETHER PAST AND PRESENT THROUGH VIBRATING BODIESProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581523(1-16)Online publication date: 19-Apr-2023
  • (2023)Tactile Symbols with Continuous and Motion-Coupled Vibration: An Exploration of using Embodied Experiences for Hermeneutic DesignProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581356(1-19)Online publication date: 19-Apr-2023
  • (2023)Simphony: Enhancing Accessible Pattern Design Practices among Blind WeaversProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581047(1-19)Online publication date: 19-Apr-2023
  • Show More Cited By

View Options

Get Access

Login options

Full Access

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media