Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3517428.3544828acmconferencesArticle/Chapter ViewAbstractPublication PagesassetsConference Proceedingsconference-collections
research-article
Public Access

Designing Gestures for Digital Musical Instruments: Gesture Elicitation Study with Deaf and Hard of Hearing People

Published: 22 October 2022 Publication History

Abstract

When playing musical instruments, deaf and hard-of-hearing (DHH) people typically sense their music from the vibrations transmitted by the instruments or the movements of their bodies while performing. Sensory substitution devices now exist that convert sounds into light and vibrations to support DHH people’s musical activities. However, these devices require specialized hardware, and the marketing profiles assume that standard musical instruments are available. Hence, a significant gap remains between DHH people and their musical performance enjoyment. To address this issue, this study identifies end users’ preferred gestures when using smartphones to emulate the musical experience based on the instrument selected. This gesture elicitation study applies 10 instrument types. Herein, we present the results and a new taxonomy of musical instrument gestures. The findings will support the design of gesture-based instrument interfaces to enable DHH people to more directly enjoy their musical performances.

Supplementary Material

Supplementary figures (assets22a-sub5580-cam-i40.zip)
Supplementary figures (assets22a-sub5580-cam-i40.zip)

References

[1]
Christopher R. Austin, Barrett Ens, Kadek Ananta Satriadi, and Bernhard Jenny. 2020. Elicitation study investigating hand and foot gesture interaction for immersive maps in augmented reality. Cartography and Geographic Information Science 47, 3 (2020), 214–228. https://doi.org/10.1080/15230406.2019.1696232 arXiv:https://doi.org/10.1080/15230406.2019.1696232
[2]
Amal Dar Aziz, Chris Warren, Hayden Bursk, and Sean Follmer. 2008. The Flote: An Instrument for People with Limited Mobility. In Proceedings of the 10th International ACM SIGACCESS Conference on Computers and Accessibility (Halifax, Nova Scotia, Canada) (Assets ’08). Association for Computing Machinery, New York, NY, USA, 295–296. https://doi.org/10.1145/1414471.1414545
[3]
Emeline Brulé. 2016. Playing Music with the Head. In Proceedings of the 18th International ACM SIGACCESS Conference on Computers and Accessibility (Reno, Nevada, USA) (ASSETS ’16). Association for Computing Machinery, New York, NY, USA, 339–340. https://doi.org/10.1145/2982142.2982146
[4]
Thisum Buddhika, Haimo Zhang, Samantha W. T. Chan, Vipula Dissanayake, Suranga Nanayakkara, and Roger Zimmermann. 2019. FSense: Unlocking the Dimension of Force for Gestural Interactions Using Smartwatch PPG Sensor. In Proceedings of the 10th Augmented Human International Conference 2019 (Reims, France) (AH2019). Association for Computing Machinery, New York, NY, USA, Article 11, 5 pages. https://doi.org/10.1145/3311823.3311839
[5]
Marshall Chasin. 2003. Music and hearing aids. The Hearing Journal 56, 7 (July 2003), 36–38.
[6]
Alice-Ann Darrow. 1993. The Role of Music in Deaf Culture: Implications for Music Educators. Journal of Research in Music Education 41, 2 (1993), 93–110. https://doi.org/10.2307/3345402 arXiv:https://doi.org/10.2307/3345402
[7]
Nem Khan Dim, Chaklam Silpasuwanchai, Sayan Sarcar, and Xiangshi Ren. 2016. Designing Mid-Air TV Gestures for Blind People Using User- and Choice-Based Elicitation Approaches. In Proceedings of the 2016 ACM Conference on Designing Interactive Systems (Brisbane, QLD, Australia) (DIS ’16). Association for Computing Machinery, New York, NY, USA, 204–214. https://doi.org/10.1145/2901790.2901834
[8]
Tilman Dingler, Rufat Rzayev, Alireza Sahami Shirazi, and Niels Henze. 2018. Designing Consistent Gestures Across Device Types: Eliciting RSVP Controls for Phone, Watch, and Glasses. Association for Computing Machinery, New York, NY, USA, 1–12. https://doi.org/10.1145/3173574.3173993
[9]
Ward R Drennan and Jay T Rubinstein. 2008. Music perception in cochlear implant users and its relationship with psychophysical capabilities. Journal of rehabilitation research and development 45, 5(2008), 779—789. https://doi.org/10.1682/jrrd.2007.08.0118
[10]
Jane L. E, Ilene L. E, James A. Landay, and Jessica R. Cauchard. 2017. Drone & Wo: Cultural Influences on Human-Drone Interaction Techniques. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (Denver, Colorado, USA) (CHI ’17). Association for Computing Machinery, New York, NY, USA, 6794–6799. https://doi.org/10.1145/3025453.3025755
[11]
George Thomas Ealy. 1994. Of ear trumpets and a resonance plate: early hearing aids and Beethoven’s hearing perception. 19th-Century Music 17, 3 (Spring 1994), 262–273.
[12]
Georg Essl. 2010. The Mobile Phone Ensemble As Classroom. In Proceedings of the International Computer Music Conference (ICMC), Stony Brooks/New York.
[13]
Georg Essl and Michael Rohs. 2007. ShaMus – A Sensor-Based Integrated Mobile Phone Instrument. In Proceedings of the International Computer Music Conference (ICMC). 27–31.
[14]
Shariff A. M. Faleel, Michael Gammon, Yumiko Sakamoto, Carlo Menon, and Pourang Irani. 2020. User Gesture Elicitation of Common Smartphone Tasks for Hand Proximate User Interfaces. In Proceedings of the 11th Augmented Human International Conference (Winnipeg, Manitoba, Canada) (AH ’20). Association for Computing Machinery, New York, NY, USA, Article 6, 8 pages. https://doi.org/10.1145/3396339.3396363
[15]
David Fourney. 2012. Can Computer Representations of Music Enhance Enjoyment for Individuals Who Are Hard of Hearing?. In Proceedings of the 13th International Conference on Computers Helping People with Special Needs - Volume Part I (Linz, Austria) (ICCHP’12). Springer-Verlag, Berlin, Heidelberg, 535–542. https://doi.org/10.1007/978-3-642-31522-0_80
[16]
David W. Fourney. 2015. Making the invisible visible: visualization of music and lyrics for deaf and hard of hearing audiences. https://doi.org/10.32920/ryerson.14664129.v1
[17]
David W. Fourney and Deborah I. Fels. 2009. Creating access to music through visualization. In 2009 IEEE Toronto International Conference Science and Technology for Humanity (TIC-STH). 939–944. https://doi.org/10.1109/TIC-STH.2009.5444364
[18]
Qian-Jie Fu and John J Galvin. 2007. Computer-Assisted Speech Training for Cochlear Implant Patients: Feasibility, Outcomes, and Future Directions. Seminars in hearing 28, 2 (May 2007). https://doi.org/10.1055/s-2007-973440
[19]
John J. Galvin III, Qian-Jie Fu, and Robert V. Shannon. 2009. Melodic Contour Identification and Music Perception by Cochlear Implant Users. Annals of the New York Academy of Sciences 1169, 1 (July 2009), 518–533. https://doi.org/10.1111/j.1749-6632.2009.04551.x arXiv:https://nyaspubs.onlinelibrary.wiley.com/doi/pdf/10.1111/j.1749-6632.2009.04551.x
[20]
Lalya Gaye, Lars Erik Holmquist, Frauke Behrendt, and Atau Tanaka. 2006. Mobile Music Technology: Report on an Emerging Community. In Proceedings of the 2006 Conference on New Interfaces for Musical Expression (Paris, France) (NIME ’06). IRCAM — Centre Pompidou, Paris, FRA, 22–25.
[21]
Günter Geiger. 2006. Using the Touch Screen as a Controller for Portable Computer Music Instruments. In Proceedings of the 2006 Conference on New Interfaces for Musical Expression (Paris, France) (NIME ’06). IRCAM — Centre Pompidou, Paris, FRA, 61–64.
[22]
Nicholas Gillian, Sile O’Modhrain, and Georg Essl. 2009. Scratch-Off : A Gesture Based Mobile Music Game with Tactile Feedback. In Proceedings of the International Conference on New Interfaces for Musical Expression. Zenodo, 308–311. https://doi.org/10.5281/zenodo.1177553
[23]
Evelyn Glennie. 2015. Hearing Essay. https://www.evelyn.co.uk/hearing-essay/. (Accessed on 07/10/2022).
[24]
Rumi Hiraga and Kjetil Falkenberg Hansen. 2013. Sound Preferences of Persons with Hearing Loss Playing an Audio-Based Computer Game. In Proceedings of the 3rd ACM International Workshop on Interactive Multimedia on Mobile & Portable Devices (Barcelona, Spain) (IMMPD ’13). Association for Computing Machinery, New York, NY, USA, 25–30. https://doi.org/10.1145/2505483.2505489
[25]
Euyshick Hong and Jun Kim. 2017. Webxophone: Web Audio Wind Instrument. In Proceedings of the International Conference on Algorithms, Computing and Systems (Jeju Island, Republic of Korea) (ICACS ’17). Association for Computing Machinery, New York, NY, USA, 79–82. https://doi.org/10.1145/3127942.3127954
[26]
Ryo Iijima, Akihisa Shitara, Sayan Sarcar, and Yoichi Ochiai. 2021. Smartphone Drum: Gesture-Based Digital Musical Instruments Application for Deaf and Hard of Hearing People. In Symposium on Spatial User Interaction (Virtual Event, USA) (SUI ’21). Association for Computing Machinery, New York, NY, USA, Article 25, 2 pages. https://doi.org/10.1145/3485279.3488285
[27]
Alon Ilsar and Gail Kenning. 2020. Inclusive Improvisation through Sound and Movement Mapping: From DMI to ADMI. In The 22nd International ACM SIGACCESS Conference on Computers and Accessibility(Virtual Event, Greece) (ASSETS ’20). Association for Computing Machinery, New York, NY, USA, Article 49, 8 pages. https://doi.org/10.1145/3373625.3416988
[28]
Maria Karam, Carmen Branje, Gabe Nespoli, Norma Thompson, Frank A. Russo, and Deborah I. Fels. 2010. The Emoti-Chair: An Interactive Tactile Music Exhibit. In CHI ’10 Extended Abstracts on Human Factors in Computing Systems (Atlanta, Georgia, USA) (CHI EA ’10). Association for Computing Machinery, New York, NY, USA, 3069–3074. https://doi.org/10.1145/1753846.1753919
[29]
Maria Karam, Gabe Nespoli, Frank Russo, and Deborah I. Fels. 2009. Modelling Perceptual Elements of Music in a Vibrotactile Display for Deaf Users: A Field Study. In Proceedings of the 2009 Second International Conferences on Advances in Computer-Human Interactions(ACHI ’09). IEEE Computer Society, USA, 249–254. https://doi.org/10.1109/ACHI.2009.64
[30]
Maria Karam, Frank Russo, Carmen Branje, Emily Price, and Deborah I. Fels. 2008. Towards a Model Human Cochlea: Sensory Substitution for Crossmodal Audio-Tactile Displays. In Proceedings of Graphics Interface 2008 (Windsor, Ontario, Canada) (GI ’08). Canadian Information Processing Society, CAN, 267–274.
[31]
Jeeeun Kim, Swamy Ananthanarayan, and Tom Yeh. 2015. Seen Music: Ambient Music Data Visualization for Children with Hearing Impairments. In Proceedings of the 14th International Conference on Interaction Design and Children(Boston, Massachusetts) (IDC ’15). Association for Computing Machinery, New York, NY, USA, 426–429. https://doi.org/10.1145/2771839.2771870
[32]
Joy Kim and Jonathan Ricaurte. 2011. TapBeats: Accessible and Mobile Casual Gaming. In The Proceedings of the 13th International ACM SIGACCESS Conference on Computers and Accessibility (Dundee, Scotland, UK) (ASSETS ’11). Association for Computing Machinery, New York, NY, USA, 285–286. https://doi.org/10.1145/2049536.2049609
[33]
Bruno La Versa, Isabella Peruzzi, Luca Diamanti, and Marco Zemolin. 2014. MUVIB: Music and Vibration. In Proceedings of the 2014 ACM International Symposium on Wearable Computers: Adjunct Program (Seattle, Washington) (ISWC ’14 Adjunct). Association for Computing Machinery, New York, NY, USA, 65–70. https://doi.org/10.1145/2641248.2641267
[34]
Huy Viet Le, Sven Mayer, Maximilian Weiß, Jonas Vogelsang, Henrike Weingärtner, and Niels Henze. 2020. Shortcut Gestures for Mobile Text Editing on Fully Touch Sensitive Smartphones. ACM Trans. Comput.-Hum. Interact. 27, 5, Article 33 (aug 2020), 38 pages. https://doi.org/10.1145/3396233
[35]
Charles Lenay, Stephane Canu, and Pierre Villon. 1997. Technology and Perception: The Contribution of Sensory Substitution Systems. In Proceedings of the 2nd International Conference on Cognitive Technology (CT ’97)(CT ’97). IEEE Computer Society, USA, 44.
[36]
Yang Kyu Lim and Woon Seung Yeo. 2014. Smartphone-based Music Conducting. In Proceedings of the International Conference on New Interfaces for Musical Expression. Zenodo, 573–576. https://doi.org/10.5281/zenodo.1178851
[37]
Charles J. Limb and Alexis T. Roy. 2014. Technological, biological, and acoustical constraints to music perception in cochlear implant users. Hearing Research 308(2014), 13–26. https://doi.org/10.1016/j.heares.2013.04.009 Music: A window into the hearing brain.
[38]
Joana Lobo, Soichiro Matsuda, Izumi Futamata, Ryoichi Sakuta, and Kenji Suzuki. 2019. CHIMELIGHT: Augmenting Instruments in Interactive Music Therapy for Children with Neurodevelopmental Disorders. In The 21st International ACM SIGACCESS Conference on Computers and Accessibility (Pittsburgh, PA, USA) (ASSETS ’19). Association for Computing Machinery, New York, NY, USA, 124–135. https://doi.org/10.1145/3308561.3353784
[39]
Meethu Malu, Pramod Chundury, and Leah Findlater. 2018. Exploring Accessible Smartwatch Interactions for People with Upper Body Motor Impairments. Association for Computing Machinery, New York, NY, USA, 1–12. https://doi.org/10.1145/3173574.3174062
[40]
Dan Mauney, Jonathan Howarth, Andrew Wirtanen, and Miranda Capra. 2010. Cultural Similarities and Differences in User-Defined Gestures for Touchscreen User Interfaces. In CHI ’10 Extended Abstracts on Human Factors in Computing Systems (Atlanta, Georgia, USA) (CHI EA ’10). Association for Computing Machinery, New York, NY, USA, 4015–4020. https://doi.org/10.1145/1753846.1754095
[41]
Hugh J. McDermott. 2004. Music Perception with Cochlear Implants: A Review. Trends in Amplification 8, 2 (January 2004), 49–82. https://doi.org/10.1177/108471380400800203 arXiv:https://doi.org/10.1177/108471380400800203PMID: 15497033.
[42]
Jorge Mori and Deborah I. Fels. 2009. Seeing the music can animated lyrics provide access to the emotional content in music for people who are deaf or hard of hearing?. In 2009 IEEE Toronto International Conference Science and Technology for Humanity (TIC-STH). 951–956. https://doi.org/10.1109/TIC-STH.2009.5444362
[43]
Suranga Nanayakkara, Elizabeth Taylor, Lonce Wyse, and S H. Ong. 2009. An Enhanced Musical Experience for the Deaf: Design and Evaluation of a Music Display and a Haptic Chair. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Boston, MA, USA) (CHI ’09). Association for Computing Machinery, New York, NY, USA, 337–346. https://doi.org/10.1145/1518701.1518756
[44]
Vijayakumar Nanjappan, Rongkai Shi, Hai-Ning Liang, Kim King-Tong Lau, Yong Yue, and Katie Atkinson. 2019. Towards a Taxonomy for In-Vehicle Interactions Using Wearable Smart Textiles: Insights from a User-Elicitation Study. Multimodal Technologies and Interaction 3, 2 (2019). https://doi.org/10.3390/mti3020033
[45]
Jieun Oh, Jorge Herrera, Nicholas J. Bryan, Luke Dahl, and Ge Wang. 2010. Evolving The Mobile Phone Orchestra. In Proceedings of the International Conference on New Interfaces for Musical Expression. Zenodo, 82–87. https://doi.org/10.5281/zenodo.1177871
[46]
Shotaro Omori and Ikuko Eguchi Yairi. 2013. Collaborative Music Application for Visually Impaired People with Tangible Objects on Table. In Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility (Bellevue, Washington) (ASSETS ’13). Association for Computing Machinery, New York, NY, USA, Article 42, 2 pages. https://doi.org/10.1145/2513383.2513403
[47]
Deysi Helen Ortega, Franceli Linney Cibrian, and Mónica Tentori. 2015. BendableSound: A Fabric-Based Interactive Surface to Promote Free Play in Children with Autism. In Proceedings of the 17th International ACM SIGACCESS Conference on Computers & Accessibility (Lisbon, Portugal) (ASSETS ’15). Association for Computing Machinery, New York, NY, USA, 315–316. https://doi.org/10.1145/2700648.2811355
[48]
Mikel Ostiz-Blanco, Alfredo Pina, Miriam Lizaso, Jose Javier Astráin, and Gonzalo Arrondo. 2018. Using the Musical Multimedia Tool ACMUS with People with Severe Mental Disorders: A Pilot Study. In Proceedings of the 20th International ACM SIGACCESS Conference on Computers and Accessibility (Galway, Ireland) (ASSETS ’18). Association for Computing Machinery, New York, NY, USA, 462–464. https://doi.org/10.1145/3234695.3241016
[49]
Carol A Padden and Tom Humphries. 1988. Deaf in America. Harvard University Press.
[50]
William Payne, Alex Xu, Amy Hurst, and S. Alex Ruthmann. 2019. Non-Visual Beats: Redesigning the Groove Pizza. In The 21st International ACM SIGACCESS Conference on Computers and Accessibility (Pittsburgh, PA, USA) (ASSETS ’19). Association for Computing Machinery, New York, NY, USA, 651–654. https://doi.org/10.1145/3308561.3354590
[51]
Benjamin Petry, Thavishi Illandara, Don Samitha Elvitigala, and Suranga Nanayakkara. 2018. Supporting Rhythm Activities of Deaf Children Using Music-Sensory-Substitution Systems. Association for Computing Machinery, New York, NY, USA, 1–10. https://doi.org/10.1145/3173574.3174060
[52]
Benjamin Petry, Thavishi Illandara, and Suranga Nanayakkara. 2016. MuSS-Bits: Sensor-Display Blocks for Deaf People to Explore Musical Sounds. In Proceedings of the 28th Australian Conference on Computer-Human Interaction (Launceston, Tasmania, Australia) (OzCHI ’16). Association for Computing Machinery, New York, NY, USA, 72–80. https://doi.org/10.1145/3010915.3010939
[53]
Michael Pouris and Deborah I. Fels. 2012. Creating an Entertaining and Informative Music Visualization. In Proceedings of the 13th International Conference on Computers Helping People with Special Needs - Volume Part I (Linz, Austria) (ICCHP’12). Springer-Verlag, Berlin, Heidelberg, 451–458. https://doi.org/10.1007/978-3-642-31522-0_68
[54]
Grazia Ragone. 2020. Designing Embodied Musical Interaction for Children with Autism. In The 22nd International ACM SIGACCESS Conference on Computers and Accessibility (Virtual Event, Greece) (ASSETS ’20). Association for Computing Machinery, New York, NY, USA, Article 104, 4 pages. https://doi.org/10.1145/3373625.3417077
[55]
Janine Roebuck. 2007. I am a deaf opera singer. https://www.theguardian.com/theguardian/2007/sep/29/weekend7.weekend2. (Accessed on 07/10/2022).
[56]
Michael Rohs and Georg Essl. 2007. CaMus2: Optical Flow and Collaboration in Camera Phone Music Performance. In Proceedings of the 7th International Conference on New Interfaces for Musical Expression (New York, New York) (NIME ’07). Association for Computing Machinery, New York, NY, USA, 160–163. https://doi.org/10.1145/1279740.1279770
[57]
Michael Rohs, Georg Essl, and Martin Roth. 2006. CaMus: Live Music Performance using Camera Phones and Visual Grid Tracking. In Proceedings of the International Conference on New Interfaces for Musical Expression. Zenodo, 31–36. https://doi.org/10.5281/zenodo.1176997
[58]
Jaime Ruiz, Yang Li, and Edward Lank. 2011. User-Defined Motion Gestures for Mobile Interaction. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Vancouver, BC, Canada) (CHI ’11). Association for Computing Machinery, New York, NY, USA, 197–206. https://doi.org/10.1145/1978942.1978971
[59]
Greg Schiemer and Mark Havryliv. 2006. Pocket Gamelan: Tuneable Trajectories for Flying Sources in Mandala 3 and Mandala 4. In Proceedings of the 2006 Conference on New Interfaces for Musical Expression (Paris, France) (NIME ’06). IRCAM — Centre Pompidou, Paris, FRA, 37–42.
[60]
Matthias Seuter, Eduardo Rodriguez Macrillante, Gernot Bauer, and Christian Kray. 2018. Running with Drones: Desired Services and Control Gestures. In Proceedings of the 30th Australian Conference on Computer-Human Interaction (Melbourne, Australia) (OzCHI ’18). Association for Computing Machinery, New York, NY, USA, 384–395. https://doi.org/10.1145/3292147.3292156
[61]
Bradley Strylowski, Jesse Allison, and Jesse Guessford. 2014. Pitch Canvas: Touchscreen Based Mobile Music Instrument. In Proceedings of the International Conference on New Interfaces for Musical Expression. Zenodo, 171–174. https://doi.org/10.5281/zenodo.1178947
[62]
Atau Tanaka. 2004. Mobile Music Making. In Proceedings of the 2004 Conference on New Interfaces for Musical Expression (Hamamatsu, Shizuoka, Japan) (NIME ’04). National University of Singapore, SGP, 154–156.
[63]
Stephanie Valencia, Dwayne Lamb, Shane Williams, Harish S. Kulkarni, Ann Paradiso, and Meredith Ringel Morris. 2019. Dueto: Accessible, Gaze-Operated Musical Expression. In The 21st International ACM SIGACCESS Conference on Computers and Accessibility (Pittsburgh, PA, USA) (ASSETS ’19). Association for Computing Machinery, New York, NY, USA, 513–515. https://doi.org/10.1145/3308561.3354603
[64]
Maria Varvarigou, Susan Hallam, Andrea Creech, and Hilary McQueen. 2012. Benefits experienced by older people in group music-making activities. Journal of Applied Arts and Health 3 (08 2012), 183–198. https://doi.org/10.1386/jaah.3.2.183_1
[65]
Radu-Daniel Vatavu and Jacob O. Wobbrock. 2015. Formalizing Agreement Analysis for Elicitation Studies: New Measures, Significance Test, and Toolkit. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (Seoul, Republic of Korea) (CHI ’15). Association for Computing Machinery, New York, NY, USA, 1325–1334. https://doi.org/10.1145/2702123.2702223
[66]
Santiago Villarreal-Narvaez, Jean Vanderdonckt, Radu-Daniel Vatavu, and Jacob O. Wobbrock. 2020. A Systematic Review of Gesture Elicitation Studies: What Can We Learn from 216 Studies?Association for Computing Machinery, New York, NY, USA, 855–872. https://doi.org/10.1145/3357236.3395511
[67]
Panagiotis Vogiatzidakis and Panayiotis Koutsabasis. 2020. Mid-Air Gesture Control of Multiple Home Devices in Spatial Augmented Reality Prototype. Multimodal Technologies and Interaction 4, 3 (2020). https://doi.org/10.3390/mti4030061
[68]
Quoc V. Vy, Jorge A. Mori, David W. Fourney, and Deborah I. Fels. 2008. EnACT: A Software Tool for Creating Animated Text Captions. In Computers Helping People with Special Needs, Klaus Miesenberger, Joachim Klaus, Wolfgang Zagler, and Arthur Karshmer (Eds.). Springer Berlin Heidelberg, Berlin, Heidelberg, 609–616.
[69]
Benjamin Walther-Franks, Tanja Döring, Meltem Yilmaz, and Rainer Malaka. 2019. Embodiment or Manipulation? Understanding Users’ Strategies for Free-Hand Character Control. In Proceedings of Mensch Und Computer 2019 (Hamburg, Germany) (MuC’19). Association for Computing Machinery, New York, NY, USA, 661–665. https://doi.org/10.1145/3340764.3344887
[70]
Ge Wang. 2014. Ocarina: Designing the iPhone’s Magic Flute. Computer Music Journal 38, 2 (06 2014), 8–21. https://doi.org/10.1162/COMJ_a_00236 arXiv:https://direct.mit.edu/comj/article-pdf/38/2/8/1855988/comj_a_00236.pdf
[71]
Gil Weinberg, Mark Godfrey, and Andrew Beck. 2010. ZOOZbeat: Mobile Music Recreation. In CHI ’10 Extended Abstracts on Human Factors in Computing Systems (Atlanta, Georgia, USA) (CHI EA ’10). Association for Computing Machinery, New York, NY, USA, 4817–4822. https://doi.org/10.1145/1753846.1754238
[72]
Adam S. Williams and Francisco R. Ortega. 2020. Understanding Gesture and Speech Multimodal Interactions for Manipulation Tasks in Augmented Reality Using Unconstrained Elicitation. arxiv:2009.06591 [cs.HC]
[73]
Jacob O. Wobbrock, Htet Htet Aung, Brandon Rothrock, and Brad A. Myers. 2005. Maximizing the Guessability of Symbolic Input. In CHI ’05 Extended Abstracts on Human Factors in Computing Systems (Portland, OR, USA) (CHI EA ’05). Association for Computing Machinery, New York, NY, USA, 1869–1872. https://doi.org/10.1145/1056808.1057043
[74]
Jacob O. Wobbrock, Meredith Ringel Morris, and Andrew D. Wilson. 2009. User-Defined Gestures for Surface Computing. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Boston, MA, USA) (CHI ’09). Association for Computing Machinery, New York, NY, USA, 1083–1092. https://doi.org/10.1145/1518701.1518866
[75]
Huiyue Wu, Jinxuan Gai, Yu Wang, Jiayi Liu, Jiali Qiu, Jianmin Wang, and Xiaolong(Luke) Zhang. 2020. Influence of cultural factors on freehand gesture design. International Journal of Human-Computer Studies 143 (2020), 102502. https://doi.org/10.1016/j.ijhcs.2020.102502
[76]
Huiyue Wu, Weizhou Luo, Neng Pan, Shenghuan Nan, Yanyi Deng, Shengqian Fu, and Liuqingqing Yang. 2019. Understanding freehand gestures: a study of freehand gestural interaction for immersive VR shopping applications. Human-centric Computing and Information Sciences 9, 1 (2019), 43. https://doi.org/10.1186/s13673-019-0204-7
[77]
Huiyue Wu, Yu Wang, Jiayi Liu, Jiali Qiu, and Xiaolong (Luke) Zhang. 2020. User-defined gesture interaction for in-vehicle information systems. Multimedia Tools and Applications 79, 1 (2020), 263–288. https://doi.org/10.1007/s11042-019-08075-1
[78]
Ikuko Eguchi Yairi and Takuya Takeda. 2012. A Music Application for Visually Impaired People Using Daily Goods and Stationeries on the Table. In Proceedings of the 14th International ACM SIGACCESS Conference on Computers and Accessibility (Boulder, Colorado, USA) (ASSETS ’12). Association for Computing Machinery, New York, NY, USA, 271–272. https://doi.org/10.1145/2384916.2384988
[79]
Hui-Jen Yang, Y.-L. Lay, Yi-Chin Liou, Wen-Yu Tsao, and Cheng-Kun. Lin. 2007. Development and evaluation of computer-aided music-learning system for the hearing impaired. Journal of Computer Assisted Learning 23, 6 (2007), 466–476. https://doi.org/10.1111/j.1365-2729.2007.00229.x arXiv:https://onlinelibrary.wiley.com/doi/pdf/10.1111/j.1365-2729.2007.00229.x
[80]
Zhican Yang, Chun Yu, Fengshi Zheng, and Yuanchun Shi. 2019. ProxiTalk: Activate Speech Input by Bringing Smartphone to the Mouth. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 3, 3, Article 118 (sep 2019), 25 pages. https://doi.org/10.1145/3351276
[81]
Yinsheng Zhou, Khe Chai Sim, Patsy Tan, and Ye Wang. 2012. MOGAT: Mobile Games with Auditory Training for Children with Cochlear Implants. In Proceedings of the 20th ACM International Conference on Multimedia (Nara, Japan) (MM ’12). Association for Computing Machinery, New York, NY, USA, 429–438. https://doi.org/10.1145/2393347.2393409

Cited By

View all
  • (2024)Development of embodied listening studies with multimodal and wearable haptic interfaces for hearing accessibility in musicFrontiers in Computer Science10.3389/fcomp.2023.11627585Online publication date: 10-Jan-2024
  • (2023)Exploring Think-aloud Method with Deaf and Hard of Hearing College StudentsProceedings of the 2023 ACM Designing Interactive Systems Conference10.1145/3563657.3595980(1757-1772)Online publication date: 10-Jul-2023

Index Terms

  1. Designing Gestures for Digital Musical Instruments: Gesture Elicitation Study with Deaf and Hard of Hearing People

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    ASSETS '22: Proceedings of the 24th International ACM SIGACCESS Conference on Computers and Accessibility
    October 2022
    902 pages
    ISBN:9781450392587
    DOI:10.1145/3517428
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 22 October 2022

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Deaf
    2. gesture elicitation study
    3. hard of hearing
    4. mobile
    5. music

    Qualifiers

    • Research-article
    • Research
    • Refereed limited

    Funding Sources

    Conference

    ASSETS '22
    Sponsor:

    Acceptance Rates

    ASSETS '22 Paper Acceptance Rate 35 of 132 submissions, 27%;
    Overall Acceptance Rate 436 of 1,556 submissions, 28%

    Upcoming Conference

    ASSETS '25

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)265
    • Downloads (Last 6 weeks)31
    Reflects downloads up to 08 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Development of embodied listening studies with multimodal and wearable haptic interfaces for hearing accessibility in musicFrontiers in Computer Science10.3389/fcomp.2023.11627585Online publication date: 10-Jan-2024
    • (2023)Exploring Think-aloud Method with Deaf and Hard of Hearing College StudentsProceedings of the 2023 ACM Designing Interactive Systems Conference10.1145/3563657.3595980(1757-1772)Online publication date: 10-Jul-2023

    View Options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Login options

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media