Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3478384.3478403acmotherconferencesArticle/Chapter ViewAbstractPublication PagesamConference Proceedingsconference-collections
short-paper

Resin: a Vocal Tract Resonances and Head Based Accessible Digital Musical Instrument

Published: 15 October 2021 Publication History

Abstract

Recent developments in sensor technologies allowed the definition of new human-computer interaction channels, useful for people with very limiting motor disabilities such as quadriplegia. Some of these sensors are available pre-packaged on the mass market, complete with computer interaction softwares, while others are easily achievable at low costs through DIY approaches. In this article we present Resin, an Accessible Digital Musical Instrument dedicated to people with quadriplegic disability. Resin exploits two interaction channels, head movements and the shape of the vocal tract, detected through the corresponding acoustic resonances, to control musical performance parameters. The structure of the instrument is discussed, from both the hardware and software points of view. Feature extraction algorithms for both channels are explained, particularly focusing on the vocal tract resonances interaction paradigm.

Supplementary Material

MP4 File (p280-video.mp4)
Supplemental video

References

[1]
Evan Balster. n.d. Imitone: mind to melody. https://imitone.com/.
[2]
Jingyuan Cheng, Ayano Okoso, Kai Kunze, Niels Henze, Albrecht Schmidt, Paul Lukowicz, and Koichi Kise. 2014. On the Tip of My Tongue: A Non-Invasive Pressure-Based Tongue Interface. In Proc. 5th Augmented Human Int. Conf.(AH ’14). Association for Computing Machinery, New York, NY, USA, 1–4. https://doi.org/10.1145/2582051.2582063
[3]
Yee Chieh (Denise) Chew and Eric Caspary. 2011. MusEEGk: A Brain Computer Musical Interface. In Proc. ’11 Ann. Conf. Ext. Abs. on Human Factors in Computing Systems. ACM Press, Vancouver, BC, Canada, 1417. https://doi.org/10.1145/1979742.1979784
[4]
Nicola Davanzo and Federico Avanzini. 2020. Experimental Evaluation of Three Interaction Channels for Accessible Digital Musical Instruments. In Proc. ’20 Int. Conf. on Computers Helping People With Special Needs. Springer, Cham, Online Conf., 437–445. https://doi.org/10.1007/978-3-030-58805-2_52
[5]
Nicola Davanzo and Federico Avanzini. 2020. Hands-Free Accessible Digital Musical Instruments: Conceptual Framework, Challenges, and Perspectives. IEEE Access 8(2020), 163975–163995. https://doi.org/10.1109/ACCESS.2020.3019978
[6]
Nicola Davanzo, Piercarlo Dondi, Mauro Mosconi, and Marco Porta. 2018. Playing Music with the Eyes through an Isomorphic Interface. In Proc. of the Workshop on Communication by Gaze Interaction. ACM Press, Warsaw, Poland, 1–5. https://doi.org/10.1145/3206343.3206350
[7]
Bruce Denby, T. Schultz, K. Honda, T. Hueber, J. M. Gilbert, and J. S. Brumberg. 2010. Silent Speech Interfaces. Speech Communication (Special Issues) 52, 4 (April 2010), 270–287. https://doi.org/10.1016/j.specom.2009.08.002
[8]
John Fillwalk. 2015. ChromaChord: A Virtual Musical Instrument. In Proc. 2015 IEEE Symposium on 3D User Interfaces. IEEE, Arles, France, 201–202. https://doi.org/10.1109/3DUI.2015.7131770
[9]
Emma Frid. 2019. Accessible Digital Musical Instruments—A Review of Musical Interfaces in Inclusive Music Practice. Multimodal Technologies and Interaction 3, 3 (July 2019), 57. https://doi.org/10.3390/mti3030057
[10]
Housemate. n.d. Magic Flute. http://housemate.ie/magic-flute/.
[11]
Thomas Hueber, Elie-Laurent Benaroya, Gérard Chollet, Bruce Denby, Gérard Dreyfus, and Maureen Stone. 2010. Development of a Silent Speech Interface Driven by Ultrasound and Optical Images of the Tongue and Lips. Speech Communication 52, 4 (April 2010), 288–300. https://doi.org/10.1016/j.specom.2009.11.004
[12]
Jamboxx. n.d. Jamboxx. https://www.jamboxx.com/.
[13]
Tokihiko Kaburagi and Masaaki Honda. 1994. An Ultrasonic Method for Monitoring Tongue Shape and the Position of a Fixed Point on the Tongue Surface. J. of the Acoustical Society of America 95, 4 (April 1994), 2268–2270. https://doi.org/10.1121/1.408637
[14]
Ajay Kapur, Ariel J. Lazier, Philip Davidson, Wilson R. Scott, and Perry R. Cook. 2017. The Electronic Sitar Controller. In A NIME Reader: Fifteen Years of New Interfaces for Musical Expression (firsted.), Alexander Refsum Jensenius and Michael J. Lyons (Eds.). Current Research in Systematic Musicology, Vol. 1. Springer International Publishing, Cham, 147–163. https://doi.org/10.1007/978-3-319-47214-0_10
[15]
Jeppe Veirum Larsen, Dan Overholt, and Thomas B. Moeslund. 2016. The Prospects of Musical Instruments For People with Physical Disabilities. In Proc. 16th Int. Conf. on New Interfaces for Musical Expression(NIME ’16). NIME, Griffith University, Brisbane, Australia, 327–331.
[16]
Theodore C. Levin and Michael E. Edgerton. 1999. The Throat Singers of Tuva. Scientific American 281, 3 (1999), 80–87.
[17]
Michael J. Lyons, Michael Haehnel, and Nobuji Tetsutani. 2003. Designing, Playing, and Performing with a Vision-Based Mouth Interface. In Proc. 3rd Conf. on New Interfaces for Musical Expression(NIME ’03). NIME, McGill University, Montreal, Canada, 116–121.
[18]
Vahakn Matossian and Rolf Gehlhaar. 2015. Human Instruments: Accessible Musical Instruments for People with Varied Physical Ability. Annual Review of Cybertherapy and Telemedicine 219 (2015), 202–207.
[19]
Hye Yeon Nam and Carl DiSalvo. 2010. Tongue Music: The Sound of a Kiss. In Proc. 28th Int. Conf. Ext. Abs. on Human Factors in Computing Systems. ACM Press, Atlanta, Georgia, USA, 4805. https://doi.org/10.1145/1753846.1754235
[20]
Takuya Niikawa. 2004. Tongue-Controlled Electro-Musical Instrument. In Proc. 18th Int. Congr. on Acoustics, Vol. 3. Acoustical Society, International Conference Hall, Kyoto, Japan, 1905–1908.
[21]
Hangue Park, Mehdi Kiani, Hyung-Min Lee, Jeonghee Kim, Jacob Block, Benoit Gosselin, and Maysam Ghovanloo. 2012. A Wireless Magnetoresistive Sensing System for an Intraoral Tongue-Computer Interface. IEEE Trans. on Biomedical Circuits and Systems 6, 6 (Dec. 2012), 571–585. https://doi.org/10.1109/TBCAS.2012.2227962
[22]
Andreas Refsgaard. n.d. Eye Conductor. https://andreasrefsgaard.dk/project/eye-conductor/.
[23]
T. Scott Saponas, Daniel Kelly, Babak A. Parviz, and Desney S. Tan. 2009. Optically Sensing Tongue Gestures for Computer Input. In Proc. 22nd ACM Symp. on User Interface Software and Technology(UIST ’09). Association for Computing Machinery, New York, NY, USA, 177–180. https://doi.org/10.1145/1622176.1622209
[24]
Stefania Serafin, Cumhur Erkut, Juraj Kojs, Niels C. Nilsson, and Rolf Nordahl. 2016. Virtual Reality Musical Instruments: State of the Art, Design Principles, and Future Directions. Computer Music J. 40, 3 (Sept. 2016), 22–40. https://doi.org/10.1162/COMJ_a_00372
[25]
Ingo R. Titze and Daniel W. Martin. 1998. Principles of Voice Production. J. of the Acoustical Society of America 104, 3 (Sept. 1998), 1148–1148. https://doi.org/10.1121/1.424266
[26]
United Nations. 2015. Universal Declaration of Human Rights. https://www.un.org/en/universal-declaration-human-rights/index.html.
[27]
Zacharias Vamvakousis and Rafael Ramirez. 2014. P300 Harmonies: A Brain-Computer Musical Interface. In Proc. 2014 Int. Computer Music Conf./Sound and Music Computing Conf.Michigan Publishing, Athens, Greece, 725–729.
[28]
Zacharias Vamvakousis and Rafael Ramirez. 2016. The EyeHarp: A Gaze-Controlled Digital Musical Instrument. Frontiers in Psychology 7 (2016), article 906. https://doi.org/10.3389/fpsyg.2016.00906
[29]
Florian Vogt, Graeme McCaig, Mir Adnan Ali, and Sidney S. Fels. 2002. Tongue’n’Groove: An Ultrasound Based Music Controller. In Proc. 2nd Int. Conf. on New Interfaces for Musical Expression(NIME 2002). NIME, Dublin, Ireland, 60–64.

Cited By

View all
  • (2023)A Case Study on Netychords: Crafting Accessible Digital Musical Instrument Interaction for a Special Needs ScenarioComputer-Human Interaction Research and Applications10.1007/978-3-031-49425-3_22(353-372)Online publication date: 23-Dec-2023

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
AM '21: Proceedings of the 16th International Audio Mostly Conference
September 2021
283 pages
ISBN:9781450385695
DOI:10.1145/3478384
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 15 October 2021

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. accessibility
  2. head tracking
  3. musical instrument
  4. vocal tract

Qualifiers

  • Short-paper
  • Research
  • Refereed limited

Conference

AM '21
AM '21: Audio Mostly 2021
September 1 - 3, 2021
virtual/Trento, Italy

Acceptance Rates

Overall Acceptance Rate 177 of 275 submissions, 64%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)13
  • Downloads (Last 6 weeks)1
Reflects downloads up to 26 Sep 2024

Other Metrics

Citations

Cited By

View all
  • (2023)A Case Study on Netychords: Crafting Accessible Digital Musical Instrument Interaction for a Special Needs ScenarioComputer-Human Interaction Research and Applications10.1007/978-3-031-49425-3_22(353-372)Online publication date: 23-Dec-2023

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media