Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3489849.3489912acmconferencesArticle/Chapter ViewAbstractPublication PagesvrstConference Proceedingsconference-collections
abstract

Effect of Visual Feedback on Understanding Timbre with Shapes Based on Crossmodal Correspondences

Published: 08 December 2021 Publication History

Abstract

Timbre is a crucial element in playing musical instruments, and it is difficult for beginners to learn it independently. Therefore, external feedback (FB) is required. However, conventional FB methods lack intuitiveness in visualization. In this study, we propose a novel FB method that adopts crossmodal correspondence to enhance the intuitive visualization of timbre with visual shapes. Based on the experiments, it was inferred that the FB based on crossmodal correspondence prevents dependence on FB and promotes learning.

Supplementary Material

MP4 File (VRST_video.mp4)
Supplemental video

References

[1]
Mohammad Adeli, Jean Rouat, and StéAphane Molotchnikoff. 2014. Audiovisual correspondence between musical timbre and visual shapes. Frontiers in Human Neuroscience 8, MAY (may 2014), 352. https://doi.org/10.3389/fnhum.2014.00352
[2]
Sergio Giraldo, George Waddell, Ignasi Nou, Ariadna Ortega, Oscar Mayor, Alfonso Perez, Aaron Williamon, and Rafael Ramirez. 2019. Automatic assessment of tone quality in violin music performance. Frontiers in Psychology 10 (2019), 334.
[3]
Robyn S Kim, Aaron R Seitz, and Ladan Shams. 2008. Benefits of stimulus congruency for multisensory facilitation of visual learning. PLoS One 3, 1 (2008), e1532.
[4]
Naoki Kimura, Keisuke Shiro, Yota Takakura, Hiromi Nakamura, and Jun Rekimoto. 2020. SonoSpace: Visual Feedback of Timbre with Unsupervised Learning. In Proceedings of the 28th ACM International Conference on Multimedia. 367–374.
[5]
Tadasu Oyama, Hisao Miyano, and Hiroshi Yamada. 2003. Multidimensional scaling of computer-generated abstract forms. In New developments in psychometrics. Springer, 551–558.
[6]
Cesare V Parise and Charles Spence. 2012. Audiovisual crossmodal correspondences and sound symbolism: A study using the implicit association test. Experimental Brain Research 220, 3-4 (aug 2012), 319–333. https://doi.org/10.1007/s00221-012-3140-6
[7]
Oriol Romani Picas, Hector Parra Rodriguez, Dara Dabiri, Hiroshi Tokuda, Wataru Hariya, Koji Oishi, and Xavier Serra. 2015. A real-time system for measuring sound goodness in instrumental sounds. In Audio Engineering Society Convention 138. Audio Engineering Society.
[8]
Makiko Sadakata, David Hoppe, Alex Brandmeyer, Renee Timmers, and Peter Desain. 2008. Real-Time Visual Feedback for Learning to Perform Short Rhythms with Expressive Variations in Timing and Loudness. Journal of New Music Research 37, 3 (2008), 207–220. https://doi.org/10.1080/09298210802322401
[9]
Charles Spence. 2011. Crossmodal correspondences: A tutorial review. Attention, Perception, & Psychophysics 73, 4 (2011), 971–995.
[10]
Yuji Wada, Kazuya Matsubara, Akira Miyamae, and Kazuya Ishibashi. Japanese Patent 6725123 07 2020. Method, program and information processing device for displaying time-varying flavors as time-varying visual elements (in Japanese).

Cited By

View all
  • (2022)Coloured hearing, colour music, colour organs, and the search for perceptually meaningful correspondences between colour and soundi-Perception10.1177/2041669522109280213:3(204166952210928)Online publication date: 9-May-2022

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
VRST '21: Proceedings of the 27th ACM Symposium on Virtual Reality Software and Technology
December 2021
563 pages
ISBN:9781450390927
DOI:10.1145/3489849
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 08 December 2021

Check for updates

Author Tags

  1. concurrent feedback
  2. crossmodal correspondence
  3. perceptual learning
  4. timbre visualization

Qualifiers

  • Abstract
  • Research
  • Refereed limited

Conference

VRST '21

Acceptance Rates

Overall Acceptance Rate 66 of 254 submissions, 26%

Upcoming Conference

VRST '24

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)18
  • Downloads (Last 6 weeks)0
Reflects downloads up to 26 Sep 2024

Other Metrics

Citations

Cited By

View all
  • (2022)Coloured hearing, colour music, colour organs, and the search for perceptually meaningful correspondences between colour and soundi-Perception10.1177/2041669522109280213:3(204166952210928)Online publication date: 9-May-2022

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media