Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article
Open access

“Why are there so many steps?”: Improving Access to Blind and Low Vision Music Learning through Personal Adaptations and Future Design Ideas

Published: 21 September 2023 Publication History

Abstract

Music can be a catalyst for self-development, creative expression, and community building for blind or low vision (BLV) individuals. However, BLV music learners face complex obstacles in learning music. They are highly reliant on their learning environment and music teachers for accommodations and flexibility. Prior research identified the challenges faced by BLV musicians. Yet, limited research has addressed these challenges through the development of technology. Drawing upon the experience and suggestions of 40 BLV professional musicians, amateur musicians and music teachers (including sighted teachers with experience teaching blind students), we identified five themes: (1) Key Challenges of BLV Music Learning, (2) Personal Adaptations to Overcome Music Learning Challenges, (3) Perspectives on Current and Future Assistive Technologies, (4) Contention Between Braille Music and Auditory Learning, and (5) Role of Human Support for Music Learning. Together, these findings outline a path to make music learning more accessible to BLV people. To this end, we describe opportunities for enhanced audio cues for musical communication, recommend integrating vibrotactile feedback to aid music reading and design technology that supports independence and interdependence in music learning.

1 Introduction

The World Health Organization estimated that there are currently 284 million people in the world who are low vision and 39 million people who are blind [41]. Blind and Low Vision (BLV) people actively use other senses (especially hearing) to compensate for vision and find music as a source of personal enjoyment [44, 58]. Furthermore, studies indicated that BLV people exhibit higher levels of musical aptitude than sighted individuals [23, 24]. BLV musicians, such as pianist Alec Templeton, Nicholas Constantinidis and Michael Arnowitt; opera singer Andrea Bocelli; violinist Takayoshi Wanami and famous singers like Jose Feliciano, Ray Charles and Stevie Wonder have been celebrated for their contribution to music [44].
However, learning to play music as a BLV person remains a complex and challenging endeavour [1, 4, 18, 22]. BLV music learners have limited access to learning resources and new music [18]. Learning resources such as Braille music scores are expensive and time-consuming to acquire [43]. They rely on their music teachers and peers for accommodations and support [36]. However, music teachers are, at times, under-prepared and ill-equipped to meet the individual needs of BLV music learners [13, 14, 25]. Davis described the lack of knowledge and emphasis on rigid teaching practices as a ‘barrier to access’ [15]. Abramo and Pierce suggested that a teacher’s inattention to students with disabilities is influenced by larger systemic issues like teacher education, professional development and professional responsibilities [1].
Widely available commercial tools to assist BLV people in learning music remain limited to improving access to music reading [63], and music composition [64]. However, prior work found other challenges for BLV music learning, such as learning new music (either by ear or from a Braille music score), learning technique on an instrument and understanding non-verbal cues from music teachers and sighted musicians [1, 4].
We built on our previous research [33] and others [1, 4, 36] to expand on the current challenges faced by BLV musicians. Specifically, we sought to identify what personal adaptation strategies are utilized by BLV musicians and how this can inform the design of technology for non-verbal communication, music reading and to support technical and conceptual guidance. In our discussion, we explored the potential applications of technology to enhance listening to increase comprehension of non-verbal cues and gestures. We proposed the integration of vibrotactile feedback into existing music reading practices. Additionally, we delved into flexible design of assistive technologies that support independent and interdependent learning. Our study addressed two research questions:
RQ1: What personal adaptations and design ideas can inform the development of assistive technologies for BLV music learning?
RQ2: What other factors must be considered to develop assistive technologies for BLV music learning?
We conducted one-hour-long interviews with 40 BLV professional musicians, music teachers (BLV music teachers and sighted music teachers with experience teaching BLV music learners) and BLV amateur musicians. We discussed challenges and strategies that participants had encountered when learning and teaching music, their use of assistive technologies (ATs) to help with learning, the contentious discourse around braille music and considerations to inform the design of technology for BLV music learning. Our study makes the following contributions:
(1)
We expand on key BLV music learning challenges and identify personal adaptation strategies.
(2)
We report on current limitations of technology and provide future design.
(3)
We identify the impact of social factors on music learning, including the role of social support and the perception of braille music and auditory learning.

2 Background

To understand the gaps in current research and the importance of our research questions, we review prior work that examined BLV people’s challenges in learning music, the importance of touch for music learning, and current tools for music learning.

2.1 Challenges in Music Learning

2.1.1 Reading Music from Braille or by Ear.

Music education within the Western tradition has focused on staff notation that is primarily visual or graphic in nature. Subsequently, BLV music learners have very limited access to this means of musical information and must rely on learning music by Ear or by learning music using braille music scores. Abramo and Pierce [1] found BLV music learners preferred learning music by ear as it was quicker and more easily accessible. Furthermore, participants found braille to be a difficult medium to learn and would instead prefer learning music scores through cross-referencing with audio recordings and through the use of resources such as YouTube [4]. However, Goldstein [22] found many students who had played music for years without learning braille music found rhythmic concepts and counting time difficult to comprehend. While learning musical scores through braille can be challenging, BLV musicians stated that braille music is important for music literacy and composition as well as for sharing musical ideas with other musicians [4].
Considering the benefits associated with each learning method, the recent advancement has focused on identifying conditions in which different methods can be employed based on a BLV student’s type of vision impairments, music level, and braille proficiency [43, 48].

2.1.2 Missing Non-verbal Cues and Gestures.

Musical communication in music schools, ensembles and musical learning groups is predominantly visual [4]. The gestural aspects of music teaching can include pointing, nodding and facial expressions. Sighted students learn music by reading the instructor’s physical gestures, body movements and posture [28]. However, BLV musicians are not privy to non-verbal communication, such as gestures and cannot mimic the actions of their teachers.
Some music teachers devised alternative ways of communicating with BLV musicians during rehearsals and performances using oral communication through spoken word and musical gestures [1]. Students would give each other spoken word cues during performances by explicitly stating when the music is about to begin or end. Also, teachers taught students to communicate through the music itself by changing a single musical note at the end of a musical phrase to indicate the end or beginning of the next verse. Recent research has devised a system that can communicate a conductor’s gestures to a choir through a haptic mobile phone [20], and the advancement highlights the potential for technology to support non-verbal cues to BLV people. However, little advancements have been made to develop technology to support non-verbal cues in other teaching environments to support BLV musicians.

2.2 Leveraging Touch to Learn Music as BLV Person

The sense of touch plays a fundamental role in learning to play a musical instrument for BLV people. It allows BLV musicians to develop an understanding of their instrument [4], interpret instruction using tactile modelling and physical guidance [40], and, as discussed before, read musical information [4, 48]. Modhrain and Gillespie [42] found that a person’s ability to learn and develop skills with a musical instrument is connected to understanding active and passive vibrotactile interactions. They describe the feedback loop between player and instrument as a dynamic coupling of sound and vibrations that determine the playability of the instrument and develop virtuosity in the music learner. Additionally, studies found that vibrotactile feedback can be used to convey real-time information [19].
One commonly used strategy to provide instructional guidance to BLV learners is the use of tactile modelling [47, 53]. Tactile modelling is the inspection of an action by a student by touching a demonstrator [40]. This is particularly beneficial for BLV learners as it often clarifies the mechanics of the movement more comprehensively than an explanation alone. Also, this allows BLV students to control the learning process by paying closer attention to aspects of movement that might have been unclear [40]. Tactile modelling in music learning takes place through the hand-over-hand method [1]. This involves a teacher playing the instrument while the student follows along by placing their hands on their teacher’s hands. A BLV music learner mentioned that they could not see others playing their instruments and could not mimic techniques based on visual examples alone. Instead, they asked their teacher to show them what to do and how to play by touching [4].
Another strategy to teach BLV learners is the application of physical guidance. This involves physically moving the student’s body part to describe a desired action [30]. When a BLV student is learning a new skill, the proprioceptive feedback from physical guidance will give them the information they need to perform the task correctly. Physical guidance gives the BLV student a kinesthetic cue related to the desired movement, increases their understanding of the action and allows the student to develop muscle memory of the correct form [40]. However, to our knowledge, there is limited work in HCI that discusses some recommendations to develop AT technologies to support already existing strategies to support BLV musicians.

2.3 Functionalities of Current Technology

A technology or system that is designed to aid music learning can be considered assistive technology for musicians (ATs). The development of ATs for BLV musicians has reduced barriers to access while promoting music literacy and independence. ATs for BLV music learning include screen reader-friendly software, refreshable braille displays and the use of vibrotactile feedback for music learning.

2.3.1 Refreshable Braille Displays.

A recent innovation in braille displays [26] allows users to listen to what they are reading, which could be used to connect braille music with sound more effectively. Furthermore, new multi-line braille displays [17] may be better suited for braille music, which uses positional information of notes to convey musical information. Park [43] reported advancements in braille reading technology with the development of refreshable braille displays and speech synthesizers [26] that enabled BLV musicians to read and listen to musical scores simultaneously.

2.3.2 Vibrotactile Feedback for Music Learning.

Holland et al. [6] designed Haptic Bracelets that guided users to play rhythmic patterns that required multi-limb coordination. They found localized haptic feedback on the wrists and ankles performed better than oral or visual cues to teach multi-limb coordination. Other studies found haptics are beneficial for teaching breathing control for vocal guidance [29], body movement and posture for violin bowing [56] and fingering technique for learning the flute [65].
Brewster and Brown [9] proposed the use of Tactons or tactile icons to convey information non-visually by altering the frequency, intensity, duration and position of tactile pulses. Other studies further explored the application of vibrotactile feedback to convey musical information and musical scores through tactile icons [21]. Baker et al. [3] created the Haptic Baton that allowed BLV musicians in an orchestra to respond to a conductor without having to look at the physical gestures being made.

2.4 The Present Study

Prior work understandably developed ATs that directly address the challenges to a certain degree to support a BLV student’s independence. For instance, refreshable braille displays empower a BLV student to learn by both modalities (i.e., audio and tactile sensations) so that the student does not need to use one method. Research on prominent educational theories suggests the possibility that there could be other non-independence factors that can inform the design of ATs. The self-determination theory proposes teaching must satisfy a student’s relatedness needs (i.e., feeling one is affiliated with others) [11, 16]. The Universal Learning for Design (ULD) indicates that there is an increase in variability in learning amongst students based on numerous factors, including learning preference and psycho-emotional backgrounds [49]. Thus, we examined BLV individuals’ personal adaptations and learning strategies to provide a nuanced understanding of existing challenges and identify non-independence factors that can inform the design of ATs.

3 Methodology

To address RQ1 and RQ2, we conducted individual one-hour-long virtual interviews with 40 participants (Table 1) after receiving ethics approval from our institution. Next, we analyzed the data using thematic analysis and organized the data into meaningful categories to form five themes. All participants were fluent in English and were located in Canada \((n = 18)\) , the United States of America \((n = 14)\) , the United Kingdom \((n = 2)\) , Scotland \((n = 2)\) , Ireland \((n = 1)\) , Norway \((n = 1)\) , Hungary \((n = 1)\) and Australia \((n = 1)\) .
Table 1.
\(^{*}\) IDAgeGenderVisual Impairment#Learning Exp. (in yrs.)Teaching Exp. (in yrs.)
P164ManB (since birth) \(^{**}\) n/a
P216ManB (since birth)Over 10Less than 1
P345+ManLVOver 15 \(^{**}\)
P476ManB (since childhood)Over 30n/a
T125WomanSOver 10Over 4
T247WomanS \(^{**}\) Over 21
T371WomanB (since birth)Over 15 \(^{**}\)
T453WomanS \(^{**}\) Over 25
T527WomanLVOver 10Over 5
T671WomanB (since birth) \(^{**}\) Over 25
T767WomanLV \(^{**}\) Over 25
T826WomanB (since birth)Over 20 \(^{**}\)
T924ManBOver 20 \(^{**}\)
T1074WomanB (since birth)Over 30 \(^{**}\)
T1156ManBOver 20Over 10
A155WomanB (since birth)Over 15n/a
A266ManB (since 30yrs+)Sporadic \(^{**}\) n/a
A341ManB (since 30yrs+)Sporadic \(^{**}\) n/a
A448ManLVLess than 1n/a
A528ManB (since birth)Sporadic \(^{**}\) n/a
A670+WomanB (since birth)Sporadic \(^{**}\) n/a
A743WomanB (since birth)Less than 1n/a
A829ManB (since birth)Less than 1n/a
A971ManBSporadic \(^{**}\) n/a
A1076ManLVSporadic \(^{**}\) n/a
A1132WomanLV \(^{**}\) n/a
A1269WomanB \(^{**}\) n/a
A1351ManB (since birth)Over 20n/a
A1433WomanB (since birth) \(^{**}\) n/a
A1534WomanB (since birth)Sporadic \(^{**}\) n/a
A1640ManB (since 30yrs+)Sporadic \(^{**}\) n/a
A1760WomanB (since birth)Sporadic \(^{**}\) n/a
A1854WomanLV \(^{**}\) n/a
TP187ManB (since childhood) \(^{**}\) Over 40
TP268WomanB (since birth) \(^{**}\) Over 30
TP348WomanBOver 20Over 20
TP435WomanB (since birth)Over 15Over 10
TP529WomanB (since birth)Over 16 \(^{**}\)
TP633WomanLVOver 25 \(^{**}\)
TA155ManBOver 10Over 10
Table 1. Participant Information and Music Experience
Note: \(^{*}\) T = Music Teacher, A = Amateur Musician, P = Professional Musician; #B = Blind, LV = Low Vision, S = Sighted; \(^{**}\) Did not specify.
The authors acknowledge that though they come from a knowledge and background of disability, none of the authors are part of the BLV community. Because of this, the authors reflected on their own privileges during the interviews and analysis process to try to limit the biases in the interpretation of the results. We frame the work to focus on a disability interpretive lens [35] and frame our findings on the differences that are brought up by the BLV musicians and teachers.

3.1 Recruitment

We contacted institutions, including the Canadian Council for the Blind (Canada), the Canadian National Institute for the Blind (Canada), the Royal National Institute of Blind People (UK), the American Foundation for the Blind (US) and the World Access for the Blind (US). We requested them to share our call for participation with their members through internal mailing lists. Simultaneously, we also recruited participants through our institution’s associated social media accounts on Twitter and Facebook. Also, we posted a call for participation on Reddit under the popular subreddit r/Blind [51]. Potential participants contacted the first author over email with initial information about themselves and their experiences with music. After learning about the study procedure details, including mandatory audio recording and optional video recording, potential participants provided oral or written consent and proceeded with an interview.

3.2 Participant Consent and Approval

Next, we provided potential participants with more details about the study and the interview process. We described the topics that would be covered during the one-hour-long virtual interview and shared consent forms over email that our institution had approved. We asked each potential participant to read through the consent form and give oral or written consent for their participation in the study. We also informed potential participants that audio recordings of the interviews would be mandatory and video recordings of the interview would be optional.

3.3 Interview Process

We began the interview by introducing the goal of the study. Then, we asked participants to share their experiences with music learning as a BLV person through a series of semi-structured questions based on seven subtopics that answered RQ1 and RQ2. Each subtopic was explored through an initial question and follow-up questions based on the flow of the conversation. The first author conducted all interviews over Zoom video conferencing software [67] or over the phone. Below are the subtopics and examples of the questions asked:
(1)
Getting demographic information: “What best describes your visual impairment? How old were you when you became visually impaired?”
(2)
Understanding the motivation behind learning music: “What motivated you to start learning music? What do you find fulfilling about learning/playing music?”
(3)
Understanding learning strategies: “Talk us through your process of learning a new musical piece?”
(4)
Understanding communication challenges: “How would a music teacher communicate with you while playing music? What would be your preferred method?”
(5)
Understanding the use of touch, voice, and other modalities for learning: “Other than voice, what other modes/feedback would you prefer when learning/teaching music?”
(6)
Understanding the use of notation and braille music: “Do you think reading braille music is important for playing music? If so, why?”
(7)
Ideating technological solutions supporting BLV music learning: “Imagine a black box that can support you in learning music. What do you think it would do? How would it do what you want it to do? Where on your body or on the instrument would it be?”
We intended the interview questions to flow chronologically, beginning with demographic questions and ending with future technologies for music learning. We sometimes altered this order based on the natural progression of the conversation. Also, we encouraged participants to explore topics that were important to them and skip questions that did not interest them. We designed the questions to be open-ended to encourage participants to share lived experiences and insights into the challenges, personal adaptations, and use of technology for music learning.

3.4 Participant Profile

Based on an initial assessment of the interview data, participants were categorized into three categories: professional musicians, amateur musicians, and music teachers (Table 2). Furthermore, we determined that some participants belonged to two categories, such as being both a professional musician and a music teacher. Participants were also grouped based on their braille music literacy (Table 3).
Table 2.
User Group (No. of participants)Music Skill and ExperiencePersonal Goal
Amateur Musicians (19)Can acceptably play and perform simple music. Learning is sporadic and unstructured.Play music for pleasure and enjoy performing music for friends and family.
Professional Musicians (10)Years of formal music training through private lessons and at a university/college.Gain mastery of musical instrument, perform music professionally.
Music Teachers (18 (15 BLV and 3 sighted))Years of experience teaching and learning music, have studied music at university/college, may have degree in education.Teach music to earn a living, through private lessons or at a university/college.
Table 2. User Groups, Music Skill, and Goals
Table 3.
User Group (No. of participants)No ExperienceLimited KnowledgeFormally Learnt
Amateur Musicians (19)1135
Professional Musicians (10)217
Music Teachers (18: 15 BLV and 3 sighted)3 (with 2) \(^{*}\) 2 (with 1) \(^{*}\) 13
Table 3. Braille Music Literacy of Participants
\(^{*}\) Sighted teachers.

3.5 Data Analysis

The audio recordings from the interview were transcribed using Trint [54]. Next, we analyzed the data through the six phases of Braun and Clarke’s [7] thematic analysis using a qualitative analysis software called MAXQDA [34]. We describe each phase of the thematic analysis in (Table 4) based on Byrne’s reflexive thematic analysis example [10]. We adopted an interpretative view of reliability in coding, where we approached the coding of the data as an evolving and organic process. As such, we defined reliability in terms of a rich description of the analytic procedure and abundant descriptions of raw data from participants to meet trustworthiness criteria [39].
Table 4.
Phases of Thematic AnalysisSteps Taken to Establish Trustworthiness
Phase 1: Familiarizing yourself with the dataThe first and second author had prolonged engagement with the data and documented thoughts about potential codes and themes. Both authors engaged in ‘active listening’ to develop an understanding of the primary areas discussed in each interview. The audio recordings, the transcriptions and preliminary notes were stored on MAXQDA [34].
Phase 2: Generating initial codesThe first and second author individually generated a preliminary set of codes for the first participant in the study using MAXQDA [34]. Next, both authors shared their codes with one another and developed an initial codebook. The remaining datasets were divided between both authors and each author individually created iterations of the preliminary codebook as they analyzed more datasets. Both authors met weekly and discussed the evolution of their codebook and applied agreed upon codes to subsequent datasets until all 40 datasets had been coded.
Phase 3: Searching for themesNext, the first author reviewed all 40 data sets along with the final codebook to identify aggregated meaning and meaningfulness across the dataset. Furthermore, the first author identified codes that were conducive to interpreting themes which answered RQ1 and RQ2 and discarded codes that were not relevant to the study. Next, the first author created an initial thematic map of candidate themes and related sub themes.
Phase 4: Reviewing themesNext, the first author conducted a recursive review of the candidate themes in relation to the coded data. They reflected upon the candidate themes based on key questions identified by Braun and Clarke [8]. A finalized thematic framework resulted from the review of candidate themes.
Phase 5: Defining and naming themesNext, the first author related the thematic framework to quotes and findings from the dataset. The themes and sub themes are named and organized to present a lucid narrative that is consistent with the dataset and answers the research questions of the study.
Phase 6: Producing a reportFinally, the first author made a report in the form of this manuscript to describe the coding and analysis process as well as report on the themes from the data analysis.
Table 4. Description of Six Phases of Thematic Analysis

4 Findings

In this section, we report on the key challenges for BLV music learning, the personal adaptation strategies that meet these challenges, perspectives on current technology and design ideas for future technologies, perspectives on braille music and auditory learning for music reading and the role of people in supporting music learning.

4.1 Key Challenges of Learning Music

Our study and others [1, 4, 36, 43] point to three key challenges for BLV music learning:

4.1.1 Understanding Non-verbal Cues and Gestures.

We found that missing nonverbal cues, such as nods, gestures, and facial expressions, present a significant challenge for BLV learners and musicians. These cues, which are commonly used in musical communication, facilitate understanding and coordination amongst musicians, teachers and learners, and conductors and performers. A8(B) said, “When [the conductor] says louder, you go louder. When they say softer, you go softer. But that is a challenging process because I didn’t really know [what they were saying]”.

4.1.2 Music Reading.

We found that understanding new music by relying on braille music or audio recordings, described further in (Section 4.4), presents another hurdle. Participants reported that the process of accessing new music through braille or through audio recordings is complex and requires many steps, which poses a barrier to accessing new music, particularly when compared to sight reading.1 T9(B) said, “Whenever [my friends] see a piano, they’ll just open the book and play for fun, not play it well, but just try things. I don’t have that ability, and I don’t know how to fix that”. T11(B) added, “It would be great to be able to pick up any piece of music and be able to just play it. Why are there so many steps? In an ideal world, we’ll remove all those barriers and have instant access to music”. Additionally, A11(LV) added that low-vision music learners often experience significant fatigue and eye strain when they are required to closely zoom in on music notation to read music.

4.1.3 Understanding Technical and Conceptual Instruction.

Lastly, seven participants expressed challenges in understanding technical and conceptual instruction, citing that the reliance on visual metaphors and demonstrations hindered their comprehension. A9(B) said, “My teacher talked to me about holding my hands in a certain way. [They said] that a lot of emotion is supposed to be in my forearm. It’s difficult [to understand]. If you don’t have the visual reference, how would you know?”. TP3(B) added, “One of the biggest [challenges] I notice is posture. Sighted students see their teachers who are modelling proper technique, and they also see pictures in books that show people sitting at the piano in a proper way. Visually impaired students don’t have that”.

4.2 Personal Adaptations to Overcome Music Learning Challenges

4.2.1 Listening for Non-verbal Cues and Gestures.

Five participants described the importance of listening for clues that could aid musicians and learners in following non-verbal cues and gestures. This was particularly beneficial for musicians and learners participating in choirs or ensembles, as they listened to the movement and breathing patterns of their conductor. T10(B) said, “In my singing, I have to pay close attention [to the conductor]. I can get clues, like when a conductor lifts his arms, sometimes you can hear the clothing rustle”. Relatedly, TA1(B) added, “I had a conductor who was very expressive. I could sort of hear his body language. He used to breathe into the music and I could [follow] his breath”. Furthermore, participants mentioned that paying attention to the conductor’s breathing was particularly helpful in comprehending cues for starting or stopping. P1(B) said, “So often conductors will breathe in conjunction with their upbeat or downbeat, essentially mirroring their conducting pattern. This is incredibly helpful when the piece is starting”. We also found that ten participants pointed to their sense of perfect pitch2 as a significantly helpful tool to identify and memorize music.

4.2.2 Avoiding Music Reading.

Participants described the importance of memorizing entire music scores first to avoid music reading. TP6(LV) said, “I’m not a big advocate of staying with the music score for a long time. Memorize as soon as possible”. TP3(B) pointed to the practicality of memorizing music scores first and said, “Once you have the music memorized, you can keep both hands on the piano without having to use your hands to also read your music”. Furthermore, TP3(B) reported memorizing particular “landmarks” in the music to account for any lapses in memory while performing music. They said, “Let’s say you have a memory flip [and forget what to play next]. You can always go to one of those landmarks and restart from there”.

4.2.3 Receiving Technical and Conceptual Guidance Through Touch.

Participants emphasized the significance of physical touch to understand technical and conceptual instruction. Especially when it involved touching their teachers’ hands to understand the placement of fingers or while receiving physical guidance to correct musical techniques such as body posture. T9(B) reflected on their initial music lessons and said, “I would sit next to [my music teacher], and she would put my hand on top of her hand while she played so that I could feel the curve of her fingers or how high her wrist was. I think I felt it so many times that I started to mimic the motions”. Furthermore, participants described how the tactile experience of touching their teachers’ body provided them with valuable information about how to copy precise movements. A1(B) said, “I would actually feel the face of my teacher to know how to make a particular face [while singing]. They would describe certain things like sucking on a straw [to explain how to shape my face while I touched their face].”
We also found that participants used tactile objects to understand technical and conceptual instruction. T9(B) reported that when learning clarinet, they attached textured stickers onto their instrument, which enabled their teachers to indicate which keys needed to be pressed by referring to particular textures or shapes of the sticker. We also found that participants used tactile objects to facilitate music reading for both sighted and blind music learners simultaneously. T9(B) said, “I have magnets that represent different rhythmic values and a treble clef with sharps, flats and all those things. I have a staff that has lines made of really thin tape. I [as a blind person] can feel the lines and the notes while it is visual for the [sighted] students”.
However, we also discovered that touching their music teachers or being touched, to understand musical information was not always a viable strategy. TA1(B) described this best and said, “I remember this one case where my student, who is a world-class pianist, was being taught by a world-renowned music teacher, who refused physical contact with the student. You had a superduper teacher who did not know how to work with a blind student”.

4.3 Perspectives on Current and Future Assistive Technologies

4.3.1 Limitations of Existing Assistive Technologies.

We found that currently available assistive technologies only partially address the challenge of music reading (Section 4.1.2). From the 22 participants who reported using technology for music learning or performance, their responses revealed a lack of technological support for understanding non-verbal cues and gestures (Section 4.1.1), as well as for comprehending technical or conceptual instruction (Section 4.1.3).
Five participants reported using Sibelius [64] and LIME [62], that made music notation accessible through screen readers. Two participants reported using open-source platforms to convert music notation to braille [37, 50, 60]. Seven other participants used online resources that played segmented parts of music scores to make listening and memorization easier [38, 59, 61]. Furthermore, TP6(LV) added that they relied on printing enlarged music notation to memorize new music. Four participants expressed concerns about the impracticality of single-line braille displays for reading music. TP3(B) said, “One challenge with a single line braille display is that if you’re reading piano music, you can only read music for one hand at a time, and it is very difficult”.3 In addition, we found that some participants preferred physical copies of braille music over digital braille music on braille displays. This preference stemmed from the ability to swiftly navigate their fingers to specific parts of the music without the need to read through each individual line. T9(B) and TP4(B) noted that physical copies of braille music allowed them to quickly skim back and forth between music notes and other musical information such as time signatures and music articulation.4 TP6(LV) added, “I prefer holding paper because I can move it as I need to. It’s easier than scrolling around on a screen. I find [searching and navigating on a screen] kind of an inconvenience”. Furthermore, two participants encountered challenges with converting digital braille music for braille displays due to the absence of standardized formatting, resulting in potential errors and inconsistencies.
However, as we further describe in (Section 4.4), music reading through braille music or auditory learning remains slow, cumbersome and daunting for new music learners.

4.3.2 Design Insights for Future Technologies.

Creating Vibration-based Interfaces: Six participants described using vibration to aid in music reading and to understand technical and conceptual instruction. A4(LV) said, “Maybe the keys [of a piano] have a vibration on them, and they vibrate to indicate that they need to be played. The first key vibrates so you can feel which key to press then the next key vibrates and so on”. T6(B) expanded on this idea and recommended interactions that use a combination of vibration and textured materials to convey information. Three other participants, expressed their interest in combining braille music and vibration in future technologies. Six other participants imagined a multi-modal system that combined audio description, sound and vibration. TP3(B) said, “It can have multiple modes. So, if you know how best you learn, it can give you feedback in an audio format, or if you are a tactile learner, it can give you feedback to feel something on your body. Depending on what you needed, you could pull out whatever because sometimes you need different things”.
Ideas on Form and Location: Participants were divided into two schools of thought. Seven participants pictured a device that could be integrated into existing musical instruments while six other participants imagined a wearable device such as a watch, a bracelet or a pair of glasses. All participants agreed that the form and location of the technology must enable hands-free interactions so musicians and learners can continue to play their instruments while receiving musical information. T9(B) expressed this sentiment and said, “Some sort of a tool that could transfer music notes into a form that I could access immediately. So I could just play without having to memorize and perfect something first would be so, so cool”.
A5(B) and P2(B) highlighted the benefits of designing instruments which incorporated music-teaching functionalities, such as self-playing pianos with built-in tutorials. A11(LV) said, “If you could have an instrument that would tell me how [to play a particular] note, that would [really] help”. These instruments enabled participants to concentrate on learning to play music seamlessly, without having to move their fingers away from the instrument to first read the braille music.
A11(LV) imagined a watch-like device worn on the wrist that made non-verbal cues and gestures accessible. They said, “Maybe a watch-type device that can have music programmed into it. The conductor can wear it and as the conductor’s hands move, it could send that movement as vibrations that would tell me to sing”. However, T5(LV) was not convinced that the wrist was the most suitable location for such a device; they said, “Probably not on the wrist. I think that would bother some musicians. I know pianists might not find that beneficial. Maybe it could be something like a band that goes around the upper part of your arm or something that sticks in your pocket.” TP5(B) imagined a technology that enabled sight reading and said, “I think it would be absolutely wonderful if a pair of glasses had a music reading software built into them. A person who is totally blind could be put on the glasses and actually point their eyes at a specific passage. It would read out the measure number, and it would actually play what was in that measure. I think that would be really cool”.
Designing for User Experience: Participants emphasized that the adoption of assistive technology depends on its’ usability and real-world application. Three participants pointed to portability as an important criterion for BLV musicians and learners, who often may be travelling to and from lessons and performances with heavy instruments and other assistive technologies. Seven participants underscored the importance of developing future technologies as learning tools that place emphasis on engagement. T9(B) said, “Sighted students have so many online games and tools to learn music. I wish there was some sort of a program that could make braille music learning faster and more engaging”. Lastly, seven participants highlighted the importance of designing for flexible learning. A4(LV) imagined a device that allowed them to control the tempo of the music while they were learning to play it. A14(B) added, “If you could start and stop [the music], slow it down or speed it up. It could [make] learning a little bit easier”. Additionally, P1(B) added that this system should automatically adjust the volume of certain parts of the music to hear subtleties that could otherwise be missed.

4.4 Contention Between Braille Music and Auditory Learning

4.4.1 Braille Music is a Hassle, Auditory Learning is Preferred.

Similar to prior studies [4], BLV musicians, especially amateur musicians, found the braille music code cumbersome to learn, and time-consuming and costly to acquire. T9(B) said, “If the student is struggling with just reading the braille alphabet, reading braille music can be really hard. Braille music isn’t super intuitive and not super logical”. A9(B) said, “I don’t know braille music, but obviously, I would have to take my hands off the keys, go over the braille and then come back. Recordings, I can click a button, play it, and stop it. I don’t know how viable it (braille music) would be”. Thirteen other amateur musicians also preferred learning music by ear through audio recordings. Two participants highlighted physical copies of braille music were expensive and time-consuming to acquire.
Music teachers also perceived braille music to be a hassle to use for music lessons. TP3(B) said, “There are actually teachers out there who will actively say they don’t want to teach using Braille music. They’d rather take gifted [BLV] kids, teach them by ear and let the music reading worry about itself if they get to whatever point. And that concerns me.” Four other music teachers found that BLV musicians develop a high level of technical ability but struggle at university, where they need to know how to read music. TP3(B) and T10(B), reported that the traditional braille music curriculum was boring for advanced students and promoted musical content that was challenging for learners.

4.4.2 Braille Music is Valuable, Auditory Learning is Limited.

Contrary to the findings from Abramo and Pierce [1], five participants reported the value that braille music brings to BLV music learning. T9(B) highlighted the importance of braille music for communication and collaboration between musicians and said, “I could memorize the notes by listening to a recording. But if we were in rehearsal and the conductor said okay start at measure forty-three, how was I going to know where that was (without being able to read the music).” T8(B) noted the use of braille music to learn music more accurately and said, “Braille music is most useful for classical music. Particularly for instrumentalists. In general, you always miss something if you’re only playing by ear. You don’t have the dynamics and the articulation. It is harder to hear if you don’t have the instructions”. P3(LV) added, “I say listening is for speed, and braille is for precision. Yes, you can learn a piece of music very quickly by ear. That is relatively easy to do at the early stage, but with braille music, you have the precision”. T5(LV) added, “I think not teaching braille music when sighted children are learning to read music is a mistake in my experience as a teacher. Eventually, the student is not going to be able to keep up [with more complex music] however good their ear is”. T11(B) spoke about the impact braille music had on a professional musician’s career and said, “They really need to learn the code...this young person did not know the braille code and when she learnt it, it escalated her career and her opportunities to find and learn new music”.

4.5 Role of Human Support for Music Learning

We observed that family members, helpers, fellow musicians, conductors and music teachers play a significant role in supporting music learning. BLV musicians and learners emphasized the importance of receiving accommodations and support during music practice and performance. Blind music teachers highlighted the interpersonal relations they developed with their mixed visual ability students to aid them in teaching music. Additionally, sighted music teachers reported their reliance on sighted family members and helpers to facilitate the learning of BLV music learners. We report our findings by identifying the different roles people assume in providing support across different music contexts:
In Music Classrooms: Participants reported that family members, especially parents play an important role in motivating and supporting young music learners. TP3(B) said, “I think that in most cases, really young children might express an interest in music, but it’s really the parents who are the motivating factor in getting them into a structured learning program”. Music teachers reported that they rely on parents while teaching music. T1(S) said, “I actually prefer my students to bring one of their parents to class to guide them while I teach”. TP3(B) also echoed similar thoughts and said, “I really rely on parents [when younger kids are developing technique]. Not necessarily during the lessons but for keeping a lookout for [technique] that is problematic”. Music teachers also highlighted the role of helpers in the music classroom. T2(S) said, “[the helper] was one hundred percent fixing things. She was taking the direction [I gave to the mixed ability class] and making sure that the [blind] child was doing it correctly”.
Music learners also emphasized the importance of having a parent with them in the music classroom. T9(B) said, “When I started piano, my mom would go to lessons with me. She would [make sure] my hands were in the right position and she learnt to read music [so she could remind me] if I forgot”. A15(B) added, “If I’m with my mom or sister, I can [ask them] if I’m holding the instrument correctly”.
Participants emphasized the importance of developing a shared communication style between teachers and students. A16(B) said, “The [most important] thing is that you need to figure out what is the best way for a teacher to explain things to their student”. While T9(B), who is an elementary school music teacher, highlighted the value of familiarizing sighted students with assistive devices and said, “One of the things I do with every new group is I explain to them why I use a cane and that they don’t have to panic if it touches them. I also let them hear what it sounds like when my phone reads something. If [young learners] have never interacted with a blind person, they can get nervous because it’s different”.
While Reading and Practicing Music: Participants reported that parents who accompanied them to music lessons supported them during music practice. T9(B) who initially discussed their parents involvement during music classes, later also said, “My mom was always there at practice [as well]. She [already] knew what I should be doing. She knew when my hands were in the right position, and she could remind me if I [made a mistake]”. Participants also emphasized the support they received from peers, parents and helpers for music reading. TP5(B) added, “My mom was my eyes. She read everything to me”. T3(B) added, “My mom took a course in braille music so she could braille clarinet music for me”. T8(B) said, “One of my friends is really good at sight reading. [I asked them] to record the music for me with a metronome [so I could listen to it]”. P4(B) added that the support of peers was especially useful for the music they could not decipher by hearing alone; they said, “If it’s really complex, I will find a written source, and I’ll ask a musician friend to walk me through the chords, line by line”. TP1(B) added that they had state-sponsored readers who would sit with them and tell them what the musical notes were.
During Musical Performances: Participants emphasized the active support they received from band members, conductors and helpers during performances, as well as the support they interpreted from the actions of people around them. We found that participants relied on band members and peers to assist with stage etiquette and performance decorum. TP3(B) aptly said, “I can’t look around and compare [what to do during performances]. Am I putting my foot where people can see it? You need [musicians] you can trust, so you can ask some of these questions”. Additionally, participants leaned on helpers to navigate their way during performances. As T9(B) added, “I played the saxophone in marching band. I had a helper who would keep her hand on my shoulder [as we marched]”.
Participants also reported being assisted by fellow band members with non-verbal cues and gestures during performances. T8(B) said, “While playing in an orchestra, [I would ask] the person sitting beside me to tap on my leg or convey the beats to me in some way”. As highlighted in (Section 4.2.1), participants also interpreted non-verbal cues by listening. Furthermore, A8(B) emphasized the importance of getting to know your band members really well and said, “I got to a point where I was [so] in sync with [my band members], I developed an instinct of knowing when to go louder or when to get quieter [based on what my band members played]”.
We found two conflicting viewpoints regarding the role of technology. On one hand, participants expressed a reluctance to augment or replace the relations they built with music teachers and peers with technology. TA1(B) said, “I can’t imagine a technology that will teach without the music teacher. You couldn’t do it with mechanical hands. You couldn’t do it with mechanical people. Blind people already move mechanically. You need to have real people moving in real ways”. While TP4(B) added, “I don’t think there needs to be any specific software to teach blind musicians how to play an instrument, you need a good music teacher just like a sighted student”.
On the other hand, eight participants expressed a desire for technology that would empower them to be self-reliant and independent during music practice. TP2(B) said, “A technology that could check how you played would increase independence in learning. You [currently] need someone to describe [what to play] or record it or record the words or do something for you”. TP3(B) added, “I think if there was a machine that could store my lessons and the feedback from the music teacher [for me to revisit during practice], that would be great”. A11(LV) also added, “If [only] an instrument could tell me what to do, that would be [really] helpful”.

5 Discussion

In this section, we connect our findings with related work, we share our vision to (1) Make non-verbal cues and gestures accessible through enhanced audio cues, (2) Aid music reading by integrating vibrotactile feedback, and (3) support independent and interdependent music learning contexts.

5.1 Making Non-verbal Cues and Gestures Accessible through Enhanced Audio Cues

We found that BLV musicians and learners rely on carefully listening to fellow performers and conductors during performances to interpret non-verbal cues and gestures. Participants reported that fellow performers and conductors synchronized their breathing pattern with the music’s timing, providing valuable clues about when to start and when to stop playing. Additionally, participants attuned their ears to the subtle sound of rustling clothes, which offered insights into body movement and gestures. However, we found that effectively perceiving these clues requires a keen sense of hearing, given the simultaneous performance of music. Additionally, it necessitates a deep familiarity with a fellow performer’s or conductor’s movements and breathing patterns. Furthermore, this may also require adjustments and support from fellow musicians and conductors, such as exaggerating specific movements or ensuring the BLV musician is positioned nearby so that they can hear these clues clearly.
Recent studies have explored the use of vibration-based systems to facilitate musical communication with BLV musicians, aiming to improve synchronization among BLV musicians [55] and establish effective communication between conductors and BLV musicians [3]. In adjacent work, accessibility researchers have explored the integration of spatial audio, contextual audio cues and audio description to enable independent navigation [12, 32, 52] which has been further enhanced by the introduction of transparent listening headphones [31]. However, to our knowledge, no prior work has tackled BLV musical communication with enhanced audio cues. Designing assistive technologies that can support the personal adaptation strategy of listening for non-verbal cues and gestures (Section 4.2.1) has the potential to make musical communication accessible to all BLV musicians and learners. A system that combines contextual audio cues and audio description has the potential to eliminate uncertainty regarding non-verbal cues and gestures, enabling musicians to fully immerse themselves in their musical performance. In addition, the utilization of transparent listening headphones offers a means for BLV musicians to stay attuned to the music being performed while also listening to the audio cues. This is an area of research that requires further exploration as open questions about the cognitive load of enhanced audio cues while performing music remain. Moreover, the appropriate selection of audio cues and audio descriptions corresponding to specific non-verbal cues and gestures has yet to be established.

5.2 Integrating Vibrotactile Feedback to Aid Music Reading

It is evident from our findings that music reading continues to pose a substantial obstacle for BLV musicians and learners. We observed that braille music and auditory learning have distinct benefits and limitations for music reading (Section 4.4). We inferred that the preference for either braille music or auditory learning was influenced by several factors, including individual learning goals, familiarity with braille music code, proficiency in deciphering musical notes by listening and availability of braille music scores and audio recordings.
Our findings, along with previous studies [2, 45, 46], underscore the importance of designing music reading tools that support multi-modal outputs such as audio, print and braille music. Researchers in the field of musical haptics have explored the utilization of vibration intensity modulation [21] and a combination of vibration patterns [57] to convey musical information, such as tempo, dynamics, and articulation as well as simplified music notation through vibrations felt on the body. However, we found a noticeable gap that explores the integration of vibration into existing practices of BLV music reading. Combining vibration with braille music and audio recordings presents an opportunity to improve the accessibility of music reading for all BLV musicians. BLV musicians can access nuanced information such as tempo variations, articulation and dynamics through modulations in vibration intensity felt on the body while utilizing audio or braille music to access musical notes. This would be especially useful for amateur musicians and early learners who prefer accessing music through audio recordings. By combining the sensory experience of feeling and hearing the music simultaneously, they can more effectively decipher complexities within the music. Furthermore, the integration of vibration would benefit BLV musicians who rely on memorizing music prior to playing. By conveying select musical information through vibration, it would mimic the experience of sight reading5 and lessen the memory load associated with memorization. This is an area of research that needs further investigation as open questions remain about the application of vibration to convey musical notes, as well as exploration of other factors such as comfort, long-term use and learnability.

5.3 Supporting BLV Learners’ Independence and Interdependence

Our findings clearly indicate that people play a significant role in supporting BLV musicians to address music learning challenges. Our BLV participants made personal adaptations that involved touching and listening to their sighted teachers, peers, and parents, and their adaptations were less about modifying ATs. We can infer two reasons as to why there were few AT-related adaptations among the BLV participants who used ATs: (1) The current range of ATs falls short in supporting essential aspects of music learning beyond music notation, and (2) ATs cannot be solely relied upon as the main solution to address the learning challenges encountered by BLV individuals in the context of music. This latter sentiment was clearly present in some BLV participants who emphasized having a good teacher more than having an AT. Even among other participants who saw the value of ATs in BLV music, they imagined using ATs after interacting with teachers and independently practicing music. These findings suggest that the role of ATs can vary depending on how a BLV learner is situated with their sighted support. When a BLV learner is learning with sighted teachers or parents, ATs can support their interdependence; ATs can support their independence when learning individually.
Bennett et al. [5] proposed the interdependence frame of assistive technology. In this frame, AT is one of the means to address accessibility challenges, and all people in the environment use their unique strengths to provide and receive support. The emphasis is on people understanding and applying their expertise, with or without ATs. We can apply this frame to outline how AT can support BLV music learning by supporting people who are assistive, being assisted, or doing both simultaneously.
In recent years, researchers have applied vibration and force-feedback to support technical instruction such as violin bowing [56], breath guidance for vocal training [29] and multi-limb coordination [6, 27]. However, when considered through the Interdependence frame, we can see that technologies that support technical guidance through vibration and force-feedback have not considered the relations between people, such as between student and teacher.
ATs supporting independence and interdependence must be a flexible system that can support a BLV student in both contexts. Accessibility researchers have taken a great stride in understanding the requirements of ATs that support interdependence by conducting co-design workshops with mixed ability (vs. investigating only the experience of people with disabilities independently) [66]. We recommend future researchers adopt this approach to develop flexible ATs for BLV music learning. For instance, Xia et al. [65] designed ShIFT, a semi-haptic interface that used force feedback to teach fingering patterns on a flute. While this technology enables students to practice on their own, it is important to consider that in this scenario, it replaces the music teacher rather than supporting the relationship between student and teacher. Instead, for this system to support interdependence between student and teacher, it can offer additional functionalities that augment the collective capacities of the student and the teacher (i.e., supporting people to locate or realize their unique resources). This can be accomplished by providing the teacher with insight into the proficiency of music playing by the student, highlighting the areas that were especially challenging. Teachers can use this insight to develop new ways to better support the learner.

6 Limitations and Future Work

There are several limitations to the current study. We note that all participants came from WEIRD (Western, Educated, Industrialized, Rich, And Democratic) countries which gave the participants more access to technology and braille music than people from other parts of the world. All participants had experience with Western acoustic musical instruments with limited experience with digital musical instruments. Additionally, only one participant had lived experience with multiple disabilities.
The findings from our study provide ideas and inspiration to inform future technologies for BLV music learning. In future studies, we recommend researchers address key challenges for BLV music learning that require further exploration, including understanding non-verbal cues and gestures and supporting technical and conceptual instruction. Additionally, we emphasize the role people play in supporting BLV music learners, and we point to the design of technology that supports already existing relationships between people.

7 Conclusion

Previous studies [1, 4, 36, 49] have considered the lived experiences of BLV musicians and recommended inclusive and flexible pedagogical practices to support BLV music learners. This paper expanded on the findings from prior work and identified design ideas for future technologies for BLV music learning. Drawing upon the experience and suggestions of 40 BLV professional musicians, amateur musicians and music teachers (including sighted teachers with experience teaching blind students), we identified five themes: (1) Key Challenges of BLV Music Learning, (2) Personal Adaptations to Overcome Music Learning Challenges, (3) Perspectives on Current and Future Assistive Technologies, (4) Contention Between Braille Music and Auditory Learning, and (5) Role of Human Support for Music Learning. Together, these findings outline a path to make music learning more accessible to BLV people. To this end, we describe opportunities for enhanced audio cues for musical communication, recommend integrating vibrotactile feedback to aid music reading and design technology that supports independence and interdependence in music learning. We see the potential for designers and researchers to further develop resources that inform the design of technologies for BLV music learning.

Footnotes

1
Sight reading is the ability to read and perform music from sheet notation in real time without prior rehearsal or familiarization with the piece.
2
Perfect pitch, also known as absolute pitch, is the ability to identify or reproduce musical pitches without the aid of a reference note. It is the innate ability to recognize and name a musical note by its pitch alone.
3
Western piano music is conventionally represented by two staves of music stacked on top of one another representing the left and right hand separately.
4
Music articulation refers to the clarity, precision, and distinctness with which musical notes and phrases are performed.
5
Sight reading is the ability to read and perform music on the spot without prior practice or memorization. It involves quickly processing musical symbols such as notes, rhythms, and other markings on the sheet music and translating them into corresponding sounds on an instrument or in singing.

References

[1]
Abramo and Pierce. 2013. An ethnographic case study of music learning at a school for the blind. Bulletin of the Council for Research in Music Education195 (2013), 9.
[2]
Fabiha Ahmed, Dennis Kuzminer, Michael Zachor, Lisa Ye, Rachel Josepho, William Christopher Payne, and Amy Hurst. 2021. Sound cells: Rendering visual and braille music in the browser. In The 23rd International ACM SIGACCESS Conference on Computers and Accessibility. ACM, Virtual Event USA, 1–4.
[3]
David Baker, Ann Fomukong-Boden, and Sian Edwards. 2019. ‘Don’t follow them, look at me!’: Contemplating a haptic digital prototype to bridge the conductor and visually impaired performer. Music Education Research 21, 3 (May2019), 295–314.
[4]
David Baker and Lucy Green. 2016. Perceptions of schooling, pedagogy and notation in the lives of visually-impaired musicians. Research Studies in Music Education 38, 2 (Dec.2016), 193–219.
[5]
Cynthia L. Bennett, Erin Brady, and Stacy M. Branham. 2018. Interdependence as a frame for assistive technology research and design. In Proceedings of the 20th International ACM SIGACCESS Conference on Computers and Accessibility. ACM, Galway Ireland, 161–173.
[6]
Anders Bouwer, Simon Holland, and Mat Dalgleish. 2013. The haptic bracelets: Learning multi-limb rhythm skills from haptic stimuli while reading. In Music and Human-Computer Interaction, Simon Holland, Katie Wilkie, Paul Mulholland, and Allan Seago (Eds.). Springer London, London, 101–122. Series Title: Springer Series on Cultural Computing.
[7]
Virginia Braun and Victoria Clarke. 2006. Using thematic analysis in psychology. Qualitative Research in Psychology 3, 2 (Jan.2006), 77–101.
[8]
Virginia Braun and Victoria Clarke. 2012. Thematic analysis. In APA Handbook of Research Methods in Psychology, Vol 2: Research Designs: Quantitative, Qualitative, Neuropsychological, and Biological. American Psychological Association, Washington, DC, US, 57–71.
[9]
Stephen A. Brewster and Lorna M. Brown. 2004. Non-visual information display using tactons. In Extended Abstracts of the 2004 Conference on Human Factors and Computing Systems - CHI ’04. ACM Press, Vienna, Austria, 787.
[10]
David Byrne. 2022. A worked example of Braun and Clarke’s approach to reflexive thematic analysis. Quality & Quantity 56, 3 (June2022), 1391–1412.
[11]
Kuan-Chung Chen and Syh-Jong Jang. 2010. Motivation in online learning: Testing a model of self-determination theory. Computers in Human Behavior 26, 4 (July2010), 741–752.
[12]
Gregory D. Clemenson, Antonella Maselli, Alexander J. Fiannaca, Amos Miller, and Mar Gonzalez-Franco. 2021. Rethinking GPS navigation: Creating cognitive maps through auditory clues. Scientific Reports 11, 1 (April2021), 7764.
[13]
Nancy Cooper. 1999. A survey of current music inclusion practices and issues in New Jersey. Ohio Music Education Association 26, 2 (1999), 30.
[14]
A.-A. Darrow. 1999. Music educators’ perceptions regarding the inclusion of students with severe disabilities in music classrooms. Journal of Music Therapy 36, 4 (Dec.1999), 254–273.
[15]
Lennard J. Davis. 2002. Bending Over Backwards: Essays on Disability and the Body. NYU Press. Google-Books-ID: EOcTCgAAQBAJ.
[16]
Edward L. Deci and Richard M. Ryan. 2012. Handbook of Theories of Social Psychology: Volume 1 - Self-Determination Theory. SAGE Publications Ltd, 1 Oliver’s Yard, 55 City Road, London EC1Y 1SP, United Kingdom.
[17]
Canute 360 Braille e Reader. 2022. Canute 360 Braille e-Reader. https://canasstech.com/products/canute-360-pre-order
[18]
Tiiu Ernits and Kadri Kutsar. 2017. Problems of music education for blind and visually impaired people in Estonia. Problems in Music Pedagogy 16 (2017).
[19]
Jan B. F. van Erp, Katja I. Paul, and Tina Mioch. 2020. Tactile working memory capacity of users who are blind in an electronic travel aid application with a vibration belt. ACM Transactions on Accessible Computing 13, 2 (June2020), 1–14.
[20]
Claire L. Galea and Chris Porter. 2018. Accessible choral ensembles for visually impaired singers.
[21]
Marcello Giordano, John Sullivan, and Marcelo M. Wanderley. 2018. Design of vibrotactile feedback and stimulation for music performance. In Musical Haptics, Stefano Papetti and Charalampos Saitis (Eds.). Springer International Publishing, Cham, 193–214. Series Title: Springer Series on Touch and Haptic Systems.
[22]
David Goldstein. 2000. Music pedagogy for the blind. International Journal of Music Education os-35, 1 (May2000), 35–39.
[23]
Frédéric Gougoux, Franco Lepore, Maryse Lassonde, Patrice Voss, Robert J. Zatorre, and Pascal Belin. 2004. Pitch discrimination in the early blind. Nature 430, 6997 (July2004), 309–309.
[24]
Roy H. Hamilton, Alvaro Pascual-Leone, and Gottfried Schlaug. 2004. Absolute pitch in blind musicians:. NeuroReport 15, 5 (April2004), 803–806.
[25]
Alice M. Hammel. 2001. Special learners in elementary music classrooms: A study of essential teacher competencies. Update: Applications of Research in Music Education 20, 1 (Nov.2001), 9–13.
[27]
Simon Holland, Anders J. Bouwer, Mathew Dalgelish, and Topi M. Hurtig. 2010. Feeling the beat where it counts: Fostering multi-limb rhythm skills with the haptic drum kit. (2010), 8. https://doi.org/978-1-60558-841-4/10/01
[28]
Carlton E. Kilpatrick. 2020. Movement, gesture, and singing: A review of literature. Update: Applications of Research in Music Education 38, 3 (June2020), 29–37.
[29]
Yinmiao Li, Ziyue Piao, and Gus Xia. 2021. A wearable haptic interface for breath guidance in vocal training. In NIME 2021. PubPub, Shanghai, China.
[30]
Lauren J. Lieberman, Monica Lepore, Maria Lepore-Stevens, and Lindsay Ball. 2019. Physical education for children with visual impairment or blindness. Journal of Physical Education, Recreation & Dance 90, 1 (Jan.2019), 30–38.
[32]
Tiffany Liu, Javier Hernandez, Mar Gonzalez-Franco, Antonella Maselli, Melanie Kneisel, Adam Glass, Jarnail Chudge, and Amos Miller. 2022. Characterizing and predicting engagement of blind and low-vision people with an audio-based navigation app. In CHI Conference on Human Factors in Computing Systems Extended Abstracts. ACM, New Orleans LA USA, 1–7.
[33]
Leon Lu. 2022. Learning music blind: Understanding the application of technology to support BLV music learning. In The 24th International ACM SIGACCESS Conference on Computers and Accessibility. ACM, Athens Greece, 1–4.
[34]
MAXQDA. 2022. MAXQDA | All-In-One Qualitative & Mixed Methods Data Analysis Tool. https://www.maxqda.com/
[35]
Donna M. Mertens. 2007. Transformative paradigm: Mixed methods and social justice. Journal of Mixed Methods Research 1, 3 (July2007), 212–225.
[36]
Frederick W. Moss. 2009. Quality of Experience in Mainstreaming and Full Inclusion of Blind and Visually Impaired High School Instrumental Music Students. Ph. D. Dissertation. University of Michigan. https://deepblue.lib.umich.edu/handle/2027.42/62425
[37]
musescore.org. 2022. MuseScore 3 Released. https://musescore.org/en/3.0
[38]
musicvi.com. 2022. Categories. https://www.musicvi.com/store/
[39]
Lorelli S. Nowell, Jill M. Norris, Deborah E. White, and Nancy J. Moules. 2017. Thematic analysis: Striving to meet the trustworthiness criteria. International Journal of Qualitative Methods 16, 1 (Dec.2017), 160940691773384.
[40]
Megan O’Connell, Lauren J. Lieberman, and Susan Petersen. 2006. The use of tactile modeling and physical guidance as instructional strategies in physical activity for children who are blind. Journal of Visual Impairment & Blindness 100, 8 (Aug.2006), 471–477.
[42]
Sile O’Modhrain and R. Brent Gillespie. 2018. Once more, with feeling: Revisiting the role of touch in performer-instrument interaction. In Musical Haptics, Stefano Papetti and Charalampos Saitis (Eds.). Springer International Publishing, Cham, 11–27. Series Title: Springer Series on Touch and Haptic Systems.
[43]
Hyu-Yong Park. 2015. How useful is Braille music?: A critical review. International Journal of Disability, Development and Education 62, 3 (May2015), 303–318.
[44]
Hye Young Park. 2017. Finding meaning through musical growth: Life histories of visually impaired musicians. Musicae Scientiae 21, 4 (Dec.2017), 405–417.
[45]
William Payne. 2022. Sounds and (Braille) cells: Co-designing music technology with blind and visually impaired musicians. In Proceedings of the 19th International Web for All Conference. ACM, Lyon France, 1–3.
[46]
William Payne and Amy Hurst. 2023. “We avoid PDFs”: Improving notation access for blind and visually impaired musicians. In Information for a Better World: Normality, Virtuality, Physicality, Inclusivity, Isaac Sserwanga, Anne Goulding, Heather Moulaison-Sandy, Jia Tina Du, António Lucas Soares, Viviane Hessami, and Rebecca D. Frank (Eds.). Vol. 13972. Springer Nature Switzerland, Cham, 581–597. Series Title: Lecture Notes in Computer Science.
[47]
Mahika Phutane, Julie Wright, Brenda Veronica Castro, Lei Shi, Simone R. Stern, Holly M. Lawson, and Shiri Azenkot. 2022. Tactile materials in practice: Understanding the experiences of teachers of the visually impaired. ACM Transactions on Accessible Computing 15, 3 (Sept.2022), 1–34.
[48]
Angela Pino and Laia Viladot. 2019. Teaching-learning resources and supports in the music classroom: Key aspects for the inclusion of visually impaired students. British Journal of Visual Impairment 37, 1 (Jan.2019), 17–28.
[49]
Bruce W. Quaglia. 2015. Planning for student variability: Universal design for learning in the music theory classroom and curriculum. (2015), 21.
[50]
IBOS MusicXML Reader. 2022. Presentation of IBOS MusicXML Reader: Design and Navigation - YouTube. https://www.youtube.com/watch?v=J0DqXA9CRV4
[51]
Reddit. 2022. Blind and Visually Impaired Community. https://www.reddit.com/r/Blind/
[52]
Anne Spencer Ross, Ed Cutrell, Alex Fiannaca, Melanie Kneisel, and Meredith Ringel Morris. 2019. Use cases and impact of audio-based virtual exploration. (2019).
[53]
Jenny Seham and Anna J. Yeo. 2015. Extending our vision: Access to inclusive dance education for people with visual impairment. Journal of Dance Education 15, 3 (July2015), 91–99.
[54]
trint.com. 2022. Transcribe video and audio to text | Content editor | Trint. https://trint.com/
[55]
Luca Turchet, David Baker, and Tony Stockman. 2021. Musical haptic wearables for synchronisation of visually-impaired performers: A co-design approach. In ACM International Conference on Interactive Media Experiences. ACM, Virtual Event USA, 20–27.
[56]
Janet van der Linden, Erwin Schoonderwaldt, Jon Bird, and Rose Johnson. 2011. MusicJacket—combining motion capture and vibrotactile feedback to teach violin bowing. IEEE Transactions on Instrumentation and Measurement 60, 1 (Jan.2011), 104–113.
[57]
Travis J. West, Alexandra Bachmayer, Sandeep Bhagwati, Joanna Berzowska, and Marcelo M. Wanderley. 2019. The design of the body:Suit:Score, a full-body vibrotactile musical score. In Human Interface and the Management of Information. Information in Intelligent Systems, Sakae Yamamoto and Hirohiko Mori (Eds.). Vol. 11570. Springer International Publishing, Cham, 70–89. Series Title: Lecture Notes in Computer Science.
[58]
K. Wolffe and S. Z. Sacks. 1997. The lifestyles of blind, low vision, and sighted youths: A quantitative comparison. Journal of Visual Impairment & Blindness 91, 3 (May1997), 245–257. Publisher: SAGE Publications Inc.
[59]
www.andrelouis.com. 2022. QWS homepage. http://www.andrelouis.com/qws/
[60]
www.contrapunctus.it. 2022. Welcome to Contrapunctus Project site | Contrapunctus. https://www.contrapunctus.it/
[61]
www.cyberbass.com. 2022. CyberBass_Home_Page. http://www.cyberbass.com/
[62]
www.dancingdots.com. 2022. Lime Aloud from Dancing Dots. https://www.dancingdots.com/prodesc/limealoud.htm
[63]
www.dancingdots.com. 2022. Lime Lighter: Music-Reading Solution for Low Vision Performers - Dancing Dots. https://dancingdots.com/limelighter/limelightermain.htm
[64]
www.sibelius.com. 2022. Sibelius - the leading music composition and notation software. http://www.sibelius.com/helpcenter/article.php?id=444&languageid=1&searchid=2644712
[65]
Gus G. Xia, Carter O. Jacobsen, Qiawen Chen, Xing-Dong Yang, and Roger B. Dannenberg. 2018. ShIFT: A semi-haptic interface for flute tutoring. In NIME. 6.
[66]
Zeynep Yildiz and Ozge Subasi. 2023. Virtual collaboration tools for mixed-ability workspaces: A cross disability solidarity case from Turkey. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems. ACM, Hamburg Germany, 1–11.
[67]
zoom.us. 2022. Video Conferencing, Cloud Phone, Webinars, Chat, Virtual Events | Zoom. https://zoom.us/

Cited By

View all
  • (2024)NoteBlock: Prototype Design of Music Learning Experience for Blind and Low Vision Children in Preschool AgesCompanion of the 2024 on ACM International Joint Conference on Pervasive and Ubiquitous Computing10.1145/3675094.3677584(56-60)Online publication date: 5-Oct-2024

Index Terms

  1. “Why are there so many steps?”: Improving Access to Blind and Low Vision Music Learning through Personal Adaptations and Future Design Ideas

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Transactions on Accessible Computing
      ACM Transactions on Accessible Computing  Volume 16, Issue 3
      September 2023
      139 pages
      ISSN:1936-7228
      EISSN:1936-7236
      DOI:10.1145/3624974
      Issue’s Table of Contents
      This work is licensed under a Creative Commons Attribution International 4.0 License.

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 21 September 2023
      Online AM: 16 August 2023
      Accepted: 02 August 2023
      Revised: 26 May 2023
      Received: 24 November 2022
      Published in TACCESS Volume 16, Issue 3

      Check for updates

      Author Tags

      1. Blind music learning
      2. music pedagogy
      3. self developed strategies
      4. assistive technologies for music learning

      Qualifiers

      • Research-article

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)1,702
      • Downloads (Last 6 weeks)160
      Reflects downloads up to 03 Oct 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)NoteBlock: Prototype Design of Music Learning Experience for Blind and Low Vision Children in Preschool AgesCompanion of the 2024 on ACM International Joint Conference on Pervasive and Ubiquitous Computing10.1145/3675094.3677584(56-60)Online publication date: 5-Oct-2024

      View Options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Get Access

      Login options

      Full Access

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media