Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2542284.2542288acmconferencesArticle/Chapter ViewAbstractPublication Pagessiggraph-asiaConference Proceedingsconference-collections
research-article

Concert viewing headphones

Published: 19 November 2013 Publication History

Abstract

We designed concert viewing headphones that let a user listening and watching to music scope a particular part of the performance that he or she wants to hear and see. The headphones are equipped with a projector, an inclination sensor on the top of the headphones, and a distance sensor on the outside right speaker (Figure 1). For example, when listening to jazz, one might want to clearly hear and see the guitar or sax. By moving your head left or right, you can hear from a frontal position. By simply putting your hand behind your ear, you can adjust the distance sensor on the headphones and focus on a particular part you want to hear and see. Previously reported headphones with sensors for detecting the direction the user is facing or the location of the head can escalate the musical presence and create a realistic impression, but they do not control the volumes and panoramic potentiometers of each part in accordance with the user's wishes [Pachet and Delerue 2000]. We previously developed sound scope headphones that enable users to change the sound mixing depending on their head direction [Hamanaka and Lee 2009]. However, the system did not have handle images. The concert viewing headphones have both image and sound processing functions. The image processing extracts the portion of the image indicated by the user and projects it free of distortion on walls located to the front and side of the user. The sound processing creates imaginary microphones for those performers without one so that the user can hear the sound from any performer. Testing using images and sounds captured with a fisheye-lens camera and 37 lavalier microphones showed that sound localization was fastest when an inverse square function was used for the sound mixing and that the zoom function was useful for locating the desired sound performance.

References

[1]
Pachet, F., and Delerue, O. 2000. On-The-Fly Multi-Track Mixing. In Proceedings of AES 109th Convention, Los Angeles: Audio Engineering Society.
[2]
Hamanaka, M., and Lee, S. H. 2009. Sound Scope Headphones. Siggraph2009 Talks TK-201 / Emerging Technologies ET-201.
[3]
Tachi, S. 2007. TWISTER: immersive ominidirectional autostereoscopic 3D booth for mutual telexistence, in Proceedings of the Asiagraph, pp. 1--6.
[4]
Iwata, H. 2004. Full-surround image display technologies, International Journal of Computer Vision, 58, 3, pp. 227--235.
[5]
Google. 2012. Google Maps with Street View http://maps.google.com/intl/en/help/maps/streetview/.
[6]
Immersive media. 2013. Immersive Media. http://www.immersivemedia.com/demos/index.php.
[7]
Goto, M., Hashiguchi, H., Nishimura, T., and Oka, R. 2003. RWC music database: music genre database and musical instrument sound database," in Proceedings of the 4th International Conference on Music Information Retrieval (ISMIR '03), pp. 229--230.

Cited By

View all

Index Terms

  1. Concert viewing headphones
    Index terms have been assigned to the content through auto-classification.

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    SA '13: SIGGRAPH Asia 2013 Emerging Technologies
    November 2013
    43 pages
    ISBN:9781450326322
    DOI:10.1145/2542284
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    In-Cooperation

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 19 November 2013

    Permissions

    Request permissions for this article.

    Check for updates

    Qualifiers

    • Research-article

    Conference

    SA '13
    Sponsor:
    SA '13: SIGGRAPH Asia 2013
    November 19 - 22, 2013
    Hong Kong, Hong Kong

    Acceptance Rates

    Overall Acceptance Rate 178 of 869 submissions, 20%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 110
      Total Downloads
    • Downloads (Last 12 months)3
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 12 Nov 2024

    Other Metrics

    Citations

    Cited By

    View all

    View Options

    Get Access

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media