Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/191028.191030acmconferencesArticle/Chapter ViewAbstractPublication PagesassetsConference Proceedingsconference-collections
Article
Free access

Pattern recognition and synthesis for sign language translation system

Published: 31 October 1994 Publication History
  • Get Citation Alerts
  • Abstract

    Sign language is one means of communication for hearing-impaired people. Words and sentences in sign language are mainly represented by hands' gestures. In this report, we show a sign language translation system which we are developing. The system translates Japanese sign language into Japanese and vice versa. In this system, hand shape and position data are inputted using DataGlove. Inputted hand motions are recognized and translated into Japanese sentences. Japanese text is translated into sign language represented as 3-D computer-graphic animation of sign language gestures.

    References

    [1]
    S. Ishihara, "Sign Language for Everyone", NHK series, 1990 (in Japanese)
    [2]
    j. Lee and T. L. Kunii, "Generation and Recognition of Sign Language Using Graphic Models", Proc. of IISF/ACM Japan International Symposium, pp. 96- 103, 1994.
    [3]
    T. Matsumoto and K. Kamata, "Basic Study on Constructing Sign Word Processor", Technical Report of IEICE, HC93-10, 1993 (in Japanese).
    [4]
    T. Miura, "Answering the Challenge to Develop Accessible Information System for the Disabled", Proc. of IISF/ACM Japan International Symposium, pp. 112-119, 1994.
    [5]
    Y. Nakano, "A Study of Sign Language", Fukumura Publisher, 1981 (in Japanese)
    [6]
    K. Nozawa, "Case Work i1 for Hearingimpaired People", 1991.
    [7]
    J. Ogawa and K. Kanda, "Basic of Sign Language Translation", Daiichi-Hoki, 1992 (in Japanese)
    [8]
    M. Ohki, H. Sagawa, T. Sakiyama, E. Oohira, and M. Fujisawa, "Pattern Recognition and Synthesis for Sign Language Translation System", Technical Report of Information,Media of IPSJ, 15-6, 1994 (in Japanese).
    [9]
    R. Oka, "Continuous Words Recognition by Use of Continuous Dynamic Programming for Pattern Matching", Acous. Soc. J., SIG- S, $78-20, pp. 145 - 152, 1978 (in Japanese).
    [10]
    E. Oohira, T. Sakiyama, M. Abe, and H. Sagawa, "Study on Sign Language Generation System", 46th IPSJ Annual Conference, Vol. 1, pp. 309-310, 1993 (in Japanese)
    [11]
    H. Sagawa and M. Abe, "Enlargement of Vocabulary on Sign Language Translation System", 46th IPSJ Annual Conference, Vol. l, pp. 307-308, 1993
    [12]
    H. Sagawa, T. Sakiyama, E. Oohira, H. Sakou, and M. Abe, "Prototype Sign Language Translation System", Proc. of IISF/ACM Japan International Symposium, pp. 152-153, 1994.
    [13]
    T. Sakiyama, E. Oohira, H. Sagawa, M. Abe, and K. Arai, "Sign Language Generation using Computer Animation", 46th iPSJ Annual Conference, V ol. 1, pp. 311-312, 1993 (in Japanese)
    [14]
    T. Takahashi and F. Kishino, "A Hand Gesture Recognition Method and Its Application", The Transactions of IEICE, Vol. J73-D-II, No. 12, pp. 1985-1992, 1990 (in Japanese)
    [15]
    M. Terauchi, Y. Nagashima, H. Mihara, H. Nagashima and G. Ohwa, "The Examination which is Basic is Done by the animated Induction System", Technical Report of Human Interface of IPSJ, 41-7, 1992 (in Japanese).
    [16]
    VPL Research, "DATAGLOVE MODEL 2 Operation Manual", VPL Research, Inc., 1989
    [17]
    S. Watanabe, T. Izuchi, E. Fujishige and T. Kurokawa, "Technical Aspects of Automatic Translation of Japanese to Sign Language", Human Interface, Vol. 8, pp. 363-370, Society of Instruments and Control Engineers, 1993 (in Japanese).
    [18]
    Welfare Ministry, "Welfare White Paper", Welfare Ministry, 1989 (in Japanese).

    Cited By

    View all
    • (2018)Data-driven development of Virtual Sign Language Communication Agents2018 27th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN)10.1109/ROMAN.2018.8525717(370-377)Online publication date: 27-Aug-2018
    • (2006)Virtual sign animated pedagogic agents to support computer education for deaf learnersACM SIGACCESS Accessibility and Computing10.1145/1196148.1196158(40-44)Online publication date: 1-Sep-2006
    • (2006)Towards a dialogue system based on recognition and synthesis of Japanese sign languageGesture and Sign Language in Human-Computer Interaction10.1007/BFb0053005(259-271)Online publication date: 19-May-2006
    • Show More Cited By

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    Assets '94: Proceedings of the first annual ACM conference on Assistive technologies
    October 1994
    158 pages
    ISBN:0897916492
    DOI:10.1145/191028
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 31 October 1994

    Permissions

    Request permissions for this article.

    Check for updates

    Qualifiers

    • Article

    Conference

    ASSETS94
    ASSETS94: First International ACM/SIGCAPH Conference on Assistive Technologies
    October 31 - November 1, 1994
    California, Marina Del Rey, USA

    Acceptance Rates

    Overall Acceptance Rate 436 of 1,556 submissions, 28%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)45
    • Downloads (Last 6 weeks)9
    Reflects downloads up to 11 Aug 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2018)Data-driven development of Virtual Sign Language Communication Agents2018 27th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN)10.1109/ROMAN.2018.8525717(370-377)Online publication date: 27-Aug-2018
    • (2006)Virtual sign animated pedagogic agents to support computer education for deaf learnersACM SIGACCESS Accessibility and Computing10.1145/1196148.1196158(40-44)Online publication date: 1-Sep-2006
    • (2006)Towards a dialogue system based on recognition and synthesis of Japanese sign languageGesture and Sign Language in Human-Computer Interaction10.1007/BFb0053005(259-271)Online publication date: 19-May-2006
    • (1997)Description and recognition methods for sign language based on gesture componentsProceedings of the 2nd international conference on Intelligent user interfaces10.1145/238218.238310(97-104)Online publication date: 6-Jan-1997

    View Options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Get Access

    Login options

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media