Linguist, retired from Linguistic Data Consortium at University of Pennsylvania. Previously Senior Linguist at Dragon Systems, Inc. Ph.D. in linguistics from Berkeley Address: Philadelphia, Pennsylvania, United States
HLT '94: Proceedings of the workshop on Human Language Technology, 1994
The goal of this study is to evaluate the potential for using large vocabulary continuous speec... more The goal of this study is to evaluate the potential for using large vocabulary continuous speech recognition as an engine for automatically classifying utterances according to the language being spoken. The problem of language identification is often thought of as being separate from the problem of speech recognition. But in this paper, as in Dragon's earlier work on topic and speaker identification, we explore a unifying approach to all three message classification problems based on the underlying stochastic process which gives rise to speech. We discuss the theoretical framework upon which our message classification systems are built and report on a series of experiments in which this theory is tested, using large vocabulary continuous speech recognition to distinguish English from Spanish.
A left parietal infarct in a prelingually deaf person resulted in an aphasia for both American Si... more A left parietal infarct in a prelingually deaf person resulted in an aphasia for both American Sign Language (ASL) and written and finger-spelled English. Originally the patient had a nearly global aphasia affecting all language systems. By five to seven weeks post-onset her symptoms resembled those of hearing aphasics with posterior lesions: fluent but paraphasic signing, anomia, impaired comprehension and repetition, alexia, and agraphia with elements of neologistic jargon. In addition, there was a pronounced sequential movement copying disorder, reduced short-term verbal memory and acalculia. In general, the patient's sign errors showed a consistent disruption in the structure of ASL signs which parallels the speech errors of oral aphasic patients. We conclude that most aphasic symptoms are not modality-dependent, but rather reflect a disruption of linguistic processes common to all human languages. This case confirms the importance of the left hemisphere in the processing of sign language. Furthermore, the results indicate that the left supramarginal and angular gyri are necessary substrates for the comprehension of visual/gestural languages.
HLT '94: Proceedings of the workshop on Human Language Technology, 1994
The goal of this study is to evaluate the potential for using large vocabulary continuous speec... more The goal of this study is to evaluate the potential for using large vocabulary continuous speech recognition as an engine for automatically classifying utterances according to the language being spoken. The problem of language identification is often thought of as being separate from the problem of speech recognition. But in this paper, as in Dragon's earlier work on topic and speaker identification, we explore a unifying approach to all three message classification problems based on the underlying stochastic process which gives rise to speech. We discuss the theoretical framework upon which our message classification systems are built and report on a series of experiments in which this theory is tested, using large vocabulary continuous speech recognition to distinguish English from Spanish.
A left parietal infarct in a prelingually deaf person resulted in an aphasia for both American Si... more A left parietal infarct in a prelingually deaf person resulted in an aphasia for both American Sign Language (ASL) and written and finger-spelled English. Originally the patient had a nearly global aphasia affecting all language systems. By five to seven weeks post-onset her symptoms resembled those of hearing aphasics with posterior lesions: fluent but paraphasic signing, anomia, impaired comprehension and repetition, alexia, and agraphia with elements of neologistic jargon. In addition, there was a pronounced sequential movement copying disorder, reduced short-term verbal memory and acalculia. In general, the patient's sign errors showed a consistent disruption in the structure of ASL signs which parallels the speech errors of oral aphasic patients. We conclude that most aphasic symptoms are not modality-dependent, but rather reflect a disruption of linguistic processes common to all human languages. This case confirms the importance of the left hemisphere in the processing of sign language. Furthermore, the results indicate that the left supramarginal and angular gyri are necessary substrates for the comprehension of visual/gestural languages.
Uploads
Papers by Mark A. Mandel
PMID: 7066673 [PubMed - indexed for MEDLINE]
PMID: 7066673 [PubMed - indexed for MEDLINE]