Abstract
We propose a new method for user-independent gesture recognition from time-varying images. The method uses relative-motion extraction and discriminant analysis for providing online learning/recognition abilities. Efficient and robust extraction of motion information is achieved. The method is computationally inexpensive which allows real-time operation on a personal computer. The performance of the proposed method has been tested with several data sets and good generalization abilities have been observed: it is robust to changes in background and illumination conditions, to users’ external appearance and changes in spatial location, and successfully copes with the non-uniformity of the performance speed of the gestures. No manual segmentation of any kind, or use of markers, etc. is necessary. Having the above-mentioned features, the method could be successfully used as a part of more refined human-computer interfaces.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Barron, J. L., Fleet, D. J. and Beauchemin, S., “Performance of Optical Flow Techniques,”International Journal of Computer Vision, 12(1), pp. 43–77, 1994.
Cedras, C. and Shah, M., “Motion-based Recognition: a Survey,”Image and Vision Computing, 13, 2, pp. 129–155, 1995.
Cutting, J. E. and Proffitt, D. R., “The Minimum Principle and the Perception of Absolute, Common, and Relative Motions,”Cognitive Psychology, 14, pp. 211–246, 1982.
Fisher, R. A., “The Use of Multiple Measurements in Taxonomic Problems.,”Annals of Eugenics 7, pp. 179–188, 1936.
Mitiche, A., and Bouthemy, P., “Computation and Analysis of Image Motion: A Synopsis of Current Problems and Methods,”International Journal of Computer Vision, 19(1), pp. 29–55, 1996.
Otsu, N. and Kurita, T., “A New Scheme for Practical Flexible and Intelligent Vision Systems,”Proc. IAPR Workshop on Computer Vision, Tokyo, pp. 431–435, 1988.
Author information
Authors and Affiliations
Corresponding author
Additional information
Bisser R. Raytchev: He received his BS and MS degrees in electronics from Tokai University, Japan, in 1995 and 1997 respectively. He is currently a doctoral student in electronics and information sciences at Tsukuba University, Japan. His research interests include biological and computer vision, pattern recognition and neural networks.
Osamu Hasegawa, Ph.D.: He received the B.E. and M.E. degrees in Mechanical Engineering from the Science University of Tokyo, in 1988, 1990 respectively. He received Ph.D. degree in Electrical Engineering from the University of Tokyo, in 1993. Currently, he is a senior research scientist at the Electrotechnical Laboratory (ETL), Tsukuba, Japan. His research interests include Computer Vision and Multi-modal Human Interface. Dr. Hasegawa is a member of the AAAI, the Institute of Electronics, Information and Communication Engineers, Japan (IEICE), Information Processing Society of Japan and others.
Nobuyuki Otsu, Ph.D.: He received B.S., Mr. Eng. and Dr. Eng. in Mathematical Engineering from the University of Tokyo in 1969, 1971, and 1981, respectively. Since he joined ETL in 1971, he has been engaged in theoretical research on pattern recognition, multivariate data analysis, and applications to image recognition in particular. After taking positions of Head of Mathematical Informatics Section (since 1985) and ETL Chief Senior Scientist (since 1990), he is currently Director of Machine Understanding Division since 1991, and concurrently a professor of the post graduate school of Tsukuba University since 1992. He has been involved in the Real World Computing program and directing the R&D of the project as Head of Real World Intelligence Center at ETL. Dr. Otsu is members of Behaviormetric Society and IEICE of Japan, etc.
About this article
Cite this article
Raytchev, B., Hasegawa, O. & Otsu, N. User-independent gesture recognition by relative-motion extraction and discriminant analysis. New Gener Comput 18, 117–126 (2000). https://doi.org/10.1007/BF03037590
Received:
Revised:
Issue Date:
DOI: https://doi.org/10.1007/BF03037590