Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2212776.2223682acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
extended-abstract

Super Mirror: a kinect interface for ballet dancers

Published: 05 May 2012 Publication History

Abstract

We propose the Super Mirror, a Kinect-based system that combines the functionality of studio mirrors and prescriptive images to provide the user with instructional feedback in real-time. In this study, we developed a working prototype of this system, which records ballet movements (also called positions and poses), captures live motion, and shows the difference between the two.

Supplementary Material

JPG File (wpfile551-3.jpg)
suppl.mov (wpfile551-3.mov)
Supplemental video

References

[1]
Y.-J. Chang, S.-F. Chen, and J.-D. Huang, A Kinect-based system for physical rehabilitation: A pilot study for young adults with motor disabilities, Research in Developmental Disabilities, vol. 32, no. 6, pp. 2566--2570, Dec. 2011.
[2]
K. Dearborn and R. Ross, Dance Learning and the Mirror, Journal of Dance Education, vol. 6, no. 4, pp. 109--115, Nov. 2006.
[3]
S. DeLahunta and F. Bevilacqua, Sharing descriptions of movement, International Journal of Performance Arts & Digital Media, vol. 3, no. 1, pp. 316, Jan. 2007.
[4]
P. Hämäläinen, Interactive video mirrors for sports training, in Proceedings of the third Nordic conference on Human-computer interaction, New York, NY, USA, 2004, pp. 199--202.
[5]
W. S. Meador, T. J. Rogers, K. O'Neal, E. Kurt, and C. Cunningham, Mixing dance realities: collaborative development of live-motion capture in a performing arts environment, Comput. Entertain., vol. 2, no. 2, pp. 12--12, Apr. 2004.
[6]
K. C. Ng, T. Weyde, O. Larkin, K. Neubarth, T. Koerselman, and B. Ong, 3d augmented mirror: a multimodal interface for string instrument learning and teaching with gesture support, in Proceedings of the 9th international conference on Multimodal interfaces, New York, NY, USA, 2007, pp. 339--345.
[7]
M. Raptis, D. Kirovski, and H. Hoppe, Real-Time Classification of Dance Gestures from Skeleton Animation, in ACMSIGGRAPH Symposium on Computer Animation, 2011.
[8]
Synapse - Synapse for Kinect: http://synapsekinect.tumblr.com/post/6610177302/syn apse.
[9]
Tryplex - the toolkit for collaborative design innovation - Google Project Hosting: http://code.google.com/p/tryplex/.
[10]
B. Umino, J. S. Longstaff, and A. Soga, Feasibility study for ballet e-learning: automatic composition system for ballet enchainement with online 3D motion data archive, Research in Dance Education, vol. 10, no. 1, pp. 17--32, Apr. 2009.
[11]
J. Usui, H. Hatayama, T. Sato, Y. Furuoka, and N. Okude, Paravie: dance entertainment system for everyone to express oneself with movement, in Proceedings of the 2006 ACM SIGCHI international conference on Advances in computer entertainment technology, New York, NY, USA, 2006.
[12]
G. W. Warren, Classical Ballet Technique. Tampa: University of South Florida Press, 1989.
[13]
T. Yu, X. Shen, Q. Li, and W. Geng, Motion retrieval based on movement notation language, Computer Animation and Virtual Worlds, vol. 16, no. 34, pp. 273--282, Jul. 2005.

Cited By

View all
  • (2024)Expanding the Design Space of Vision-based Interactive Systems for Group Dance PracticeProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661568(2768-2787)Online publication date: 1-Jul-2024
  • (2024)Understanding Feedback in Rhythmic Gymnastics Training: An Ethnographic-Informed Study of a Competition ClassProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642434(1-16)Online publication date: 11-May-2024
  • (2024)Innovative toolkit for objective evaluation of traditional Thai dancing by intern teachers using Kinect, MediaPipe, and microcontroller technologyCogent Arts & Humanities10.1080/23311983.2024.243331311:1Online publication date: 25-Nov-2024
  • Show More Cited By

Recommendations

Reviews

Partha Pratim Das

This paper introduces Super Mirror, interactive software for real-time feedback for ballet dancers. It animates the correct (desired) dance pose at every step of the ballet's choreography. It also provides feedback on the lack of angular correspondence between a step performed by a novice dancer and the correct dance step. Super Mirror captures the dancer's motion in real time. Motion-based information of the live dance pose is compared with prerecorded dance poses to obtain the angular difference. The Microsoft Kinect sensor device is used to capture motion in the form of depth data. The software generates a skeletal model of the dancer, showing the positions of joints. The system compares the angles at knee and hip joints using a threshold on angle widths based on the variation between the real-time and prerecorded positions. A particular joint position is considered a "hit" if the difference falls below the threshold requirement. Otherwise, a correct final position and the movements leading to that position are displayed on the screen. Thresholds can be specified for a movement by setting the values of one or more angles through the graphical user interface of the system. In addition, snapshots can be taken to record positional data for the live performer. A research group from the University of Texas at Austin has been working on Super Mirror. This paper is a work-in-progress report. The group has tested the system using seven ballet poses. The system is reported to work well in recognizing four of the seven. Improvements are required for the rest of the poses. Presently, Super Mirror recognizes static poses only. In the future, the authors hope to process dynamic movements-for example, in the form of a succession of two or more prerecorded poses-for comparison and analysis. Online Computing Reviews Service

Access critical reviews of Computing literature here

Become a reviewer for Computing Reviews.

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
CHI EA '12: CHI '12 Extended Abstracts on Human Factors in Computing Systems
May 2012
2864 pages
ISBN:9781450310161
DOI:10.1145/2212776

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 05 May 2012

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. 3d
  2. ballet
  3. education
  4. kinect
  5. motion capture

Qualifiers

  • Extended-abstract

Conference

CHI '12
Sponsor:

Acceptance Rates

Overall Acceptance Rate 6,164 of 23,696 submissions, 26%

Upcoming Conference

CHI 2025
ACM CHI Conference on Human Factors in Computing Systems
April 26 - May 1, 2025
Yokohama , Japan

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)35
  • Downloads (Last 6 weeks)2
Reflects downloads up to 23 Dec 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Expanding the Design Space of Vision-based Interactive Systems for Group Dance PracticeProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661568(2768-2787)Online publication date: 1-Jul-2024
  • (2024)Understanding Feedback in Rhythmic Gymnastics Training: An Ethnographic-Informed Study of a Competition ClassProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642434(1-16)Online publication date: 11-May-2024
  • (2024)Innovative toolkit for objective evaluation of traditional Thai dancing by intern teachers using Kinect, MediaPipe, and microcontroller technologyCogent Arts & Humanities10.1080/23311983.2024.243331311:1Online publication date: 25-Nov-2024
  • (2023)System for Estimation of Human Anthropometric Parameters Based on Data from Kinect v2 Depth CameraSensors10.3390/s2307345923:7(3459)Online publication date: 25-Mar-2023
  • (2023)A Simulcast System for Live Streaming and Virtual Avatar ConcertsJournal of the Korea Computer Graphics Society10.15701/kcgs.2023.29.2.2129:2(21-30)Online publication date: 1-Jun-2023
  • (2023)LearnThatDance: Augmenting TikTok Dance Challenge Videos with an Interactive Practice Support System Powered by Automatically Generated Lesson PlansAdjunct Proceedings of the 36th Annual ACM Symposium on User Interface Software and Technology10.1145/3586182.3615801(1-4)Online publication date: 29-Oct-2023
  • (2023)Ballet Gesture Recognition and Evaluation System (Posé Ballet): Dynamic Improvement from Laboratory to Art Gallery2023 Research, Invention, and Innovation Congress: Innovative Electricals and Electronics (RI2C)10.1109/RI2C60382.2023.10356000(98-103)Online publication date: 24-Aug-2023
  • (2023)Ballet Gesture Recognition and Evaluation System (Posé Ballet): System Thinking, Design Thinking, and Dynamic Improvement in Three Versions2023 27th International Computer Science and Engineering Conference (ICSEC)10.1109/ICSEC59635.2023.10329663(362-370)Online publication date: 14-Sep-2023
  • (2023)Ballet Gesture Recognition and Evaluation System (Posé Ballet): System Thinking, Design Thinking, and Dynamic Improvement in Three Versions from Laboratory to Art GalleryMethods and Applications for Modeling and Simulation of Complex Systems10.1007/978-981-99-7240-1_9(105-124)Online publication date: 13-Oct-2023
  • (2022)Dance Practice System that Shows What You Would Look Like if You Could Master the DanceProceedings of the 8th International Conference on Movement and Computing10.1145/3537972.3537991(1-8)Online publication date: 22-Jun-2022
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media