guitARhero: Interactive Augmented Reality Guitar Tutorials
Pages 4676 - 4685
Abstract
This paper presents guitARhero, an Augmented Reality application for interactively teaching guitar playing to beginners through responsive visualizations overlaid on the guitar neck. We support two types of visual guidance, a highlighting of the frets that need to be pressed and a 3D hand overlay, as well as two display scenarios, one using a desktop magic mirror and one using a video see-through head-mounted display. We conducted a user study with 20 participants to evaluate how well users could follow instructions presented with different guidance and display combinations and compare these to a baseline where users had to follow video instructions. Our study highlights the trade-off between the provided information and visual clarity affecting the user's ability to interpret and follow instructions for fine-grained tasks. We show that the perceived usefulness of instruction integration into an HMD view highly depends on the hardware capabilities and instruction details.
References
[1]
R. Azuma. A Survey of Augmented Reality. Presence: Teleoperators & Virtual Environments, 6 (4): pp. 355–385, 1997. 2.
[2]
Z. Bai and A. F. Blackwell. See-through window vs. magic mirror: A comparison in supporting visual-motor tasks. In Proc. International Symposium on Mixed and Augmented Reality (ISMAR), pp. 239–240, 2013. 5.
[3]
A. Bangor, P. Kortum, and J. Miller. An Empirical Evaluation of the System Usability Scale. International Journal of Human-Computer Interaction, 24 (6): pp. 574–594, 2008. 8.
[4]
J. Brooke. SUS: A quick and dirty usability scale. Usability Evaluation in Industry, 189 (194): pp. 4–7, 1996. 7.
[5]
L. A. Elkin, M. Kay, J. J. Higgins, and J. O. Wobbrock. An Aligned Rank Transform Procedure for Multifactor Contrast Tests. In Proc. ACM Symposium on User Interface Software and Technology (UIST), pp. 754–768, 2021. 8.
[6]
M. Fiala. Magic Mirror System with Handheld and Wearable Augmentations. In Proc. IEEE Conference on Virtual Reality and 3D User Interfaces (VR), pp. 251–254, 2007. 1, 4.
[7]
T. Goodman and I. Batten. Real-Time Polyphonic Pitch Detection on Acoustic Musical Signals. In IEEE International Symposium on Signal Processing and Information Technology (ISSPIT), pp. 1–6, 12, 2018. 4.
[8]
E. Harrison. Challenges Facing Guitar Education. In Music Educators Journal, vol. 97, pp. 50–55, 2010. 1.
[9]
S. G. Hart and L. E. Staveland. Development of NASA-TLX (Task Load Index): Results of Empirical and Theoretical Research. In Human Mental Workload, vol. 52 of Advances in Psychology, pp. 139–183. North-Holland, 1988. 7.
[10]
F. Heyen and M. Sedlmair. Augmented Reality Visualization for Musical Instrument Learning. In International Society for Music Information Retrieval Conference (ISMIR), 2022. 5.
[11]
M. Kay, L. A. Elkin, J. J. Higgins, and J. O. Wobbrock. ARTool: R Package for the Aligned Rank Transform for Nonparametric Factorial ANOVAs. R Package Version 0.11.1, 2021. 8.
[12]
J. R. Keebler, T. J. Wiltshire, D. C. Smith, S. M. Fiore, and J. S. Bedwell. Shifting the Paradigm of Music Instruction: Implications of Embodiment Stemming From an Augmented Reality Guitar Learning System. In Frontiers in Psychology, vol. 5, 2014. 2, 3.
[13]
F. Liarokapis. Augmented Reality Scenarios for Guitar Learning. In Proc. of Theory and Practice of Computer Graphics, pp. 163–170, 2005. 2, 3.
[14]
F. Liarokapis and E. F. Anderson. Using Augmented Reality as a Medium to Assist Teaching in Higher Education. In Eurographics 2010 - Education Papers. The Eurographics Association, 2010. 2.
[15]
L. Lin, A. Normoyle, A. Adkins, Y. Sun, A. Robb, Y. Ye, M. Di Luca, and S. Jörg. The Effect of Hand Size and Interaction Modality on the Virtual Hand Illusion. In Proc. IEEE Conference on Virtual Reality and 3D User Interfaces (VR), pp. 510–518, 3 2019. 4.
[16]
J. Liu, B. Tversky, and S. Feiner. Precueing Object Placement and Orientation for Manual Tasks in Augmented Reality. IEEE Transactions on Visualization and Computer Graphics (TVCG), 28: pp. 3799–3809, 2022. 9.
[17]
M. Löchtefeld, S. Gehring, R. Jung, and A. Krüger. Using Mobile Projection to Support Guitar Learning. In Smart Graphics, vol. 6815, pp. 103–114, 2011. 3.
[18]
K. Marky, A. Weiß, A. Matviienko, F. Brandherm, S. Wolf, M. Schmitz, F. Krell, F. Müller, M. Mühlhäuser, and T. Kosch. Let's Frets! Assisting Guitar Students During Practice via Capacitive Sensing. In Proc. ACM Conference on Human Factors in Computing Systems (CHI), pp. 1–12, 05 2021. 2, 3, 6, 9.
[19]
J. Martin-Gutierrez, M. S. D. R. Guerra, V. Lopez-Chao, R. H. S. Gastelum, and J. F. V. Bojórquez. Augmented reality to facilitate learning of the acoustic guitar. In Applied Sciences, vol. 10 (7), 4 2020., 2, 4, 5.
[20]
P. Mohr, S. Mori, T. Langlotz, B. H. Thomas, D. Schmalstieg, and D. Kalkofen. Mixed Reality Light Fields for Interactive Remote Assistance. In Proc. ACM Conference on Human Factors in Computing Systems (CHI), pp. 1–12, 2020. 9.
[21]
Y. Motokawa and H. Saito. Support System for Guitar Playing Using Augmented Reality Display. In Proc. International Symposium on Mixed and Augmented Reality (ISMAR), pp. 243–244, 2006. 1, 2, 3, 4.
[22]
B. Patzer, D. Smith, and J. Keebler. Novelty and Retention for Two Augmented Reality Learning Systems. In Human Factors and Ergonomics Society Annual Meeting, vol. 58 (1), pp. 1164–1168, 2014. 1, 2, 3.
[23]
S. Pongnumkul, M. Dontcheva, W. Li, J. Wang, L. Bourdev, S. Avidan, and M. F. Cohen. Pause-and-Play: Automatically Linking Screencast Video Tutorials with Applications. In Proc. ACM Symposium on User Interface Software and Technology (UIST), pp. 135–144, 2011. 4.
[24]
L. Ribeiro Skreinig, A. Stanescu, S. Mori, F. Heyen, P. Mohr-Ziak, M. Sedlmair, D. Schmalstieg, and D. Kalkofen. AR Hero: Generating Interactive Augmented Reality Guitar Tutorials. In Proc. IEEE Virtual Reality and 3D User Interfaces Abstracts and Workshops, pp. 395–401, 2022. 2, 3.
[25]
D. Rio-Guerra, M. Sylvia, J. Martin-Gutierrez, V. A. Lopez-Chao, R. Flores Parra, and M. A. Ramirez Sosa. AR Graphic Representation of Musical Notes for Self-learning on Guitar. Applied Sciences, 9 (21): pp. 4527, 2019. 1, 2, 3, 4.
[26]
R. Spencer. The Streamlined Cognitive Walkthrough Method, Working around Social Constraints Encountered in a Software Development Company. In Proc. ACM Conference on Human Factors in Computing Systems (CHI), pp. 353–359, 2000. 5.
[27]
A. Stanescu, P. Mohr, D. Schmalstieg, and D. Kalkofen. Model-free authoring by demonstration of assembly instructions in augmented reality. IEEE Transactions on Visualization and Computer Graphics (TVCG), 28 (11): pp. 3821–3831, 2022. 3.
[28]
B. Thoravi Kumaravel, F. Anderson, G. Fitzmaurice, B. Hartmann, and T. Grossman. Loki: Facilitating Remote Instruction of Physical Tasks Using Bi-directional Mixed-reality Telepresence. In Proc. ACM Symposium on User Interface Software and Technology (UIST), pp. 161–174, 10 2019. 1, 2, 3.
[29]
C. Torres and P. Figueroa. Learning How to Play a Guitar with the HoloLens: A Case Study. In Proc. XLIV Latin American Computer Conference (CLEI), pp. 606–611, 2018. 1, 2, 3, 4, 5.
[30]
T. Waltemate, D. Gall, D. Roth, M. Botsch, and M. E. Latoschik. The Impact of Avatar Personalization and Immersion on Virtual Body Ownership, Presence, and Emotional Response. In IEEE Transactions on Visualization and Computer Graphics (TVCG), vol. 24, pp. 1643–1652, 2018. 5.
[31]
B. Wang, M. Y. Yang, and T. Grossman. Soloist: Generating Mixed-Initiative Tutorials from Existing Guitar Instructional Videos Through Audio Processing. In Proc. ACM Conference on Human Factors in Computing Systems (CHI), pp. 1–14, 5 2021. 2.
[32]
C. Wharton, J. Rieman, C. Lewis, and P. Polson. The Cognitive Walk-through Method: A Practitioner's Guide, pp. 105–140. John Wiley & Sons, Inc., 1994. 5.
[33]
J. O. Wobbrock, L. Findlater, D. Gergle, and J. J. Higgins. The Aligned Rank Transform for Nonparametric Factorial Analyses Using Only Anova Procedures. In Proc. ACM Conference on Human Factors in Computing Systems (CHI), pp. 143–146, 2011. 8.
[34]
R. W. Young. Terminology for Logarithmic Frequency Units. The Journal of the Acoustical Society of America, 11 (1): pp. 134–139, 1939. 4.
[35]
X. Yu, K. Angerbauer, P. Mohr, D. Kalkofen, and M. Sedlmair. Perspective Matters: Design Implications for Motion Guidance in Mixed Reality. In Proc. International Symposium on Mixed and Augmented Reality (ISMAR), pp. 577–587, 2020. 5.
Recommendations
Haptics in Augmented Reality
ICMCS '99: Proceedings of the IEEE International Conference on Multimedia Computing and Systems - Volume 2An augmented reality system merges synthetic sensory information into a user's perception of a three-dimensional environment. An important performance goal for an augmented reality system is that the user perceives a single seamless environment. In most ...
Comments
Information & Contributors
Information
Published In
© 2023 The Authors. This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.
Publisher
IEEE Educational Activities Department
United States
Publication History
Published: 01 November 2023
Qualifiers
- Research-article
Contributors
Other Metrics
Bibliometrics & Citations
Bibliometrics
Article Metrics
- 0Total Citations
- 0Total Downloads
- Downloads (Last 12 months)0
- Downloads (Last 6 weeks)0
Reflects downloads up to 08 Feb 2025