Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content

Detection and Localisation of Pointing, Pairing and Grouping Gestures for Brainstorming Meeting Applications

  • Conference paper
  • First Online:
HCI International 2021 - Posters (HCII 2021)

Abstract

The detection of gestures and their interpretation is crucial for blind and visually impaired people (BVIP). In a card-based brainstorming meeting, sighted users use non-verbal communication when referring to cards on a common workspace using pointing, grouping, or pairing gestures. While sighted users could easily interpret such gestures, they remain inaccessible to BVIP. Thus, there is a need for capturing, interpreting and translating gestures for BVIP.

To address this problem, we developed a pointing gesture detection system using Unity with the SteamVR Plugin and HTC Vive. HTC’s trackers are attached to a user’s hands to measure the hand position in 3D space. With pointing gestures, a user controls a virtual ray that will intersect with a virtual whiteboard. This virtual whiteboard is invisible to the sighted users, but its position and size corresponds to a physical whiteboard. The intersection of the ray with the virtual whiteboard is calculated, resulting in a pointing trajectory on it. The shape of the trajectory is analyzed to determine, which artifacts are selected by the pointing gesture. A pointing gesture is detected when a user is pointing at a card on the screen and then ending the gesture by pointing outside of the screen. A pairing gesture is detected when pointing at one artifact and then on another one before leaving the screen. The grouping gesture is detected when performing an encircling gesture around multiple artifacts before leaving the screen.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Alavi, A., Kunz, A.: Tracking deictic gestures over large interactive surfaces. Comput. Support. Coop. Work (CSCW) 24(2), 109–119 (2015). https://doi.org/10.1007/s10606-015-9219-4

    Article  Google Scholar 

  2. Dhingra, N., Valli, E., Kunz, A.: Recognition and localisation of pointing gestures using a RGB-D camera. In: Stephanidis, C., Antona, M. (eds.) HCII 2020. CCIS, vol. 1224, pp. 205–212. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-50726-8_27

    Chapter  Google Scholar 

  3. Kane, S.K., Wobbrock, J.O., Ladner, R.E.: Usable gestures for blind people: understanding preference and performance. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 413–422. ACM, New York (2011). https://doi.org/10.1145/1978942.1979001

  4. Koutny, R., Günther, S., Dhingra, N., Kunz, A., Miesenberger, K., Mühlhäuser, M.: Accessible multimodal tool support for brainstorming meetings. In: Miesenberger, K., Manduchi, R., Covarrubias Rodriguez, M., Peňáz, P. (eds.) ICCHP 2020. LNCS, vol. 12377, pp. 11–20. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-58805-2_2

    Chapter  Google Scholar 

  5. Kunz, A., Alavi, A., Sinn, P.: Integrating pointing gesture detection for enhancing brainstorming meetings using kinect and pixelsense. In: 8th International Conference on Digital Enterprise Technology, pp. 205–212 (2014). https://doi.org/10.1016/j.procir.2014.10.031, http://www.det-2014.de/en/home.html

  6. Kunz, A., Schnelle-Walka, D., Alavi, A., Poelzer, S., Muehlhaeuser, M., Miesenberger, K.: Making tabletop interaction accessible for blind users. In: Interactive Tabletops and Surfaces, pp. 327–332. ACM, New York (2014). https://doi.org/10.1145/2669485.2669541, http://its2014.org/

  7. Le, H.-A., Mac, K.-N.C., Pham, T.-A., Tran, M.-T.: Realtime pointing gesture recognition and applications in multi-user interaction. In: Selamat, A., Nguyen, N.T., Haron, H. (eds.) ACIIDS 2013. LNCS (LNAI), vol. 7802, pp. 355–364. Springer, Heidelberg (2013). https://doi.org/10.1007/978-3-642-36546-1_37

    Chapter  Google Scholar 

  8. Mehrabian, A., Ferris, S.R.: Inference of attitudes from nonverbal communication in two channels. J. Consult. Psychol. 31(3), 248–252 (1967). https://doi.org/10.1037/h0024648

    Article  Google Scholar 

  9. Park, C., Roh, M., Lee, S.: Real-time 3D pointing gesture recognition in mobile space. In: 2008 8th IEEE International Conference on Automatic Face Gesture Recognition, pp. 1–6. IEEE, New York (2008). https://doi.org/10.1109/AFGR.2008.4813448

  10. Pölzer, S., Schnelle-Walka, D., Pöll, D., Heumader, P., Miesenberger, K.: Making brainstorming meetings accessible for blind users. In: AAATE Conference (2013)

    Google Scholar 

  11. Miesenberger, K., Fels, D., Archambault, D., Peňáz, P., Zagler, W. (eds.): ICCHP 2014. LNCS, vol. 8547. Springer, Cham (2014). https://doi.org/10.1007/978-3-319-08596-8

    Book  Google Scholar 

  12. Wang, J., Liu, T., Wang, X.: Human hand gesture recognition with convolutional neural networks for K-12 double-teachers instruction mode classroom. Infrared Phys. Technol. 111, 103464 (2020). https://doi.org/10.1016/j.infrared.2020.103464

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Naina Dhingra .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Liechti, S., Dhingra, N., Kunz, A. (2021). Detection and Localisation of Pointing, Pairing and Grouping Gestures for Brainstorming Meeting Applications. In: Stephanidis, C., Antona, M., Ntoa, S. (eds) HCI International 2021 - Posters. HCII 2021. Communications in Computer and Information Science, vol 1420. Springer, Cham. https://doi.org/10.1007/978-3-030-78642-7_4

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-78642-7_4

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-78641-0

  • Online ISBN: 978-3-030-78642-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics