Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3316782.3322747acmotherconferencesArticle/Chapter ViewAbstractPublication PagespetraConference Proceedingsconference-collections
research-article

MAPVI: meeting accessibility for persons with visual impairments

Published: 05 June 2019 Publication History

Abstract

In recent years, the inclusion of persons with visual impairments (PVI) is taking tremendous steps, especially with regards to group meetings. However, a significant part of communication is conveyed through non-verbal communication which is commonly inaccessible, such as deictic pointing gestures or the mimics and body language of participants. In this vision paper, we present an overview of our project MAPVI. MAPVI proposes new technologies on making meetings more accessible for PVIs. Therefore, we explore which relevant information has to be tracked and how those can be sensed for the users. Finally, those captured information get translated into a multitude of haptic feedback to make them accessible.

References

[1]
Bassam Almasri, Islam Elkabani, and Rached Zantout. 2014. An Interactive Workspace for Helping the Visually Impaired Learn Linear Algebra. In Computers Helping People with Special Needs, Klaus Miesenberger, Deborah Fels, Dominique Archambault, Petr Peňáz, and Wolfgang Zagler (Eds.). Springer International Publishing, Cham, 572--579.
[2]
Michael Argyle, Mark Cook, and Duncan Cramer. 1994. Gaze and Mutual Gaze. British Journal of Psychiatry 165, 6 (1994), 848--850.
[3]
Mauro Avila Soto and Markus Funk. 2018. Look, a guidance drone! Assessing the Social Acceptability of Companion Drones for Blind Travelers in Public Spaces. In Proceedings of the 20th International ACM SIGACCESS Conference on Computers and Accessibility. ACM, New York, NY, USA, 417--419.
[4]
Mauro Avila Soto, Markus Funk, Matthias Hoppe, Robin Boldt, Katrin Wolf, and Niels Henze. 2017. Dronenavigator: Using leashed and free-floating quadcopters to navigate visually impaired travelers. In Proceedings of the 19th International ACM SIGACCESS Conference on Computers and Accessibility. ACM, New York, NY, USA, 300--304.
[5]
Amartya Banerjee, Jesse Burstyn, Audrey Girouard, and Roel Vertegaal. 2011. Pointable: An In-air Pointing Technique to Manipulate Out-of-reach Targets on Tabletops. In Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces (ITS '11). ACM, New York, NY, USA, 11--20.
[6]
Andrey V. Bogdanov. 2008. Neuroinspired Architecture for Robust Classifier Fusion of Multisensor Imagery. IEEE Transactions on Geoscience and Remote Sensing 46, 5 (May 2008), 1467--1487.
[7]
Nikolaos Bourbakis. 2008. Sensing Surrounding 3-D Space for Navigation of the Blind. IEEE Engineering in Medicine and Biology Magazine 27, 1 (Jan 2008), 49--55.
[8]
Stephen Brewster and Lorna M. Brown. 2004. Tactons: Structured Tactile Messages for Non-visual Information Display. In Proceedings of the Fifth Conference on Australasian User Interface - Volume 28 (AUIC '04). Australian Computer Society, Inc., Darlinghurst, Australia, Australia, 15--23. http://dl.acm.org/citation.cfm?id=976310.976313
[9]
Anke Brock, Philippe Truillet, Bernard Oriola, and Christophe Jouffrais. 2014. Making Gestural Interaction Accessible to Visually Impaired People. In Haptics: Neuroscience, Devices, Modeling, and Applications, Malika Auvray and Christian Duriez (Eds.). Springer Berlin Heidelberg, Berlin, Heidelberg, 41--48.
[10]
Anke M. Brock. 2013. Touch the Map!: Designing Interactive Maps for Visually Impaired People. SIGACCESS Access. Comput. 105 (Jan. 2013), 9--14. Issue 105.
[11]
Michael Brock and Per Ola Kristensson. 2013. Supporting Blind Navigation Using Depth Sensing and Sonification. In Proceedings of the 2013 ACM Conference on Pervasive and Ubiquitous Computing Adjunct Publication (UbiComp '13 Adjunct). ACM, New York, NY, USA, 255--258.
[12]
Sebastian Büttner, Henrik Mucha, Markus Funk, Thomas Kosch, Mario Aehnelt, Sebastian Robert, and Carsten Röcker. 2017. The design space of augmented and virtual reality applications for assistive environments in manufacturing: a visual approach. In Proceedings of the 10th International Conference on PErvasive Technologies Related to Assistive Environments. ACM, Ney York, NY, USA, 433--440.
[13]
Rafael A. Calvo and Sidney D'Mello. 2010. Affect Detection: An Interdisciplinary Review of Models, Methods, and Their Applications. IEEE Transactions on Affective Computing 1, 1 (Jan 2010), 18--37.
[14]
Christian Cherek, Simon Voelker, Jan Thar, Rene Linden, Florian Busch, and Jan O. Borchers. 2015. PERCs Demo: Persistently Trackable Tangibles on Capacitive Multi-Touch Displays. In ITS. ACM, New York, NY, USA, 389--392.
[15]
Bruno R. De Araùjo, Géry Casiez, and Joaquim A. Jorge. 2012. Mockup Builder: Direct 3D Modeling on and Above the Surface in a Continuous Interaction Space. In Proceedings of Graphics Interface 2012 (GI '12). Canadian Information Processing Society, Toronto, Ont., Canada, Canada, 173--180. http://dl.acm.org/citation.cfm?id=2305276.2305305
[16]
Alexandra Delazio, Ken Nakagaki, Scott E Hudson, Jill Fain Lehman, and Alanson P Sample. 2018. Force Jacket: Pneumatically-Actuated Jacket for Embodied Haptic Experiences. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems - CHI'18. ACM, New York, NY, USA, 1--12.
[17]
Alan R. Dennis, Joey F. George, Len M. Jessup, Jay F. Nunamaker, and Douglas R. Vogel. 1988. Information Technology to Support Electronic Meetings. MIS Q. 12, 4 (Dec. 1988), 591--624.
[18]
Scott Elrod, Richard Bruce, Rich Gold, David Goldberg, Frank Halasz, William Janssen, David Lee, Kim McCall, Elin Pedersen, Ken Pier, John Tang, and Brent Welch. 1992. Liveboard: A Large Interactive Display Supporting Group Meetings, Presentations, and Remote Collaboration. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '92). ACM, New York, NY, USA, 599--607.
[19]
Sebastian Günther, Sven Kratz, Daniel Avrahami, and Max Mühlhäuser. 2018. Exploring Audio, Visual, and Tactile Cues for Synchronous Remote Assistance. In Proceedings of the 11th PErvasive Technologies Related to Assistive Environments Conference on - PETRA '18. ACM Press, New York, New York, USA, 339--344.
[20]
Sebastian Günther, Florian Müller, Markus Funk, Jan Kirchner, Niloofar Dezfuli, and Max Mühlhäuser. 2018. TactileGlove: Assistive Spatial Guidance in 3D Space through Vibrotactile Navigation. In Proceedings of the 11th PErvasive Technologies Related to Assistive Environments Conference on - PETRA '18. ACM, New York, NY, USA, 273--280.
[21]
Sebastian Günther, Martin Schmitz, Florian Müller, Jan Riemann, and Max Mühlhäuser. 2017. BYO*: Utilizing 3D Printed Tangible Tools for Interaction on Interactive Surfaces. In Proceedings of the 2017 ACM Workshop on Interacting with Smart Objects - SmartObject '17. ACM, New York, NY, USA, 21--26.
[22]
Wilko Heuten, Niels Henze, Susanne Boll, and Martin Pielot. 2008. Tactile wayfinder: a non-visual support system for wayfinding. In Proceedings of the 5th Nordic conference on Human-computer interaction: building bridges. ACM, New York, NY, USA, 172--181.
[23]
Otmar Hilliges, Shahram Izadi, Andrew D. Wilson, Steve Hodges, Armando Garcia-Mendoza, and Andreas Butz. 2009. Interactions in the Air: Adding Further Depth to Interactive Tabletops. In Proceedings of the 22Nd Annual ACM Symposium on User Interface Software and Technology (UIST '09). ACM, New York, NY, USA, 139--148.
[24]
Victoria E. Hribar and Dianne T.V. Pawluk. 2011. A Tactile-thermal Display for Haptic Exploration of Virtual Paintings. In The Proceedings of the 13th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS '11). ACM, New York, NY, USA, 221--222.
[25]
Hans-Christian Jetter, Harald Reiterer, and Florian Geyer. 2014. Blended Interaction: understanding natural human-computer interaction in post-WIMP interactive spaces. Personal and Ubiquitous Computing 18, 5 (01 Jun 2014), 1139--1158.
[26]
Brad Johanson, Armando Fox, and Terry Winograd. 2002. The Interactive Workspaces project: experiences with ubiquitous computing rooms. IEEE Pervasive Computing 1, 2 (April 2002), 67--74.
[27]
Udo Kannengiesser and Stefan Oppl. 2015. Business Processes to Touch: Engaging Domain Experts in Process Modelling. In 13th International Conference on Business Process Management. BPM, Innsbruck, Austria, 40--44.
[28]
Alexey Karpov and Andrey Ronzhin. 2014. A Universal Assistive Technology with Multimodal Input and Multimedia Output Interfaces. In Universal Access in Human-Computer Interaction. Design and Development Methods for Universal Access, Constantine Stephanidis and Margherita Antona (Eds.). Springer International Publishing, Cham, 369--378.
[29]
Adam Kendon. 2010. Spacing and Orientation in Co-present Interaction. Springer Berlin Heidelberg, Berlin, Heidelberg, 1--15.
[30]
Florian Klompmaker, Karsten Nebe, and Julien Eschenlohr. 2013. Towards Multimodal 3D Tabletop Interaction Using Sensor Equipped Mobile Devices. In Mobile Computing, Applications, and Services, David Uhler, Khanjan Mehta, and Jennifer L. Wong (Eds.). Springer Berlin Heidelberg, Berlin, Heidelberg, 100--114.
[31]
Marc L. Knapp, Judith A. Hall, and Terrence G. Horgan. 2013. Nonverbal Communication in Human Interaction. Wadsworth CENGAGE Learning, Boston, MA, USA.
[32]
Markus Koppensteiner, Pia Stephan, and Johannes Paul Michael Jaschke. 2016. Moving speeches: Dominance, trustworthiness and competence in body motion. Personality and Individual Differences 94 (2016), 101 -- 106.
[33]
Sreekar Krishna, Greg Little, John Black, and Sethuraman Panchanathan. 2005. A Wearable Face Recognition System for Individuals with Visual Impairments. In Proceedings of the 7th International ACM SIGACCESS Conference on Computers and Accessibility (Assets '05). ACM, New York, NY, USA, 106--113.
[34]
Andreas Kunz, Ali Alavi, and Philipp Sinn. 2014. Integrating pointing gesture detection for enhancing brainstorming meetings using kinect and pixelsense. In Procedia CIRP - Proceedings of the 8th International Conference on Digital Enterprise Technology (DET2014), Vol. 25. Elsevier, Paris, France, 205--212.
[35]
Andreas Kunz and Morten Fjeld. 2010. From Table-System to Tabletop: Integrating Technology into Interactive Surfaces. Number Tabletops - Horizontal Interactive Displays in Human-Computer Interaction. Springer Verlag, Berlin, Germany, 53--72. http://www.springer.com/computer/user+interfaces/book/978-1-84996-112-7
[36]
Sri Hastuti Kurniawan and Alistair G. Sutcliffe. 2002. Mental Models of Blind Users in the Windows Environment. In Proceedings of the 8th International Conference on Computers Helping People with Special Needs (ICCHP '02). Springer-Verlag, London, UK, UK, 568--574. http://dl.acm.org/citation.cfm?id=646269.684511
[37]
Orly Lahav and David Mioduser. 2001. Multisensory Virtual Environment for Supporting Blind Persons' Acquisition of Spatial Cognitive Mapping - a Case Study. In Proceedings of EdMedia + Innovate Learning 2001, Craig Montgomerie and Jarmo Viteli (Eds.). Association for the Advancement of Computing in Education (AACE), Norfolk, VA USA, 1046--1051. https://www.learntechlib.org/p/8729
[38]
Saadi Lahlou. 2009. Designing User Friendly Augmented Work Environments (1st ed.). Springer Publishing Company, Incorporated, London, UK.
[39]
J Laring, M Forsman, R Kadefors, and R Örtengren. 2002. MTM-based ergonomic workload analysis. International journal of Industrial ergonomics 30, 3 (2002), 135--148.
[40]
Yann LeCun, Yoshua Bengio, and Geoffrey Hinton. 2015. Deep learning. nature 521, 7553 (2015), 436.
[41]
Carsten Magerkurth and Thorsten Prante. 2001. ,Metaplan" für die Westentasche: Mobile Computerunterstützung fü Kreativitätssitzungen. Vieweg+Teubner Verlag, Wiesbaden, 163--171.
[42]
Nicolai Marquardt, Ken Hinckley, and Saul Greenberg. 2012. Cross-device Interaction via Micro-mobility and F-formations. In Proceedings of the 25th Annual ACM Symposium on User Interface Software and Technology (UIST '12). ACM, New York, NY, USA, 13--22.
[43]
Troy McDaniel, Sreekar Krishna, Vineeth Balasubramanian, Dirk Colbry, and Sethuraman Panchanathan. 2008. Using a haptic belt to convey non-verbal communication cues during social interactions to individuals who are blind. In 2008 IEEE International Workshop on Haptic Audio visual Environments and Games. IEEE, New York, NY, USA, 13--18.
[44]
Albert Mehrabian. 1969. Significance of posture and position in the communication of attitude and status relationships. Psychological Bulletin 7, 5 (May 1969), 359--372.
[45]
MICOLE. 2006. MICOLE - Multimodal Collaboration Environment for Inclusion of Visually Impaired Children. http://micole.cs.uta.fi/index.html
[46]
Florian Müller, Joshua McManus, Sebastian Günther, Martin Schmitz, Max Mühlhäuser, and Markus Funk. 2019. Mind the Tap: Assessing Foot-Taps for Interacting with Head-Mounted Displays. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems.
[47]
Stephan Pölzer and Klaus Miesenberger. 2014. Presenting Non-verbal Communication to Blind Users in Brainstorming Sessions. In Computers Helping People with Special Needs, Klaus Miesenberger, Deborah Fels, Dominique Archambault, Petr Peňáz, and Wolfgang Zagler (Eds.). Springer International Publishing, Cham, 220--225.
[48]
Stephan Pölzer and Klaus Miesenberger. 2014. A Tactile Presentation Method of Mind Maps in Co-located Meetings. In Proceedings of the International Workshop on Tactile/Haptic User Interfaces for Tabletops and Tablets, held in conjunction with ACM ITS 2014. ACM, New York, NY, USA.
[49]
Halley Profita, Reem Albaghli, Leah Findlater, Paul Jaeger, and Shaun K Kane. 2016. The AT effect: how disability affects the perceived social acceptability of head-mounted display use. In proceedings of the 2016 CHI conference on human factors in computing systems. ACM, New York, NY, USA, 4884--4895.
[50]
Markus Rader, Clemens Holzmann, Enrico Rukzio, and Julian Seifert. 2013. MobiZone: Personalized Interaction with Multiple Items on Interactive Surfaces. In Proceedings of the 12th International Conference on Mobile and Ubiquitous Multimedia (MUM '13). ACM, New York, NY, USA, Article 8, 10 pages.
[51]
Stefan Radomski and Dirk Schnelle-Walka. 2013. Spatial audio with the W3C architecture for multimodal interfaces. In Workshop on Speech in Mobile and Pervasive Environments. ACM, New York, NY, USA, 1--5.
[52]
Georg Regal, Elke Mattheiss, David Sellitsch, and Manfred Tscheligi. 2016. TalkingCards: Using Tactile NFC Cards for Accessible Brain-storming. In Proceedings of the 7th Augmented Human International Conference 2016 (AH '16). ACM, New York, NY, USA, Article 18, 7 pages.
[53]
Martin Schmitz, Mohammadreza Khalilbeigi, Matthias Balwierz, Roman Lissermann, Max Mühlhäuser, and Jürgen Steimle. 2015. Capricate: A Fabrication Pipeline to Design and 3D Print Capacitive Touch Sensors for Interactive Objects. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology (UIST '15). ACM, New York, NY, USA, 253--258.
[54]
Eberhard Schnelle. 1982. Metaplan Gesprächstechnik: Kommunikationswerkzeug fur die Gruppenarbeit. Metaplan GmbH, Quickborn, Germany. https://books.google.ch/books?id=3odkGwAACAAJ
[55]
Dirk Schnelle-Walka, Stefan Radomski, and Max Mühlhäuser. 2013. JVoiceXML as a modality component in the W3C multimodal architecture. Journal on Multimodal User Interfaces 7, 3 (01 Nov 2013), 183--194.
[56]
Dirk Schnelle-Walka, Stefan Radomski, and Max Mühlhäuser. 2014. Multimodal Fusion and Fission within W3C Standards for Nonverbal Communication with Blind Persons. In Computers Helping People with Special Needs, Klaus Miesenberger, Deborah Fels, Dominique Archambault, Petr Peňáz, and Wolfgang Zagler (Eds.). Springer International Publishing, Cham, 209--213.
[57]
Dirk Schnelle-Walka, Chiew Seng Sean Tan, Patrick Ostie, Stefan Radomski, Kris Luyten, Karin Coninx, and Max Mühlhäuser. 2016. Whom-I-Approach: A System that Provides Cues on Approachability of Bystanders for Blind Users. http://tubiblio.ulb.tu-darmstadt.de/98343/
[58]
Roy Shilkrot, Jochen Huber, Wong Meng Ee, Pattie Maes, and Suranga Chandima Nanayakkara. 2015. FingerReader: A Wearable Device to Explore Printed Text on the Go. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI '15). ACM, New York, NY, USA, 2363--2372.
[59]
Joel Snyder. 2005. Audio description: The visual made verbal. International Congress Series 1282 (2005), 935 -- 939. Vision 2005.
[60]
Aureli Soria-Frisch, Alejandro Riera, and Stephen Dunne. 2010. Fusion operators for multi-modal biometric authentication based on physiological signals. In International Conference on Fuzzy Systems. IEEE, New York, NY, USA, 1--7.
[61]
Caleb Southern, James Clawson, Brian Frey, Gregory Abowd, and Mario Romero. 2012. An Evaluation of BrailleTouch: Mobile Touchscreen Text Entry for the Visually Impaired. In Proceedings of the 14th International Conference on Human-computer Interaction with Mobile Devices and Services (MobileHCI '12). ACM, New York, NY, USA, 317--326.
[62]
Norbert A. Streitz, Jörg Geißler, Torsten Holmer, Shin'ichi Konomi, Christian Müller-Tomfelde, Wolfgang Reischl, Petra Rexroth, Peter Tandler, and Ralf Steinmetz. 1999. i-LAND: An Interactive Landscape for Creativity and Innovation. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI'99. ACM, Pittsburgh, PA, USA, 120--127.
[63]
Peter Tandler. 2000. Architecture of BEACH: The Software Infrastructure for Roomware Environments. In ACM CONFERENCE ON COMPUTERSUPPORTED COOPERATIVE WORK (CSCW'2000. ACM, Philadelphia, PE, USA, 2--6.
[64]
Zhihong Zeng, Maja Pantic, Glenn I. Roisman, and Thomas S. Huang. 2009. A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions. IEEE Transactions on Pattern Analysis and Machine Intelligence 31, 1 (Jan 2009), 39--58.

Cited By

View all
  • (2024)Exploring Space: User Interfaces for Blind and Visually Impaired People for Spatial and Non-verbal InformationComputers Helping People with Special Needs10.1007/978-3-031-62846-7_32(267-274)Online publication date: 8-Jul-2024
  • (2022)Demonstrating MagneTisch: Tangibles in Motion on an Interactive SurfaceProceedings of Mensch und Computer 202210.1145/3543758.3547520(590-593)Online publication date: 4-Sep-2022
  • (2022)Accessible User Interface Concept for Business Meeting Tool Support Including Spatial and Non-verbal Information for Blind and Visually Impaired PeopleComputers Helping People with Special Needs10.1007/978-3-031-08648-9_37(321-328)Online publication date: 1-Jul-2022
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
PETRA '19: Proceedings of the 12th ACM International Conference on PErvasive Technologies Related to Assistive Environments
June 2019
655 pages
ISBN:9781450362320
DOI:10.1145/3316782
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 05 June 2019

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. assistive technologies
  2. haptics
  3. machine learning
  4. meetings

Qualifiers

  • Research-article

Funding Sources

  • FWF
  • DFG
  • SNF

Conference

PETRA '19

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)14
  • Downloads (Last 6 weeks)1
Reflects downloads up to 09 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Exploring Space: User Interfaces for Blind and Visually Impaired People for Spatial and Non-verbal InformationComputers Helping People with Special Needs10.1007/978-3-031-62846-7_32(267-274)Online publication date: 8-Jul-2024
  • (2022)Demonstrating MagneTisch: Tangibles in Motion on an Interactive SurfaceProceedings of Mensch und Computer 202210.1145/3543758.3547520(590-593)Online publication date: 4-Sep-2022
  • (2022)Accessible User Interface Concept for Business Meeting Tool Support Including Spatial and Non-verbal Information for Blind and Visually Impaired PeopleComputers Helping People with Special Needs10.1007/978-3-031-08648-9_37(321-328)Online publication date: 1-Jul-2022
  • (2020)Accessible Multimodal Tool Support for Brainstorming MeetingsComputers Helping People with Special Needs10.1007/978-3-030-58805-2_2(11-20)Online publication date: 4-Sep-2020
  • (2020)Recognition and Localisation of Pointing Gestures Using a RGB-D CameraHCI International 2020 - Posters10.1007/978-3-030-50726-8_27(205-212)Online publication date: 10-Jul-2020
  • (2019)Res3ATN - Deep 3D Residual Attention Network for Hand Gesture Recognition in Videos2019 International Conference on 3D Vision (3DV)10.1109/3DV.2019.00061(491-501)Online publication date: Sep-2019

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media