Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
  • Mo W, Dechant M, Marquardt N, Ayobi A, Singh A and Holloway C. Exploring the Design Space of Input Modalities for Working in Mixed Reality on Long-haul Flights. Proceedings of the 2024 ACM Designing Interactive Systems Conference. (2267-2285).

    https://doi.org/10.1145/3643834.3661560

  • Dupré C, Appert C, Rey S, Saidi H and Pietriga E. TriPad: Touch Input in AR on Ordinary Surfaces with Hand Tracking Only. Proceedings of the 2024 CHI Conference on Human Factors in Computing Systems. (1-18).

    https://doi.org/10.1145/3613904.3642323

  • Perella-Holfeld F, Faleel S and Irani P. Evaluating design guidelines for hand proximate user interfaces. Proceedings of the 2023 ACM Designing Interactive Systems Conference. (1159-1173).

    https://doi.org/10.1145/3563657.3596117

  • Müller F, Schmitt D, Matviienko A, Schön D, Günther S, Kosch T and Schmitz M. TicTacToes: Assessing Toe Movements as an Input Modality. Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems. (1-17).

    https://doi.org/10.1145/3544548.3580954

  • Paredes L, Ipsita A, Mesa J, Martinez Garrido R and Ramani K. (2022). StretchAR. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies. 6:3. (1-26). Online publication date: 6-Sep-2022.

    https://doi.org/10.1145/3550305

  • Choi M, Sakamoto D and Ono T. Kuiper Belt: Utilizing the “Out-of-natural Angle” Region in the Eye-gaze Interaction for Virtual Reality. Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems. (1-17).

    https://doi.org/10.1145/3491102.3517725

  • Wu H, Fu S, Yang L and Zhang X. (2020). Exploring frame-based gesture design for immersive VR shopping environments. Behaviour & Information Technology. 10.1080/0144929X.2020.1795261. 41:1. (96-117). Online publication date: 2-Jan-2022.

    https://www.tandfonline.com/doi/full/10.1080/0144929X.2020.1795261

  • Froschauer R, Kurschl W, Wolfartsberger J, Pimminger S, Lindorfer R and Blattner J. (2021). A Human-Centered Assembly Workplace For Industry: Challenges and Lessons Learned. Procedia Computer Science. 10.1016/j.procs.2021.01.166. 180. (290-300).

    https://linkinghub.elsevier.com/retrieve/pii/S1877050921002064

  • Kim Y, Park S and Kim G. (2020). “Blurry Touch Finger”: Touch-Based Interaction for Mobile Virtual Reality with Clip-on Lenses. Applied Sciences. 10.3390/app10217920. 10:21. (7920).

    https://www.mdpi.com/2076-3417/10/21/7920

  • Brasier E, Chapuis O, Ferey N, Vezien J and Appert C. (2020). ARPads: Mid-air Indirect Input for Augmented Reality 2020 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). 10.1109/ISMAR50242.2020.00060. 978-1-7281-8508-8. (332-343).

    https://ieeexplore.ieee.org/document/9284722/

  • Müller F, Schmitz M, Schmitt D, Günther S, Funk M and Mühlhäuser M. Walk The Line: Leveraging Lateral Shifts of the Walking Path as an Input Modality for Head-Mounted Displays. Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. (1-15).

    https://doi.org/10.1145/3313831.3376852

  • Muller F, Gunther S and Muhlhauser M. Around-Body Interaction: Interacting While on the Go. IEEE Pervasive Computing. 10.1109/MPRV.2020.2977850. 19:2. (74-78).

    https://ieeexplore.ieee.org/document/9094289/

  • Oh J, Park J and Park J. FingerTouch: Touch Interaction Using a Fingernail-Mounted Sensor on a Head-Mounted Display for Augmented Reality. IEEE Access. 10.1109/ACCESS.2020.2997972. 8. (101192-101208).

    https://ieeexplore.ieee.org/document/9102253/

  • Wu H, Luo W, Pan N, Nan S, Deng Y, Fu S and Yang L. (2019). Understanding freehand gestures: a study of freehand gestural interaction for immersive VR shopping applications. Human-centric Computing and Information Sciences. 10.1186/s13673-019-0204-7. 9:1. Online publication date: 1-Dec-2019.

    https://hcis-journal.springeropen.com/articles/10.1186/s13673-019-0204-7

  • Liang R, Yang S and Chen B. InDexMo. Proceedings of the 2019 ACM International Symposium on Wearable Computers. (129-134).

    https://doi.org/10.1145/3341163.3347724

  • Bailly C, Leitner F and Nigay L. Head-Controlled Menu in Mixed Reality with a HMD. Human-Computer Interaction – INTERACT 2019. (395-415).

    https://doi.org/10.1007/978-3-030-29390-1_22

  • Müller F, McManus J, Günther S, Schmitz M, Mühlhäuser M and Funk M. Mind the Tap. Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. (1-13).

    https://doi.org/10.1145/3290605.3300707

  • Iravantchi Y, Goel M and Harrison C. BeamBand. Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. (1-10).

    https://doi.org/10.1145/3290605.3300245

  • Zhang Z, Li Y, Guo J, Weng D, Liu Y and Wang Y. (2018). Vision‐tangible interactive display method for mixed and virtual reality: Toward the human‐centered editable reality. Journal of the Society for Information Display. 10.1002/jsid.747. 27:2. (72-84). Online publication date: 1-Feb-2019.

    https://sid.onlinelibrary.wiley.com/doi/10.1002/jsid.747

  • Azai T, Ushiro S, Li J, Otsuki M, Shibata F and Kimura A. Tap-tap menu. Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology. (1-2).

    https://doi.org/10.1145/3281505.3281561

  • Alallah F, Neshati A, Sakamoto Y, Hasan K, Lank E, Bunt A and Irani P. Performer vs. observer. Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology. (1-9).

    https://doi.org/10.1145/3281505.3281541

  • Soliman M, Mueller F, Hegemann L, Roo J, Theobalt C and Steimle J. FingerInput. Proceedings of the 2018 ACM International Conference on Interactive Surfaces and Spaces. (177-187).

    https://doi.org/10.1145/3279778.3279799

  • Li Y, Li T, Patel R, Yang X and Zhou X. Self-Powered Gesture Recognition with Ambient Light. Proceedings of the 31st Annual ACM Symposium on User Interface Software and Technology. (595-608).

    https://doi.org/10.1145/3242587.3242635

  • Zhang Z, Li Y, Guo J, Weng D, Liu Y and Wang Y. (2018). Task‐driven latent active correction for physics‐inspired input method in near‐field mixed reality applications. Journal of the Society for Information Display. 10.1002/jsid.728. 26:8. (496-509). Online publication date: 1-Aug-2018.

    https://sid.onlinelibrary.wiley.com/doi/10.1002/jsid.728

  • Alallah F, Neshati A, Sheibani N, Sakamoto Y, Bunt A, Irani P and Hasan K. Crowdsourcing vs Laboratory-Style Social Acceptability Studies?. Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. (1-7).

    https://doi.org/10.1145/3173574.3173884

  • Azai T, Otsuki M, Shibata F and Kimura A. Open Palm Menu. Proceedings of the 9th Augmented Human International Conference. (1-5).

    https://doi.org/10.1145/3174910.3174929

  • Lee L and Hui P. Interaction Methods for Smart Glasses: A Survey. IEEE Access. 10.1109/ACCESS.2018.2831081. 6. (28712-28732).

    https://ieeexplore.ieee.org/document/8368051/

  • Memo A and Zanuttigh P. (2018). Head-mounted gesture controlled interface for human-computer interaction. Multimedia Tools and Applications. 77:1. (27-53). Online publication date: 1-Jan-2018.

    https://doi.org/10.1007/s11042-016-4223-3

  • Gong J, Zhang Y, Zhou X and Yang X. Pyro. Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology. (553-563).

    https://doi.org/10.1145/3126594.3126615

  • Wu H, Wang J and Zhang X. (2017). Combining hidden Markov model and fuzzy neural network for continuous recognition of complex dynamic gestures. The Visual Computer: International Journal of Computer Graphics. 33:10. (1265-1278). Online publication date: 1-Oct-2017.

    https://doi.org/10.1007/s00371-015-1147-2

  • Delamare W, Han T and Irani P. Designing a gaze gesture guiding system. Proceedings of the 19th International Conference on Human-Computer Interaction with Mobile Devices and Services. (1-13).

    https://doi.org/10.1145/3098279.3098561

  • Rapp J and Goyal V. A Few Photons Among Many: Unmixing Signal and Noise for Photon-Efficient Active Imaging. IEEE Transactions on Computational Imaging. 10.1109/TCI.2017.2706028. 3:3. (445-459).

    https://ieeexplore.ieee.org/document/7932527/

  • Azai T, Ogawa S, Otsuki M, Shibata F and Kimura A. Selection and Manipulation Methods for a Menu Widget on the Human Forearm. Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems. (357-360).

    https://doi.org/10.1145/3027063.3052959

  • Jang Y, Jeon I, Kim T and Woo W. Metaphoric Hand Gestures for Orientation-Aware VR Object Manipulation With an Egocentric Viewpoint. IEEE Transactions on Human-Machine Systems. 10.1109/THMS.2016.2611824. (1-15).

    http://ieeexplore.ieee.org/document/7588127/

  • Yu J and Kim G. Blurry (sticky) finger. Proceedings of the 26th International Conference on Artificial Reality and Telexistence and the 21st Eurographics Symposium on Virtual Environments. (49-56).

    /doi/10.5555/3061323.3061335

  • Akkil D and Isokoski P. Accuracy of interpreting pointing gestures in egocentric view. Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing. (262-273).

    https://doi.org/10.1145/2971648.2971687

  • Zhang Z, Weng D, Liu Y and Wang Y. (2016). A Modular Calibration Framework for 3D Interaction System Based on Optical See-Through Head-Mounted Displays in Augmented Reality 2016 International Conference on Virtual Reality and Visualization (ICVRV). 10.1109/ICVRV.2016.72. 978-1-5090-5188-5. (393-400).

    http://ieeexplore.ieee.org/document/7938227/

  • Hsiao D, Sun M, Ballweber C, Cooper S and Popović Z. Proactive Sensing for Improving Hand Pose Estimation. Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. (2348-2352).

    https://doi.org/10.1145/2858036.2858587

  • Park D, Lee Y, Song S, Rhiu I, Kwon S, An Y and Yun M. User centered gesture development for smart lighting. Proceedings of HCI Korea. (146-150).

    https://doi.org/10.17210/hcik.2016.01.146

  • Wu H, Wang J and Zhang X. (2016). User-centered gesture development in TV viewing environment. Multimedia Tools and Applications. 75:2. (733-760). Online publication date: 1-Jan-2016.

    https://doi.org/10.1007/s11042-014-2323-5

  • Wu H and Wang J. (2016). A visual attention-based method to address the midas touch problem existing in gesture-based interaction. The Visual Computer: International Journal of Computer Graphics. 32:1. (123-136). Online publication date: 1-Jan-2016.

    https://doi.org/10.1007/s00371-014-1060-0

  • Kim J, Yun S, Seol D, Park H and Kim Y. An IR Proximity-Based 3D Motion Gesture Sensor for Low-Power Portable Applications. IEEE Sensors Journal. 10.1109/JSEN.2015.2471845. 15:12. (7009-7016).

    http://ieeexplore.ieee.org/document/7217802/

  • Chan L, Chen Y, Hsieh C, Liang R and Chen B. CyclopsRing. Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology. (549-556).

    https://doi.org/10.1145/2807442.2807450

  • Gieser S, Sassaman P, Becker E and Makedon F. Pot hunter. Proceedings of the 8th ACM International Conference on PErvasive Technologies Related to Assistive Environments. (1-4).

    https://doi.org/10.1145/2769493.2769523

  • Delamare W, Coutrix C and Nigay L. Designing guiding systems for gesture-based interaction. Proceedings of the 7th ACM SIGCHI Symposium on Engineering Interactive Computing Systems. (44-53).

    https://doi.org/10.1145/2774225.2774847

  • Chan L, Hsieh C, Chen Y, Yang S, Huang D, Liang R and Chen B. Cyclops. Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. (3001-3009).

    https://doi.org/10.1145/2702123.2702464

  • Withana A, Peiris R, Samarasekara N and Nanayakkara S. zSense. Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. (3661-3670).

    https://doi.org/10.1145/2702123.2702371

  • Tung Y, Hsu C, Wang H, Chyou S, Lin J, Wu P, Valstar A and Chen M. User-Defined Game Input for Smart Glasses in Public Space. Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. (3327-3336).

    https://doi.org/10.1145/2702123.2702214

  • Lee S, Ha G, Cha J, Kim J, Lee H and Kim S. (2015). CyberTouch - Touch and Cursor Interface for VR HMD. HCI International 2015 - Posters’ Extended Abstracts. 10.1007/978-3-319-21380-4_85. (503-507).

    http://link.springer.com/10.1007/978-3-319-21380-4_85

  • He Z and Yang X. Hand-based interaction for object manipulation with augmented reality glasses. Proceedings of the 13th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and its Applications in Industry. (227-230).

    https://doi.org/10.1145/2670473.2670505

  • Kollee B, Kratz S and Dunnigan A. Exploring gestural interaction in smart spaces using head mounted devices with ego-centric sensing. Proceedings of the 2nd ACM symposium on Spatial user interaction. (40-49).

    https://doi.org/10.1145/2659766.2659781

  • Jalaliniya S, Mardanbeigi D, Pederson T and Hansen D. Head and Eye Movement as Pointing Modalities for Eyewear Computers. Proceedings of the 2014 11th International Conference on Wearable and Implantable Body Sensor Networks Workshops. (50-53).

    https://doi.org/10.1109/BSN.Workshops.2014.14

  • Ham J, Hong J, Jang Y, Ko S and Woo W. (2014). Smart Wristband: Touch-and-Motion–Tracking Wearable 3D Input Device for Smart Glasses. Distributed, Ambient, and Pervasive Interactions. 10.1007/978-3-319-07788-8_11. (109-118).

    http://link.springer.com/10.1007/978-3-319-07788-8_11

  • Colaço A. Sensor design and interaction techniques for gestural input to smart glasses and mobile devices. Adjunct Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology. (49-52).

    https://doi.org/10.1145/2508468.2508474