Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Hierarchical Graphical Models for Context-Aware Hybrid Brain-Machine Interfaces

Annu Int Conf IEEE Eng Med Biol Soc. 2018 Jul:2018:1964-1967. doi: 10.1109/EMBC.2018.8512677.

Abstract

We present a novel hierarchical graphical model based context-aware hybrid brain-machine interface (hBMI) using probabilistic fusion of electroencephalographic (EEG) and electromyographic (EMG) activities. Based on experimental data collected during stationary executions and subsequent imageries of five different hand gestures with both limbs, we demonstrate feasibility of the proposed hBMI system through within session and online across sessions classification analyses. Furthermore, we investigate the context-aware extent of the model by a simulated probabilistic approach and highlight potential implications of our work in the field of neurophysiologically-driven robotic hand prosthetics.

Publication types

  • Research Support, N.I.H., Extramural
  • Research Support, U.S. Gov't, Non-P.H.S.

MeSH terms

  • Awareness*
  • Brain-Computer Interfaces*
  • Electroencephalography
  • Gestures
  • Robotics