Hcii 2017
Hcii 2017
Hcii 2017
Abstract.
We present a novel multi-modal bio-sensing platform capable of integrating
multiple data streams for use in real-time applications. The system is composed
of a central compute module and a companion headset. The compute node col-
lects, time-stamps and transmits the data while also providing an interface for a
wide range of sensors including electroencephalogram, photoplethysmogram,
electrocardiogram, and eye gaze among others. The companion headset con-
tains the gaze tracking cameras. By integrating many of the measurements sys-
tems into an accessible package, we are able to explore previously unanswera-
ble questions ranging from open-environment interactions to emotional-
response studies. Though some of the integrated sensors are designed from the
ground-up to fit into a compact form factor, we validate the accuracy of the sen-
sors and find that they perform similarly to, and in some cases better than, al-
ternatives.
1 Introduction
2 System Overview
We use modular design to increase the flexibility and efficiency of multiple meas-
urements of the multi-modal bio-sensing platform. Selecting a control board that is
well supported by the open-source community and had capable expansion was a pri-
ority. To this end, this study has explored different solutions including the Arduino,
Raspberry Pi, LeMaker Guitar, and other ARM-based embedded controllers. The
hardware evaluation metric that determined viability was the ability for the systems to
hit lower-bound frame-rates and collect data from multiple sensors in real-time using
the Lab-Streaming Layer (LSL [23]). The last but one of the most important evalua-
tion metrics was the expandability via general input/output or communication proto-
cols. After evaluation of the different platforms, the Raspberry Pi 3 (RPi3) was identi-
fied as being the system that best balances cost, support, and capabilities. The sensors
that were selected for preliminary use are explored in detail below.
Fig. 2. Miniaturized PPG sensor with scale reference. (A) 3-axis accelerometer, (B) 100 Hz 12-
bit ADC, (C) IR emitter and receiver, (D) third-order filter bank.
Fig. 3. Schematic overview of adaptive noise cancellation integration with PPG.
The next sensor of the multi-modal system is a pair of cameras. One camera, an IR
emitting device, is capable of accurately capturing the pupil location. A pupil-
centering algorithm is also integrated into the platform and is capable of maintaining
the exact location even under perturbation. An algorithm developed by Pupil Labs [4]
for pupil detection and eye-gaze calibration is utilized. Refer to the results section for
quantification of tracking accuracy.
The second integrated camera in the system is a world-view camera. The camera
provides a wide-angle view of what the wearer is seeing. While being small and inte-
grated into the headset, the camera itself is a standard easily-accessible module. With
the information that is retrievable from both the pupil and the world cameras, it is
possible to retrospectively reconstruct the full-view that the user was observing. The
primary problem that stems from this type of mass video collection is that the amount
of data that must be manually labelled is enormous. There are machine-learning tools
that are capable of labelling video post-hoc, but limit the types of experiments that
can be performed. To create a truly portable system, the system’s video can be
streamed to a computer and processed using deep-learning libraries such as You Only
Look Once (YOLO) [19] that are capable of labelling 20 objects in real-time (trained
on Pascal VOC [20] dataset). By labelling exactly what the user is looking at and
allowing labelled data to be accessible during the experiment, the experimental rigidi-
ty can be relaxed allowing for more natural free-flowing behavior to be measured
with minimally intrusive cues (Fig. 4).
Fig. 4. Pupil and world views from companion headset device (top-left). Deep-learning pack-
age used to classify objects in real-time (top-right). EEG with real-time ICA and PPG signals
capture (bottom panels).
3 Evaluation
The proposed device addresses many of the limitations of existing systems while
providing the measurement capabilities in a form-factor that is convenient for both
researchers and subjects. To evaluate the efficacy of the system, the individual com-
ponents that were created in this study were evaluated. In particular, the evaluations
of the Emotiv Epoc and Microsoft Band are not explicitly evaluated in this review.
The novel PPG and eye-gaze tracking systems will be evaluated for effectiveness in
their respective areas.
Fig. 8. Angular precision analysis comparing the mean after calibration (red) and after 30 sec-
onds of dynamic head movement to simulate active conditions (blue).
4 Conclusion
There are numerous sensors capable of measuring useful metrics for human
behavior and interactions, however, limitations in the collection hardware and soft-
ware hinder their use in experiments spanning multiple modalities. By developing a
low-cost, portable, multi-modal bio-sensing platform that is capable of interfacing
with numerous different sensors, we are able to explore richer experimental questions
that have previously been unable to be accessed due to the constrained nature of the
measurement hardware. In particular, the modular nature of the control board, inter-
face software, and headset, time can be better spent looking for novel research in-
sights rather than wrangling devices and software packages from different manufac-
turers.
5 References
1. LaFleur, Karl, et al. "Quadcopter control in three-dimensional space using a
noninvasive motor imagery-based brain–computer interface." Journal of neu-
ral engineering 10.4 (2013): 046003.
2. Bell, Christian J., et al. "Control of a humanoid robot by a noninvasive
brain–computer interface in humans." Journal of neural engineering 5.2
(2008): 214.
3. Carlson, Tom, and Jose del R. Millan. "Brain-controlled wheelchairs: a ro-
botic architecture." IEEE Robotics & Automation Magazine 20.1 (2013): 65-
73.
4. Kassner, Moritz, William Patera, and Andreas Bulling. "Pupil: An Open
Source Platform for Pervasive Eye Tracking and Mobile Gaze-based Interac-
tion.(April 2014)." CoRR abs 1405.0006 (2014).
5. Makeig, Scott, et al. "Independent component analysis of electroencephalo-
graphic data." Advances in neural information processing systems (1996):
145-151.
6. Makeig, Scott, et al. "Blind separation of auditory event-related brain re-
sponses into independent components." Proceedings of the National Acade-
my of Sciences 94.20 (1997): 10979-10984.
7. Poh, Ming-Zher, Nicholas C. Swenson, and Rosalind W. Picard. "Motion-
tolerant magnetic earring sensor and wireless earpiece for wearable photo-
plethysmography." IEEE Transactions on Information Technology in Bio-
medicine 14.3 (2010): 786-794.
8. Van der Wall, E. E., and W. H. Van Gilst. "Neurocardiology: close interac-
tion between heart and brain." Neth Heart J 21.2 (2013): 51-52.
9. Patterson, James AC, Douglas C. McIlwraith, and Guang-Zhong Yang. "A
flexible, low noise reflective PPG sensor platform for ear-worn heart rate
monitoring." Wearable and Implantable Body Sensor Networks, 2009. BSN
2009. Sixth International Workshop on. IEEE, 2009.
10. Samuels, Martin A. "The brain–heart connection." Circulation 116.1 (2007).
11. Morgante, James D., Rahman Zolfaghari, and Scott P. Johnson. "A critical
test of temporal and spatial accuracy of the Tobii T60XL eye tracker." Infan-
cy 17.1 (2012): 9-32.
12. Hansen, Dan Witzner, and Qiang Ji. "In the eye of the beholder: A survey of
models for eyes and gaze." IEEE transactions on pattern analysis and ma-
chine intelligence 32.3 (2010): 478-500.
13. Notch Motion Tracking System (https://wearnotch.com/)
14. Microsoft Band (https://www.microsoft.com/microsoft-band/)
15. Da He, David, Eric S. Winokur, and Charles G. Sodini. "A continuous,
wearable, and wireless heart monitor using head ballistocardiogram (BCG)
and head electrocardiogram (ECG)." Engineering in Medicine and Biology
Society, EMBC, 2011 Annual International Conference of the IEEE, 2011.
16. He, David Da. A wearable heart monitor at the ear using ballistocardiogram
(BCG) and electrocardiogram (ECG) with a nanowatt ECG heartbeat detec-
tion circuit. Diss. Massachusetts Institute of Technology, 2013.
17. Vaughan, Theresa M., Jonathan R. Wolpaw, and Emanuel Donchin. "EEG-
based communication: Prospects and problems." IEEE transactions on reha-
bilitation engineering 4.4 (1996): 425-430.
18. Widrow, Bernard, et al. "Adaptive noise cancelling: Principles and applica-
tions." Proceedings of the IEEE 63.12 (1975): 1692-1716.
19. Redmon, Joseph, et al. "You only look once: Unified, real-time object detec-
tion." Proceedings of the IEEE Conference on Computer Vision and Pattern
Recognition. 2016.
20. Everingham, Mark, et al. "The pascal visual object classes (voc) challenge."
International journal of computer vision 88.2 (2010): 303-338.
21. Bell, Anthony J., and Terrence J. Sejnowski. "An information-maximization
approach to blind separation and blind deconvolution." Neural computation
7.6 (1995): 1129-1159.
22. Kothe, Christian Andreas, and Scott Makeig. "BCILAB: a platform for
brain–computer interface development." Journal of neural engineering 10.5
(2013): 056014.
23. Kothe, C. "Lab streaming layer (LSL)." https://github.
com/sccn/labstreaminglayer. Accessed in 2015.
24. Hsu, Sheng-Hsiou, et al. "Online recursive independent component analysis
for real-time source separation of high-density EEG." Engineering in Medi-
cine and Biology Society (EMBC), 2014 36th Annual International Confer-
ence of the IEEE. IEEE, 2014.
25. Matlab Signal Processing Toolbox (www.mathworks.com/help/sig nal/)
26. Devillez, Hélène, Nathalie Guyader, and Anne Guérin-Dugué. "An eye fixa-
tion–related potentials analysis of the P300 potential for fixations onto a tar-
get object when exploring natural scenes." Journal of vision 15.13 (2015).
27. Kamienkowski, Juan E., et al. "Fixation-related potentials in visual search: A
combined EEG and eye tracking study Fixation-related potentials in visual
search." Journal of vision 12.7 (2012): 4-4.
28. Acqualagna, Laura, and Benjamin Blankertz. "Gaze-independent BCI-
spelling using rapid serial visual presentation (RSVP)." Clinical Neurophysi-
ology 124.5 (2013): 901-908.