The relationship between audience and performers is crucial to what makes live events so special.... more The relationship between audience and performers is crucial to what makes live events so special. The aim of this work is to develop a new approach amplifying the link between audiences and performers. Specifically, we explore the use of wearable sensors in gathering real-time audience data to augment the visuals of a live dance performance. We used the J!NS MEME, smart glasses with integrated electrodes enabling eye movement analysis (e.g. blink detection) and inertial motion sensing of the head (e.g. nodding recognition). This data is streamed from the audience and visualised live on stage during a performance, alongside we also collected heart rate and eye gaze of selected audience. In this paper we present the recorded dataset, including accelerometer, electrooculography(EOG), and gyroscope data from 23 audience members.
This paper explores the use of wearable eye-tracking to detect physical activities and location i... more This paper explores the use of wearable eye-tracking to detect physical activities and location information during assembly and construction tasks involving small groups of up to four people. Large physical activities, like carrying heavy items and walking, are analysed alongside more precise, hand-tool activities, like using a drill, or a screwdriver. In a first analysis, gazeinvariant features from the eye-tracker are classified (using Naive Bayes) alongside features obtained from wrist-worn accelerometers and microphones. An evaluation is presented using data from an 8-person dataset containing over 600 physical activity events, performed under real-world (noisy) conditions. Despite the challenges of working with complex, and sometimes unreliable, data we show that event-based precision and recall of 0.66 and 0.81 respectively can be achieved by combining all three sensing modalities (using experiment independent training, and temporal smoothing). In a further analysis, we apply ...
Adjunct Proceedings of the 2019 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2019 ACM International Symposium on Wearable Computers, 2019
In this demo, we present the smart eyewear toolchain consisting of smart glasses prototypes and a... more In this demo, we present the smart eyewear toolchain consisting of smart glasses prototypes and a software platform for cognitive and social interaction assessments in the wild, with several application cases and a demonstration of activity recognition in real-time. The platform is designed to work with Jins MEME, smart EOG enabled glasses, The user software is capable data logging, posture tracking and recognition of several activities, such as talking, reading and blinking. During the demonstration we will walk through several applications and studies that the platform has been used for.
Abstract. Wearable computers promise the ability to access information and computing resources di... more Abstract. Wearable computers promise the ability to access information and computing resources directly from miniature devices embedded in our clothing. The problem lies in how to access the most relevant information without disrupting whatever task it is we are doing. Most existing interfaces, such as keyboards and touch pads, require direct interaction. This is both a physical and cognitive distraction. The problem is particularly acute for the mobile maintenance worker who must access information, such as on-line manuals or schematics, quickly and with minimal distraction. One solution is a wearable computer that monitors the user’s ‘context’ - information such as activity, location and environment. Being ‘context aware’, the wearable would be better placed to offer relevant information to the user as and when it is needed. In this work we focus on recognising one of the most important parts of context: user activity. The contributions of the thesis are twofold. First, we present...
When people interact, they fall into synchrony. This synchrony has been demonstrated in a range o... more When people interact, they fall into synchrony. This synchrony has been demonstrated in a range of contexts, from walking or playing music together to holding a conversation, and has been linked to prosocial outcomes such as development of rapport and efficiency of cooperation. While the basis of synchrony remains unclear, several studies have found synchrony to increase when an interaction is made challenging, potentially providing a means of facilitating interaction. Here we focus on head movement during free conversation. As verbal information is obscured when conversing over background noise, we investigate whether synchrony is greater in high vs low levels of noise, as well as addressing the effect of background noise complexity. Participants held a series of conversations with unfamiliar interlocutors while seated in a lab, and the background noise level changed every 15-30s between 54, 60, 66, 72, and 78 dB. We report measures of head movement synchrony recorded via high-reso...
Adjunct Proceedings of the 2019 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2019 ACM International Symposium on Wearable Computers, 2019
Communication with others relies on coordinated exchanges of social signals, such as eye gaze and... more Communication with others relies on coordinated exchanges of social signals, such as eye gaze and facial displays. However, this can only happen when partners are able to see each other. Although previous studies report that autistic individuals have difficulties in planning eye gaze and making facial displays during conversation, evidence from real-life dyadic tasks is scarce and mixed. Across two studies, here we investigate how eye gaze and facial displays of typical and high-functioning autistic individuals are modulated by the belief in being seen and potential to show true gaze direction. Participants were recorded with an eye-tracking and video-camera system while they completed a structured Q&A task with a confederate under three social contexts: pre-recorded video, video-call and face-to-face. Typical participants gazed less to the confederate and produced more facial displays when they were being watched and when they were speaking. Contrary to our hypotheses, eye gaze and...
Conversation between two people involves subtle non-verbal coordination but the parameters and ti... more Conversation between two people involves subtle non-verbal coordination but the parameters and timing of this coordination remain unclear, which limits our models of social coordination mechanisms. We implemented high-resolution motion capture of human head motion during structured conversations. Using pre-registered analyses, we quantify cross-participant wavelet coherence of head motion as a measure of non-verbal coordination, and report two novel results. First, head pitch (nodding) at 2.6 – 6.5 Hz shows below-chance coherence between people. This is driven by fast-nodding behaviour from the person listening, and is a newly defined nonverbal behaviour which may act as an important social signal. Second, head pitch movements at 0.2-1.1 Hz show above-chance coherence with a constant lag of around 600msec between a leader and follower. This is consistent with reactive (rather than predictive) models of mimicry behaviour. These results provide a step towards the quantification of rea...
The relationship between audience and performers is crucial to what makes live events so special.... more The relationship between audience and performers is crucial to what makes live events so special. The aim of this work is to develop a new approach amplifying the link between audiences and performers. Specifically, we explore the use of wearable sensors in gathering real-time audience data to augment the visuals of a live dance performance. We used the J!NS MEME, smart glasses with integrated electrodes enabling eye movement analysis (e.g. blink detection) and inertial motion sensing of the head (e.g. nodding recognition). This data is streamed from the audience and visualised live on stage during a performance, alongside we also collected heart rate and eye gaze of selected audience. In this paper we present the recorded dataset, including accelerometer, electrooculography(EOG), and gyroscope data from 23 audience members.
This paper explores the use of wearable eye-tracking to detect physical activities and location i... more This paper explores the use of wearable eye-tracking to detect physical activities and location information during assembly and construction tasks involving small groups of up to four people. Large physical activities, like carrying heavy items and walking, are analysed alongside more precise, hand-tool activities, like using a drill, or a screwdriver. In a first analysis, gazeinvariant features from the eye-tracker are classified (using Naive Bayes) alongside features obtained from wrist-worn accelerometers and microphones. An evaluation is presented using data from an 8-person dataset containing over 600 physical activity events, performed under real-world (noisy) conditions. Despite the challenges of working with complex, and sometimes unreliable, data we show that event-based precision and recall of 0.66 and 0.81 respectively can be achieved by combining all three sensing modalities (using experiment independent training, and temporal smoothing). In a further analysis, we apply ...
Adjunct Proceedings of the 2019 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2019 ACM International Symposium on Wearable Computers, 2019
In this demo, we present the smart eyewear toolchain consisting of smart glasses prototypes and a... more In this demo, we present the smart eyewear toolchain consisting of smart glasses prototypes and a software platform for cognitive and social interaction assessments in the wild, with several application cases and a demonstration of activity recognition in real-time. The platform is designed to work with Jins MEME, smart EOG enabled glasses, The user software is capable data logging, posture tracking and recognition of several activities, such as talking, reading and blinking. During the demonstration we will walk through several applications and studies that the platform has been used for.
Abstract. Wearable computers promise the ability to access information and computing resources di... more Abstract. Wearable computers promise the ability to access information and computing resources directly from miniature devices embedded in our clothing. The problem lies in how to access the most relevant information without disrupting whatever task it is we are doing. Most existing interfaces, such as keyboards and touch pads, require direct interaction. This is both a physical and cognitive distraction. The problem is particularly acute for the mobile maintenance worker who must access information, such as on-line manuals or schematics, quickly and with minimal distraction. One solution is a wearable computer that monitors the user’s ‘context’ - information such as activity, location and environment. Being ‘context aware’, the wearable would be better placed to offer relevant information to the user as and when it is needed. In this work we focus on recognising one of the most important parts of context: user activity. The contributions of the thesis are twofold. First, we present...
When people interact, they fall into synchrony. This synchrony has been demonstrated in a range o... more When people interact, they fall into synchrony. This synchrony has been demonstrated in a range of contexts, from walking or playing music together to holding a conversation, and has been linked to prosocial outcomes such as development of rapport and efficiency of cooperation. While the basis of synchrony remains unclear, several studies have found synchrony to increase when an interaction is made challenging, potentially providing a means of facilitating interaction. Here we focus on head movement during free conversation. As verbal information is obscured when conversing over background noise, we investigate whether synchrony is greater in high vs low levels of noise, as well as addressing the effect of background noise complexity. Participants held a series of conversations with unfamiliar interlocutors while seated in a lab, and the background noise level changed every 15-30s between 54, 60, 66, 72, and 78 dB. We report measures of head movement synchrony recorded via high-reso...
Adjunct Proceedings of the 2019 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2019 ACM International Symposium on Wearable Computers, 2019
Communication with others relies on coordinated exchanges of social signals, such as eye gaze and... more Communication with others relies on coordinated exchanges of social signals, such as eye gaze and facial displays. However, this can only happen when partners are able to see each other. Although previous studies report that autistic individuals have difficulties in planning eye gaze and making facial displays during conversation, evidence from real-life dyadic tasks is scarce and mixed. Across two studies, here we investigate how eye gaze and facial displays of typical and high-functioning autistic individuals are modulated by the belief in being seen and potential to show true gaze direction. Participants were recorded with an eye-tracking and video-camera system while they completed a structured Q&A task with a confederate under three social contexts: pre-recorded video, video-call and face-to-face. Typical participants gazed less to the confederate and produced more facial displays when they were being watched and when they were speaking. Contrary to our hypotheses, eye gaze and...
Conversation between two people involves subtle non-verbal coordination but the parameters and ti... more Conversation between two people involves subtle non-verbal coordination but the parameters and timing of this coordination remain unclear, which limits our models of social coordination mechanisms. We implemented high-resolution motion capture of human head motion during structured conversations. Using pre-registered analyses, we quantify cross-participant wavelet coherence of head motion as a measure of non-verbal coordination, and report two novel results. First, head pitch (nodding) at 2.6 – 6.5 Hz shows below-chance coherence between people. This is driven by fast-nodding behaviour from the person listening, and is a newly defined nonverbal behaviour which may act as an important social signal. Second, head pitch movements at 0.2-1.1 Hz show above-chance coherence with a constant lag of around 600msec between a leader and follower. This is consistent with reactive (rather than predictive) models of mimicry behaviour. These results provide a step towards the quantification of rea...
Uploads
Papers by jamie A ward