Export Citations
Save this search
Please login to be able to save your searches and receive alerts for new content matching your search criteria.
- research-articleSeptember 2023
Building Knowledge through Action: Considerations for Machine Learning in the Workplace
ACM Transactions on Computer-Human Interaction (TOCHI), Volume 30, Issue 5Article No.: 72, Pages 1–51https://doi.org/10.1145/3584947Innovations in machine learning are enabling organisational knowledge bases to be automatically generated from working people's activities. The potential for these to shift the ways in which knowledge is produced and shared raises questions about what ...
- research-articleMay 2023
Exploring Dwell-time from Human Cognitive Processes for Dwell Selection
Proceedings of the ACM on Human-Computer Interaction (PACMHCI), Volume 7, Issue ETRAArticle No.: 159, Pages 1–15https://doi.org/10.1145/3591128In order to develop future implicit interactions, it is important to understand the duration a user needs to recognize a visual object. By providing interactions that are triggered after a user recognizes an object, confusion resulting from the ...
- invited-talkMarch 2023
New generation Car Navigation Systems enhancing Human-Computer Interaction and exploiting sensors and machine learning on the smartphone
IUI '23 Companion: Companion Proceedings of the 28th International Conference on Intelligent User InterfacesPages 237–239https://doi.org/10.1145/3581754.3584113This Doctoral Consortium submission proposes to investigate how to provide user-centred, personalized navigation services by exploiting data collected from smartphones and applying machine-learning techniques. The presented research focuses on inferring ...
- research-articleSeptember 2022
Towards Implicit Interaction in Highly Automated Vehicles - A Systematic Literature Review
Proceedings of the ACM on Human-Computer Interaction (PACMHCI), Volume 6, Issue MHCIArticle No.: 191, Pages 1–21https://doi.org/10.1145/3546726The inclusion of in-vehicle sensors and increased intention and state recognition capabilities enable implicit in-vehicle interaction. Starting from a systematic literature review (SLR) on implicit in-vehicle interaction, which resulted in 82 ...
- demonstrationJune 2022
Implicit Interaction Approach for Car-related Tasks On Smartphone Applications - A Demo
AVI '22: Proceedings of the 2022 International Conference on Advanced Visual InterfacesArticle No.: 78, Pages 1–3https://doi.org/10.1145/3531073.3534465Implicit interaction is a possible approach to improve the user experience of smartphone apps in car-related environments. Indeed, it can enhance safety and avoids unnecessary and repetitive interactions on the user’s part.
This demo paper presents a ...
-
- short-paperJune 2022
Implicit Interaction Approach for Car-related Tasks On Smartphone Applications
AVI '22: Proceedings of the 2022 International Conference on Advanced Visual InterfacesArticle No.: 39, Pages 1–5https://doi.org/10.1145/3531073.3531173This work proposes an implicit interaction approach to ease implementing basic car-related tasks on a smartphone application. Many car drivers use apps on their smartphones to get support in typical tasks related to car usage, yet some of the available ...
- research-articleJanuary 2022
ML-based re-orientation of smartphone-collected car motion data
Procedia Computer Science (PROCS), Volume 198, Issue CPages 237–242https://doi.org/10.1016/j.procs.2021.12.234AbstractSmartphone sensors can collect data in many different contexts. They make it feasible to obtain large amounts of data at little or no cost because most people own mobile phones. In this work, we focus on the collection of smartphone data in the ...
- short-paperOctober 2020
Automating Facilitation and Documentation of Collaborative Ideation Processes
ICMI '20: Proceedings of the 2020 International Conference on Multimodal InteractionPages 699–702https://doi.org/10.1145/3382507.3421158My research is is in the field of computer supported and enabled innovation processes, in particular focusing on the first phases of ideation in a co-located environment. I'm developing a concept for documenting, tracking and enhancing creative ideation ...
- abstractJuly 2020
Careful Design: Implicit Interactions with Care, Taboo, and Humor
DIS' 20 Companion: Companion Publication of the 2020 ACM Designing Interactive Systems ConferencePages 515–519https://doi.org/10.1145/3393914.3395827Data-driven technologies increasingly participate in everyday experiences as implicit interactions that are unseen and dynamically configured. My research explores the design and implications of implicit interactions by designing within social relations ...
- abstractApril 2020
Proto-Chair: Posture-Sensing Smart Furniture with 3D-Printed Auxetics
CHI EA '20: Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing SystemsPages 1–7https://doi.org/10.1145/3334480.3383036Public space/furniture are amongst the new domains to apply a data-driven approach of design intervention and improvements. Open space is essentially dynamic, livable and interactive. Various types of people spend time for various purposes. Therefore, ...
- abstractSeptember 2019
Visualizing implicit eHMI for autonomous vehicles
AutomotiveUI '19: Proceedings of the 11th International Conference on Automotive User Interfaces and Interactive Vehicular Applications: Adjunct ProceedingsPages 475–477https://doi.org/10.1145/3349263.3349603Autonomous vehicles' (AVs) interactions with pedestrians remain an ongoing uncertainty. Studies claim the need for explicit external human-machine interfaces (eHMI) such as lights to replace the lack of eye contact with and explicit gestures from ...
- posterSeptember 2019
I'm listening: the effect of cue difference to elicit user's continuous turn-taking with A.I. agent in TV
UbiComp/ISWC '19 Adjunct: Adjunct Proceedings of the 2019 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2019 ACM International Symposium on Wearable ComputersPages 153–156https://doi.org/10.1145/3341162.3343848This study investigates the effect of different types of cues in terms of modality and message form applied in a turn-taking interaction with a virtual agent. We divided the message form to implicit and explicit, and modality to visual and auditory. The ...
- research-articleMay 2019
Explicating "Implicit Interaction": An Examination of the Concept and Challenges for Research
CHI '19: Proceedings of the 2019 CHI Conference on Human Factors in Computing SystemsPaper No.: 417, Pages 1–16https://doi.org/10.1145/3290605.3300647The term implicit interaction is often used to denote interactions that differ from traditional purposeful and attention demanding ways of interacting with computers. However, there is a lack of agreement about the term's precise meaning. This paper ...
- research-articleMarch 2019
A Wearable Nebula Material Investigations of Implicit Interaction
TEI '19: Proceedings of the Thirteenth International Conference on Tangible, Embedded, and Embodied InteractionPages 625–633https://doi.org/10.1145/3294109.3295623In this paper, we present the Nebula, a garment that translates intentional gestures and implicit interaction into sound.Nebula is a studded cloak made from a heavy fabric that envelopes the wearer with many pendulous folds. We describe the design ...
- research-articleNovember 2017
Simplifying game mechanics: gaze as an implicit interaction method
SA '17: SIGGRAPH Asia 2017 Technical BriefsArticle No.: 4, Pages 1–4https://doi.org/10.1145/3145749.3149446This paper explores the possibilities of using player gaze as an implicit interaction method, to simplify game mechanics in a space shooting video game. First, a set of five experienced players were eye-tracked while playing the game Ikaruga where gaze ...
- research-articleNovember 2017
Metatation: Annotation as Implicit Interaction to Bridge Close and Distant Reading
ACM Transactions on Computer-Human Interaction (TOCHI), Volume 24, Issue 5Article No.: 35, Pages 1–41https://doi.org/10.1145/3131609In the domain of literary criticism, many critics practice close reading, annotating by hand while performing a detailed analysis of a single text. Often this process employs the use of external resources to aid analysis. In this article, we present a ...
- Work in ProgressJune 2017
Leaky Objects: Implicit Information, Unintentional Communication
DIS '17 Companion: Proceedings of the 2017 ACM Conference Companion Publication on Designing Interactive SystemsPages 182–186https://doi.org/10.1145/3064857.3079142This paper introduces the concept of leaky objects to describe this phenomenon in which shared objects unintentionally reveal implicit information about individual or collective users. This leaking of implicit information changes our individual ...
- Work in ProgressJune 2017
Musico: Personal Playlists through Peripheral and Implicit Interaction
DIS '17 Companion: Proceedings of the 2017 ACM Conference Companion Publication on Designing Interactive SystemsPages 121–126https://doi.org/10.1145/3064857.3079131While listening to music has been a part of everyday life for ages, access to unlimited numbers of songs has never been as ubiquitous as it has become with the introduction of streaming services and mobile Internet access. However, creating and updating ...
- extended-abstractSeptember 2016
Smart lighting in dementia care facility
UbiComp '16: Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing: AdjunctPages 1636–1639https://doi.org/10.1145/2968219.2968526The growing number of people at old ages (demographic change) entails more age-related deficits and diseases. One of them is dementia, a complex neurodegenerative syndrome, which affects patient's cognitive abilities (e.g. short-term memory, attention, ...
- research-articleMay 2016
EyeGrip: Detecting Targets in a Series of Uni-directional Moving Objects Using Optokinetic Nystagmus Eye Movements
CHI '16: Proceedings of the 2016 CHI Conference on Human Factors in Computing SystemsPages 5801–5811https://doi.org/10.1145/2858036.2858584EyeGrip proposes a novel and yet simple technique of analysing eye movements for automatically detecting the user's objects of interest in a sequence of visual stimuli moving horizontally or vertically in front of the user's view. We assess the ...