Export Citations
Save this search
Please login to be able to save your searches and receive alerts for new content matching your search criteria.
- otherJuly 2023
DJuggling: Sonification of expressive movement performance
SIGGRAPH '23: ACM SIGGRAPH 2023 Real-Time Live!Article No.: 2, Pages 1–2https://doi.org/10.1145/3588430.3597246In the real-time demo, we demonstrate how to create a musical performance using juggling movement as an instrument. We have equipped juggling balls with accelerometers, gyroscopes, and WiFi sensors. The system measures acceleration and rotation in a ...
- abstractOctober 2020
Draw Portraits by Music: A Music based Image Style Transformation
MM '20: Proceedings of the 28th ACM International Conference on MultimediaPages 4399–4400https://doi.org/10.1145/3394171.3416340"Draw portraits by music", an interactive work of art. Compared with music visualization and image style conversion, it's AI's imitation of human synaesthetic. New portraits gradually appear on the screen and are synchronized with music in real-time. ...
- abstractNovember 2019
Synesthesia Wear : Full-body haptic clothing interface based on two-dimensional signal transmission
- Taichi Furukawa,
- Nobuhisa Hanamitsu,
- Yoichi Kamiyama,
- Hideaki Nii,
- Charalampos Krekoukiotis,
- Kouta Minamizawa,
- Akihito Noda,
- Junko Yamada,
- Keiichi Kitamura,
- Daisuke Niwa,
- Yoshiaki Hirano,
- Tetsuya Mizuguchi
In this paper, we present Synesthesia Wear, a full-body, customizable haptic interface, and demonstrate its capabilities with an untethered spatial computing experience. This wear not only looks as flexible as ordinary cloth, but also the attached ...
- research-articleSeptember 2019
Syn(es)thetic reality: simulating synesthesia for the non synesthetic
ISWC '19: Proceedings of the 2019 ACM International Symposium on Wearable ComputersPages 290–295https://doi.org/10.1145/3341163.3346940Syn(es)thetic Reality explores a new way of sensing the world by understanding sounds through colors. It looks to simulate projective chromesthesia, an experience of seeing colors involuntarily as a result of sound input. The project achieves this ...
- abstractMay 2019
Translating Affective Touch into Text
CHI EA '19: Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing SystemsPaper No.: LBW0175, Pages 1–6https://doi.org/10.1145/3290607.3313015This paper presents a game-like experience that translates tactile input into text, which captures the emotional qualities of that touch. We describe the experience and the system that generates it: a plush toy instrumented with pressure sensors, a ...
-
- abstractMarch 2018
The Screaming Sun: Choreographing Synesthesia
TEI '18: Proceedings of the Twelfth International Conference on Tangible, Embedded, and Embodied InteractionPages 562–566https://doi.org/10.1145/3173225.3173308Digital devices usually contain pre-programmed constraints and behaviours. These behaviours are programmed by the architect of these devices. Analogue devices, on the other hand, are bound by the properties of the components connected to them. ...
- abstractMarch 2018
Opto-Phono-Kinesia (OPK): Designing Motion-Based Interaction for Expert Performers
TEI '18: Proceedings of the Twelfth International Conference on Tangible, Embedded, and Embodied InteractionPages 487–492https://doi.org/10.1145/3173225.3173295Opto-Phono-Kinesia (OPK) is an audio-visual performance piece in which all media elements are controlled by the body movements of a single performer. The title is a play on a possible synesthetic state involving connections between vision, sound and ...
- posterJuly 2016
Synesthesia suit: the full body immersive experience
SIGGRAPH '16: ACM SIGGRAPH 2016 PostersArticle No.: 71, Page 1https://doi.org/10.1145/2945078.2945149The Synesthesia Suit provides immersive embodied experience in Virtual Reality environment with vibro-tactile sensations on the entire body. Each vibro-tactile actuator provides not a simple vibration such as traditional game controller, but we designed ...
- otherJuly 2016
Synesthesia suit: the full body immersive experience
SIGGRAPH '16: ACM SIGGRAPH 2016 VR VillageArticle No.: 20, Page 1https://doi.org/10.1145/2929490.2932629The Synesthesia Suit provides immersive embodied experience in Virtual Reality environment with vibro-tactile sensations on the entire body. Each vibro-tactile actuator provides not a simple vibration such as traditional game controller, but we designed ...
- research-articleJune 2013
Site Weave: revealing interconnections through music
C&C '13: Proceedings of the 9th ACM Conference on Creativity & CognitionPages 418–419https://doi.org/10.1145/2466627.2481215Site Weave is an interactive installation exploring the idea that different locations within a site are interconnected through music, that is; they exist in a musical network that creates a natural relationship between them. This network is revealed ...
- posterApril 2013
The sound of light: induced synesthesia for augmenting the photography experience
CHI EA '13: CHI '13 Extended Abstracts on Human Factors in Computing SystemsPages 745–750https://doi.org/10.1145/2468356.2468489This paper presents a novel approach to assist users of digital cameras and augment their photograph taking experience. To this end, a realtime analysis of the images framed by the camera is conducted to assess composition, exposure and presence of ...
- extended-abstractMay 2012
scoreLight & scoreBots
CHI EA '12: CHI '12 Extended Abstracts on Human Factors in Computing SystemsPages 1011–1014https://doi.org/10.1145/2212776.2212373"scoreLight" and "scoreBots" are two experimental platforms for performative sound design and manipulation. Both are essentially synesthetic interfaces - synesthetic musical instruments - capable of translating free-hand drawings into a sonic language ...
- research-articleOctober 2010
Synesthetic video: hearing colors, seeing sounds
MindTrek '10: Proceedings of the 14th International Academic MindTrek Conference: Envisioning Future Media EnvironmentsPages 130–133https://doi.org/10.1145/1930488.1930515In this paper we present Synesthetic Video, an interactive video that allows to experience video in cross-sensorial ways, to hear its colors and to influence its visual properties with sound and music, through user interaction or ambient influence. Our ...
- abstractJanuary 2010
SoLu: hyperinstrument
TEI '11: Proceedings of the fifth international conference on Tangible, embedded, and embodied interactionPages 405–406https://doi.org/10.1145/1935701.1935802Demonstration of a proposal of correspondence between Light and Sound. This proposal is materially exemplified by means of a new hyperinstrument, which gives its users the control over a multi-sensorial algorithmic composition generated in real-time. ...
- ArticleJuly 2009
COL.diesis: Transforming Colour into Melody and Implementing the Result in a Colour Sensor Device
VIZ '09: Proceedings of the 2009 Second International Conference in VisualisationPages 30–35https://doi.org/10.1109/VIZ.2009.49This paper represents a systemwhich explores the reality of colour in relation to sound and dedicated firstly to blind people, the challenges they live with, their psychology and their emotions. It’s an exploration of the infinite range of colours that ...
- research-articleMarch 2008
Cheek to Chip: Dancing Robots and AI's Future
- Jean-Julien Aucouturier,
- Katsushi Ikeuchi,
- Hirohisa Hirukawa,
- Shin'ichiro Nakaoka,
- Takaaki Shiratori,
- Shunsuke Kudoh,
- Fumio Kanehiro,
- Tetsuya Ogata,
- Hideki Kozima,
- Hiroshi G. Okuno,
- Marek P. Michalowski,
- Yuta Ogai,
- Takashi Ikegami,
- Kazuhiro Kosuge,
- Takahiro Takeda,
- Yasuhisa Hirata
IEEE Intelligent Systems (IEEECS-INTELLI-NEW), Volume 23, Issue 2Pages 74–84https://doi.org/10.1109/MIS.2008.22More and more AI researchers are trying to make robots dance to music. This installment of T&C features five essays showing how this research addresses issues that are central to dance.
- ArticleJune 2007
MULTI: multiple user interactive template installation
C&C '07: Proceedings of the 6th ACM SIGCHI conference on Creativity & cognitionPage 294https://doi.org/10.1145/1254960.1255030To develop a software tool which simulates the experience of synesthesia to produce concrete, documentable expressions of creativity. A constitutive relationship between synesthesia and creativity (Campen, 2002) and an operational relationship between ...
- ArticleDecember 2006
Where are my legs?: embodiment gaps in avatars
CyberGames '06: Proceedings of the 2006 international conference on Game research and developmentPages 104–111This paper identifies "gaps" in the manifestation and behaviour of avatars that commonly occur in games. These gaps in embodiment are sometimes surprisingly conspicuous, and cannot always be attributed to simple hardware or software limitations. Most of ...
- ArticleJune 2006
spinCycle: a color-tracking turntable sequencer
This report presents an interface for musical performance called the spinCycle. spinCycle enables performers to make visual patterns with brightly colored objects on a spinning turntable platter that get translated into musical arrangements in real-...