Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
survey
Open access

Is Multimedia Multisensorial? - A Review of Mulsemedia Systems

Published: 04 September 2018 Publication History
  • Get Citation Alerts
  • Abstract

    Mulsemedia—multiple sensorial media—makes possible the inclusion of layered sensory stimulation and interaction through multiple sensory channels. The recent upsurge in technology and wearables provides mulsemedia researchers a vehicle for potentially boundless choice. However, in order to build systems that integrate various senses, there are still some issues that need to be addressed. This review deals with mulsemedia topics that remain insufficiently explored by previous work, with a focus on the multi-multi (multiple media-multiple senses) perspective, where multiple types of media engage multiple senses. Moreover, it addresses the evolution of previously identified challenges in this area and formulates new exploration directions.

    References

    [1]
    J. W. Adams, L. Paxton, K. Dawes, K. Burlak, M. Quayle, and P. G. McMenamin. 2015. 3D printed reproductions of orbital dissections: A novel mode of visualising anatomy for trainees in ophthalmology or optometry. British Journal of Ophthalmology 99, 9 (2015), 1162--1167. arXiv:http://bjo.bmj.com/content/99/9/1162.full.pdf.
    [2]
    O. A. Ademoye and G. Ghinea. 2009. Synchronization of olfaction-enhanced multimedia. IEEE Transactions on Multimedia 11, 3 (April 2009), 561--565.
    [3]
    O. A. Ademoye and G. Ghinea. 2013. Information recall task impact in olfaction-enhanced multimedia. ACM Transactions on Multimedia Computing, Communications, and Applications (TOMM) 9, 3 (2013), 17.
    [4]
    O. A. Ademoye, N. Murray, G.-M. Muntean, and G. Ghinea. 2016. Audio masking effect on inter-component skews in olfaction-enhanced multimedia presentations. ACM Transactions on Multimedia Computing, Communications, and Applications (TOMM) 12, 4, Article 51 (Aug. 2016), 14 pages.
    [5]
    A. Aijaz, M. Dohler, A. H. Aghvami, V. Friderikos, and M. Frodigh. 2017. Realizing the tactile internet: Haptic communications over next generation 5G cellular networks. IEEE Wireless Communications 24, 99 (October 2017), 82--89.
    [6]
    D. Alais and J. Cass. 2010. Multisensory perceptual learning of temporal order: Audiovisual learning transfers to vision but not audition. PloS One 5, 6 (2010), e11283.
    [7]
    F. Arafsha, L. Zhang, H. Dong, and A. E. Saddik. 2015. Contactless haptic feedback: State of the art. In 2015 IEEE International Symposium on Haptic, Audio and Visual Environments and Games (HAVE’15). 1--6.
    [8]
    A. Baddeley. 1992. Working memory. Science 255, 5044 (1992), 556--559.
    [9]
    P. J. Barnard and J. D. Teasdale. 1991. Interacting cognitive subsystems: A systemic approach to cognitive-affective interaction and change. Cognition and Emotion 5, 1 (1991), 1--39.
    [10]
    K. Belkin, R. Martin, S. E. Kemp, and A. N. Gilbert. 1997. Auditory pitch as a perceptual analogue to odor quality. Psychological Science 8, 4 (1997), 340--342.
    [11]
    B. Alexander and K. Nobbs. 2016. Multi-sensory fashion retail experiences: The impact of sound, smell, sight and touch on consumer based brand equity. In Handbook of Research on Global Fashion Management and Merchandising, Alessandra Vecchi and Chitra Buckley (Eds.). IGI Global, Hershey, PA, 420--443. Retrieved from.
    [12]
    N. Bolognini, F. Frassinetti, A. Serino, and E. Làdavas. 2005. “Acoustical visio” of below threshold stimuli: Interaction among spatially converging audiovisual inputs. Experimental Brain Research 160, 3 (2005), 273--282.
    [13]
    C. Branje, M. Karam, D. Fels, and F. Russo. 2009. Enhancing entertainment through a multimodal chair interface. In 2009 IEEE Toronto International Conference Science and Technology for Humanity (TIC-STH’09). 636--641.
    [14]
    K. Bronner, K. Frieler, H. Bruhn, R. Hirt, and D. Piper. 2012. What is the sound of citrus? Research on the correspondences between the perception of sound and flavour. In Proceedings of the 12th International Conference of Music Perception and Cognition (ICMPC) and the 8th Triennial Conference of the European Society for the Cognitive Sciences of Music (ESCOM). 142--148.
    [15]
    M. Bruijnes, G. Huisman, and D. Heylen. 2016. Tasty tech: Human-food interaction and multimodal interfaces. In Proceedings of the 1st Workshop on Multi-Sensorial Approaches to Human-Food Interaction. ACM, 4.
    [16]
    L. Cancar, A. Díaz, A. Barrientos, D. Travieso, and D. M. Jacobs. 2013. Tactile-sight: A sensory substitution device based on distance-related vibrotactile flow. International Journal of Advanced Robotic Systems 10, 6 (2013), 272.
    [17]
    S. K. Card. 1981. The model human processor: A model for making engineering calculations of human performance. Proceedings of the Human Factors Society Annual Meeting 25, 1 (1981), 301--305.
    [18]
    M. Carulli, M. Bordegoni, and U. Cugini. 2015. Visual-olfactory immersive environment for product evaluation. In 2015 IEEE Virtual Reality (VR’15). 161--162.
    [19]
    U. Castiello, G. M. Zucco, V. Parma, C. Ansuini, and R. Tirindelli. 2006. Cross-modal interactions between olfaction and vision when grasping. Chemical Senses 31, 7 (2006), 665--671.
    [20]
    J. Cha, Y. Seo, Y. Kim, and J. Ryu. 2007. An authoring/editing framework for haptic broadcasting: Passive haptic interactions using MPEG-4 BIFS. In 2nd Joint EuroHaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems (WHC’07). 274--279.
    [21]
    A. Chalmers, D. Howard, and C. Moir. 2009. Real virtuality: A step change from virtual reality. In Proceedings of the 25th Spring Conference on Computer Graphics (SCCG’09). ACM, New York, NY, 9--16.
    [22]
    S. W-c. Chan, D. R. Thompson, J. P. C. Chau, W. W. S. Tam, I. W. S. Chiu, and S. H. S. Lo. 2010. The effects of multisensory therapy on behaviour of adult clients with developmental disabilities: A systematic review. International Journal of Nursing Studies 47, 1 (2010), 108--122.
    [23]
    T. Chatzidimitris, D. Gavalas, and D. Michael. 2016. SoundPacman: Audio augmented reality in location-based games. In 2016 18th Mediterranean Electrotechnical Conference (MELECON’16). 1--6.
    [24]
    B. Choi, E. S. Lee, and K. Yoon. 2011. Streaming media with sensory effect. In 2011 International Conference on Information Science and Applications. 1--6.
    [25]
    D. L. Clements, S. Sato, and A. Portela Fonseca. 2016. Cosmic sculpture: A new way to visualise the cosmic microwave background. European Journal of Physics 38, 1 (2016), 015601.
    [26]
    E. Cooke and E. Myin. 2011. Is trilled smell possible? How the structure of olfaction determines the phenomenology of smell. Journal of Consciousness Studies 18, 11--12 (2011), 59--95.
    [27]
    A. Covaci, G. Ghinea, C.-H. Lin, S.-H. Huang, and J.-L. Shih. 2018. Multisensory games-based learning-lessons learnt from olfactory enhancement of a digital board game. Multimedia Tools and Applications 77, 16 (2018), 21245--21263.
    [28]
    A.-S. Crisinel and C. Spence. 2010. As bitter as a trombone: Synesthetic correspondences in nonsynesthetes between tastes/flavors and musical notes. Attention, Perception, 8 Psychophysics 72, 7 (2010), 1994--2002.
    [29]
    A.-S. Crisinel and C. Spence. 2009. Implicit association between basic tastes and pitch. Neuroscience Letters 464, 1 (2009), 39--42.
    [30]
    A.-S. Crisinel and C. Spence. 2010. A sweet sound? Food names reveal implicit associations between taste and pitch. Perception 39, 3 (2010), 417--425.
    [31]
    A.-S. Crisinel and C. Spence. 2012. A fruity note: Crossmodal associations between odors and musical notes. Chemical Senses 37, 2 (2012), 151--158.
    [32]
    A. Cruz and B. G. Green. 2000. Thermal stimulation of taste. Nature 403, 6772 (2000), 889.
    [33]
    H. Culbertson, J. Unwin, and K. J. Kuchenbecker. 2014. Modeling and rendering realistic textures from unconstrained tool-surface interactions. IEEE Transactions on Haptics 7, 3 (July 2014), 381--393.
    [34]
    F. Danieau, J. Bernon, J. Fleureau, P. Guillotel, N. Mollet, M. Christie, and A. Lécuyer. 2013. H-Studio: An authoring tool for adding haptic and motion effects to audiovisual content. In Proceedings of the Adjunct Publication of the 26th Annual ACM Symposium on User Interface Software and Technology (UIST’13 Adjunct). ACM, New York, NY, 83--84.
    [35]
    F. Danieau, A. Lecuyer, P. Guillotel, J. Fleureau, N. Mollet, and M. Christie. 2013. Enhancing audiovisual experience with haptic feedback: A survey on HAV. IEEE Transactions on Haptics 6, 2 (April 2013), 193--205.
    [36]
    J. Delwiche. 1996. Are there “basic” tastes?Trends in Food Science 8 Technology 7, 12 (1996), 411--415.
    [37]
    M. L. Dematte, D. Sanabria, R. Sugarman, and C. Spence. 2006. Cross-modal interactions between olfaction and touch. Chemical Senses 31, 4 (2006), 291--300.
    [38]
    O. Deroy, I. Fasiello, V. Hayward, and M. Auvray. 2016. Differentiated audio-tactile correspondences in sighted and blind individuals.Journal of Experimental Psychology: Human Perception and Performance 42, 8 (2016), 1204.
    [39]
    O. Deroy and C. Spence. 2016. Crossmodal correspondences: Four challenges. Multisensory Research 29, 1--3 (2016), 29--48.
    [40]
    O. Deroy and D. Valentin. 2011. Tasting liquid shapes: Investigating the sensory basis of cross-modal correspondences. Chemosensory Perception 4, 3 (2011), 80.
    [41]
    H. Q. Dinh, N. Walker, L. F. Hodges, C. Song, and A. Kobayashi. 1999. Evaluating the importance of multi-sensory input on memory and the sense of presence in virtual environments. In Proceedings IEEE Virtual Reality (Cat. No. 99CB36316). 222--228.
    [42]
    N. F. Dixon and L. Spitz. 1980. The detection of auditory visual desynchrony. Perception 9, 6 (1980), 719--721. arXiv:http://dx.doi.org/10.1068/p090719 7220244.
    [43]
    D. Dmitrenko, E. Maggioni, C. T. Vi, and M. Obrist. 2017. What did I sniff? Mapping scents onto driving-related messages. In Proceedings of the 9th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI’17). ACM, 154--163.
    [44]
    H. Dong, Y. Gao, H. A. Osman, and A. E. Saddik. 2015. Development of a web-based haptic authoring tool for multimedia applications. In 2015 IEEE International Symposium on Multimedia (ISM’15). 13--20.
    [45]
    A. E. Dubin and A. Patapoutian. 2010. Nociceptors: The sensors of the pain pathway. Journal of Clinical Investigation 120, 11 (2010), 3760--3772.
    [46]
    B. Durie. 2017. Senses Special: Doors of Perception. Retrieved from https://www.newscientist.com/article/mg18524841-600-senses-special-doors-of-perception/. Accessed: March 19, 2017.
    [47]
    M. Eldridge, E. Saltzman, and A. Lahav. 2010. Seeing what you hear: Visual feedback improves pitch recognition. European Journal of Cognitive Psychology 22, 7 (2010), 1078--1091. arXiv:http://dx.doi.org/10.1080/09541440903316136
    [48]
    M. J. Enriquez and K. E. MacLean. 2003. The Hapticon editor: A tool in support of haptic communication research. In Proceedings of the 11th Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, 2003 (HAPTICS’03). 356--362.
    [49]
    L. Escobedo, M. Tentori, E. Quintana, J. Favela, and D. Garcia-Rosas. 2014. Using augmented reality to help children with autism stay focused. IEEE Pervasive Computing 13, 1 (2014), 38--46.
    [50]
    C. F. Lai, R. H. Hwang, H. C. Chao, M. M. Hassan, and A. Alamri. 2015. A buffer-aware HTTP live streaming approach for SDN-enabled 5G wireless networks. IEEE Network 29, 1 (Jan. 2015), 49--55.
    [51]
    M. Falasconi, I. Concina, E. Gobbi, V. Sberveglieri, A. Pulvirenti, and G. Sberveglieri. 2012. Electronic nose for microbiological quality control of food products. International Journal of Electrochemistry 2012 (2012).
    [52]
    G. P. Fettweis. 2014. The tactile internet: Applications and challenges. IEEE Vehicular Technology Magazine 9, 1 (March 2014), 64--70.
    [53]
    C. Forsythe, H. Liao, M. C. S. Trumbo, and R. E. Cardona-Rivera. 2014. Cognitive Neuroscience of Human Systems: Work and Everyday Life. CRC Press.
    [54]
    S. Gane, D. Georganakis, K. Maniati, M. Vamvakias, N. Ragoussis, E. M. C. Skoulakis, and L. Turin. 2013. Molecular vibration-sensing component in human olfaction. PLOS ONE 8, 1 (2013), 1--7.
    [55]
    G. Ghinea and O. Ademoye. 2012. User perception of media content association in olfaction-enhanced multimedia. ACM Transactions on Multimedia Computing, Communications, and Applications (TOMM) 8, 4, Article 52 (Nov. 2012), 19 pages.
    [56]
    G. Ghinea and O. Ademoye. 2015. Olfactory media impact on task performance: The case of a word search game. In 2015 International Conference on Interactive Mobile Communication Technologies and Learning (IMCL’15). IEEE, 296--300.
    [57]
    G. Ghinea and O. A. Ademoye. 2010. Perceived synchronization of olfactory multimedia. IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans 40, 4 (July 2010), 657--663.
    [58]
    G. Ghinea and O. A. Ademoye. 2011. Olfaction-enhanced multimedia: Perspectives and challenges. Multimedia Tools and Applications 55, 3 (2011), 601--626.
    [59]
    G. Ghinea and S. Y. Chen. 2006. Perceived quality of multimedia educational content: A cognitive style approach. Multimedia Systems 11, 3 (2006), 271--279.
    [60]
    G. Ghinea and S. Y. Chen. 2008. Measuring quality of perception in distributed multimedia: Verbalizers vs. imagers. Computers in Human Behavior 24, 4 (2008), 1317--1329.
    [61]
    G. Ghinea, C. Timmerer, W. Lin, and S. R. Gulliver. 2014. Mulsemedia: State of the art, perspectives, and challenges. ACM Transactions on Multimedia Computing, Communications, and Applications (TOMM) 11, 1s, Article 17 (Oct. 2014), 23 pages.
    [62]
    G. Gingras, B. A. Rowland, and B. E. Stein. 2009. The differing impact of multisensory and unisensory integration on behavior. Journal of Neuroscience 29, 15 (2009), 4897--4902.
    [63]
    M. Grimshaw. 2014. The Oxford Handbook of Virtuality. Oxford University Press.
    [64]
    K. Gsöllpointner, R. Schnell, and R. K. Schuler. 2016. Digital Synesthesia: A Model for the Aesthetics of Digital Art. Gruyter, Walter de GmbH.
    [65]
    X. Gu, Y. Zhang, W. Sun, Y. Bian, D. Zhou, and P. O. Kristensson. 2016. Dexmo: An inexpensive and lightweight mechanical exoskeleton for motion capture and force feedback in VR. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI’16). ACM, New York, 1991--1995.
    [66]
    R. Guetta and P. Loui. 2017. When music is salty: The crossmodal associations between sound and taste. PLOS ONE 12, 3 (03 2017), 1--14.
    [67]
    G. Hamilton-Fletcher, T. D. Wright, and J. Ward. 2016. Cross-modal correspondences enhance performance on a colour-to-sound sensory substitution device. Multisensory Research 29, 4--5 (2016), 337--363.
    [68]
    P. A. Hancock, J. E. Mercado, J. Merlo, and J. B. F. Van Erp. 2013. Improving target detection in visual search through the augmenting multi-sensory cues. Ergonomics 56, 5 (2013), 729--738.
    [69]
    G. Hanson-Vaux, A.-S. Crisinel, and C. Spence. 2013. Smelling shapes: Crossmodal correspondences between odors and shapes. Chemical Senses 38, 2 (2013), 161--166.
    [70]
    K. Hapeshi and D. Jones. 1992. Interactive multimedia for instruction: A cognitive analysis of the role of audition and vision. International Journal of Human-Computer Interaction 4, 1 (1992), 79--99.
    [71]
    K. Hashimoto and T. Nakamoto. 2016. Tiny olfactory display using surface acoustic wave device and micropumps for wearable applications. IEEE Sensors Journal 16, 12 (2016), 4974--4980.
    [72]
    M. Haverkamp. 2013. Synesthetic Design: Handbook for a Multi-sensory Approach. Walter de Gruyter.
    [73]
    C. H. Hawkes. 2004. Olfaction taste and cognition. Brain 127, 1 (2004), 231--231.
    [74]
    V. Hayward, O. R. Astley, M. Cruz-Hernandez, D. Grant, and G. Robles-De-La-Torre. 2004. Haptic interfaces and devices. Sensor Review 24, 1 (2004), 16--29.
    [75]
    N. S. Herrera and R. P. McMahan. 2014. Development of a simple and low-cost olfactory display for immersive media experiences. In Proceedings of the 2nd ACM International Workshop on Immersive Media Experiences (ImmersiveMe’14). ACM, New York, 1--6.
    [76]
    C. Ho, R. Gray, and C. Spence. 2014. Reorienting driver attention with dynamic tactile cues. IEEE Transactions on Haptics 7, 1 (Jan. 2014), 86--94.
    [77]
    C. Ho, R. Gray, and C. Spence. 2014. To what extent do the findings of laboratory-based spatial attention research apply to the real-world setting of driving? IEEE Transactions on Human-Machine Systems 44, 4 (Aug. 2014), 524--530.
    [78]
    C. Ho and C. Spence. 2012. The Multisensory Driver: Implications for Ergonomic Car Interface Design. Ashgate Publishing.
    [79]
    K. Hong, J. Lee, and S. Choi. 2013. Demonstration-based vibrotactile pattern authoring. In Proceedings of the 7th International Conference on Tangible, Embedded and Embodied Interaction (TEI’13). ACM, New York, 219--222.
    [80]
    K. Hong, J. Lee, and S. Choi. 2013. Demonstration-based vibrotactile pattern authoring. In Proceedings of the 7th International Conference on Tangible, Embedded and Embodied Interaction (TEI’13). ACM, New York, 219--222.
    [81]
    K. Hopkins, S. J. Kass, L. D. Blalock, and J. C. Brill. 2017. Effectiveness of auditory and tactile crossmodal cues in a dual-task visual and auditory scenario. Ergonomics 60, 5 (2017), 692--700.
    [82]
    K. Hoshino, M. Koge, T. Hachisu, R. Kodama, and H. Kajimoto. 2015. Jorro beat: Shower tactile stimulation device in the bathroom. In Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems (CHI EA’15). ACM, New York, 1675--1680.
    [83]
    S. Hoshino, Y. Ishibashi, N. Fukushima, and S. Sugawara. 2011. QoE assessment in olfactory and haptic media transmission: Influence of inter-stream synchronization error. In 2011 IEEE International Workshop Technical Committee on Communications Quality and Reliability (CQR’11). IEEE, 1--6.
    [84]
    L. Iordanescu, E. Guzman-Martinez, M. Grabowecky, and S. Suzuki. 2008. Characteristic sounds facilitate visual search. Psychonomic Bulletin 8 Review 15, 3 (2008), 548--554.
    [85]
    M. Ischer, N. Baron, C. Mermoud, I. Cayeux, C. Porcherot, D. Sander, and S. Delplanque. 2014. How incorporation of scents could enhance immersive virtual experiences. Applied Olfactory Cognition 119, 3 (2014), 4.
    [86]
    A. Israr, S. Zhao, and O. Schneider. 2015. Exploring embedded haptics for social networking and interactions. In Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems (CHI EA’15). ACM, New York, 1899--1904.
    [87]
    A. Israr, S. Zhao, K. Schwalje, R. Klatzky, and J. Lehman. 2014. Feel effects: Enriching storytelling with haptic feedback. ACM Transactions on Applied Perception 11, 3, Article 11 (Sept. 2014), 17 pages.
    [88]
    D. Jain, M. Sra, J. Guo, R. Marques, R. Wu, J. Chiu, and C. Schmandt. 2016. Immersive scuba diving simulator using virtual reality. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology. ACM, 729--739.
    [89]
    M. Karam, C. Branje, G. Nespoli, N. Thompson, F. A. Russo, and D. I. Fels. 2010. The emoti-chair: An interactive tactile music exhibit. In CHI’10 Extended Abstracts on Human Factors in Computing Systems (CHI EA’10). ACM, New York, 3069--3074.
    [90]
    M. Kim, S. Lee, and S. Choi. 2012. Saliency-Driven Tactile Effect Authoring for Real-Time Visuotactile Feedback. Springer, Berlin,258--269.
    [91]
    S-J. Kim and D-H. Shin. 2016. The multisensory effects of atmospheric cues on online shopping satisfaction. In International Conference on HCI in Business, Government and Organizations. Springer, 406--416.
    [92]
    S.-K. Kim. 2013. Authoring multisensorial content. Signal Processing: Image Communication 28, 2 (2013), 162--167.
    [93]
    Y. Kim, J. Cha, J. Ryu, and I. Oakley. 2010. A tactile glove design and authoring system for immersive multimedia. IEEE MultiMedia 17, 3 (July 2010), 34--45.
    [94]
    K. Kolev, P. Tanskanen, P. Speciale, and M. Pollefeys. 2014. Turning mobile phones into 3D scanners. In The IEEE Conference on Computer Vision and Pattern Recognition (CVPR’14).
    [95]
    P. T. Kovács, N. Murray, G. Rozinaj, Y. Sulema, and R. Rybárová. 2015. Application of immersive technologies for education: State of the art. In 2015 International Conference on Interactive Mobile Communication Technologies and Learning (IMCL’15). 283--288.
    [96]
    A. Krishna, L. Cian, and T. Sokolova. 2016. The power of sensory marketing in advertising. Current Opinion in Psychology 10 (2016), 142--147.
    [97]
    S. Kuang and T. Zhang. 2014. Smelling directions: Olfaction modulates ambiguous visual motion perception. Scientific Reports 4 (2014), 5796.
    [98]
    S. D. Kulkarni, C. J. Fisher, P. Lefler, A. Desai, S. Chakravarthy, E. R. Pardyjak, M. A. Minor, and J. M. Hollerbach. 2015. A full body steerable wind display for a locomotion interface. IEEE Transactions on Visualization and Computer Graphics 21, 10 (Oct. 2015), 1146--1159.
    [99]
    S. D. Kulkarni, M. A. Minor, M. W. Deaver, and E. R. Pardyjak. 2007. Output feedback control of wind display in a virtual environment. In Proceedings 2007 IEEE International Conference on Robotics and Automation. 832--839.
    [100]
    H. Morton L. 1962. Sensorama Simulator. (Aug. 28, 1962). US Patent 3,050,870.
    [101]
    M.-K. Lai. 2015. Universal scent blackbox: Engaging visitors communication through creating olfactory experience at art museum. In Proceedings of the 33rd Annual International Conference on the Design of Communication (SIGDOC’15). ACM, New York, Article 27, 6 pages.
    [102]
    B. Lee, S. Srivastava, R. Kumar, R. Brafman, and S. R. Klemmer. 2010. Designing with interactive example galleries. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’10). ACM, New York, 2257--2266.
    [103]
    J. Lee and J. W. Park. 2013. A media art study using multi-sensory elements. In International Conference on Human-Computer Interaction. Springer, 277--281.
    [104]
    J. Lee, J. Ryu, and S. Choi. 2009. Vibrotactile score: A score metaphor for designing vibrotactile patterns. In World Haptics 2009-3rd Joint EuroHaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems. 302--307.
    [105]
    S.-H. Lee, J. E. Workman, and K. Jung. 2016. The influence of need for touch and gender on internet shopping attitudes among Korean consumers. International Journal of Fashion Design, Technology and Education 10, 2 (2016), 230--239.
    [106]
    P. Lemmens, F. Crompvoets, D. Brokken, J. van den Eerenbeemd, and G. J. de Vries. 2009. A body-conforming tactile jacket to enrich movie viewing. In World Haptics 2009-3rd Joint EuroHaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems. 7--12.
    [107]
    N. Leunis, M.-L. Boumans, B. Kremer, S. Din, E. Stobberingh, A. G. H. Kessels, and K. W. Kross. 2014. Application of an electronic nose in the diagnosis of head and neck cancer. Laryngoscope 124, 6 (2014), 1377--1381.
    [108]
    C.-H. Lin, S.-H. Huang, J.-L. Shih, A. Covaci, and G. Ghinea. 2017. Game-based learning effectiveness and motivation study between competitive and cooperative modes. In 2017 IEEE 17th International Conference on Advanced Learning Technologies (ICALT’17). IEEE, 123--127.
    [109]
    CyberGlove Systems LLC. 2016. Home. Retrieved from http://www.cyberglovesystems.com/. Accessed: September 19, 2016.
    [110]
    M. L. Dematté, D. Sanabria, and C. Spence. 2006. Cross-modal associations between odors and colors. Chemical Senses 31, 6 (2006), 531--538.
    [111]
    N. Magnenat-Thalmann and U. Bonanni. 2006. Haptics in virtual reality and multimedia. IEEE MultiMedia 13, 3 (July 2006), 6--11.
    [112]
    J. Martιnez, A. S. Garcιa, M. Oliver, J. P. Molina, and P. González. 2014. VITAKI: A vibrotactile prototyping toolkit for virtual reality and video games. International Journal of Human-Computer Interaction 30, 11 (2014), 855--871. arXiv:http://dx.doi.org/10.1080/10447318.2014.941272
    [113]
    H. Matsukura, T. Nihei, and H. Ishida. 2011. Multi-sensorial field display: Presenting spatial distribution of airflow and odor. In 2011 IEEE Virtual Reality Conference. 119--122.
    [114]
    H. Matsukura, T. Yoneda, and H. Ishida. 2013. Smelling screen: Development and evaluation of an olfactory display system for presenting a virtual odor source. IEEE Transactions on Visualization and Computer Graphics 19, 4 (April 2013), 606--615.
    [115]
    R. E. Mayer and R. Moreno. 1998. A split-attention effect in multimedia learning: Evidence for dual processing systems in working memory.Journal of Educational Psychology 90, 2 (1998), 312.
    [116]
    D. Maynes-Aminzade. 2005. Edible bits: Seamless interfaces between people, data and food. In Conference on Human Factors in Computing Systems (CHI’05)-Extended Abstracts. Citeseer, 2207--2210.
    [117]
    A. Mazzoni and N. Bryan-Kinns. 2016. Mood glove: A haptic wearable prototype system to enhance mood music in film. Entertainment Computing 17 (2016), 9--17.
    [118]
    D. McGookin and D. Escobar. 2016. Hajukone: Developing an open source olfactory device. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems (CHI EA’16). ACM, New York, 1721--1728.
    [119]
    M. McGrath and Q. Summerfield. 1985. Intermodal timing relations and audio-visual speech recognition by normal-hearing adults. Journal of the Acoustical Society of America 77, 2 (1985), 678--685.
    [120]
    B. Mesz, M. Sigman, and M. Trevisan. 2012. A composition algorithm based on crossmodal taste-music correspondences. Frontiers in Human Neuroscience 6 (2012), 71.
    [121]
    T. A. Mikropoulos and A. Natsis. 2011. Educational virtual environments: A ten-year review of empirical research (1999--2009). Computers 8 Education 56, 3 (2011), 769--780.
    [122]
    T. Moon and G. J. Kim. 2004. Design and evaluation of a wind display for virtual reality. In Proceedings of the ACM Symposium on Virtual Reality Software and Technology (VRST’04). ACM, New York, 122--128.
    [123]
    G. M. Muntean, P. Perry, and L. Murphy. 2004. A new adaptive multimedia streaming system for all-IP multi-service networks. IEEE Transactions on Broadcasting 50, 1 (March 2004), 1--10.
    [124]
    T. Murakami, T. Person, C. L. Fernando, and K. Minamizawa. 2017. Altered touch: Miniature haptic display with force, thermal and tactile feedback for augmented haptics. In ACM SIGGRAPH 2017 Posters. ACM, 53.
    [125]
    M. Murer, I. Aslan, and M. Tscheligi. 2013. LOLLio: Exploring taste as playful modality. In Proceedings of the 7th International Conference on Tangible, Embedded and Embodied Interaction (TEI’13). ACM, New York, 299--302.
    [126]
    M. M. Murray, C. M. Michel, R. Grave de Peralta, S. Ortigue, D. Brunet, S. Gonzalez Andino, and A. Schnider. 2004. Rapid discrimination of visual and multisensory memories revealed by electrical neuroimaging. Neuroimage 21, 1 (2004), 125--135.
    [127]
    N. Murray, O. A. Ademoye, G. Ghinea, and G.-M. Muntean. 2017. A tutorial for olfaction-based multisensorial media application design and evaluation. ACM Computing Surveys (CSUR) 50, 5 (2017), 67.
    [128]
    N. Murray, B. Lee, Y. Qiao, and G. Miro-Muntean. 2017. The impact of scent type on olfaction-enhanced multimedia quality of experience. IEEE Transactions on Systems, Man, and Cybernetics: Systems 47, 9 (2017), 2503--2515.
    [129]
    N. Murray, B. Lee, Y. Qiao, and G.-M. Muntean. 2014. Multiple-scent enhanced multimedia synchronization. ACM Transactions on Multimedia Computing, Communications, and Applications (TOMM) 11, 1s (2014), 12.
    [130]
    N. Murray, B. Lee, Y. Qiao, and G.-M. Muntean. 2016. Olfaction-enhanced multimedia: A survey of application domains, displays, and research challenges. ACM Computing Surveys (CSUR) 48, 4 (2016), 56.
    [131]
    N. Murray, Y. Qiao, B. Lee, and G.-M. Muntean. 2014. User-profile-based perceived olfactory and visual media synchronization. ACM Transactions on Multimedia Computing, Communications, and Applications (TOMM) 10, 1s, Article 11 (Jan. 2014), 24 pages.
    [132]
    T. Nakamoto. 2005. Study of odor recorder for dynamical change of odor. Chemical Senses 30 (2005), i254.
    [133]
    T. Nakamoto, Y. Nakahira, H. Hiramatsu, and T. Moriizumi. 2001. Odor recorder using active odor sensing system. Sensors and Actuators B: Chemical 76, 1--3 (2001), 465--469. Proceeding of the 8th International Meeting on Chemical Sensors (IMCS-8) - Part 1.
    [134]
    T. Nakamoto, S. Otaguro, M. Kinoshita, M. Nagahama, K. Ohinishi, and T. Ishida. 2008. Cooking up an interactive olfactory game display. IEEE Computer Graphics and Applications 28, 1 (Jan. 2008), 75--78.
    [135]
    H. Nakamura and H. Miyashita. 2012. Development and evaluation of interactive system for synchronizing electric taste and visual content. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 517--520.
    [136]
    T. Nakano, Y. Yoshioka, and Y. Yanagida. 2014. Effects of wind source configuration of wind displays on property of wind direction perception. In Proceedings of the Seventh International Conference on Advances in Computer-Human Interactions. Citeseer, 365--370.
    [137]
    H. Y. Nam and M. Nitsche. 2013. Interactive installations as performance: Inspiration for HCI. In Proceedings of the 8th International Conference on Tangible, Embedded and Embodied Interaction (TEI’14). ACM, New York, 189--196.
    [138]
    S. Nam and D. Fels. 2016. Design and Evaluation of an Authoring Tool and Notation System for Vibrotactile Composition. Springer International Publishing, Cham, 43--53.
    [139]
    S. Nanayakkara, E. Taylor, L. Wyse, and S. H. Ong. 2009. An enhanced musical experience for the deaf: Design and evaluation of a music display and a haptic chair. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’09). ACM, New York, 337--346.
    [140]
    T. Narumi. 2016. Multi-sensorial virtual reality and augmented human food interaction. In Proceedings of the 1st Workshop on Multi-sensorial Approaches to Human-Food Interaction. ACM, 1.
    [141]
    T. Narumi, S. Nishizaka, T. Kajinami, T. Tanikawa, and M. Hirose. 2011. Augmented reality flavors: Gustatory display based on edible marker and cross-modal interaction. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’11). ACM, New York, 93--102.
    [142]
    A. Niijima and T. Ogawa. 2016. Study on control method of virtual food texture by electrical muscle stimulation. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology (UIST’16 Adjunct). ACM, New York, 199--200.
    [143]
    M. Nishizawa, W. Jiang, and K. Okajima. 2016. Projective-AR system for customizing the appearance and taste of food. In Proceedings of the 2016 workshop on Multimodal Virtual and Augmented Reality. ACM, 6.
    [144]
    R. Nisimura, K. Hashimoto, H. Kawahara, and T. Irino. 2014. Proposal for an Interactive 3D Sound Playback Interface Controlled by User Behavior. Springer International Publishing, Cham, 446--450.
    [145]
    M. Obrist, P. Cesar, D. Geerts, T. Bartindale, and E. F. Churchill. 2015. Online video and interactive TV experiences. Interactions 22, 5 (Aug. 2015), 32--37.
    [146]
    M. Obrist, R. Comber, S. Subramanian, B. Piqueras-Fiszman, C. Velasco, and C. Spence. 2014. Temporal, affective, and embodied characteristics of taste experiences: A framework for design. In Proceedings of the 32nd Annual ACM Conference on Human Factors in Computing Systems (CHI’14). ACM, New York, 2853--2862.
    [147]
    M. Obrist, A. N. Tuch, and K. Hornbaek. 2014. Opportunities for odor: Experiences with smell and implications for technology. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’14). ACM, New York, 2843--2852.
    [148]
    M. Obrist, C. Velasco, C. Vi, N. Ranasinghe, A. Israr, A. Cheok, C. Spence, and P. Gopalakrishnakone. 2016. Sensing the future of HCI: Touch, taste, and smell user interfaces. Interactions 23, 5 (Aug. 2016), 40--49.
    [149]
    S. Oviatt. 2006. Human-centered design meets cognitive load theory: Designing interfaces that help people think. In Proceedings of the 14th ACM International Conference on Multimedia (MM’06). ACM, New York, 871--880.
    [150]
    S. Oviatt, R. Coulston, S. Tomko, B. Xiao, R. Lunsford, M. Wesson, and L. Carmichael. 2003. Toward a theory of organized multimodal integration patterns during human-computer interaction. In Proceedings of the 5th International Conference on Multimodal Interfaces (ICMI’03). ACM, New York, 44--51.
    [151]
    Panocam3D.com. 2017. Stereoscopic and Panoramic Video Cameras - 3D 360 Videos Production. Retrieved from http://www.panocam3d.com. Accessed March 19, 2017.
    [152]
    S. A. Panëels, J. C. Roberts, and P. J. Rodgers. 2010. HITPROTO: A tool for the rapid prototyping of haptic interactions for haptic data visualization. In 2010 IEEE Haptics Symposium. 261--268.
    [153]
    C. V. Parise. 2016. Crossmodal correspondences: Standing issues and experimental guidelines. Multisensory Research 29, 1--3 (2016), 7--28.
    [154]
    I. Park and M. J. Hannafin. 1993. Empirically-based guidelines for the design of interactive multimedia. Educational Technology Research and Development 41, 3 (1993), 63--85.
    [155]
    R. A. Pieretti, S. D. Kaul, R. M. Zarchy, and L. M. O’Hanlon. 2015. Using a multimodal approach to facilitate articulation, phonemic awareness, and literacy in young children. Communication Disorders Quarterly 36, 3 (2015), 131--141.
    [156]
    K.-H. Plattig and J. Innitzer. 1976. Taste qualities elicited by electric stimulation of single human tongue papillae. Pflügers Archiv 361, 2 (1976), 115--120.
    [157]
    V. Potkonjak, M. Gardner, V. Callaghan, P. Mattila, C. Guetl, V. M. Petrović, and K. Jovanović. 2016. Virtual laboratories for education in science, technology, and engineering: A review. Computers 8 Education 95 (2016), 309--327.
    [158]
    M. Prasad, P. Taele, D. Goldberg, and T. A. Hammond. 2014. HaptiMoto: Turn-by-turn haptic route guidance interface for motorcyclists. In Proceedings of the 32nd Annual ACM Conference on Human Factors in Computing Systems (CHI’14). ACM, New York, 3597--3606.
    [159]
    N. Ranasinghe, A. D. Cheok, and R. Nakatsu. 2012. Taste/IP: The sensation of taste for digital communication. In Proceedings of the 14th ACM International Conference on Multimodal Interaction. ACM, 409--416.
    [160]
    N. Ranasinghe and E. Yi-Luen Do. 2016. Virtual sweet: Simulating sweet sensation using thermal stimulation on the tip of the tongue. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology (UIST’16 Adjunct). ACM, New York, 127--128.
    [161]
    N. Ranasinghe, P. Jain, S. Karwita, and E. Yi-Luen Do. 2017. Virtual lemonade: Let’s teleport your lemonade!. In Proceedings of the 11th International Conference on Tangible, Embedded, and Embodied Interaction (TEI’17). ACM, New York, 183--190.
    [162]
    N. Ranasinghe, T. N. T. Nguyen, Y. Liangkun, L.-Y. Lin, D. Tolley, and E. Y-. Do. 2017. Vocktail: A virtual cocktail for pairing digital taste, smell, and color sensations. In Proceedings of the 2017 ACM on Multimedia Conference (MM’17). ACM, New York, 1139--1147.
    [163]
    S. Rasool and A. Sourin. 2016. Real-time haptic interaction with RGBD video streams. Visual Computer 32, 10 (2016), 1311--1321.
    [164]
    E. Richard, A. Tijou, P. Richard, and J.-L. Ferrier. 2006. Multi-modal virtual environments for education with haptic and olfactory feedback. Virtual Reality 10, 3--4 (2006), 207--225.
    [165]
    L. A. Ross, D. Saint-Amour, V. M. Leavitt, D. C. Javitt, and J. J. Foxe. 2007. Do you see what I am saying? Exploring visual enhancement of speech comprehension in noisy environments. Cerebral Cortex 17, 5 (2007), 1147--1153.
    [166]
    J. Ryu and S. Choi. 2008. posVibEditor: Graphical authoring tool of vibrotactile patterns. In 2008 IEEE International Workshop on Haptic Audio visual Environments and Games. 120--125.
    [167]
    F. Röck, N. Barsan, and U. Weimar. 2008. Electronic nose: Current status and future trends. Chemical Reviews 108, 2 (2008), 705--725. arXiv:http://dx.doi.org/10.1021/cr068121qPMID: 18205411.
    [168]
    J. Saboune and J. M. Cruz-Hernandez. 2015. Method and apparatus to generate haptic feedback from video content analysis. (June 23, 2015). US Patent 9,064,385.
    [169]
    E. B. Saleme, J. R. Celestrini, and C. A. Saibel Santos. 2017. Time evaluation for the integration of a gestural interactive application with a distributed mulsemedia platform. In Proceedings of the 8th ACM on Multimedia Systems Conference (MMSys’17). ACM, New York, 308--314.
    [170]
    N. A. Samshir, N. Johari, K. Karunanayaka, and A. D. Cheok. 2016. Thermal sweet taste machine for multisensory internet. In Proceedings of the 4th International Conference on Human Agent Interaction. ACM, 325--328.
    [171]
    K. Sato, H. Kajimoto, N. Kawakami, and S. Tachi. 2007. Electrotactile display for integration with kinesthetic display. In The 16th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN’07). 3--8.
    [172]
    O. Schneider, K. MacLean, C. Swindells, and K. Booth. 2017. Haptic experience design: What hapticians do and where they need help. International Journal of Human-Computer Studies 107 (2017), 5--21.
    [173]
    O. Schneider, S. Zhao, and A. Israr. 2015. FeelCraft: User-Crafted Tactile Content. Springer Japan, Tokyo, 253--259.
    [174]
    O. S. Schneider, A. Israr, and K. E. MacLean. 2015. Tactile animation by direct manipulation of grid displays. In Proceedings of the 28th Annual ACM Symposium on User Interface Software 8 Technology (UIST’15). ACM, New York, 21--30.
    [175]
    O. S. Schneider and K. E. MacLean. 2016. Studying design process and example use with Macaron, a web-based vibrotactile effect editor. In 2016 IEEE Haptics Symposium (HAPTICS’16). 52--58.
    [176]
    H. Seifi, C. Anthonypillai, and K. E. MacLean. 2014. End-user customization of affective tactile messages: A qualitative examination of tool parameters. In 2014 IEEE Haptics Symposium (HAPTICS’14). 251--256.
    [177]
    H. Seifi, K. Zhang, and K. E. MacLean. 2015. VibViz: Organizing, visualizing and navigating vibration libraries. In 2015 IEEE World Haptics Conference (WHC’15). 254--259.
    [178]
    A. Seigneuric, K. Durand, T. Jiang, J-Y. Baudouin, and B. Schaal. 2010. The nose tells it to the eyes: Crossmodal associations between olfaction and vision. Perception 39, 11 (2010), 1541--1554.
    [179]
    L. Shams and A. R. Seitz. 2008. Benefits of multisensory learning. Trends in Cognitive Sciences 12, 11 (2008), 411--417.
    [180]
    R. Sigrist, G. Rauter, R. Riener, and P. Wolf. 2013. Augmented visual, auditory, haptic, and multimodal feedback in motor learning: A review. Psychonomic Bulletin 8 Review 20, 1 (2013), 21--53.
    [181]
    J. M. Silva, M. Orozco, J. Cha, A. El Saddik, and E. M. Petriu. 2013. Human perception of haptic-to-video and haptic-to-audio skew in multimedia applications. ACM Transactions on Multimedia Computing, Communications, and Applications (TOMM) 9, 2, Article 9 (May 2013), 16 pages.
    [182]
    J. Simner, C. Cuskley, and S. Kirby. 2010. What sound does that taste? Cross-modal mappings across gustation and audition. Perception 39, 4 (2010), 553--569.
    [183]
    M. Simsek, A. Aijaz, M. Dohler, J. Sachs, and G. Fettweis. 2016. 5G-enabled tactile internet. IEEE Journal on Selected Areas in Communications 34, 3 (March 2016), 460--473.
    [184]
    B. G. Slocombe, D. A. Carmichael, and J. Simner. 2016. Cross-modal tactile--taste interactions in food evaluations. Neuropsychologia 88 (2016), 58--64.
    [185]
    E. R. Spangenberg, B. Grohmann, and D. E. Sprott. 2005. It’s beginning to smell (and sound) a lot like Christmas: The interactive effects of ambient scent and music in a retail setting. Journal of Business Research 58, 11 (2005), 1583--1589.
    [186]
    C. Spence. 2011. Crossmodal correspondences: A tutorial review. Attention, Perception, 8 Psychophysics 73, 4 (2011), 971--995.
    [187]
    C. Spence and C. Ho. 2015. Multisensory information processing. In APA Handbook of Human Systems Integration, D. A. Boehm-Davis, F. T. Durso, and J. D. Lee (Eds.). American Psychological Association, Washington, DC, 435--448.
    [188]
    C. Spence, X. Wan, A. Woods, C. Velasco, J. Deng, J. Youssef, and O. Deroy. 2015. On tasty colours and colourful tastes? Assessing, explaining, and utilizing crossmodal correspondences between colours and basic tastes. Flavour 4, 1 (July 2015), 23.
    [189]
    B. E. Stein. 2016. The New Handbook of Multisensory Processing. MIT Press, Cambridge, MA.
    [190]
    B. E. Stein, T. R. Stanford, and B. A. Rowland. 2014. Development of multisensory integration from the perspective of the individual neuron. Nature Reviews Neuroscience 15, 8 (2014), 520--535.
    [191]
    R. Steinmetz. 1996. Human perception of jitter and media synchronization. IEEE Journal on Selected Areas in Communications 14, 1 (Jan. 1996), 61--72.
    [192]
    N. R. Stiles and S. Shimojo. 2014. Auditory sensory substitution is intuitive and automatic with texture stimuli.Scientific Reports 5 (2014), 15628--15628.
    [193]
    T. Stockhammer. 2011. Dynamic adaptive streaming over HTTP --: Standards and design principles. In Proceedings of the 2nd Annual ACM Conference on Multimedia Systems (MMSys’11). ACM, New York, 133--144.
    [194]
    Y. Sulema. 2016. Mulsemedia vs. multimedia: State of the art and future trends. In 2016 International Conference on Systems, Signals and Image Processing (IWSSIP’16). IEEE, 1--5.
    [195]
    C. Suzuki, T. Narumi, T. Tanikawa, and M. Hirose. 2014. Affecting tumbler: Affecting our flavor perception with thermal feedback. In Proceedings of the 11th Conference on Advances in Computer Entertainment Technology (ACE’14). ACM, New York, Article 19, 10 pages.
    [196]
    R. Suzuki, S. Homma, E. Matsuura, and K-i. Okada. 2014. System for presenting and creating smell effects to video. In Proceedings of the 16th International Conference on Multimodal Interaction (ICMI’14). ACM, New York, 208--215.
    [197]
    C. Swindells, S. Pietarinen, and A. Viitanen. 2014. Medium fidelity rapid prototyping of vibrotactile haptic, audio and video effects. In 2014 IEEE Haptics Symposium (HAPTICS’14). 515--521.
    [198]
    B. Tag, T. Goto, K. Minamizawa, R. Mannschreck, H. Fushimi, and K. Kunze. 2017. atmoSphere: Mindfulness over haptic-audio cross modal correspondence. In Proceedings of the 2017 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2017 ACM International Symposium on Wearable Computers. ACM, 289--292.
    [199]
    A. Tatematsu, Y. Ishibashi, N. Fukushima, and S. Sugawara. 2010. QoE assessment in haptic media, sound and video transmission: Influences of network latency. In 2010 IEEE International Workshop Technical Committee on Communications Quality and Reliability (CQR’10). 1--6.
    [200]
    V. A. Thompson and A. Paivio. 1994. Memory for pictures and sounds: Independence of auditory and visual codes. Canadian Journal of Experimental Psychology/Revue Canadienne de Psychologie Expérimentale 48, 3 (1994), 380.
    [201]
    C. Timmerer, J. Gelissen, M. Waltl, and H. Hellwagner. 2009. Interfacing with virtual worlds. Network and Electronic Media Summit (2009).
    [202]
    N. Turoman, C. Velasco, Y.-C. Chen, P.-C. Huang, and C. Spence. 2018. Symmetry and its role in the crossmodal correspondence between shape and taste. Attention, Perception, 8 Psychophysics 80, 3 (2018), 738--751.
    [203]
    S. U. Rehman, J. Sun, L. Liu, and H. Li. 2008. Turn your mobile into the ball: Rendering live football game using vibration. IEEE Transactions on Multimedia 10, 6 (Oct. 2008), 1022--1033.
    [204]
    H. Uematsu, D. Ogawa, R. Okazaki, T. Hachisu, and H. Kajimoto. 2016. HALUX: Projection-based interactive skin for digital sports. In ACM SIGGRAPH 2016 Emerging Technologies (SIGGRAPH’16). ACM, New York, Article 10, 2 pages.
    [205]
    E. Van der Burg, C. N. L. Olivers, A. W. Bronkhorst, and J. Theeuwes. 2008. Pip and pop: Nonspatial auditory signals improve spatial visual search. Journal of Experimental Psychology: Human Perception and Performance 34, 5 (2008), 1053.
    [206]
    C. Velasco, M. Obrist, O. Petit, and C. Spence. 2018. Multisensory technology for flavor augmentation: A mini review. Frontiers in Psychology 9 (2018), 26.
    [207]
    C. Velasco, A. T. Woods, O. Petit, A. D. Cheok, and C. Spence. 2016. Crossmodal correspondences between taste and shape, and their implications for product packaging: A review. Food Quality and Preference 52 (2016), 17--26.
    [208]
    C. Thanh Vi, D. Ablart, D. Arthur, and M. Obrist. 2017. Gustatory interface: The challenges of ‘how’ to stimulate the sense of taste. In Proceedings of 2nd ACM SIGCHI International Workshop on Multisensory Approaches to Human-Food Interaction (MHFI’17). ACM, 29--33.
    [209]
    G. von Békésy. 1964. Olfactory analogue to directional hearing. Journal of Applied Physiology 19, 3 (1964), 369--373. Retrieved from http://jap.physiology.org/content/19/3/369.
    [210]
    M. Waltl. 2010. Enriching Multimedia with Sensory Effects: Annotation and Simulation Tools for the Representation of Sensory Effects. VDM Verlag, Saarbrücken, Germany.
    [211]
    M. Waltl, B. Rainer, C. Timmerer, and H. Hellwagner. 2013. An end-to-end tool chain for sensory experience based on MPEG-V. Signal Processing: Image Communication 28, 2 (2013), 136--150.
    [212]
    Q. J. Wang and C. Spence. 2017. The role of pitch and tempo in sound-temperature crossmodal correspondences. Multisensory Research 30, 3--5 (2017), 307--320.
    [213]
    O. Wright, Y. Jraissati, and D. Özçelik. 2017. Cross-modal associations between color and touch: Mapping haptic and tactile terms to the surface of the Munsell color solid. Multisensory Research 30, 7--8 (2017), 691--715.
    [214]
    T. Yamanaka, R. Matsumoto, and T. Nakamoto. 2003. Fundamental study of odor recorder for multicomponent odor using recipe exploration method based on singular value decomposition. IEEE Sensors Journal 3, 4 (Aug. 2003), 468--474.
    [215]
    Z. Yuan, T. Bi, G. M. Muntean, and G. Ghinea. 2015. Perceived synchronization of mulsemedia services. IEEE Transactions on Multimedia 17, 7 (July 2015), 957--966.
    [216]
    Z. Yuan, G. Ghinea, and G.-M. Muntean. 2015. Beyond multimedia adaptation: Quality of experience-aware multi-sensorial media delivery. IEEE Transactions on Multimedia 17, 1 (2015), 104--117.
    [217]
    Z. Yuan, G. Ghinea, and G. M. Muntean. 2015. Beyond multimedia adaptation: Quality of experience-aware multi-sensorial media delivery. IEEE Transactions on Multimedia 17, 1 (Jan. 2015), 104--117.
    [218]
    J. S. Zelek, S. Bromley, D. Asmar, and D. Thompson. 2003. A haptic glove as a tactile-vision sensory substitution for wayfinding. Journal of Visual Impairment and Blindness 97, 10 (2003), 621--632.
    [219]
    Q. Zeng, Y. Ishibashi, N. Fukushima, S. Sugawara, and K. E. Psannis. 2013. Influences of inter-stream synchronization errors among haptic media, sound, and video on quality of experience in networked ensemble. In 2013 IEEE 2nd Global Conference on Consumer Electronics (GCCE’13). 466--470.
    [220]
    S. E. Zohora, A. M. Khan, and N. Hundewale. 2013. Chemical Sensors Employed in Electronic Noses: A Review. Springer, Berlin,177--184.
    [221]
    L. Zou, R. Trestian, and G. M. Muntean. 2015. E2DOAS: User experience meets energy saving for multi-device adaptive video delivery. In 2015 IEEE Conference on Computer Communications Workshops (INFOCOM WKSHPS’15). 444--449.

    Cited By

    View all
    • (2024)Sensing Heritage: Exploring Creative Approaches for Capturing, Experiencing and Safeguarding the Sensorial Aspects of Cultural HeritageCompanion Publication of the 2024 ACM Designing Interactive Systems Conference10.1145/3656156.3658400(445-448)Online publication date: 1-Jul-2024
    • (2024)Remote Rhythms: Audience-informed insights for designing remote music performancesProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3660719(2675-2690)Online publication date: 1-Jul-2024
    • (2024)Mulseplayer: A Multi-Sensorial Media Content Delivery Solution to Enhance End-User Quality of Experience2024 IEEE International Symposium on Broadband Multimedia Systems and Broadcasting (BMSB)10.1109/BMSB62888.2024.10608351(1-6)Online publication date: 19-Jun-2024
    • Show More Cited By

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Computing Surveys
    ACM Computing Surveys  Volume 51, Issue 5
    September 2019
    791 pages
    ISSN:0360-0300
    EISSN:1557-7341
    DOI:10.1145/3271482
    • Editor:
    • Sartaj Sahni
    Issue’s Table of Contents
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 04 September 2018
    Accepted: 01 June 2018
    Revised: 01 March 2018
    Received: 01 September 2017
    Published in CSUR Volume 51, Issue 5

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Mulsemedia
    2. cross-modal
    3. multimedia
    4. multisensory
    5. quality of experience

    Qualifiers

    • Survey
    • Research
    • Refereed

    Funding Sources

    • European Union's Horizon 2020 Research and Innovation program

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)954
    • Downloads (Last 6 weeks)124
    Reflects downloads up to 11 Aug 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Sensing Heritage: Exploring Creative Approaches for Capturing, Experiencing and Safeguarding the Sensorial Aspects of Cultural HeritageCompanion Publication of the 2024 ACM Designing Interactive Systems Conference10.1145/3656156.3658400(445-448)Online publication date: 1-Jul-2024
    • (2024)Remote Rhythms: Audience-informed insights for designing remote music performancesProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3660719(2675-2690)Online publication date: 1-Jul-2024
    • (2024)Mulseplayer: A Multi-Sensorial Media Content Delivery Solution to Enhance End-User Quality of Experience2024 IEEE International Symposium on Broadband Multimedia Systems and Broadcasting (BMSB)10.1109/BMSB62888.2024.10608351(1-6)Online publication date: 19-Jun-2024
    • (2024)Multisensory Metaverse-6G: A New Paradigm of Commerce and EducationIEEE Access10.1109/ACCESS.2024.339283812(75657-75677)Online publication date: 2024
    • (2024)Using olfactory cues in text materials benefits delayed retention and schemata constructionScientific Reports10.1038/s41598-024-68885-814:1Online publication date: 1-Aug-2024
    • (2024)Advances in Piezoelectret Materials‐Based Bidirectional Haptic Communication DevicesAdvanced Materials10.1002/adma.20240530836:33Online publication date: 27-Jun-2024
    • (2023)Challenges and Opportunities of Force Feedback in MusicArts10.3390/arts1204014712:4(147)Online publication date: 10-Jul-2023
    • (2023)Welcome to SensoryX 2023Proceedings of the 2023 ACM International Conference on Interactive Media Experiences Workshops10.1145/3604321.3604348(43-45)Online publication date: 12-Jun-2023
    • (2023)Semi-automatic mulsemedia authoring analysis from the user's perspectiveProceedings of the 14th ACM Multimedia Systems Conference10.1145/3587819.3590979(249-256)Online publication date: 7-Jun-2023
    • (2023)ICAMUS: Evaluation Criteria of an Interactive Multisensory Authoring ToolProceedings of the 2023 ACM International Conference on Interactive Media Experiences10.1145/3573381.3596472(174-179)Online publication date: 12-Jun-2023
    • Show More Cited By

    View Options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Get Access

    Login options

    Full Access

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media