Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article
Open access

PRECYSE: Predicting Cybersickness using Transformer for Multimodal Time-Series Sensor Data

Published: 15 May 2024 Publication History

Abstract

Cybersickness, a factor that hinders user immersion in VR, has been the subject of ongoing attempts to predict it using AI. Previous studies have used CNN and LSTM for prediction models and used attention mechanisms and XAI for data analysis, yet none explored a transformer that can better reflect the spatial and temporal characteristics of the data, beneficial for enhancing prediction and feature importance analysis. In this paper, we propose cybersickness prediction models using multimodal time-series sensor data (i.e., eye movement, head movement, and physiological signals) based on a transformer algorithm, considering sensor data pre-processing and multimodal data fusion methods. We constructed the MSCVR dataset consisting of normalized sensor data, spectrogram formatted sensor data, and cybersickness levels collected from 45 participants through a user study. We proposed two methods for embedding multimodal time-series sensor data into the transformer: modality-specific spatial and temporal transformer encoders for normalized sensor data (MS-STTN) and modality-specific spatial-temporal transformer encoder for spectrogram (MS-STTS). MS-STTN yielded the highest performance in the ablation study and the comparison of the existing models. Furthermore, by analyzing the importance of data features, we determined their relevance to cybersickness over time, especially the salience of eye movement features. Our results and insights derived from multimodal time-series sensor data and the transformer model provide a comprehensive understanding of cybersickness and its association with sensor data. Our MSCVR dataset and code are publicly available: https://github.com/dayoung-jeong/PRECYSE.git.

References

[1]
Hassan Akbari, Liangzhe Yuan, Rui Qian, Wei-Hong Chuang, Shih-Fu Chang, Yin Cui, and Boqing Gong. 2021. Vatt: Transformers for multimodal self-supervised learning from raw video, audio and text. Advances in Neural Information Processing Systems 34 (2021), 24206--24221.
[2]
Samuel Ang, Amanda Fernandez, Michael Rushforth, and John Quarles. 2023. You Make Me Sick! The Effect of Stairs on Presence, Cybersickness, and Perception of Embodied Conversational Agents. In 2023 IEEE Conference Virtual Reality and 3D User Interfaces (VR). IEEE, 561--570.
[3]
Samuel Ang and John Quarles. 2022. You're in for a Bumpy Ride! Uneven Terrain Increases Cybersickness While Navigating with Head Mounted Displays. In 2022 IEEE Conference on Virtual Reality and 3D User Interfaces (VR). IEEE, 428--435.
[4]
Muhammad Shahid Anwar, Jing Wang, Wahab Khan, Asad Ullah, Sadique Ahmad, and Zesong Fei. 2020. Subjective QoE of 360-degree virtual reality videos and machine learning predictions. IEEE Access 8 (2020), 148084--148099.
[5]
Benjamin Arcioni, Stephen Palmisano, Deborah Apthorp, and Juno Kim. 2019. Postural stability predicts the likelihood of cybersickness in active HMD-based virtual reality. Displays 58 (2019), 3--11.
[6]
Anurag Arnab, Mostafa Dehghani, Georg Heigold, Chen Sun, Mario Lučič, and Cordelia Schmid. 2021. Vivit: A video vision transformer. In Proceedings of the IEEE/CVF International Conference on Computer Vision. 6836--6846.
[7]
Iqra Arshad, Paulo De Mello, Martin Ender, Jason D McEwen, and Elisa R Ferré. 2021. Reducing Cybersickness in 360-Degree virtual reality. Multisensory Research 35, 2 (2021), 203--219.
[8]
Mahdi Azmandian, Mark Hancock, Hrvoje Benko, Eyal Ofek, and Andrew D Wilson. 2016. Haptic retargeting: Dynamic repurposing of passive haptics for enhanced virtual reality experiences. In Proceedings of the 2016 chi conference on human factors in computing systems. 1968--1979.
[9]
Paulo Bala, Dina Dionísio, Valentina Nisi, and Nuno Nunes. 2018. Visually induced motion sickness in 360° videos: Comparing and combining visual optimization techniques. In 2018 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct). IEEE, 244--249.
[10]
Willem Bles, Jelte E Bos, Bernd De Graaf, Eric Groen, and Alexander H Wertheim. 1998. Motion sickness: only one provocative conflict? Brain research bulletin 47, 5 (1998), 481--487.
[11]
Pulkit Budhiraja, Mark Roman Miller, Abhishek K Modi, and David Forsyth. 2017. Rotation blurring: use of artificial blurring to reduce cybersickness in virtual reality first person shooters. arXiv preprint arXiv:1710.02599 (2017).
[12]
Yudong Cao, Yifei Ding, Minping Jia, and Rushuai Tian. 2021. A novel temporal convolutional network with residual self-attention mechanism for remaining useful life prediction of rolling bearings. Reliability Engineering & System Safety 215 (2021), 107813.
[13]
Berk Cebeci, Ufuk Celikcan, and Tolga K Capin. 2019. A comprehensive study of the affective and physiological responses induced by dynamic virtual reality environments. Computer Animation and Virtual Worlds 30, 3-4 (2019), e1893.
[14]
Eunhee Chang, Hyun Taek Kim, and Byounghyun Yoo. 2021. Predicting cybersickness based on user's gaze behaviors in HMD-based virtual reality. Journal of Computational Design and Engineering 8, 2 (2021), 728--739.
[15]
Fang-Yi Chao, Cagri Ozcinar, and Aljosa Smolic. 2021. Transformer-based Long-Term Viewport Prediction in 360° Video: Scanpath is All You Need. In MMSP. 1--6.
[16]
Simon Davis, Keith Nesbitt, and Eugene Nalivaiko. 2014. A systematic review of cybersickness. In Proceedings of the 2014 conference on interactive entertainment. 1--9.
[17]
Shohreh Deldari, Hao Xue, Aaqib Saeed, Daniel V Smith, and Flora D Salim. 2022. Cocoa: Cross modality contrastive learning for sensor data. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 6, 3 (2022), 1--28.
[18]
Mark S Dennison, A Zachary Wisti, and Michael D'Zmura. 2016. Use of physiological signals to predict cybersickness. Displays 44 (2016), 42--52.
[19]
Jacob Devlin, Ming-Wei Chang, Kenton Lee, and Kristina Toutanova. 2018. Bert: Pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805 (2018).
[20]
Iveta Dirgová Luptáková, Martin Kubovčík, and Jiří Pospíchal. 2022. Wearable sensor-based human activity recognition with transformer model. Sensors 22, 5 (2022), 1911.
[21]
Xiao Dong, Ken Yoshida, and Thomas A Stoffregen. 2011. Control of a virtual vehicle influences postural activity and motion sickness. Journal of Experimental Psychology: Applied 17, 2 (2011), 128.
[22]
Alexey Dosovitskiy, Lucas Beyer, Alexander Kolesnikov, Dirk Weissenborn, Xiaohua Zhai, Thomas Unterthiner, Mostafa Dehghani, Matthias Minderer, Georg Heigold, Sylvain Gelly, et al. 2020. An image is worth 16x16 words: Transformers for image recognition at scale. arXiv preprint arXiv:2010.11929 (2020).
[23]
Runyu Fan, Jun Li, Weijing Song, Wei Han, Jining Yan, and Lizhe Wang. 2022. Urban informal settlements classification via a transformer-based spatial-temporal fusion network using multimodal remote sensing and time-series human activity data. International Journal of Applied Earth Observation and Geoinformation 111 (2022), 102831.
[24]
Ajoy S Fernandes and Steven K Feiner. 2016. Combating VR sickness through subtle dynamic field-of-view modification. In 2016 IEEE symposium on 3D user interfaces (3DUI). IEEE, 201--210.
[25]
Maria Gallagher and Elisa Raffaella Ferrè. 2018. Cybersickness: a multisensory integration perspective. Multisensory research 31, 7 (2018), 645--674.
[26]
Alireza Mazloumi Gavgani, Keith V Nesbitt, Karen L Blackmore, and Eugene Nalivaiko. 2017. Profiling subjective symptoms and autonomic changes associated with cybersickness. Autonomic Neuroscience 203 (2017), 41--50.
[27]
John F Golding. 1998. Motion sickness susceptibility questionnaire revised and its relationship to other forms of sickness. Brain research bulletin 47, 5 (1998), 507--516.
[28]
Kunal Gupta, Sam WT Chan, Yun Suen Pai, Nicholas Strachan, John Su, Alexander Sumich, Suranga Nanayakkara, and Mark Billinghurst. 2022. Total VREcall: Using Biosignals to Recognize Emotional Autobiographical Memory in Virtual Reality. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 6, 2 (2022), 1--21.
[29]
Bruno Herbelin, Roy Salomon, Andrea Serino, and Olaf Blanke. 2016. Neural mechanisms of bodily self-consciousness and the experience of presence in virtual reality. Technical Report. De Gruyter.
[30]
Lawrence J Hettinger, Kevin S Berbaum, Robert S Kennedy, William P Dunlap, and Margaret D Nolan. 1990. Vection and simulator sickness. Military psychology 2, 3 (1990), 171--181.
[31]
Franz Hlawatsch, Gloria Faye Boudreaux-Bartels, et al. 1992. Linear and quadratic time-frequency signal representations. IEEE signal processing magazine 9, 2 (1992), 21--67.
[32]
Zhongxu Hu, Yiran Zhang, Yang Xing, Yifan Zhao, Dongpu Cao, and Chen Lv. 2022. Toward human-centered automated driving: A novel spatiotemporal vision transformer-enabled head tracker. IEEE Vehicular Technology Magazine 17, 4 (2022), 57--64.
[33]
Momal Ijaz, Renato Diaz, and Chen Chen. 2022. Multimodal transformer for nursing activity recognition. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 2065--2074.
[34]
Rifatul Islam, Samuel Ang, and John Quarles. 2021. Cybersense: A closed-loop framework to detect cybersickness severity and adaptively apply reduction techniques. In 2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW). IEEE, 148--155.
[35]
Rifatul Islam, Kevin Desai, and John Quarles. 2021. Cybersickness Prediction from Integrated HMD's Sensors: A Multimodal Deep Fusion Approach using Eye-tracking and Head-tracking Data. In 2021 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). IEEE, 31--40.
[36]
Rifatul Islam, Kevin Desai, and John Quarles. 2022. Towards Forecasting the Onset of Cybersickness by Fusing Physiological, Head-tracking and Eye-tracking with Multimodal Deep Fusion Network. In 2022 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). IEEE, 121--130.
[37]
Rifatul Islam, Yonggun Lee, Mehrad Jaloli, Imtiaz Muhammad, Dakai Zhu, Paul Rad, Yufei Huang, and John Quarles. 2020. Automatic detection and prediction of cybersickness severity using deep neural networks from user's physiological signals. In 2020 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). IEEE, 400--411.
[38]
Arash Jahangiri and Hesham A Rakha. 2015. Applying machine learning techniques to transportation mode recognition using mobile phone sensor data. IEEE transactions on intelligent transportation systems 16, 5 (2015), 2406--2417.
[39]
Dayoung Jeong and Kyungsik Han. 2022. Leveraging multimodal sensory information in cybersickness prediction. In Proceedings of the 28th ACM Symposium on Virtual Reality Software and Technology. 1--2.
[40]
Dayoung Jeong, Seungwon Paik, YoungTae Noh, and Kyungsik Han. 2023. MAC: multimodal, attention-based cybersickness prediction modeling in virtual reality. Virtual Reality (2023), 1--16.
[41]
Daekyo Jeong, Sangbong Yoo, and Jang Yun. 2019. Cybersickness analysis with eeg using deep learning algorithms. In 2019 IEEE conference on virtual reality and 3D user interfaces (VR). IEEE, 827--835.
[42]
Hyunwook Jeong, Hak Gu Kim, and Yong Man Ro. 2017. Visual comfort assessment of stereoscopic images using deep visual and disparity features based on human attention. In 2017 IEEE International Conference on Image Processing (ICIP). IEEE, 715--719.
[43]
Weina Jin, Jianyu Fan, Diane Gromala, and Philippe Pasquier. 2018. Automatic prediction of cybersickness for virtual reality games. In 2018 IEEE Games, Entertainment, Media Conference (GEM). IEEE, 1--9.
[44]
Sungchul Jung, Richard Li, Ryan McKee, Mary C Whitton, and Robert W Lindeman. 2021. Floor-vibration vr: mitigating cybersickness using whole-body tactile stimuli in highly realistic vehicle driving experiences. IEEE Transactions on Visualization & Computer Graphics 27, 05 (2021), 2669--2680.
[45]
RS Kennedy, A Graybiel, RC McDonough, and D Beckwith. 1968. Symptomatology under storm conditions in the North Atlantic in control subjects and in persons with bilateral labyrinthine defects. Acta oto-laryngologica 66, 1-6 (1968), 533--540.
[46]
Robert S Kennedy, Norman E Lane, Kevin S Berbaum, and Michael G Lilienthal. 1993. Simulator sickness questionnaire: An enhanced method for quantifying simulator sickness. The international journal of aviation psychology 3, 3 (1993), 203--220.
[47]
Behrang Keshavarz and Heiko Hecht. 2011. Validating an efficient method to quantify motion sickness. Human factors 53, 4 (2011), 415--426.
[48]
Hyewon Kim, Dong Jun Kim, Won Ho Chung, Kyung-Ah Park, James DK Kim, Dowan Kim, Kiwon Kim, and Hong Jin Jeon. 2021. Clinical predictors of cybersickness in virtual reality (VR) among highly stressed people. Scientific reports 11, 1 (2021), 12139.
[49]
Hak Gu Kim, Wissam J Baddar, Heoun-taek Lim, Hyunwook Jeong, and Yong Man Ro. 2017. Measurement of exceptional motion in VR video contents for VR sickness assessment using deep convolutional autoencoder. In Proceedings of the 23rd ACM symposium on virtual reality software and technology. 1--7.
[50]
Jinwoo Kim, Woojae Kim, Heeseok Oh, Seongmin Lee, and Sanghoon Lee. 2019. A deep cybersickness predictor based on brain signal analysis for virtual reality contents. In Proceedings of the IEEE/CVF International Conference on Computer Vision. 10580--10589.
[51]
Jinwoo Kim, Heeseok Oh, Woojae Kim, Seonghwa Choi, Wookho Son, and Sanghoon Lee. 2020. A deep motion sickness predictor induced by visual stimuli in virtual reality. IEEE Transactions on Neural Networks and Learning Systems (2020).
[52]
Panagiotis Kourtesis, Josie Linnell, Rayaan Amir, Ferran Argelaguet, and Sarah E MacPherson. 2023. Cybersickness in virtual reality questionnaire (csq-vr): A validation and comparison against ssq and vrsq. In Virtual Worlds, Vol. 2. MDPI, 16--35.
[53]
Ripan Kumar Kundu, Osama Yahia Elsaid, Prasad Calyam, and Khaza Anuarul Hoque. 2023. VR-LENS: Super Learning-based Cybersickness Detection and Explainable AI-Guided Deployment in Virtual Reality. In Proceedings of the 28th International Conference on Intelligent User Interfaces. 819--834.
[54]
Ripan Kumar Kundu, Rifatul Islam, Prasad Calyam, and Khaza Anuarul Hoque. 2022. TruVR: Trustworthy Cybersickness Detection using Explainable Machine Learning. arXiv preprint arXiv:2209.05257 (2022).
[55]
Ripan Kumar Kundu, Rifatul Islam, John Quarles, and Khaza Anuarul Hoque. 2023. LiteVR: Interpretable and Lightweight Cybersickness Detection using Explainable AI. In 2023 IEEE Conference Virtual Reality and 3D User Interfaces (VR). IEEE, 609--619.
[56]
Joseph J LaViola Jr. 2000. A discussion of cybersickness in virtual environments. ACM Sigchi Bulletin 32, 1 (2000), 47--56.
[57]
Sangmin Lee, Seongyeop Kim, Hak Gu Kim, Min Seob Kim, Seokho Yun, Bumseok Jeong, and Yong Man Ro. 2019. Physiological fusion net: Quantifying individual vr sickness with content stimulus and physiological response. In 2019 IEEE International Conference on Image Processing (ICIP). IEEE, 440--444.
[58]
Yun-Xuan Lin, Rohith Venkatakrishnan, Roshan Venkatakrishnan, Elham Ebrahimi, Wen-Chieh Lin, and Sabarish V Babu. 2020. How the presence and size of static peripheral blur affects cybersickness in virtual reality. ACM Transactions on Applied Perception (TAP) 17, 4 (2020), 1--18.
[59]
Scott M Lundberg and Su-In Lee. 2017. A unified approach to interpreting model predictions. Advances in neural information processing systems 30 (2017).
[60]
Arnab Kumar Mondal, Arnab Bhattacharjee, Parag Singla, and AP Prathosh. 2021. xViTCOS: explainable vision transformer based COVID-19 screening using radiography. IEEE Journal of Translational Engineering in Health and Medicine 10 (2021), 1--10.
[61]
KE Money and BS Cheung. 1983. Another function of the inner ear: facilitation of the emetic response to poisons. Aviation, space, and environmental medicine (1983).
[62]
Eugene Nalivaiko, John A Rudd, and Richard HY So. 2014. Motion sickness, nausea and thermoregulation: The "toxic" hypothesis. Temperature 1, 3 (2014), 164--171.
[63]
Charles M Oman. 1982. A heuristic mathematical model for the dynamics of sensory conflict and motion sickness. Acta Oto-Laryngologica 94, sup392 (1982), 4--44.
[64]
Charles M Oman. 1990. Motion sickness: a synthesis and evaluation of the sensory conflict theory. Canadian journal of physiology and pharmacology 68, 2 (1990), 294--303.
[65]
Hiroshi Oyamada, Atsuhiko Iijima, Akira Tanaka, Kazuhiko Ukai, Haruo Toda, Norihiro Sugita, Makoto Yoshizawa, and Takehiko Bando. 2007. A pilot study on pupillary and cardiovascular changes induced by stereoscopic video movies. Journal of neuroengineering and rehabilitation 4 (2007), 1--7.
[66]
Nitish Padmanaban, Timon Ruban, Vincent Sitzmann, Anthony M Norcia, and Gordon Wetzstein. 2018. Towards a machine-learning approach for sickness prediction in 360 stereoscopic videos. IEEE transactions on visualization and computer graphics 24, 4 (2018), 1594--1603.
[67]
Stephen Palmisano, Robert S Allison, Joel Teixeira, and Juno Kim. 2022. Differences in virtual and physical head orientation predict sickness during active head-mounted display-based virtual reality. Virtual Reality (2022), 1--21.
[68]
Sangjoon Park, Gwanghyun Kim, Jeongsol Kim, Boah Kim, and Jong Chul Ye. 2021. Federated split task-agnostic vision transformer for COVID-19 CXR diagnosis. Advances in Neural Information Processing Systems 34 (2021), 24617--24630.
[69]
Michael Portnoff. 1980. Time-frequency representation of digital signals and systems based on short-time Fourier analysis. IEEE Transactions on Acoustics, Speech, and Signal Processing 28, 1 (1980), 55--69.
[70]
Chenxin Qu, Xiaoping Che, Siqi Ma, and Shuqin Zhu. 2022. Bio-physiological-signals-based VR cybersickness detection. CCF Transactions on Pervasive Computing and Interaction (2022), 1--17.
[71]
James T Reason. 1978. Motion sickness adaptation: a neural mismatch model. Journal of the Royal Society of Medicine 71, 11 (1978), 819--829.
[72]
Lisa Rebenitsch and Charles Owen. 2016. Review on cybersickness in applications and visual displays. Virtual Reality 20, 2 (2016), 101--125.
[73]
Marco Tulio Ribeiro, Sameer Singh, and Carlos Guestrin. 2016. " Why should i trust you?" Explaining the predictions of any classifier. In Proceedings of the 22nd ACM SIGKDD international conference on knowledge discovery and data mining. 1135--1144.
[74]
Gary E Riccio and Thomas A Stoffregen. 1991. An ecological theory of motion sickness and postural instability. Ecological psychology 3, 3 (1991), 195--240.
[75]
Dante Risi and Stephen Palmisano. 2019. Effects of postural stability, active control, exposure duration and repeated exposures on HMD induced cybersickness. Displays 60 (2019), 9--17.
[76]
Daniel M Shafer, Corey P Carbonara, and Michael F Korpi. 2017. Modern virtual reality technology: cybersickness, sense of presence, and gender. Media Psychology Review 11, 2 (2017), 1.
[77]
Muhammad Shahid Anwar, Jing Wang, Sadique Ahmad, Asad Ullah, Wahab Khan, and Zesong Fei. 2020. Evaluating the factors affecting QoE of 360-degree videos and cybersickness levels predictions in virtual reality. Electronics 9, 9 (2020), 1530.
[78]
Wenkang Shan, Zhenhua Liu, Xinfeng Zhang, Shanshe Wang, Siwei Ma, and Wen Gao. 2022. P-stmo: Pre-trained spatial temporal many-to-one model for 3d human pose estimation. In Computer Vision-ECCV 2022: 17th European Conference, Tel Aviv, Israel, October 23-27, 2022, Proceedings, Part V. Springer, 461--478.
[79]
Debaditya Shome, T Kar, Sachi Nandan Mohanty, Prayag Tiwari, Khan Muhammad, Abdullah AlTameem, Yazhou Zhang, and Abdul Khader Jilani Saudagar. 2021. Covid-transformer: Interpretable covid-19 detection using vision transformer for healthcare. International Journal of Environmental Research and Public Health 18, 21 (2021), 11086.
[80]
Kay Stanney, Cali Fidopiastis, and Linda Foster. 2020. Virtual reality is sexist: but it does not have to be. Frontiers in Robotics and AI 7 (2020), 4.
[81]
Thomas A Stoffregen and L James Smart Jr. 1998. Postural instability precedes motion sickness. Brain research bulletin 47, 5 (1998), 437--448.
[82]
Luma Tabbaa, Ryan Searle, Saber Mirzaee Bafti, Md Moinul Hossain, Jittrapol Intarasisrisawat, Maxine Glancy, and Chee Siang Ang. 2021. Vreed: Virtual reality emotion recognition dataset using eye tracking & physiological measures. Proceedings of the ACM on interactive, mobile, wearable and ubiquitous technologies 5, 4 (2021), 1--20.
[83]
Ye Tian, Dulmini Hettiarachchi, and Shunsuke Kamijo. 2022. Transportation mode detection combining CNN and vision transformer with sensors recalibration using smartphone built-in sensors. Sensors 22, 17 (2022), 6453.
[84]
Hugo Touvron, Matthieu Cord, Matthijs Douze, Francisco Massa, Alexandre Sablayrolles, and Hervé Jégou. 2021. Training data-efficient image transformers & distillation through attention. In International Conference on Machine Learning. PMLR, 10347--10357.
[85]
Michel Treisman. 1977. Motion sickness: an evolutionary hypothesis. Science 197, 4302 (1977), 493--495.
[86]
Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N Gomez, Łukasz Kaiser, and Illia Polosukhin. 2017. Attention is all you need. Advances in neural information processing systems 30 (2017).
[87]
Chenxing Wang, Haiyong Luo, Fang Zhao, and Yanjun Qin. 2020. Combining residual and LSTM recurrent networks for transportation mode detection using multimodal sensors integrated in smartphones. IEEE Transactions on Intelligent Transportation Systems 22, 9 (2020), 5473--5485.
[88]
Jialin Wang, Hai-Ning Liang, Diego Monteiro, Wenge Xu, and Jimin Xiao. 2022. Real-time prediction of simulator sickness in virtual reality games. IEEE Transactions on Games (2022).
[89]
Séamas Weech, Sophie Kenny, and Michael Barnett-Cowan. 2019. Presence and cybersickness in virtual reality are negatively related: a review. Frontiers in psychology 10 (2019), 158.
[90]
Carolin Wienrich, Christine Katharina Weidner, Celina Schatto, David Obremski, and Johann Habakuk Israel. 2018. A virtual nose as a rest-frame-the impact on simulator sickness and game experience. In 2018 10th international conference on virtual worlds and games for serious applications (VS-Games). IEEE, 1--8.
[91]
Birsen Yazici and Gerald B Kliman. 1999. An adaptive statistical time-frequency method for detection of broken bars and bearing faults in motors using stator current. IEEE Transactions on Industry Applications 35, 2 (1999), 442--452.
[92]
Yunkai Yu, Zhihong Yang, Peiyao Li, Zhicheng Yang, and Yuyang You. 2019. Work-in-progress: On the feasibility of lightweight scheme of real-time atrial fibrillation detection using deep learning. In 2019 IEEE Real-Time Systems Symposium (RTSS). IEEE, 552--555.
[93]
Qi Zhang, Hengshu Zhu, Peng Wang, Enhong Chen, and Hui Xiong. 2023. Hierarchical Wi-Fi Trajectory Embedding for Indoor User Mobility Pattern Analysis. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 7, 2 (2023), 1--21.
[94]
Ye Zhang, Longguang Wang, Huiling Chen, Aosheng Tian, Shilin Zhou, and Yulan Guo. 2022. IF-ConvTransformer: A framework for human activity recognition using IMU fusion and ConvTransformer. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 6, 2 (2022), 1--26.
[95]
Zijing Zhang and Edwin C Kan. 2022. Novel Muscle Monitoring by Radiomyography (RMG) and Application to Hand Gesture Recognition. arXiv preprint arXiv:2211.03767 (2022).

Cited By

View all
  • (2025)Exploring the Feasibility of Head-Tracking Data for Cybersickness Prediction in Virtual RealityElectronics10.3390/electronics1403050214:3(502)Online publication date: 26-Jan-2025
  • (2025)CPNet: Real-Time Cybersickness Prediction without Physiological Sensors for Cybersickness MitigationACM Transactions on Sensor Networks10.1145/3716386Online publication date: 5-Feb-2025
  • (2024)Analysis of Cybersickness through Biosignals: an approach with Symbolic Machine LearningProceedings of the 26th Symposium on Virtual and Augmented Reality10.1145/3691573.3691582(11-20)Online publication date: 30-Sep-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies
Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies  Volume 8, Issue 2
June 2024
1330 pages
EISSN:2474-9567
DOI:10.1145/3665317
Issue’s Table of Contents
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike International 4.0 License.

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 15 May 2024
Published in IMWUT Volume 8, Issue 2

Check for updates

Author Tags

  1. Cybersickness
  2. Multimodal time-series sensor data
  3. Transformer
  4. Virtual reality

Qualifiers

  • Research-article
  • Research
  • Refereed

Funding Sources

  • The National Research Foundation (NRF)
  • Institute of Information & communications Technology Planning & Evaluation (IITP)

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)1,065
  • Downloads (Last 6 weeks)151
Reflects downloads up to 20 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2025)Exploring the Feasibility of Head-Tracking Data for Cybersickness Prediction in Virtual RealityElectronics10.3390/electronics1403050214:3(502)Online publication date: 26-Jan-2025
  • (2025)CPNet: Real-Time Cybersickness Prediction without Physiological Sensors for Cybersickness MitigationACM Transactions on Sensor Networks10.1145/3716386Online publication date: 5-Feb-2025
  • (2024)Analysis of Cybersickness through Biosignals: an approach with Symbolic Machine LearningProceedings of the 26th Symposium on Virtual and Augmented Reality10.1145/3691573.3691582(11-20)Online publication date: 30-Sep-2024
  • (2024)Early Prediction of Cybersickness in Virtual Reality Using a Large Language Model for Multimodal Time Series DataCompanion of the 2024 on ACM International Joint Conference on Pervasive and Ubiquitous Computing10.1145/3675094.3677578(25-29)Online publication date: 5-Oct-2024

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Login options

Full Access

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media