Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

JackIn Head: Immersive Visual Telepresence System with Omnidirectional Wearable Camera

Published: 01 March 2017 Publication History

Abstract

Sharing one's own immersive experience over the Internet is one of the ultimate goals of telepresence technology. In this paper, we present JackIn Head, a visual telepresence system featuring an omnidirectional wearable camera with image motion stabilization. Spherical omnidirectional video footage taken around the head of a local user is stabilized and then broadcast to others, allowing remote users to explore the immersive visual environment independently of the local user's head direction. We describe the system design of JackIn Head and report the evaluation results of real-time image stabilization and alleviation of cybersickness. Then, through an exploratory observation study, we investigate how individuals can remotely interact, communicate with, and assist each other with our system. We report our observation and analysis of inter-personal communication, demonstrating the effectiveness of our system in augmenting remote collaboration.

References

[1]
W. Gibson, Neuromancer . New York, NY, USA: ACE, 1984.
[2]
H. Kawasaki, H. Iizuka, S. Okamoto, H. Ando, and T. Maeda, “Collaboration and skill transmission by first-person perspective view sharing system,” in Proc. IEEE RO-MAN, 2010, pp. 125–131.
[3]
J. Lanir, R. Stone, B. Cohen, and P. Gurevich, “Ownership and control of point of view in remote assistance,” in Proc. SIGCHI Conf. Human Factors Comput. Syst., 2013, pp. 2243–2252. {Online}. Available:
[4]
K. Goldberg, D. Song, and A. Levandowski, “Collaborative teleoperation using networked spatial dynamic voting,” Proc. IEEE, vol. Volume 91, no. Issue 3, pp. 430–439, 2003.
[5]
H. Kuzuoka, “Spatial workspace collaboration: A sharedview video support system for remote collaboration capability,” in Proc. SIGCHI Conf. Human Factors Comput. Syst., 1992, pp. 533–540.
[6]
S. R. Fussell, L. D. Setlock, and R. E. Kraut, “Effects of head-mounted and scene-oriented video systems on remote collaboration on physical tasks,” in Proc. SIGCHI Conf. Human Factors Comput. Syst., 2003, pp. 513–520.
[7]
I. Rae, G. Venolia, J. C. Tang, and D. Molnar, “A framework for understanding and designing telepresence,” in Proc. CSCW, 2015. {Online}. Available: http://research.microsoft.com/apps/pubs/default.aspx?id=231671
[8]
L.-T. Cheng and J. Robinson, “Dealing with speed and robustness issues for video-based registration on a wearable computing platform,” in Proc. Second Int. Symp. Wearable Comput. Dig. Papers, 1998, pp. 84–91.
[9]
S. Kasahara and J. Rekimoto, “Jackin: Integrating first-person view with out-of-body vision generation for human-human augmentation,” in Proc. 5th Augmented Human Int. Conf., 2014, pp. 46:1–46:8. {Online}. Available:
[10]
S. Mann, “Telepointer: Hands-free completely self contained wearable visual augmented reality without headwear and without any infrastructural reliance,” in Proc. 4th IEEE Int. Symp. Wearable Comput., 2000, Art. no. 177. {Online}. Available: http://dl.acm.org/citation.cfm?id=851037.856522
[11]
T. Kurata, N. Sakata, M. Kourogi, H. Kuzuoka, and M. Billinghurst, “Remote collaboration using a shoulder-worn active camera/laser,” in Proc. Eighth Int. Symp. Wearable Comput., 2004, vol. Volume 1, pp. 62–69.
[12]
B. Jones, A. Witcraft, S. Bateman, C. Neustaedter, and A. Tang, “Mechanics of camera work in mobile video collaboration,” in Proc. 33rd Annu. ACM Conf. Human Factors Comput. Syst., 2015, pp. 957–966. {Online}. Available:
[13]
S. Gauglitz, B. Nuernberger, M. Turk, and T. Höllerer, “In touch with the remote world: Remote collaboration with augmented reality drawings and virtual navigation,” in Proc. 20th ACM Symp. Virtual Reality Softw. Technol., 2014, pp. 197–205. {Online}. Available:
[14]
S. Gauglitz, B. Nuernberger, M. Turk, and T. Höllerer, “World-stabilized annotations and virtual scene navigation for remote collaboration,” in Proc. 27th Annu. ACM Symp. User Interface Softw. Technol., 2014, pp. 449–459. {Online}. Available:
[15]
Y. Tsumaki, F. Ono, and T. Tsukuda, “The 20-DOF miniature humanoid MH-2: A wearable communication system,” in Proc. IEEE Int. Conf. Robot. Autom., 2012, pp. 3930–3935.
[16]
K. Higuchi and J. Rekimoto, “Flying head: A head motion synchronization mechanism for unmanned aerial vehicle control,” in Proc. CHI Extended Abstracts Human Factors Comput. Syst., 2013, pp. 2029–2038. {Online}. Available:
[17]
S. Tachi, Telexistence . Berlin, Germany: Springer, 2015.
[18]
K. M. Stanney, R. S. Kennedy, and J. M. Drexler, “Cybersickness is not simulator sickness,” in Proc. Human Factors Ergonom. Soc. Annu. Meet., 1997, pp. 1138–1142.
[19]
S. Kasahara, S. Nagai, and J. Rekimoto, “First person omnidirectional video: System design and implications for immersive experience,” in Proc. ACM Int. Conf. Interactive Experiences TV Online Video, 2015, pp. 33–42. {Online}. Available:
[20]
P. Howarth and P. Costello, “The occurrence of virtual simulation sickness symptoms when an HMD was used as a personal viewing system,” Displays, vol. Volume 18, no. Issue 2, pp. 107–116, 1997.
[21]
J. Gluckman and S. Nayar, “Ego-motion and omnidirectional cameras,” in Proc. Sixth Int. Conf. Comput. Vis., 1998, 1998, pp. 999–1005.
[22]
N. Carlon and E. Menegatti, “Visual gyroscope for omnidirectional cameras,” in Intelligent Autonomous Systems . Berlin, Germany: Springer, 2013, pp. 335–344.
[23]
H. Mori, D. Sekiguchi, S. Kuwashima, M. Inami, and F. Matsuno, “Motionsphere,” in Proc. ACM SIGGRAPH Emerging Technol., 2005. {Online}. Available:
[24]
J. Ardouin, A. Lécuyer, M. Marchal, C. Riant, and E. Marchand, “Flyviz: A novel display device to provide humans with 360 vision by coupling catadioptric camera with hmd,” in Proc. 18th ACM Symp. Virtual Real. Softw. Technol., 2012, pp. 41–44. {Online}. Available:
[25]
K. Kondo, Y. Mukaigawa, and Y. Yagi, “Wearable imaging system for capturing omnidirectional movies from a first-person perspective,” in Proc. 16th ACM Symp. Virtual Real. Softw. Technol., 2009, pp. 11–18. {Online}. Available:
[26]
S. Nagai, S. Kasahara, and J. Rekimoto, “Livesphere: Sharing the surrounding visual environment for immersive experience in remote collaboration,” in Proc. Ninth Int. Conf. Tangible, Embedded, Embodied Interaction, 2015, pp. 113–116. {Online}. Available:
[27]
D. Scaramuzza, A. Martinelli, and R. Siegwart, “A toolbox for easily calibrating omnidirectional cameras,” in Proc. IEEE/RSJ Int. Conf. Intell. Robots Syst., 2006, pp. 5695–5701.
[28]
J.-Y. Bouguet, “Pyramidal implementation of the affine Lucas Kanade feature tracker description of the algorithm,” Intel Corporation, vol. Volume 5, pp. 1–10, 2001.
[29]
R. S. Kennedy, N. E. Lane, K. S. Berbaum, and M. G. Lilienthal, “Simulator sickness questionnaire: An enhanced method for quantifying simulator sickness,” Int. J. Aviation Psychol., vol. Volume 3, no. Issue 3, pp. 203–220, 1993.

Cited By

View all
  • (2024)"Like I was There:" A User Evaluation of an Interpersonal Telepresence System Developed through Value Sensitive DesignProceedings of the ACM on Human-Computer Interaction10.1145/36870158:CSCW2(1-18)Online publication date: 7-Nov-2024
  • (2024)Using Co-Design with Streamers and Viewers to Identify Values and Resolve Tensions in the Design of Interpersonal Wearable Telepresence SystemsProceedings of the ACM on Human-Computer Interaction10.1145/36374258:CSCW1(1-21)Online publication date: 26-Apr-2024
  • (2023)OmniSense: Exploring Novel Input Sensing and Interaction Techniques on Mobile Device with an Omni-Directional CameraProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580747(1-18)Online publication date: 19-Apr-2023
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image IEEE Transactions on Visualization and Computer Graphics
IEEE Transactions on Visualization and Computer Graphics  Volume 23, Issue 3
March 2017
106 pages

Publisher

IEEE Educational Activities Department

United States

Publication History

Published: 01 March 2017

Qualifiers

  • Research-article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 09 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2024)"Like I was There:" A User Evaluation of an Interpersonal Telepresence System Developed through Value Sensitive DesignProceedings of the ACM on Human-Computer Interaction10.1145/36870158:CSCW2(1-18)Online publication date: 7-Nov-2024
  • (2024)Using Co-Design with Streamers and Viewers to Identify Values and Resolve Tensions in the Design of Interpersonal Wearable Telepresence SystemsProceedings of the ACM on Human-Computer Interaction10.1145/36374258:CSCW1(1-21)Online publication date: 26-Apr-2024
  • (2023)OmniSense: Exploring Novel Input Sensing and Interaction Techniques on Mobile Device with an Omni-Directional CameraProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580747(1-18)Online publication date: 19-Apr-2023
  • (2022)The Impact of Sharing Gaze Behaviours in Collaborative Mixed RealityProceedings of the ACM on Human-Computer Interaction10.1145/35555646:CSCW2(1-27)Online publication date: 11-Nov-2022
  • (2022)PITAS: Sensing and Actuating Embedded Robotic Sheet for Physical Information CommunicationProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3517532(1-16)Online publication date: 29-Apr-2022
  • (2021)Portable 3D Human Pose Estimation for Human-Human Interaction using a Chest-Mounted Fisheye CameraProceedings of the Augmented Humans International Conference 202110.1145/3458709.3458986(116-120)Online publication date: 22-Feb-2021
  • (2021)Bridging the Socio-Technical Gaps in Body-worn Interpersonal Live-Streaming Telepresence through a Critical Review of the LiteratureProceedings of the ACM on Human-Computer Interaction10.1145/34491945:CSCW1(1-39)Online publication date: 22-Apr-2021
  • (2021)ShiShaProceedings of the ACM on Human-Computer Interaction10.1145/34329504:CSCW3(1-22)Online publication date: 5-Jan-2021
  • (2021)eyemR-Vis: Using Bi-Directional Gaze Behavioural Cues to Improve Mixed Reality Remote CollaborationExtended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems10.1145/3411763.3451844(1-7)Online publication date: 8-May-2021
  • (2021)Exploring Human-to-Human Telepresence and the Use of Vibro-Tactile Commands to Guide Human StreamersVirtual, Augmented and Mixed Reality10.1007/978-3-030-77599-5_15(183-202)Online publication date: 24-Jul-2021
  • Show More Cited By

View Options

View options

Get Access

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media