Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
Jun 2016 software

360° Video Viewing Dataset in Head-Mounted Virtual Reality

Description

Our dataset provides both content data (such as image saliency maps and motion maps derived from 360° videos) and sensor data (such as viewer head positions and orientations derived from HMD sensors). We also put extra efforts to align the content and sensor data using the timestamps in the raw log files. The information of the dataset subfolders is given below: 1. Saliency: contains the videos with the saliency map of each frame based on Cornia's work (available at https://arxiv.org/abs/1609.01064), where the saliency maps indicate the attraction level of the original video frame. 2. Motion: contains the videos with the optical flow analyzed from each pair of consecutive frames, where the optical flow indicates the relative motions between the objects in 360-degree videos and the viewer. 3. Raw: contains the raw sensing data (raw x, raw y, raw z, raw yaw, raw roll, and raw pitch) with timestamps captured when the viewers watch 360-degree videos using OpenTrack. 4. Orientation: contains the orientation data (raw x, raw y, raw z, raw yaw, raw roll, and raw pitch), which have been aligned with the time of each frame of the video. The calibrated orientation data (cal. yaw, cal. pitch, and cal. roll) are provided as well. 5. Tile: contains the tile numbers overlapped with the Field-of-View (FoV) of the viewer according to the orientation data, where the tile size is 192x192. Knowing the tiles that are overlapped with the FoV of the viewer is useful for optimizing 360-degree video to HMD streaming system, for example, the system can only stream those tiles to reduce the required bandwidth or allocate higher bitrate to those tiles for better user experience.


Assets

Software (360dataset.zip)

Instructions

General Installation

Software Dependencies:

OpenTrack (available at https://github.com/opentrack/opentrack), head tracking software for MS Windows, Linux, and Apple OSX. Oculus Software Development Kit (SDK) (available at https://developer.oculus.com/), enables you to build VR experiences for the Oculus Rift in C++. Oculus Video (available at https://www.oculus.com/experiences/rift/926562347437041/), which enables you to watch 360 videos and your own video files in a completely new and immersive way. GamingAnywhere (available at http://gaminganywhere.org/index.html), an open-source clouding gaming platform, to be our frame capturer.

General Installation:

None

Other Instructions:

Oculus Rift DK2, PC workstation, and a VR-ready GPU.

Experimental Installation

Installation:

This dataset consists of content (10 videos from YouTube) and sensory data (50 subjects) of 360-degree videos to HMD. The details are available in the attached readme file.

Parameterization:

No parameters needed.

Work Flows:

None

Evaluation:

The scripts generate all result data from to plot the figures in the paper.


Data Documentation

The datasets can be used by researchers, engineers, and hobbyists to either optimize existing 360� video streaming applications (like rate-distortion optimization) and novel applications (like crowd-driven camera movements).


License

Permission to use, copy, modify and distribute this dataset is hereby granted, provided that both the copyright notice and this permission notice appear in all copies of the software, derivative works or modified versions, and any portions thereof, and that both notices appear in supporting documentation.


Comments