Datasets
Standard Dataset
VR Video Quality in the Wild
- Citation Author(s):
- Submitted by:
- Wen Wen
- Last updated:
- Tue, 11/21/2023 - 11:02
- DOI:
- 10.21227/xgp1-vm07
- License:
- Categories:
- Keywords:
Abstract
Investigating how people perceive virtual reality videos in the wild (i.e., those captured by everyday users) is a crucial and challenging task in VR-related applications due to complex authentic distortions localized in space and time. Existing panoramic video databases only consider synthetic distortions, assume fixed viewing conditions, and are limited in size. To overcome these shortcomings, we construct the VR Video Quality in the Wild (VRVQW) database, which is one of the first of its kind, and contains 502 user-generated videos with diverse content and distortion characteristics. Based on VRVQW, we conduct a formal psychophysical experiment to record the scanpaths and perceived quality scores from 139 participants under two different viewing conditions. We provide a thorough sta- tistical analysis of the recorded data, observing significant impact of viewing conditions on both human scanpaths and perceived quality. Moreover, we develop an objective quality assessment model for VR videos based on pseudocylindrical representation and convolution. Results on the proposed VRVQW show that our method is superior to existing video quality assessment models, only underperforming viewport-based models that otherwise rely on human scanpaths for projection. We have made the database and code available at https://github.com/limuhit/VR-Video-Quality-in-the-Wild.
1、'Video' Folder:Contains 502 original panoramic videos
2、'MOS_Score' Folder:Contains the MOS of the panoramic video under four viewing conditions, divided into 16 sub-files according to the experimental session
——Starting point I :Session1A.xlsx,Session2A.xlsx,Session3A.xlsx,......,Session8A.xlsx
——Starting point II :Session1B.xlsx,Session2B.xlsx,Session3B.xlsx,......,Session8B.xlsx
MOS Data Format:
A col : Video name
B col :MOS with an Exploration time of 7 second
C col :MOS with an Exploration time of 15 second
3、'HM_EM' Folder:Record the subject's eye movement (EM) and head movement(HM), and divide them into 16 subfolders according to the experimental session, and each subfolder contains viewing data of 20~21 subjects
——Starting point I :Data_Session1A,Data_Session2A,Data_Session3A,......,Data_Session8A
——Starting point II :Data_Session1B,Data_Session2B,Data_Session3B,......,Data_Session8B
Sub-file hierarchy:
——Data_Session1A
——001
——A_Bay.xlsx
——A_Boat.xlsx
——A_Coffee.xlsx
.......
——p_ViennaMuseum.xlsx
——002
——003
.......
——021
HM and EM data structure:
A col : HM_pitch, value range : (-90,90)
B col : HM_yaw,value range : (-180,180)
C col : HM_roll,value range : (-180,180)
D col : EM_latitude,value range : (-90,90)
E col : EM_longitude,value range : (-180,180)