Augmented Reality Streams for Cloud-Based Rendering
A promising technique to realize augmented reality on future light-weight glasses is to offload computationally extensive rendering tasks to the cloud. This however places considerable demands on the network as well as the air interface with respect to latency, reliability and throughput. For evaluation of these architectures and for traffic modelling, a dataset is provided, which contains realistic payloads of cloud-rendered augmented reality in form of video files. Furthermore, the video files were encoded for low-latency streaming and the time stamps and payloads of the data traffic to be transmitted over the network was recorded.
Provided are the raw video files after rendering with a resolution of 7200x6360 pixels. For low-latency encoding libx264 ffmpeg version 4.2.4 was used with flags -preset ultrafast -tune zerolatency at a target bitrate of 8Mbit/s. The streaming is taking place with ffmpeg and the custom nut output muxer. The resulting packetized output is sent to a UDP port on localhost. The encoded video files as well as the captured traffic traces are provided.