Augmented Reality Streams for Cloud-Based Rendering

Citation Author(s):
Andreas
Traßl
Technische Universität Dresden; Centre for Tactile Internet with Human-in-the-Loop
Nick
Schwarzenberg
Technische Universität Dresden
Philipp
Schulz
Technische Universität Dresden
Submitted by:
Andreas Trassl
Last updated:
Mon, 07/12/2021 - 07:33
DOI:
10.21227/jjan-tj96
Data Format:
License:
66 Views
Categories:
Keywords:
0
0 ratings - Please login to submit your rating.

Abstract 

A promising technique to realize augmented reality on future light-weight glasses is to offload computationally extensive rendering tasks to the cloud. This however places considerable demands on the network as well as the air interface with respect to latency, reliability and throughput. For evaluationof these architectures and for traffic modelling, a dataset is provided, which contains realistic payloads of cloud-rendered augmented reality in form of video files. Furthermore, the video files were encoded for low-latency streaming and the time stamps and payloads of the data traffic to be transmitted over the network was recorded.

Instructions: 

Provided are the raw video files after rendering with a resolution of 7200x6360 pixels. For low-latency encoding libx264 ffmpeg version 4.2.4 was used with flags -preset ultrafast -tune zerolatency at a target bitrate of 8Mbit/s. The streaming is taking place with ffmpeg and the custom nut output muxer. The resulting packetized output is sent to a UDP port on localhost. The encoded video files as well as the captured traffic traces are provided.