Cloud Computing
This is a dataset of client-server Round Trip Time delays of an actual cloud gaming tournament run on the infrastructure of the cloud gaming company Swarmio Inc. The dataset can be used for designing algorithms and tuning models for user-server allocation and server selection. To collect the dataset, tournament players were connected to Swarmio servers and delay measurements were taken in real time and actual networking conditions.
- Categories:

Pen testing the method to evaluate the security of an application or network by safely exploiting any security vulnerabilities present in the system. These security flaws can be present in various areas such as system configuration settings, login methods, and even end-users risky behaviors.
- Categories:

Disclaimer
- Categories:
5G technologies have enabled new applications on a heterogeneous and distributed infrastructure edge which unifies hardware, network and software aimed at digital enabling. Based on the requirements of Industry 4.0, this infrastructure is developed using the cloud and fog computing sharing model, which should meet the needs of service level agreements in a convenient and optimized way, requiring an orchestration mechanism for the dynamic resource allocation.
- Categories:

This dataset is being used to evaluate PerfSim accuracy and speed against a real deployment in a Kubernetes cluster based on sfc-stress workloads.
- Categories:
The emerging 5G services offer numerous new opportunities for networked applications. In this study, we seek to answer two key questions: i) is the throughput of mmWave 5G predictable, and ii) can we build "good" machine learning models for 5G throughput prediction? To this end, we conduct a measurement study of commercial mmWave 5G services in a major U.S. city, focusing on the throughput as perceived by applications running on user equipment (UE).
- Categories:
This dataset is a sample of the dataset used for the paper "A network analysis on cloud gaming:Stadia, GeForce Now and PSNow" and rappresent samples of the gaming sessions.
To access further data, please contact Gianluca Perna at: gianluca.perna@polito.it
- Categories:
The dataset contains memory dump data which is generated continuously. For the experiment we carried out, we implemented the volatile data dump module which generated around 360 VM memory dump images of average size 800Mb each (Total 288GB). These data files are compressed using gzip utility. Further zipped to 79.5GB one single file of memory evidence.
Out of these preserved and stored memory dump dataset, 79 files of size 17.3GB were generated during the attack. This means the data 21.76% of data (in size) is potential evidence.
- Categories: