Data from simulations in which different radiation search strategies are compared.
This data set contains 100,000 pcd files taken by LiDAR, a 3-D image sensor, of a vehicle orbiting an indoor field.
The indoor field was built as a 1/60 scale model of an intersection, where two vehicles kept moving along pre-fixed tracks independently of each other.
The size of the vehicles was 0.040 m × 0.035 m × 0.240 m
We captured the indoor field by two LiDAR sensor units, which was commercialized by Velodyne.
This dataset contains three simulation models: cantilever_transducer.mph, simply_supported_beam_transducer.mph and optimized_ simply_supported_beam_transducer.mph.
The cantilever_transducer.mph is a cantilever transducer model with a fixed center, which is used to compare its sensitivity-frequency response with that of the optimized simply supported beam transducer model.
The S3 dataset contains the behaviour (sensors, statistics of applications, and voice) of 21 volunteers interacting with their smartphones for more than 60 days. The type of users is diverse, males and females in the age range from 18 until 70 have been considered in the dataset generation. The wide range of age is a key aspect, due to the impact of age in terms of smartphone usage. To generate the dataset the volunteers installed a prototype of the smartphone application in on their Android mobile phones.
The data set is compressed into a zip file. Please unzip this file in the desired place and inside the main folder, you will find the file Readme.md with the instructions and the details of the database.
A new generation of computer vision, namely event-based or neuromorphic vision, provides a new paradigm for capturing visual data and the way such data is processed. Event-based vision is a state-of-art technology of robot vision. It is particularly promising for use in both mobile robots and drones for visual navigation tasks. Due to a highly novel type of visual sensors used in event-based vision, only a few datasets aimed at visual navigation tasks are publicly available.
The dataset includes the following sequences:
01_forest– Closed loop, Forest trail, No wind, Daytime
02_forest– Closed loop, Forest trail, No wind, Daytime
03_green_meadow– Closed loop, Meadow, grass up to 30 cm, No wind, Daytime
04_green_meadow– Closed loop, Meadow, grass up to 30 cm, Mild wind, Daytime
05_road_asphalt– Closed loop, Asphalt road, No wind, Nighttime
06_plantation– Closed loop, Shrubland, Mild wind, Daytime
07_plantation– Closed loop, Asphalt road, No wind, Nighttime
08_plantation_water– Random movement, Sprinklers (water drops on camera lens), No wind, Nighttime
09_cattle_farm– Closed loop, Cattle farm, Mild wind, Daytime
10_cattle_farm– Closed loop, Cattle farm, Mild wind, Daytime
11_cattle_farm_feed_table– Closed loop, Cattle farm feed table, Mild wind, Daytime
12_cattle_farm_feed_table– Closed loop, Cattle farm feed table, Mild wind, Daytime
13_ditch– Closed loop, Sandy surface, Edge of ditch or drainage channel, No wind, Daytime
14_ditch– Closed loop, Sandy surface, Shore or bank, Strong wind, Daytime
15_young_pines– Closed loop, Sandy surface, Pine coppice, No wind, Daytime
16_winter_cereal_field– Closed loop, Winter wheat sowing, Mild wind, Daytime
17_winter_cereal_field– Closed loop, Winter wheat sowing, Mild wind, Daytime
18_winter_rapeseed_field– Closed loop, Winter rapeseed, Mild wind, Daytime
19_winter_rapeseed_field– Closed loop, Winter rapeseed, Mild wind, Daytime
20_field_with_a_cow– Closed loop, Cows tethered in pasture, Mild wind, Daytime
21_field_with_a_cow– Closed loop, Cows tethered in pasture, Mild wind, Daytime
Each sequence contains the following separately downloadable files:
<..sequence_id..>_video.mp4– provides an overview of the sequence data (for the DVS and RGB-D sensors).
<..sequence_id..>_data.tar.gz– entire date sequence in raw data format (
AEDAT2.0- DVS, images - RGB-D, point clouds in pcd files - LIDAR, and IMU
csvfiles with original sensor timestamps). Timestamp conversion formulas are available.
<..sequence_id..>_rawcalib_data.tar.gz– recorded fragments that can be used to perform the calibration independently (intrinsic, extrinsic and time alignment).
<..sequence_id..>_rosbags.tar.gz– main sequence in
ROS bagformat. All sensors timestamps are aligned with DVS with an accuracy of less than 1 ms.
The contents of each archive are described below..
Raw format data
<..sequence_id..>_data.tar.gz contains the following files and folders:
./meta-data/- all the useful information about the sequence
./meta-data/meta-data.md- detailed information about the sequence, sensors, files, and data formats
./meta-data/cad_model.pdf- sensors placement
./meta-data/<...>_timeconvs.json- coefficients for timestamp conversion formulas
./meta-data/ground-truth/- movement ground-truth data, calculated using 3 different Lidar-SLAM algorithms (Cartographer, HDL-Graph, LeGo-LOAM)
./meta-data/calib-params/- intrinsic and extrinsic calibration parameters
./recording/- main sequence
./recording/dvs/- DVS events and IMU data
./recording/lidar/- Lidar point clouds and IMU data
./recording/realsense/- Realsense camera RGB, Depth frames, and IMU data
./recording/sensorboard/- environmental sensors data (temperature, humidity, air pressure)
<..sequence_id..>_rawcalib_data.tar.gz archive contains the following files and folders:
./imu_alignments/- IMU recordings of the platform lifting before and after the main sequence (can be used for custom timestamp alignment)
./solenoids/- IMU recordings of the solenoid vibrations before and after the main sequence (can be used for custom timestamp alignment)
./lidar_rs/- Lidar vs Realsense camera extrinsic calibration by showing both sensors a spherical object (ball)
./dvs_rs/- DVS and Realsense camera intrinsic and extrinsic calibration frames (checkerboard pattern)
ROS Bag format data
There are six rosbag files for each scene, their contents are as follows:
/dvs/imu, and accordingly message types:
/lidar/pointcloud, and accordingly message types:
/tf, and accordingly message types:
/sensorboard/temperature, and accordingly message types:
/lego_loam, and accordingly message types:
/tf, and accordingly message types:
- Realsense data now also contain depth png images with 16-bit depth, which are located in folder /recording/realsense/depth_native/
- Added data in rosbag format
In order to obtain the constants of our PID temperature controller, it was necessary to identify the system. The identification of the system allows us, through experimentation, to find the representation of the plant to be able to control it.
The first data with name "data_2.mat" represent the open loop test, where the sampling frequency is 100 [Hz], this data was useful to find the period of the pulse train generator, which is twice the slowest sampling time analyzed between the high pulse and low pulse of the input.
The LEDNet dataset consists of image data of a field area that are captured from a mobile phone camera.
Images in the dataset contain the information of an area where a PCB board is placed, containing 6 LEDs. Each state of the LEDs on the PCB board represents a binary number, with the ON state corresponding to binary 1 and the OFF state corresponding to binary 0. All the LEDs placed in sequence represent a binary sequence or encoding of an analog value.
Design and fabrication outsourcing has made integrated circuits vulnerable to malicious modifications by third parties known as hardware Trojan (HT). Over the last decade, the use of side-channel measurements for detecting the malicious manipulation of the chip has been extensively studied. However, the suggested approaches mostly suffer from two major limitations: reliance on trusted identical chip (e.i. golden chip); untraceable footprints of subtle hardware Trojans which remain inactive during the testing phase.
See the attached document.
These .MAT files contain MATLAB Tables of raw and preprocessed data. Information detailing the bed system used to collect these signals and the steps used to create the preprocessed data are contained in a publication in Sensors – Carlson, C.; Turpin, V.-R.; Suliman, A.; Ade, C.; Warren, S.; Thompson, D.E.; Bed-Based Ballistocardiography: Dataset and Ability to Track Cardiovascular Parameters. Sensors 2021, 21, 156. https://doi.org/10.3390/s21010156.
The reBAP signal is scaled at 100 mmHg/volt. The interbeat interval (IBI), stroke volume (SV), and dP/dt_max are scaled at 1000 ms/volt, 100 mL/volt, and 1 mHg/s/volt, respectively.