Data from simulations in which different radiation search strategies are compared.

Categories:
22 Views

Description

This data set contains 100,000 pcd files taken by LiDAR, a 3-D image sensor, of a vehicle orbiting an indoor field.

Data Acquisition

The indoor field was built as a 1/60 scale model of an intersection, where two vehicles kept moving along pre-fixed tracks independently of each other.

The size of the vehicles was 0.040 m  × 0.035 m × 0.240 m 

We captured the indoor field by two LiDAR sensor units, which was commercialized by Velodyne.

Categories:
309 Views

This dataset contains three simulation models: cantilever_transducer.mph, simply_supported_beam_transducer.mph and optimized_ simply_supported_beam_transducer.mph.

The cantilever_transducer.mph is a cantilever transducer model with a fixed center, which is used to compare its sensitivity-frequency response with that of the optimized simply supported beam transducer model.

Categories:
40 Views

The S3 dataset contains the behaviour (sensors, statistics of applications, and voice) of 21 volunteers interacting with their smartphones for more than 60 days. The type of users is diverse, males and females in the age range from 18 until 70 have been considered in the dataset generation. The wide range of age is a key aspect, due to the impact of age in terms of smartphone usage. To generate the dataset the volunteers installed a prototype of the smartphone application in on their Android mobile phones.

 

Instructions: 

The data set is compressed into a zip file. Please unzip this file in the desired place and inside the main folder, you will find the file Readme.md with the instructions and the details of the database.

Categories:
197 Views

A new generation of computer vision, namely event-based or neuromorphic vision, provides a new paradigm for capturing visual data and the way such data is processed. Event-based vision is a state-of-art technology of robot vision. It is particularly promising for use in both mobile robots and drones for visual navigation tasks. Due to a highly novel type of visual sensors used in event-based vision, only a few datasets aimed at visual navigation tasks are publicly available.

Instructions: 

The dataset includes the following sequences:

  • 01_forest – Closed loop, Forest trail, No wind, Daytime
  • 02_forest – Closed loop, Forest trail, No wind, Daytime
  • 03_green_meadow – Closed loop, Meadow, grass up to 30 cm, No wind, Daytime
  • 04_green_meadow – Closed loop, Meadow, grass up to 30 cm, Mild wind, Daytime
  • 05_road_asphalt – Closed loop, Asphalt road, No wind, Nighttime
  • 06_plantation – Closed loop, Shrubland, Mild wind, Daytime
  • 07_plantation – Closed loop, Asphalt road, No wind, Nighttime
  • 08_plantation_water – Random movement, Sprinklers (water drops on camera lens), No wind, Nighttime
  • 09_cattle_farm – Closed loop, Cattle farm, Mild wind, Daytime
  • 10_cattle_farm – Closed loop, Cattle farm, Mild wind, Daytime
  • 11_cattle_farm_feed_table – Closed loop, Cattle farm feed table, Mild wind, Daytime
  • 12_cattle_farm_feed_table – Closed loop, Cattle farm feed table, Mild wind, Daytime
  • 13_ditch – Closed loop, Sandy surface, Edge of ditch or drainage channel, No wind, Daytime
  • 14_ditch – Closed loop, Sandy surface, Shore or bank, Strong wind, Daytime
  • 15_young_pines – Closed loop, Sandy surface, Pine coppice, No wind, Daytime
  • 16_winter_cereal_field – Closed loop, Winter wheat sowing, Mild wind, Daytime
  • 17_winter_cereal_field – Closed loop, Winter wheat sowing, Mild wind, Daytime
  • 18_winter_rapeseed_field – Closed loop, Winter rapeseed, Mild wind, Daytime
  • 19_winter_rapeseed_field – Closed loop, Winter rapeseed, Mild wind, Daytime
  • 20_field_with_a_cow – Closed loop, Cows tethered in pasture, Mild wind, Daytime
  • 21_field_with_a_cow – Closed loop, Cows tethered in pasture, Mild wind, Daytime

Each sequence contains the following separately downloadable files:

  • <..sequence_id..>_video.mp4 – provides an overview of the sequence data (for the DVS and RGB-D sensors).
  • <..sequence_id..>_data.tar.gz – entire date sequence in raw data format (AEDAT2.0 - DVS, images - RGB-D, point clouds in pcd files - LIDAR, and IMU csv files with original sensor timestamps). Timestamp conversion formulas are available.
  • <..sequence_id..>_rawcalib_data.tar.gz – recorded fragments that can be used to perform the calibration independently (intrinsic, extrinsic and time alignment).
  • <..sequence_id..>_rosbags.tar.gz – main sequence in ROS bag format. All sensors timestamps are aligned with DVS with an accuracy of less than 1 ms.

The contents of each archive are described below..

Raw format data

The archive <..sequence_id..>_data.tar.gz contains the following files and folders:

  • ./meta-data/ - all the useful information about the sequence
  • ./meta-data/meta-data.md - detailed information about the sequence, sensors, files, and data formats
  • ./meta-data/cad_model.pdf - sensors placement
  • ./meta-data/<...>_timeconvs.json - coefficients for timestamp conversion formulas
  • ./meta-data/ground-truth/ - movement ground-truth data, calculated using 3 different Lidar-SLAM algorithms (Cartographer, HDL-Graph, LeGo-LOAM)
  • ./meta-data/calib-params/ - intrinsic and extrinsic calibration parameters
  • ./recording/ - main sequence
  • ./recording/dvs/ - DVS events and IMU data
  • ./recording/lidar/ - Lidar point clouds and IMU data
  • ./recording/realsense/ - Realsense camera RGB, Depth frames, and IMU data
  • ./recording/sensorboard/ - environmental sensors data (temperature, humidity, air pressure)

Calibration data

The <..sequence_id..>_rawcalib_data.tar.gz archive contains the following files and folders:

  • ./imu_alignments/ - IMU recordings of the platform lifting before and after the main sequence (can be used for custom timestamp alignment)
  • ./solenoids/ - IMU recordings of the solenoid vibrations before and after the main sequence (can be used for custom timestamp alignment)
  • ./lidar_rs/ - Lidar vs Realsense camera extrinsic calibration by showing both sensors a spherical object (ball)
  • ./dvs_rs/ - DVS and Realsense camera intrinsic and extrinsic calibration frames (checkerboard pattern)

ROS Bag format data

There are six rosbag files for each scene, their contents are as follows:

  • <..sequence_id..>_dvs.bag (topics: /dvs/camera_info, /dvs/events, /dvs/imu, and accordingly message types: sensor_msgs/CameraInfo, dvs_msgs/EventArray, sensor_msgs/Imu).
  • <..sequence_id..>_lidar.bag (topics: /lidar/imu/acc, /lidar/imu/gyro, /lidar/pointcloud, and accordingly message types: sensor_msgs/Imu, sensor_msgs/Imu, sensor_msgs/PointCloud2).
  • <..sequence_id..>_realsense.bag (topics: /realsense/camera_info, /realsense/depth, /realsense/imu/acc, /realsense/imu/gyro, /realsense/rgb, /tf, and accordingly message types: sensor_msgs/CameraInfo, sensor_msgs/Image, sensor_msgs/Imu, sensor_msgs/Imu, sensor_msgs/Image, tf2_msgs/TFMessage).
  • <..sequence_id..>_sensorboard.bag (topics: /sensorboard/air_pressure, /sensorboard/relative_humidity, /sensorboard/temperature, and accordingly message types: sensor_msgs/FluidPressure, sensor_msgs/RelativeHumidity, sensor_msgs/Temperature).
  • <..sequence_id..>_trajectories.bag (topics: /cartographer, /hdl, /lego_loam, and accordingly message types: geometry_msgs/PoseStamped, geometry_msgs/PoseStamped, geometry_msgs/PoseStamped).
  • <..sequence_id..>_data_for_realsense_lidar_calibration.bag (topics: /lidar/pointcloud, /realsense/camera_info, /realsense/depth, /realsense/rgb, /tf, and accordingly message types: sensor_msgs/PointCloud2, sensor_msgs/CameraInfo, sensor_msgs/Image, sensor_msgs/Image, tf2_msgs/TFMessage).

Version history

22.06.2021.

  • Realsense data now also contain depth png images with 16-bit depth, which are located in folder /recording/realsense/depth_native/
  • Added data in rosbag format

 

Categories:
331 Views

In order to obtain the constants of our PID temperature controller, it was necessary to identify the system. The identification of the system allows us, through experimentation, to find the representation of the plant to be able to control it.

The first data with name "data_2.mat" represent the open loop test, where the sampling frequency is 100 [Hz], this data was useful to find the period of the pulse train generator, which is twice the slowest sampling time analyzed between the high pulse and low pulse of the input. 

Categories:
194 Views

The LEDNet dataset consists of image data of a field area that are captured from a mobile phone camera.

Images in the dataset contain the information of an area where a PCB board is placed, containing 6 LEDs. Each state of the LEDs on the PCB board represents a binary number, with the ON state corresponding to binary 1 and the OFF state corresponding to binary 0. All the LEDs placed in sequence represent a binary sequence or encoding of an analog value.

Categories:
179 Views

Design and fabrication outsourcing has made integrated circuits vulnerable to malicious modifications by third parties known as hardware Trojan (HT). Over the last decade, the use of side-channel measurements for detecting the malicious manipulation of the chip has been extensively studied. However, the suggested approaches mostly suffer from two major limitations: reliance on trusted identical chip (e.i. golden chip); untraceable footprints of subtle hardware Trojans which remain inactive during the testing phase.

Instructions: 

See the attached document.

Categories:
465 Views

The dataset consists of the ISFET sensor data utilized to train ML models for drift compensation.

Instructions: 

This dataset consists of the sensor data used to develop SPICE macro model of ISFET and associated documentation.

Categories:
168 Views

 

Instructions: 

These .MAT files contain MATLAB Tables of raw and preprocessed data. Information detailing the bed system used to collect these signals and the steps used to create the preprocessed data are contained in a publication in Sensors – Carlson, C.; Turpin, V.-R.; Suliman, A.; Ade, C.; Warren, S.; Thompson, D.E.; Bed-Based Ballistocardiography: Dataset and Ability to Track Cardiovascular Parameters. Sensors 2021, 21, 156. https://doi.org/10.3390/s21010156.

The reBAP signal is scaled at 100 mmHg/volt. The interbeat interval (IBI), stroke volume (SV), and dP/dt_max are scaled at 1000 ms/volt, 100 mL/volt, and 1 mHg/s/volt, respectively.

Categories:
863 Views

Pages