Data from simulations in which different radiation search strategies are compared.



This data set contains 100,000 pcd files taken by LiDAR, a 3-D image sensor, of a vehicle orbiting an indoor field.

Data Acquisition

The indoor field was built as a 1/60 scale model of an intersection, where two vehicles kept moving along pre-fixed tracks independently of each other.

The size of the vehicles was 0.040 m  × 0.035 m × 0.240 m 

We captured the indoor field by two LiDAR sensor units, which was commercialized by Velodyne.


This dataset contains three simulation models: cantilever_transducer.mph, simply_supported_beam_transducer.mph and optimized_ simply_supported_beam_transducer.mph.

The cantilever_transducer.mph is a cantilever transducer model with a fixed center, which is used to compare its sensitivity-frequency response with that of the optimized simply supported beam transducer model.


A wide range of wearable sensors exist on the market for continuous physiological health monitoring. The type and scope of health data that can be gathered is a function of the sensor modality. Blumio presents a dataset of synchronized data from a reference blood pressure device along with several wearable sensor types: PPG, applanation tonometry, and the Blumio millimeter-wave radar. Data collection was conducted under set protocol with subjects seated at rest. 115 study subjects were included (age range 20-67 years), resulting in over 19 hours of data acquired.



Participant Recruitment

Potential participants were informed of the study protocol prior to being enrolled. To be included in the study, subjects had to be over the age of 18 and under the age of 90. Informed consent was obtained from all participants. Personal data such as age, gender, height, and weight were collected prior to data collection and this information, along with collected sensor readings, was deidentified and stored in conformation with HIPAA.

Data Collection System

Blumio has conducted previous studies measuring arterial pulsations at the radial artery with millimeter-wave FMCW radar [1]. For this study, the developmental stage BGT60TR24B FMCW system (Infineon Technologies AG, Munich, Germany) was worn over the left wrist.

The data collection system also included the CNAP Monitor 500 (CNSystems Medizintechnik GmbH, Graz, Austria) worn on the left arm, a SPT-301 applanation tonometer (Millar Inc, Houston, USA) worn on the right wrist, and a SS4LA PPG transducer (BIOPAC Systems Inc, Goleta, USA) worn on the right hand’s middle digit.

Data Collection Procedures

Study protocol was approved by Western IRB prior to participant recruitment (Western IRB #20193057). All measurements were collected at the Blumio Office in San Mateo, CA. Measurements were performed according to a fixed protocol. Participants were seated at an appropriate height with both arms resting comfortably on a table in front of them. They were asked to rest quietly for 5 minutes in that position. Then, signals from the sensors were recorded simultaneously for a period of 10 minutes. During the signal acquisition period, the participant was asked to maintain a normal breathing frequency and to not speak or move.

Signal Processing

Following collection, the signals were first time-synchronized and then processed according to the steps described below.

The raw IF radar data output was processed utilizing two approaches. First, a standard phase transformation was used. This consisted of performing a Fast Fourier Transform (FFT) on the IF signal and extracting the phase from the appropriate range bin as described in our previous work. Secondly, a proprietary transformation created by Blumio was utilized. The algorithms employ a set of pre-processing and noise-reduction procedures, during which the radar signal is transformed into a univariate pulse waveform.

The auxiliary signals and the reference blood pressure data was extracted from the MP36R unit using the companion AcqKnowledge software (BIOPAC Systems Inc, Goleta, USA).

Dataset Description and Usage Notes

The entire dataset and associated participant health information are freely available for download as a ZIP file. All the sensor data is stored in CSV format. Each CSV file is named after the participant’s assigned identifier. The first column of the CSV contains the timestamp in seconds. For the sake of data analysis, all sensor channels have been time aligned in the included files. The second column includes the reference blood pressure in mmHg from the CNIBP monitor. The third column is data from the PPG sensor in mV. The fourth column includes the is the data from the applanation tonometer also in mV. The fifth column is the output from Blumio’s proprietary radar transform algorithm in arbitrary units. The sixth column is the output from the phase radar transformation algorithm in radians. Note that each file varies in length of time. Certain files have a truncated start due to the CNAP Monitor 500’s initialization period.

The included participant health information is available in a XSLX summary sheet. The information in the XSLX sheet is tabulated by participant study identifier.


The authors would like to thank the Silicon Valley Innovation Center (SVIC) and the Power & Sensor Systems (PSS) teams at Infineon Technologies AG for providing engineering support during our R&D process.


This work was supported by the Center for Disease Control under grant number 9679554 and Infineon Technologies AG.


[1] J. Johnson, C. Kim, and O. Shay, "Arterial Pulse Measurement with Wearable Millimeter Wave Device," in IEEE 16th International Conference on Wearable and Implantable Body Sensor Networks (BSN), 2019, pp. 1-4.


The S3 dataset contains the behaviour (sensors, statistics of applications, and voice) of 21 volunteers interacting with their smartphones for more than 60 days. The type of users is diverse, males and females in the age range from 18 until 70 have been considered in the dataset generation. The wide range of age is a key aspect, due to the impact of age in terms of smartphone usage. To generate the dataset the volunteers installed a prototype of the smartphone application in on their Android mobile phones.



The data set is compressed into a zip file. Please unzip this file in the desired place and inside the main folder, you will find the file with the instructions and the details of the database.


A new generation of computer vision, namely event-based or neuromorphic vision, provides a new paradigm for capturing visual data and the way such data is processed. Event-based vision is a state-of-art technology of robot vision. It is particularly promising for use in both mobile robots and drones for visual navigation tasks. Due to a highly novel type of visual sensors used in event-based vision, only a few datasets aimed at visual navigation tasks are publicly available.


The dataset includes the following sequences:

  • 01_forest – Closed loop, Forest trail, No wind, Daytime
  • 02_forest – Closed loop, Forest trail, No wind, Daytime
  • 03_green_meadow – Closed loop, Meadow, grass up to 30 cm, No wind, Daytime
  • 04_green_meadow – Closed loop, Meadow, grass up to 30 cm, Mild wind, Daytime
  • 05_road_asphalt – Closed loop, Asphalt road, No wind, Nighttime
  • 06_plantation – Closed loop, Shrubland, Mild wind, Daytime
  • 07_plantation – Closed loop, Asphalt road, No wind, Nighttime
  • 08_plantation_water – Random movement, Sprinklers (water drops on camera lens), No wind, Nighttime
  • 09_cattle_farm – Closed loop, Cattle farm, Mild wind, Daytime
  • 10_cattle_farm – Closed loop, Cattle farm, Mild wind, Daytime
  • 11_cattle_farm_feed_table – Closed loop, Cattle farm feed table, Mild wind, Daytime
  • 12_cattle_farm_feed_table – Closed loop, Cattle farm feed table, Mild wind, Daytime
  • 13_ditch – Closed loop, Sandy surface, Edge of ditch or drainage channel, No wind, Daytime
  • 14_ditch – Closed loop, Sandy surface, Shore or bank, Strong wind, Daytime
  • 15_young_pines – Closed loop, Sandy surface, Pine coppice, No wind, Daytime
  • 16_winter_cereal_field – Closed loop, Winter wheat sowing, Mild wind, Daytime
  • 17_winter_cereal_field – Closed loop, Winter wheat sowing, Mild wind, Daytime
  • 18_winter_rapeseed_field – Closed loop, Winter rapeseed, Mild wind, Daytime
  • 19_winter_rapeseed_field – Closed loop, Winter rapeseed, Mild wind, Daytime
  • 20_field_with_a_cow – Closed loop, Cows tethered in pasture, Mild wind, Daytime
  • 21_field_with_a_cow – Closed loop, Cows tethered in pasture, Mild wind, Daytime

Each sequence contains the following separately downloadable files:

  • <..sequence_id..>_video.mp4 – provides an overview of the sequence data (for the DVS and RGB-D sensors).
  • <..sequence_id..>_data.tar.gz – entire date sequence in raw data format (AEDAT2.0 - DVS, images - RGB-D, point clouds in pcd files - LIDAR, and IMU csv files with original sensor timestamps). Timestamp conversion formulas are available.
  • <..sequence_id..>_rawcalib_data.tar.gz – recorded fragments that can be used to perform the calibration independently (intrinsic, extrinsic and time alignment).
  • <..sequence_id..>_rosbags.tar.gz – main sequence in ROS bag format. All sensors timestamps are aligned with DVS with an accuracy of less than 1 ms.

The contents of each archive are described below..

Raw format data

The archive <..sequence_id..>_data.tar.gz contains the following files and folders:

  • ./meta-data/ - all the useful information about the sequence
  • ./meta-data/ - detailed information about the sequence, sensors, files, and data formats
  • ./meta-data/cad_model.pdf - sensors placement
  • ./meta-data/<...>_timeconvs.json - coefficients for timestamp conversion formulas
  • ./meta-data/ground-truth/ - movement ground-truth data, calculated using 3 different Lidar-SLAM algorithms (Cartographer, HDL-Graph, LeGo-LOAM)
  • ./meta-data/calib-params/ - intrinsic and extrinsic calibration parameters
  • ./recording/ - main sequence
  • ./recording/dvs/ - DVS events and IMU data
  • ./recording/lidar/ - Lidar point clouds and IMU data
  • ./recording/realsense/ - Realsense camera RGB, Depth frames, and IMU data
  • ./recording/sensorboard/ - environmental sensors data (temperature, humidity, air pressure)

Calibration data

The <..sequence_id..>_rawcalib_data.tar.gz archive contains the following files and folders:

  • ./imu_alignments/ - IMU recordings of the platform lifting before and after the main sequence (can be used for custom timestamp alignment)
  • ./solenoids/ - IMU recordings of the solenoid vibrations before and after the main sequence (can be used for custom timestamp alignment)
  • ./lidar_rs/ - Lidar vs Realsense camera extrinsic calibration by showing both sensors a spherical object (ball)
  • ./dvs_rs/ - DVS and Realsense camera intrinsic and extrinsic calibration frames (checkerboard pattern)

ROS Bag format data

There are six rosbag files for each scene, their contents are as follows:

  • <..sequence_id..>_dvs.bag (topics: /dvs/camera_info, /dvs/events, /dvs/imu, and accordingly message types: sensor_msgs/CameraInfo, dvs_msgs/EventArray, sensor_msgs/Imu).
  • <..sequence_id..>_lidar.bag (topics: /lidar/imu/acc, /lidar/imu/gyro, /lidar/pointcloud, and accordingly message types: sensor_msgs/Imu, sensor_msgs/Imu, sensor_msgs/PointCloud2).
  • <..sequence_id..>_realsense.bag (topics: /realsense/camera_info, /realsense/depth, /realsense/imu/acc, /realsense/imu/gyro, /realsense/rgb, /tf, and accordingly message types: sensor_msgs/CameraInfo, sensor_msgs/Image, sensor_msgs/Imu, sensor_msgs/Imu, sensor_msgs/Image, tf2_msgs/TFMessage).
  • <..sequence_id..>_sensorboard.bag (topics: /sensorboard/air_pressure, /sensorboard/relative_humidity, /sensorboard/temperature, and accordingly message types: sensor_msgs/FluidPressure, sensor_msgs/RelativeHumidity, sensor_msgs/Temperature).
  • <..sequence_id..>_trajectories.bag (topics: /cartographer, /hdl, /lego_loam, and accordingly message types: geometry_msgs/PoseStamped, geometry_msgs/PoseStamped, geometry_msgs/PoseStamped).
  • <..sequence_id..>_data_for_realsense_lidar_calibration.bag (topics: /lidar/pointcloud, /realsense/camera_info, /realsense/depth, /realsense/rgb, /tf, and accordingly message types: sensor_msgs/PointCloud2, sensor_msgs/CameraInfo, sensor_msgs/Image, sensor_msgs/Image, tf2_msgs/TFMessage).

Version history


  • Realsense data now also contain depth png images with 16-bit depth, which are located in folder /recording/realsense/depth_native/
  • Added data in rosbag format



In order to obtain the constants of our PID temperature controller, it was necessary to identify the system. The identification of the system allows us, through experimentation, to find the representation of the plant to be able to control it.

The first data with name "data_2.mat" represent the open loop test, where the sampling frequency is 100 [Hz], this data was useful to find the period of the pulse train generator, which is twice the slowest sampling time analyzed between the high pulse and low pulse of the input. 


The LEDNet dataset consists of image data of a field area that are captured from a mobile phone camera.

Images in the dataset contain the information of an area where a PCB board is placed, containing 6 LEDs. Each state of the LEDs on the PCB board represents a binary number, with the ON state corresponding to binary 1 and the OFF state corresponding to binary 0. All the LEDs placed in sequence represent a binary sequence or encoding of an analog value.


Design and fabrication outsourcing has made integrated circuits vulnerable to malicious modifications by third parties known as hardware Trojan (HT). Over the last decade, the use of side-channel measurements for detecting the malicious manipulation of the chip has been extensively studied. However, the suggested approaches mostly suffer from two major limitations: reliance on trusted identical chip (e.i. golden chip); untraceable footprints of subtle hardware Trojans which remain inactive during the testing phase.


See the attached document.


The dataset consists of the ISFET sensor data utilized to train ML models for drift compensation.


This dataset consists of the sensor data used to develop SPICE macro model of ISFET and associated documentation.