The large number and scale of natural and man-made disasters have led to an urgent demand for technologies that enhance the safety and efficiency of search and rescue teams. Semi-autonomous rescue robots are beneficial, especially when searching inaccessible terrains, or dangerous environments, such as collapsed infrastructures. For search and rescue missions in degraded visual conditions or non-line of sight scenarios, radar-based approaches may contribute to acquire valuable, and otherwise unavailable information.


Objective: The human hand is known to have excellent manipulation ability compared to other primate hands. Without the palm movements, the human hand would lose more than 40% of its functions. However, uncovering the constitution of palm movements is still a challenging problem involving kinesiology, physiology, and engineering science. Methods: By recording the palm joint angles during common grasping, gesturing, and manipulation tasks, we built a palm kinematic dataset.


This dataset is created for neural network-based surrogate modeling of the power conversion losses. The dataset includes four sets of data (for AC/DC conversion losses under inversion/rectification moes and DC/DC conversion losses during battery charging/discharging, respectively) for the neural network. The raw data is generated using high fidelity analytical models. 


This data was collected during a validation study of our Torso-Dynamics Estimation System (TES). The TES consisted of a Force Sensing Seat (FSS) and an inertial measurement unit (IMU) that measured the kinetics and kinematics of the subject's torso motions. The FSS estimated the 3D forces, 3D moments, and 2D COPs while the IMU estimated the 3D torso angles. To validate the TES, the FSS and IMU estimates were compared to gold standard research equipment (AMTI force plate and Qualisys motion capture system, respectively). 


In this dataset, based on a beam sweeping experiment in the 60 GHz band in an indoor environment, we provide the acquired IQ data samples (containing the announced TX antenna weighting vectors (AWV) index as information) for the given RX AWV index, location, and carrier frequency. We also include the information obtained after processing the PPDU in IQ data.


The experimental measurement setup is performed in an anechoic chamber to secure the LOS transmission at the Millimeter Wave and Terahertz Technologies Research Laboratories (MILTAL), Scientific and Technological Research Council of Turkey (TUBİTAK), Kocaeli. The dimensions of anechoic chamber are 7m × 4m × 3m. The dataset includes channel impulse response in between 240GHz and 300GHz for the transmitter-receiver distance of 20 cm, 30 cm, 40 cm, 50 cm, 60 cm, 70 cm, 80 cm, 90 cm and 100 cm and different misalignment degrees.


A video dataset for the paper named "Analysis of ENF Signal Extraction From Videos Acquired by Rolling Shutters" submitted to IEEE Transactions on Information Forensics and Security (T-IFS) and under review.

If you used our dataset, please cite our paper as:

Jisoo Choi, Chau-Wai Wong, Hui Su, and Min Wu, "Analysis of ENF signal extraction from videos acquired by rolling shutters," submitted to IEEE Transactions on Information Forensics and Security (T-IFS), under review.


Device fingerprinting is a technique for remote indirect identification or classification of a device of interest. This database is designated for device fingerprinting by current consumption; it includes current recordings for 22" computer displays from 40 devices - 20 Dell P2217H and 20 Dell E2214H. Two signals for each device were sampled independently and sequentially to provide independent train and test parts. Each sampled signal includes a 250-second recording at a 50kHz sampling frequency.


This dataset is created for neural network-based surrogate modeling of the power conversion losses. The dataset includes two sets of training and test data (for AC/DC and DC/DC converters respectively) for the neural network. The raw data is generated using PLECS Blockset Packages in MATLAB-Simulink environment.