These .MAT files contain MATLAB Tables of raw and preprocessed data. Information detailing the bed system used to collect these signals and the steps used to create the preprocessed data are contained in a publication in Sensors – Carlson, C.; Turpin, V.-R.; Suliman, A.; Ade, C.; Warren, S.; Thompson, D.E.; Bed-Based Ballistocardiography: Dataset and Ability to Track Cardiovascular Parameters. Sensors 2021, 21, 156. https://doi.org/10.3390/s21010156.
The reBAP signal is scaled at 100 mmHg/volt. The interbeat interval (IBI), stroke volume (SV), and dP/dt_max are scaled at 1000 ms/volt, 100 mL/volt, and 1 mHg/s/volt, respectively.
Falls are a major health problem with one in three people over the age of 65 falling each year, oftentimes causing hip fractures, disability, reduced mobility, hospitalization and death. A major limitation in fall detection algorithm development is an absence of real-world falls data. Fall detection algorithms are typically trained on simulated fall data that contain a well-balanced number of examples of falls and activities of daily living. However, real-world falls occur infrequently, making them difficult to capture and causing severe data imbalance.
Follow instruction in readme file
The dataset consists a training and testing folder with received signal strength (RSS) data, obtained from a ray-tracing software (Wireless Insite). There are K=8 anchor nodes and N=12 regions.
- In the folder _training, it contains 8 * 12 = 96 separate .p2m files, each file corresponds to a RSS data collected from a grid number of user locations (coordinate is given in the .p2m file) with respect to a certain Anchor node
In each .p2m file, the coordinate (X, Y, Z) of the target (transmitter in this case) is given, together with the distance with respect to the anchor node. The receiver signal power and phase is then calculated by ray-tracing, with the results printed at the end of each row.
This code is provided here for research purpose(s) only. You are allowed to use this code/data provided that you cite the following papers:
Pilipović, R.; Risojević, V.; Bulić, P. On the Design of an Energy Efficient Digital IIR A-Weighting Filter Using Approximate Multiplication. Sensors 2021, 21, 732. https://doi.org/10.3390/s21030732
For questions and suggestions, please email Ratko Pilipović (email@example.com).
In this dataset, we performed a seven-day motor imagery (MI) based BCI experiment without feedback training on 20 healthy subjects. The MI tasks include left hand, right hand, feet and idle task.
20 healthy subjects (11 males, mean age: 23.2±1.47 years, all right-handed) participated in this study. The recruited subjects were asked to participate seven sessions within two weeks. Each session lasted around 40 minutes and was organized into 6 runs. Subjects could have a short break between runs. During each run, subjects had to perform 40 trials (4 different MI-tasks, 10 trials per task, presented in random order), each trial lasting 9s. The direction of the arrow informed the subjects which task to perform, i.e., the left arrow corresponding to MI of the left hand, the right arrow corresponding to MI of the right hand, down corresponding to MI of both feet, up corresponding to the idle task.
Electroretinography (ERG) has great potential in visual health detection in early diagnosis and intervention. To date, optical coherence tomography and other diagnostic tests are mainly used. Clinically used ERG is an important diagnostic assessment for various retinal diseases, such as hereditary diseases (retinitis pigmentosa, choroideremia, cone dystrophy, etc), diabetic retinopathies, glaucoma, macular degeneration, toxic retinopathies etc. A database of five types of adult and pediatric biomedical electroretinography signals is presented in this study.
WHEN USING THIS RESOURCE, PLEASE CITE THE ORIGINAL PUBLICATION
1. A.E. Zhdanov, A.Yu. Dolganov, E. Lucian, X. Bao, V.I. Borisov, V.N. Kazajkin, V.O. Ponomarev, A.V. Lizunov, L.G. Dorosinskiy, "OculusGraphy: Ocular Examination for Toxicity Evaluation Based on Biomedical Signals," 2020 International Conference on e-Health and Bioengineering (EHB), IASI, 2020, pp. 1-6, doi: 10.1109/EHB50910.2020.9280291.
2. A.E. Zhdanov, A.Yu. Dolganov, V.N. Kazajkin, V.O. Ponomarev, A.V. Lizunov, V.I. Borisov, E. Lucian, X. Bao, L.G. Dorosinskiy, , "OculusGraphy: Literature Review on Electrophysiological Research Methods in Ophthalmology and Electroretinograms Processing Using Wavelet Transform," 2020 International Conference on e-Health and Bioengineering (EHB), IASI, 2020, pp. 1-6, doi: 10.1109/EHB50910.2020.9280221.
The file "00 Description of Research Protocols.pdf" contains a description of the protocols used in this study. The file "01 Appendix 1.xlsx" contains the resulting analysis data of 5 signal types. The file contains filtered signals and the following information: diagnosis, age, wave amplitude, wave latency. The file "02 Appendix 2.xlsx" contains a series of signals. The file contains the following information: patient number, signal.
For further questions please contact Mr. Aleksei E. Zhdanov (correspondence e-mail: firstname.lastname@example.org).
We express our most profound appreciation for cand. med. Oleg V. Shilovskikh CEO of IRTC Eye Microsurgery Ekaterinburg Center for the opportunity to publish the database and disseminate scientific knowledge. The ERG signals data decryption within the study was supported by RFBR, project number 20-07-00498, and 18-29-03088. The ERG signals data processing was supported by Act 211 Government of the Russian Federation, contract 02.A03.21.0006.
This dataset contains RF signals from drone remote controllers (RCs) of different makes and models. The RF signals transmitted by the drone RCs to communicate with the drones are intercepted and recorded by a passive RF surveillance system, which consists of a high-frequency oscilloscope, directional grid antenna, and low-noise power amplifier. The drones were idle during the data capture process. All the drone RCs transmit signals in the 2.4 GHz band. There are 17 drone RCs from eight different manufacturers and ~1000 RF signals per drone RC, each spanning a duration of 0.25 ms.
The dataset contains ~1000 RF signals in .mat format from the remote controllers (RCs) of the following drones:
- DJI (5): Inspire 1 Pro, Matrice 100, Matrice 600*, Phantom 4 Pro*, Phantom 3
- Spektrum (4): DX5e, DX6e, DX6i, JR X9303
- Futaba (1): T8FG
- Graupner (1): MC32
- HobbyKing (1): HK-T6A
- FlySky (1): FS-T6
- Turnigy (1): 9X
- Jeti Duplex (1): DC-16.
In the dataset, there are two pairs of RCs for the drones indicated by an asterisk above, making a total of 17 drone RCs. Each RF signal contains 5 million samples and spans a time period of 0.25 ms.
The scripts provided with the dataset defines a class to create drone RC objects and creates a database of objects as well as a database in table format with all the available information, such as make, model, raw RF signal, sampling frequency, etc. The scripts also include functions to visualize data and extract a few example features from the raw RF signal (e.g., transient signal start point). Instructions for using the scripts are included at the top of each script and can also be viewed by typing help scriptName in MATLAB command window.
The drone RC RF dataset was used in the following papers:
- M. Ezuma, F. Erden, C. Kumar, O. Ozdemir, and I. Guvenc, "Micro-UAV detection and classification from RF fingerprints using machine learning techniques," in Proc. IEEE Aerosp. Conf., Big Sky, MT, Mar. 2019, pp. 1-13.
- M. Ezuma, F. Erden, C. K. Anjinappa, O. Ozdemir, and I. Guvenc, "Detection and classification of UAVs using RF fingerprints in the presence of Wi-Fi and Bluetooth interference," IEEE Open J. Commun. Soc., vol. 1, no. 1, pp. 60-79, Nov. 2019.
- E. Ozturk, F. Erden, and I. Guvenc, "RF-based low-SNR classification of UAVs using convolutional neural networks." arXiv preprint arXiv:2009.05519, Sept. 2020.
Other details regarding the dataset and data collection and processing can be found in the above papers and attached documentation.
- Experiment design: O. Ozdemir and M. Ezuma
- Data collection: M. Ezuma
- Scripts: F. Erden and C. K. Anjinappa
- Documentation: F. Erden
- Supervision, revision, and funding: I. Guvenc
This work was supported in part by NASA through the Federal Award under Grant NNX17AJ94A, and in part by NSF under CNS-1939334 (AERPAW, one of NSF's Platforms for Advanced Wireless Research (PAWR) projects).
MATLAB code for test spectrum sensing algorithm based on statistical processing of instantaneous magnitude (SPIM). The associated SCRIPTs allow: Generating different signals to check the method, FHSS, LFM, CW Pulse, etc. Plot the generated signal, the detection threshold and compare it with the ideal detection. Determine the errors for the different hypotheses based on SNR. Calculate errors in the determination of the amplitude and frequency for different SNRs. Evaluate the probability of detection with different threshold control values A and U.