Visual tracking methods have achieved a successful development in recent years. Especially the Discriminative Correlation Filter (DCF) based methods have significantly advanced the state-of-the-art in tracking. The advancement in DCF tracking performance is predominantly attributed to powerful features and sophisticated online learning formulations. However, it would come to some troubles if the tracker learns the samples indiscriminately.


The CREATE database is composed of 14 hours of multimodal recordings from a mobile robotic platform based on the iRobot Create.


Provided Files

  •          :   HDF5 files for Experiment I
  •          :   HDF5 files for Experiment II
  •          :   HDF5 files for Experiment III
  •          :   Preview MP4 videos and PDF images
  •       :   Documentation: CAD files, datasheets and images
  •  :   Source code for recording, preprocessing and examples


Extract all ZIP archives in the same directory (e.g. $HOME/Data/Create).Examples of source code (MATLAB and Python) for loading and displaying the data are included.For more details about the dataset, see the specifications document in the documentation section. 

Dataset File Format

The data is provided as a set of HDF5 files, one per recording session. The files are named to include the location (room) and session identifiers, as well as the recording date and time (ISO 8601 format). The recording sessions related to a particular experiment are stored in a separate folder. Overall, the file hierarchy is as follows:



Summary of Available Sensors

The following sensors were recorded and made available in the CREATE dataset:

  • Left and right RGB cameras (320x240, JPEG, 30 Hz sampling rate)
  • Left and right optical flow fields (16x12 sparse grid, 30 Hz sampling rate)
  • Left and right microphones (16000 Hz sampling rate, 64 ms frame length)
  • Inertial measurement unit: accelerometer, gyroscope, magnetometer (90 Hz sampling rate)
  • Battery state (50 Hz sampling rate)
  • Left and right motor velocities (50 Hz sampling rate)
  • Infrared and contact sensors (50 Hz sampling rate)
  • Odometry (50 Hz sampling rate)
  • Atmospheric pressure (50 Hz sampling rate)
  • Air temperature (1 Hz sampling rate)


Other relevant information about the recordings is also included:

  • Room location, date and time of the session.
  • Stereo calibration parameters for the RGB cameras.


Summary of Experiments

Experiment I: Navigation in Passive Environments

The robot was moving around a room, controlled by the experimenter using a joystick. Each recorded session was approximately 15 min. There are 4 session recordings per room, with various starting points and trajectories. There was little to no moving objects (including humans) in the room. The robot was directed by the experimenter not to hit any obstacles. 

Experiment II: Navigation in Environments with Passive Human Interactions

In this experiment, the robot was moving around a room, controlled by the experimenter using a joystick. Each recorded session was approximately 15 min. There are 4 session recordings per room, with various starting points and trajectories. Note that compared to Experiment I, there was a significant amount of moving objects (including humans) in the selected rooms. 

Experiment III: Navigation in Environments with Active Human Interactions

The robot was moving around a room, controlled by the experimenter using a joystick. A second experimenter lifted the robot and changed its position and orientation at random intervals (e.g. once every 10 sec). Each recorded session was approximately 15 min. There are 5 session recordings in a single room. 


The authors would like to thank the ERA-NET (CHIST-ERA) and FRQNT organizations for funding this research as part of the European IGLU project.



The dataset is an extensive collection of labeled high-frequency Wi-Fi Radio Signal Strength (RSS) measurements corresponding to multiple hand gestures made near a smartphone under different spatial and data traffic scenarios. We open source the software code and an Android app (Winiff) to create this dataset, which is available at Github ( The dataset is created using an artificial traffic induction (between the phone and the access point) approach to enable useful and meaningful RSS value


This is the Smulation Data for Power System State Estimation.


Speech detection systems are known as a type of audio classifier systems which are used to recognize, detect or mark parts of audio signal including human speech. Here, a novel robust feature named Long-Term Spectral Pseudo-Entropy (LTSPE) is proposed to detect speech and its purpose is to improve performance in combination with other features, increase accuracy and to have acceptable performance. Experimental results show that if LTSPE is combined with other features, performance of the detector is improved.


 please download files from here


1. Source code of the LTSPE feature in MATLAB (.m file)

2. Related paper (pdf)

3. Test.WAV file





In order to discriminate and mark audio signal segments which include normal human speech and discriminate segments which do not include speech (like silence, music and noise), Speech/Music Discrimination (SMD) systems are used. Using this definition, SMD systems can be considered as a specific or accurate type of speech activity detection system.


Files are being reviewed; they will be uploaded here soon ...


Recognition of human activities is one of the most promising research areas in artificial intelligence. This has come along with the technological advancement in sensing technologies as well as the high demand for applications that are mobile, context-aware, and real-time. We have used a smart watch (Apple iWatch) to collect sensory data for 14 ADL activities (Activities of Daily Living). 


This is the data competion hosted by the IEEE Machine Learning for Signal Processing (MLSP) Technical Committee as part of the 27th IEEE International Workshop on Machine Learning for Signal Processing (MLSP 2017), Tokyo, Japan. This year the competion is based on a dataset kindly provided Petroleum Geo-Systems (PGS), on source separation for seismic data acquistion. 

Last Updated On: 
Tue, 05/01/2018 - 15:07
Citation Author(s): 
IEEE MLSP Technical Committee

The aim of the database consists of providing the researchers with a collection of power quality real-life impulsive events to test experiments and measurement instruments. The dataset provides signals recordings from the power network of the University of Cádiz during the last five years (electrical network according to the UNE-EN-50160: 2011).

The dataset offers a diversity of real impulsive events, which are specifically acquired in order to test Power Quality Instruments according to the UNE-IEC 61000-4-11: 2005.


Costas arrays are permutation matrices that meet the added Costas condition that, when used as a frequency-hop scheme, allow at most one time-and-frequency-offset signal bin to overlap another.  Databases to various orders have been available for many years.  Here we have a database that is far more extensive than any available before it.  A very powerful and easy-to-use Windows utility with a GUI accompanies the database.


Download the file  This file contains the Instructions as a PDF file, the extraction and analysis utility in its own ZIP file, and several information files includign an enumeration database in an Excel file.


Unpack this file in a folder that you want to be the location of your Costas array database.  Be sure and unpack subfolders, so that you dee subfolders /Searches and /Generated when you are done.  Folder /Searches contains all Costas arrays to order 29, and folder /Generated contains all generated Costas arrays to order 100.  The file contains the extraction and analysis utility.  It may be extracted in-place or, if the database is on a network drive or other location inconvenient for DLLs, in its own folder anywhere on a local drive such as your C:\ drive.  See the Instructions PDF for details.


Then, as you need them, add these files:        More data for /Generated folder        More data for /Generated folder        More data for /Generated folder        More data for /Generated folder        More data for /Generated folder        More data for /Generated folder        More data for /Generated folder        More data for /Generated folder       More data for /Generated folder    More data for /Generated folder    More data for /Generated folder


This is a file that was produced by the extraction/analysis utility        Frequency hop LUB list; useful with PLL-based waveform generators


For further information, see the file Costas Arrays to Order 1030 INSTRUCTIONS.pdf