Skip to main content

Datasets

Standard Dataset

60 GHz FMCW Radar Gesture Dataset

Citation Author(s):
Sarah Seifi (Infineon Technologies AG; Technical University Munich)
Tobias Sukianto (Infineon Technologies AG; Johannes Kepler University)
Cecilia Carbonelli (Infineon Technologies AG)
Submitted by:
Sarah Seifi
Last updated:
DOI:
10.21227/s12w-cc46
Data Format:
No Ratings Yet

Abstract

As the field of human-computer interaction continues to evolve, there is a growing need for new methods of gesture recognition that can be used in a variety of applications, from gaming and entertainment to healthcare and robotics. While traditional methods of gesture recognition rely on cameras or other optical sensors, these systems can be limited by factors such as lighting conditions and occlusions.

To address these challenges, we have developed a new gesture dataset based on radar sensing technology. The dataset includes 21,000 carefully selected gestures, recorded using the BGT60TR13C XENSIV™ 60GHz Frequency Modulated Continuous Radar sensor.

We believe that this dataset will be an invaluable resource for researchers and developers working in the field of gesture recognition. By providing a large and diverse set of radar-based gesture data, we hope to enable the development of new and innovative applications that can enhance human-computer interaction in a wide range of domains.

*Update*- Description of the extended dataset

In addition to the original 21,000 gestures, we have extended the dataset to include 24,000 anomalous gestures and 4,000 additional nominal gestures, resulting in a total of 25,000 nominal and 24,000 anomalous gestures. The anomalous gestures were collected from eight users and include three types of anomalies:

  • Fast executions: 1,000 gestures performed at a faster pace, lasting approximately 0.1 seconds.
  • Slow executions: 1,000 gestures performed at a slower pace, lasting approximately 3 seconds.
  • Wrist executions: 1,000 gestures performed with the wrist instead of a fully extended arm and the shoulder joint.

The extended dataset provides a more comprehensive and diverse set of radar-based gesture data, enabling researchers and developers to evaluate the robustness and adaptability of their models in various scenarios.

Instructions:

The radar system was configured with an operational frequency range spanning from 58.5 GHz to 62.5 GHz, providing a range resolution of 37.5mm and the ability to resolve targets at a maximum range of 1.2 meters. For signal transmission, the radar employed a burst configuration comprising 32 chirps per burst with a frame rate of 33 Hz and a pulse repetition time of 300 µs.

The gestures were performed by eight individuals in six different locations, with a field view of ±45° and a distance of ≤1 meter from the radar. The dataset includes five different gesture types, including Swipe Left, Swipe Right, Swipe Up, Swipe Down, and Push, with an average duration of 0.5 seconds or ten frames per gesture recording. The size of the dataset is 48.1 GB and is named fulldata_zipped.zip.

The indoor locations were a gym, a library, a kitchen, a bedroom, a shared office room, and a closed meeting room.

The dataset sequence of each gesture sample is saved as a numpy array with 4 dimensions, 100x3x32x64, where the first dimension represents the frame length of each gesture, the second dimension represents the number of virtual antennas, the third dimension represents the number of chirps in one frame and the fourth dimension represents the number of samples. The format of each numpy file is GestureName_EnvironmentLabel_UserLabel_SampleLabel.npy, representing the gesture name, the environmental label, the user label, and the sample number.

*Update*

With the same radar system configuration and gesture types, 4,000 additional nominal gestures from four new people (p9, p10, p11, and p12) were collected in location e1. The anomalous gestures were collected from eight users (p1, p2, p6, p7, p9, p10, p11, and p12) and include three types of anomalies. 

The format of each numpy file is GestureName_AnomalyLabel_EnvironmentLabel_UserLabel_SampleLabel.npy, representing the gesture name, the anomaly type, the environmental label, the user label, and the sample number. The size of the extended dataset is 46.8 GB and is named fullextended_data.zip.

The environmental labels are defined as follows:

  • e1: closed-space meeting room
  • e2: open-space office room
  • e3: library
  • e4: kitchen
  • e5: exercise room
  • e6: bedroom

The anomaly labels are defined as follows:

  • fast: Fast executions (~ 0.1 seconds)
  • slow: Slow executions (~ 3 seconds)
  • wrist: Wrist executions (performed with the wrist instead of a fully extended arm and the shoulder joint)

The user labels are defined as follows:

  • p1: Male
  • p2: Female
  • p3: Female
  • p4: Male
  • p5: Male
  • p6: Male
  • p7: Male
  • p8: Male
  • p9: Male
  • p10: Female
  • p11: Male
  • p12: Male

 

 

Disclaimer  

If you decide to use our dataset, please ensure that you properly cite the following publications associated with its introduction and development:  

 

1. For the first part of the dataset:  

   ```

   @inproceedings{seifi2024interpretable,

     title={Interpretable Rule-Based System for Radar-Based Gesture Sensing: Enhancing Transparency and Personalization in AI},

     author={Seifi, Sarah and Sukianto, Tobias and Carbonelli, Cecilia and Servadei, Lorenzo and Wille, Robert},

     booktitle={2024 21st European Radar Conference (EuRAD)},

     pages={156--159},

     year={2024},

     organization={IEEE}

   }

   ```

 

   ```

   @inproceedings{sukianto2024uncertainty,

     title={An Uncertainty Aware Semi-Supervised Federated Learning Framework for Radar-based Hand Gesture Recognition},

     author={Sukianto, Tobias and Wagner, Matthias and Seifi, Sarah and Carbonelli, Cecilia and Huemer, Mario},

     booktitle={2024 21st European Radar Conference (EuRAD)},

     pages={168--171},

     year={2024},

     organization={IEEE}

   }

   ```

 

2. For the dataset extension, please cite the journal:  

   ```

   @article{seifi2025complying,

     title={Complying with the EU AI Act: Innovations in Explainable and User-Centric Hand Gesture Recognition},

     author={Seifi, Sarah and Sukianto, Tobias and Carbonelli, Cecilia and Servadei, Lorenzo and Wille, Robert},

     journal={arXiv preprint arXiv:2503.15528},

     year={2025}

   }

   ```

 

Thank you for responsibly using our dataset.