60 GHz FMCW Radar Gesture Dataset

Citation Author(s):
Sarah
Seifi
Infineon Technologies AG; Technical University Munich
Tobias
Sukianto
Infineon Technologies AG; Johannes Kepler University
Cecilia
Carbonelli
Infineon Technologies AG
Submitted by:
Sarah Seifi
Last updated:
Thu, 12/05/2024 - 11:52
DOI:
10.21227/s12w-cc46
Data Format:
License:
0
0 ratings - Please login to submit your rating.

Abstract 

As the field of human-computer interaction continues to evolve, there is a growing need for new methods of gesture recognition that can be used in a variety of applications, from gaming and entertainment to healthcare and robotics. While traditional methods of gesture recognition rely on cameras or other optical sensors, these systems can be limited by factors such as lighting conditions and occlusions.

To address these challenges, we have developed a new gesture dataset based on radar sensing technology. The dataset includes 21,000 carefully selected gestures, recorded using the BGT60TR13C XENSIV™ 60GHz Frequency Modulated Continuous Radar sensor.

We believe that this dataset will be an invaluable resource for researchers and developers working in the field of gesture recognition. By providing a large and diverse set of radar-based gesture data, we hope to enable the development of new and innovative applications that can enhance human-computer interaction in a wide range of domains.

*Update*- Description of the extended dataset

In addition to the original 21,000 gestures, we have extended the dataset to include 12,000 anomalous gestures and 2,000 additional nominal gestures, resulting in a total of 35,000 gestures. The anomalous gestures were collected from six users and include three types of anomalies:

  • Fast executions: 2,000 gestures performed at a faster pace, lasting approximately 0.1 seconds.
  • Slow executions: 2,000 gestures performed at a slower pace, lasting approximately 3 seconds.
  • Wrist executions: 2,000 gestures performed with the wrist instead of a fully extended arm and the shoulder joint.

The extended dataset provides a more comprehensive and diverse set of radar-based gesture data, enabling researchers and developers to evaluate the robustness and adaptability of their models in various scenarios.

Instructions: 

The radar system was configured with an operational frequency range spanning from 58.5 GHz to 62.5 GHz, providing a range resolution of 37.5mm and the ability to resolve targets at a maximum range of 1.2 meters. For signal transmission, the radar employed a burst configuration comprising 32 chirps per burst with a frame rate of 33 Hz and a pulse repetition time of 300 µs.

The gestures were performed by eight individuals in six different locations, with a field view of ±45° and a distance of ≤1 meter from the radar. The dataset includes five different gesture types, including Swipe Left, Swipe Right, Swipe Up, Swipe Down, and Push, with an average duration of 0.5 seconds or ten frames per gesture recording. The size of the dataset is 48.1 GB and is named fulldata_zipped.zip.

The indoor locations were a gym, a library, a kitchen, a bedroom, a shared office room, and a closed meeting room.

The dataset sequence of each gesture sample is saved as a numpy array with 4 dimensions, 100x3x32x64, where the first dimension represents the frame length of each gesture, the second dimension represents the number of virtual antennas, the third dimension represents the number of chirps in one frame and the fourth dimension represents the number of samples. The format of each numpy file is GestureName_EnvironmentLabel_UserLabel_SampleLabel.npy, representing the gesture name, the environmental label, the user label, and the sample number.

*Update*

With the same radar system configuration and gesture types, 2,000 additional nominal gestures from two new people (p9 and p10) were collected in location e1. The anomalous gestures were collected from six users (p1, p2, p6, p7, p9, and p10) and include three types of anomalies. 

The format of each numpy file is GestureName_AnomalyLabel_EnvironmentLabel_UserLabel_SampleLabel.npy, representing the gesture name, the anomaly type, the environmental label, the user label, and the sample number. The size of the extended dataset is 46.8 GB and is named fullextended_data.zip.

The environmental labels are defined as follows:

  • e1: closed-space meeting room
  • e2: open-space office room
  • e3: library
  • e4: kitchen
  • e5: exercise room
  • e6: bedroom

The anomaly labels are defined as follows:

  • fast: Fast executions (~ 0.1 seconds)
  • slow: Slow executions (~ 3 seconds)
  • wrist: Wrist executions (performed with the wrist instead of a fully extended arm and the shoulder joint)

The user labels are defined as follows:

  • p1: Male
  • p2: Female
  • p3: Female
  • p4: Male
  • p5: Male
  • p6: Male
  • p7: Male
  • p8: Male
  • p9: Male
  • p10: Female