Hand gesture dataset

For gesture recognition, radar sensors provide a unique alternative to other input devices, such as cameras or motion sensors. They combine a low sensitivity to lighting conditions, an ability to see through surfaces, and user privacy preservation, with a small form factor and low power usage. However, radar signals can be noisy, complex to analyze, and do not transpose from one radar to another.


Computer vision systems are commonly used to design touch-less human-computer interfaces (HCI) based on dynamic hand gesture recognition (HGR) systems, which have a wide range of applications in several domains, such as, gaming, multimedia, automotive, home automation. However, automatic HGR is still a challenging task, mostly because of the diversity in how people perform the gestures. In addition, the number of publicly available hand gesture datasets is scarce, often the gestures are not acquired with sufficient image quality, and the gestures are not correctly performed.


GesHome dataset consists of 18 hand gestures from 20 non-professional subjects with various ages and occupation. The participant performed 50 times for each gesture in 5 days. Thus, GesHome consists of 18000 gesture samples in total. Using embedded accelerometer and gyroscope, we take 3-axial linear acceleration and 3-axial angular velocity with frequency equals to 25Hz. The experiments have been video-recorded to label the data manually using ELan tool.