Hand Gesture Echo Based on Millimeter Wave Radar; JUST; China
The millimeter-wave radar has the ability to sense the subtle movement of hand. However, the traditional hand gesture recognition methods are not robust in the scenario with dynamic interference. To address this issue, a robust hand gesture recognition method is proposed based on the self-attention time-series neural networks. Firstly, the original radar echo is constructed in terms of frame, sequence and channel at the input terminal of network. In order to extract the feature from each frame sequence independently, a one-dimensional time-series neural network is built, and the time-distributed layer is used as the wrapper. Then the self-attention mechanism is employed to assign the adequate weights to the sequence of frames entered in parallel, to obtain the inter-frame correlation and to suppress the random interference. Finally, the Global AvgPooling layer is used to reduce the number of channels, and the fully connected layer outputs the label of the gesture. The experimental results show that the proposed method can achieve a high recognition rate in the presence of 25% random dynamic interference.
***\**\** Atten-TsNN Model \**\**\***
This is the running program of the Atten-TsNN model. A description of the file functions as follows:
Load data and labels from the DataSets folder.
Preprocessing is done by calling 'FileHandle.py'.
Initialize the Atten-TsNN model by calling 'Model_torch.py'
Training and validation by calling 'Trainer.py'
Implement data labeling, tensor transformation, dimensional transformation and so on
Initialize model parameters
Training, validation and testing
The following is the folder description:
Data set, divided into 5 categories by folder, class.txt as the label. Because the data set is too large, only a single sample is given here. For instance, 'adc1_1_Raw_0.bin' is data without interference, 'Interference1_1.bin' is data with interference.
If you want to visualize，please set hook_flag = True，and the Input and output information of the Attention layer will be stored in the hook folder in numpy format.