American Sign Language dataset for semantic communications

Citation Author(s):
Vasileios
Kouvakis
InnoCube
Lamprini
Mitsiou
InnoCube
Stylianos E.
Trevlakis
InnoCube
Alexandros-Apostolos A.
Boulogeorgos
University of Western Macedonia
Theodoros
Tsiftsis
InnoCube
Submitted by:
Stylianos E. Tr...
Last updated:
Sun, 01/12/2025 - 16:58
DOI:
10.21227/2c1z-8j21
Research Article Link:
License:
0
0 ratings - Please login to submit your rating.

Abstract 

The dataset was developed as part of the NANCY project (https://nancy-project.eu/) to support tasks in the computer vision area. It is specifically designed for sign language recognition, focusing on representing joints and finger positions. The dataset comprises images of hands that represent the alphabet in American Sign Language (ASL), with the exception of the letters "J" and "Z," as these involve motion and the dataset is limited to static images. A significant feature of the dataset is the use of color-coding, where each finger is associated with a distinct color. This approach enhances the ability to extract features and distinguish between different fingers, offering significant advantages over traditional grayscale datasets like MNIST. The dataset consists of RGB images, which enhance the recognition process and support more effective learning, achieving high performance even with a relatively modest amount of training data. This format improves the ability to discriminate and extract features compared to grayscale images. Although the use of RGB images introduces additional complexity, such as increased data representation and storage requirements, the advantages in accuracy and feature extraction make it a valuable choice. The dataset is well-suited for applications involving gesture recognition, sign language interpretation, and other tasks requiring detailed analysis of joint and finger positions. 

The NANCY project has received funding from the Smart Networks and Services Joint Undertaking (SNS JU) under the European Union's Horizon Europe research and innovation programme under Grant Agreement No 101096456. 

Instructions: 

The dataset offers extensive possibilities for a variety of applications. It can be utilized to study and analyze the positional relationships of joints and fingers in the context of sign language recognition. Moreover, the dataset is a useful tool for training models of artificial intelligence (AI) and machine learning (ML). Using the color-coded characteristics and thorough key-point annotations, researchers may create algorithms for gesture detection, sign language interpretation, and other human-computer interaction chores. RGB photos also help to build models that investigate the advantages of color information in feature extraction and discrimination, therefore enabling possibly more powerful recognition systems. The dataset can also help research on data efficiency since its design shows how to reach high performance measures using small training samples. 

The dataset consists of 24 folders for training and testing, each corresponding to a specific letter of the American Sign Language (ASL) alphabet. Each folder contains 440 unique images of the same letter for training and 75 for testing.

Funding Agency: 
Smart Networks and Services Joint Undertaking (SNS JU) under the European Union's Horizon Europe research and innovation programme
Grant Number: 
101096456