The ASL Dataset for Real-Time Recognition and Integration with LLM Services

Authors

Abstract

This study aims to investigate the impact of hand gesture recognition techniques on the efficiency of American Sign Language (ASL) interpretation, addressing a critical gap in the existing literature. The research seeks new insights into the challenges of automated sign language recognition, contributing to a deeper understanding of accessibility in communication for the deaf and hard-of-hearing community. The study employs a quantitative approach, using a dataset comprising hand gesture images representing the static letters of the ASL alphabet collected from multiple users. Data were collected from various individuals to ensure diversity and analyzed using machine learning models to evaluate their effectiveness in recognizing ASL signs. The results reveal that the machine learning models implemented achieved a high accuracy rate in recognizing hand gestures, indicating that person-specific variations do not significantly hinder performance. These findings provide evidence that the proposed dataset and methodologies can improve the reliability of sign language recognition systems, offering significant implications for the development of more inclusive communication technologies. This research offers a novel perspective on sign language recognition, providing valuable insight that extends the current understanding of gesture-based communication systems. The study’s findings contribute to advancements in accessibility technologies, highlighting areas for future research and practical applications in improving communication for the Deaf and hard of hearing community.

Additional Files

Published

2024-10-29

Issue

Section

Applied Informatics