Sign language detection and translation using Smart Glove

Maharjan, Sunila, Shrestha, Subeksha and Fernando, Sandra (2023) Sign language detection and translation using Smart Glove. In: International Conference on Innovative Computing and Communications (ICICC 2024), 16-17 February 2024, Shaheed Sukhdev College of Business Studies, University of Delhi, New Delhi. (In Press)

[img] Text
Sign Language Detection and Translation using Smart Glove.pdf - Accepted Version
Restricted to Repository staff only until 31 January 2026.

Download (1MB) | Request a copy

Abstract / Description

Communication through sign language is essential for hard of hearing or deaf people. However, effective interaction and involvement in many facets of society are hampered by the communication gap that exists between sign language users and those who do not comprehend it. In order to solve this pressing problem, a smart glove-based system for real-time sign language detection and translation is being developed as part of this research project. The project's goals include designing and creating a smart glove prototype that has an MPU-6050 and five flex sensors. Intricate finger and hand movements, which are essential to sign language, are precisely captured by these sensors. To handle the sensor input, a Recurrent Neural Network (RNN) with Long Short-Term Memory (LSTM) is developed, which allows the recognition and categorization of sign language gestures. The system's first focus is on American Sign Language (ASL) gestures that correlate to the English letters A through E and the numerals 0 through 9. The real-time translation of sign language motions into spoken language and text is a significant innovation. A text-to-speech engine is integrated into the system, enabling simultaneous textual and audio outputs. Sign gestures made by users are translated into text on a screen instantaneously and used to communicate with the system. In order to facilitate communication with both sign language and non-sign language users, the system simultaneously translates the text into spoken language. The main focus of the research project is alphabet and number detection and translation in sign language. It skips over how to recognise convoluted sign language statements or how sign languages differ from one another. Due to time and budget constraints, evaluation focuses on accuracy and performance measures instead of in-depth user research.

Item Type: Conference or Workshop Item (Paper)
Uncontrolled Keywords: Artificial Intelligence, machine learning algorithms, RNN, LSTM, Arduino, python, TensorFlow, sensors, ASL, Assistive Technology
Subjects: 600 Technology
Department: School of Computing and Digital Media
Depositing User: Sandra Fernando
Date Deposited: 03 Jan 2024 12:31
Last Modified: 01 Mar 2024 12:32
URI: https://repository.londonmet.ac.uk/id/eprint/9015

Actions (login required)

View Item View Item