Sign Language Translation using Hand Gesture Detection
Neelam Shrivastava1, Abhishek Jain2, Kopal Garg3, Abhishekh Pratap Singh4, Abhishek Sharma5

1Prof. (Dr.) Neelam Shrivastava, Professor, Department of Computer Science and Engineering, Meerut Institute of Engineering & Technology, Meerut, India.
2Kopal Garg, Student, Department of Computer Science and Engineering, Meerut Institute of Engineering and Technology, Meerut, India.
3Abhishekh Pratap Singh, Student, Department of Computer Science and Engineering, Meerut Institute of Engineering and Technology, Meerut, India.
4Abhishek Jain, Student, Department of Computer Science and Engineering, Meerut Institute of Engineering and Technology, Meerut, India.
5Abhishek Sharma, Student, Department of Computer Science and Engineering, Meerut Institute of Engineering and Technology, Meerut, India.

Manuscript received on May 25, 2020. | Revised Manuscript received on June 29, 2020. | Manuscript published on July 30, 2020. | PP: 509-512 | Volume-9 Issue-2, July 2020. | Retrieval Number: B3620079220/2020©BEIESP | DOI: 10.35940/ijrte.B3620.079220
Open Access | Ethics and Policies | Cite | Mendeley
© The Authors. Blue Eyes Intelligence Engineering and Sciences Publication (BEIESP). This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/)

Abstract: This Paper demonstrate a module on a sign to speech (voice) converter for auto Conversion of American sign language (ASL) to English Speech and text. It minimizes the gap of communication between speech impaired and other humans. It could be used to understand a speech impaired person’s thoughts or views which he communicates with others through its ASL gestures but failing to communicate due to a large communication gap between them. It also work as a translator for person who do not understand the sign language and allows the communication in the natural way of speaking. The proposed module is an interactive application module developed using Python and its Advanced Libraries. This module uses inbuilt camera of the system to get the images and perform analysis on those images to predict the meaning of that gesture and provide the output as text on screen and speech through speaker of the system that makes this module very much cost effective. This module recognizes one handed ASL gestures of alphabets (A-Z) with highly consistent, fairly high precision and accuracy. 
Keywords: ASL, Sign Language, Gesture, Image, Communication.