An Android Application that uses gesture recognition to understand alphabets of american sign language
With a rising demand for disability solutions, sign language recognition has become an important research problem in the field of computer vision. Current sign language detection systems, however, lack the basic characteristics like accessibility and affordability which is necessary for people with speech disability to interact with the everyday environment. This project focuses on providing a portable and affordable solution for understanding sign languages by developing an android application. The report provides a summarization of the basic theory and steps involved in developing this android application that uses gesture recognition to understand alphabets of American sign language. The project uses different image processing tools to separate hand from the rest of the background and then implements pattern recognition techniques for gesture recognition. A comprehensive summarization of the results obtained from the various tests performed has also been provided to illustrate the efficacy of the application.