human activity and sign language recognition for non-deaf visually impaired people

Bharathi G P,Pravin R,Jeevanantham.G

Published in International Journal of Advanced Research in Computer Science Engineering and Information Technology

ISSN: 2321-3337          Impact Factor:1.521         Volume:6         Issue:3         Year: 02 April,2021         Pages:1515-1518

International Journal of Advanced Research in Computer Science Engineering and Information Technology

Abstract

The recognition of human activity is one of the active areas of research in computer vision for various contexts such as safety surveillance, health care, and humans. In this project, Human Activity Recognition is generally developed for recognizing human activities, using HAR for blind people is a very crucial thing. HAR aims to recognize activities from a series of observations on the actions of humans and inform the humans and their actions to blind people by earphones. HAR also aims to recognize the sign language of deaf people and translate that language to human language. By integrating recognition of human action and recognition of sign language will help visually impaired people. Blind people may sense the people's moving in front of them but they can sense only a limited distance of area, for overcoming these issues, using HAR they can sense the human and their action preciously. The main objective is to recognize and translate human activity. The framework provides a helping hand for speech-impaired to communicate with the blind and the rest of the world also. This results in the elimination of the intermediary who usually acts as a means of translation. By integrating HAR and SLR, the integrated interface is named HASLR Keywords—sign language recognition, smartphone, human activity recognition, integrated interface

Kewords

Keywords—sign language recognition, smartphone, human activity recognition, integrated interface

Reference

[1] Deep Analysis for Smartphone-based Human Activity Recognition Chew Yong Shan; Pang Ying Han; Ooi Shih Yin2020 8th International Conference on Information and Communication Technology (ICoICT)Year: 2020 | Conference Paper | Publisher: IEEE [2] Towards Multilingual Sign Language Recognition Sandrine Tornay; Marzieh Razavi; Mathew Magimai.-DossI CASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)Year: 2020 | Conference Paper | Publisher: IEEE [3] Oscar Koller, Necati Camgoz, Hermann Ney, and Richard Bowden, “Weakly supervised learning with multi￾stream cnnlstm-hmms to discover sequential parallelism in sign language videos,” IEEE Transactions on Pattern Analysis and Machine Intelligence, 04 2019. [4] S. M. Lee, S. M., Cho, H., & Yoon, “Human Activity Recognition From Accelerometer Data Using Convolutional Neural Network,” IEEE Int. Conf. Big Data Smart Comput. (BigComp)., vol. 62, pp.131–134, 2017. [5] Y. Chen, K. Zhong, J. Zhang, Q. Sun, and X. Zhao, “LSTM Networks for Mobile Human Activity Recognition,” no. Icaita, pp.50–53, 2016.