hand gesture recognition to translate voice and text

M.Praveen Kumar,M.Rajadurai,K.Saidinesh,B. Aravindan

Published in International Journal of Advanced Research in Electronics, Communication & Instrumentation Engineering and Development

ISSN: 2347 -7210          Impact Factor:1.9         Volume:3         Issue:1         Year: 06 April,2017         Pages:488-493

International Journal of Advanced Research in Electronics, Communication & Instrumentation Engineering and Development

Abstract

Gesture-XPLAIN aims at solving the problem of limited communication abilities of those people, who only knows sign language, to talk naturally with the rest of the world by transforming their sign language gestures it into a form of verbal communication. The goal is to create a smart glove system and a mobile device that can continuously recognize sign language gesture and translate that into spoken words. The glove is fitted with a flex-sensors, gyroscope, magnetometer and accelerometer sensors to sense the movement made by hand and fingers. A low power ARM Cortex-M4 microcontroller recognizes the movement by means of acquiring, processing and running a sensor fusion algorithm. The system translates the sign recognized into meaningful text. This text is then transferred to a smart phone app over a Bluetooth channel where the text will be converted into speech.

Kewords

gesture , flux sensor, accelerometer, ,magnetometer.

Reference

[1] Y. Wu and T. S. Huang, “Vision-based gesture recognition: A review,” Lecture Notes Comput. Sci., vol. 1739, pp. 103–115, 1999. [2] D. Sturman and D. Zeltzer, “A survey of glove-based input” in IEEE Computer graphics and Applications, 1994. [3] H. Teleb and G. Chang, “Data Glove Integration with 3D Virtual Environments”, ICSAI 2012, 2012. [4] H. Zhou, H. Hu, N. D. Harris, J. Hammerton, “Applications of wearable inertial sensors in estimation of upper limb movements”, Biomedical Signal Processing and Control 1 22–32, 2006. [5] Z. Lu, X. Chen, Q. Li, X. Zhang, P. Zhou, “A Hand Gesture Recognition Framework and Wearable Gesture-Based Interaction Prototype for Mobile Devices”, IEEE Trans. Human-Machine Systems, vol. 44, no. 2, pp. 293–299, April 2014.