auditory and tactile information perception

Franklin Jessie Theresa,Aswathy Sunil,Deepika Ashok ,Devi.R

Published in International Journal of Advanced Research in Computer Science Engineering and Information Technology

ISSN: 2321-3337          Impact Factor:1.521         Volume:6         Issue:3         Year: 31 March,2017         Pages:1320-1333

International Journal of Advanced Research in Computer Science Engineering and Information Technology

Abstract

The proposed arrangement is an interactive system which aids blind and visually impaired people by conveying visual information via auditory and tactile methods. Users can traverse a two dimensional layout displayed on a touch screen, using their fingers while simultaneously receiving auditory instructions. The primary source of notification used here is sound which is also used for object localization, identification and shape. On the other hand, touch is used for kinesthetic feedback and pointing. In addition to the proposed theories, raised dot tactile patterns can be included to the system. For other features, the head related transfer function is used for sound directionality, sound intensity and for rendering proximity. The main objective is to dispatch the objects shape but it becomes more complex with scenery. So, each of the components is given a distinct tapping sound – “virtual cane”. The concluded project validates that the raised dot patterns show the best static shape rendition, while spatial sound is used for dynamic display. The proposed configurations outperform the ones in practice. It is also expected to be incorporated into other applications where visual perception is not possible.

Kewords

Sensory substitution, kinesthetic feedback, audio- tactile representation, spatial sound

Reference

[1] J. M. Loomis, R. L. Klatzky, and N. A. Giudice, “Sensory substitution of vision: Importance of perceptual and cognitive processing,” in Assistive Technology for Blindness and Low Vision, R. Manduchi and S. Kurniawan, Eds. Boca Raton, FL, USA: CRC Press, 2012, pp. 162–191. [2] W. H. Dobelle, M. G. Mladejovsky, and J. P. Girvin, “Artificial vi- sion for the blind: Electrical stimulation of visual cortex offers hope for a functional prosthesis,” Science, vol. 183, no. 4123, pp. 440–444, 1974. [3] G. J. Chader, J. Weiland, and M. S. Humayun, “Artificial vision: needs, functioning, and testing of a retinal electronic prosthesis,” in Neurother- apy: Progress in Restorative Neuroscience and Neurology (ser. Progress in Brain Research), J. Verhaagen, E. M. Hol, I. Huitenga, J. Wijnholds, A. B. Bergen, G. J. Boer, and D. F. Swaab, Eds. New York, NY, USA: Elsevier, 2009, vol. 175, pp. 317–332. [4] P. Bach-y-Rita and K. A. Kaczmarek, “Tongue placed tactile output de- vice,” U.S. Patent 6430450, Aug. 2002.+ [5] K. Gourgey, private communication, Jan. 2006. [6] P. Bach-y-Rita, Brain Mechanisms in Sensory Substitution. New YorkNY, USA: Academic, 1972, chs. 1 and 2, pp. 1–37. [7] H. Kajomoto, Y. Kanno, and S. Tachi, “Dorehead electro-tactile display for vision substitution,” in Proc. Int. Conf. Eurohaptics, 2006, pp. 75–79. [8] S. OModhrain, N. A. Giudice, J. A. Gardner, and G. E. Legge, “De- signing media for visually-impaired users of refreshable touch displays: Possibilities and pitfalls,” IEEE Trans. Haptics, vol. 8, no. 3, pp. 248–257,Jul.–Sep. 2015. [9] R. L. Klatzky, N. A. Giudice, C. Bennett, and J. M. Loomis, “Touch- screen technology for the dynamic display of 2d spatial information with- out vision: Promise and progress,” Multisensory Res., vol. 27, nos. 5/6, pp. 359–378, 2014. [10] S. Landau and L. Wells, “Merging tactile sensory input and audio data by means of the Talking Tactile Tablet,” in Proc. EuroHaptics, 2003, pp. 414–418. [11] O.Bau,I.Poupyrev,A.Israr,andC.Harrison,“TeslaTouch:Electrovibra- tion for touch surfaces,” in Proc. 23nd Annu. ACM Symp. User Interface Softw. Technol., 2010, pp. 283–292. [12] C. Xu, A. Israr, I. Poupyrev, O. Bau, and C. Harrison, “Tactile display for the visually impaired using TeslaTouch,” in Proc. CHI 11 Extended Abstracts Human Factors Comput. Syst., May 2011, pp. 317–322. [13] A. Israr, O. Bau, S.-C. Kim, and I. Poupyrev, “Tactile feedback on flat surfaces for the visually,” in Proc. CHI 12 Extended Abstracts Human Factors Comput. Syst., May 2012, pp. 1571–1576. [14] L. E. Winfield, J. Glassmire, J. E. Colgate, and M. A. Peshkin, “T-PaD: Tactile pattern display through variable friction reduction,” in Proc. 2nd Joint Eurohaptics Conf. Symp. Haptic Interfaces Virtual Environ. Teleop- erator Sys., Mar. 2007, pp. 421–426. [15] E. C. Chubb, J. E. Colgate, and M. A. Peshkin, “Shiverpad: A device capable of controlling the shear force on a bare finger,” in Proc. World Haptics Conf., Mar. 2009, pp. 18–23. [16] N.D.Marchuk,J.E.Colgate,andM.Peshkin,“Frictionmeasurementson a large area TPaD,” in Proc. IEEE Haptics Symp., Mar. 2010, pp. 317–320. [17] B. Poppinga, C. Magnusson, M. Pielot, and K. Rassmus-Gro ̈hn, “Tou- chOver Map: Audio-tactile exploration of interactive maps,” in Proc. 12th Int. Conf. Human Comp. Interaction Mobile Devices, 2011, pp. 545–550.