Home   >   CSC-OpenAccess Library   >    Manuscript Information
Full Text Available

(577.74KB)
This is an Open Access publication published under CSC-OpenAccess Policy.
Publications from CSC-OpenAccess Library are being accessed from over 74 countries worldwide.
AudiNect: An Aid for the Autonomous Navigation of Visually Impaired People, Based On Virtual Interface
Mario Salerno, Marco Re, Alessandro Cristini, Gianluca Susi, Marco Bertola, Emiliano Daddario, Francesca Capobianco
Pages - 25 - 33     |    Revised - 15-01-2013     |    Published - 28-02-2013
Volume - 4   Issue - 1    |    Publication Date - April 2013  Table of Contents
MORE INFORMATION
KEYWORDS
Visually Impaired, Blind, Autonomous Navigation, Virtual Interfaces, Kinect, AudiNect
ABSTRACT
In this paper, the realization of a new kind of autonomous navigation aid is presented. The prototype, called AudiNect, is mainly developed as an aid for visually impaired people, though a larger range of applications is also possible. The AudiNect prototype is based on the Kinect device for Xbox 360. On the basis of the Kinect output data, proper acoustic feedback is generated, so that useful depth information from 3D frontal scene can be easily developed and acquired. To this purpose, a number of basic problems have been analyzed, in relation to visually impaired people orientation and movement, through both actual experimentations and a careful literature research in the field. Quite satisfactory results have been reached and discussed, on the basis of proper tests on blindfolded sighted individuals.
CITED BY (3)  
1 Takizawa, H., Yamaguchi, S., Aoyagi, M., Ezaki, N., & Mizuno, S. (2015). Kinect cane: an assistive system for the visually impaired based on the concept of object recognition aid. Personal and Ubiquitous Computing, 1-11.
2 Visittrakoon, P., Nhewbang, W., Jenjob, K., Waranusast, R., & Pattanathaburt, P. nuVision: A Mobility Aid for the Navigation of Visually Impaired People using Depth Information.
3 Susi, G., Cristini, A., Salerno, M., & Daddario, E. (2014, November). A low-cost indoor and outdoor terrestrial autonomous navigation model. In Telecommunications Forum Telfor (TELFOR), 2014 22nd (pp. 675-678). IEEE.
1 Google Scholar 
2 CiteSeerX 
3 refSeek 
4 Scribd 
5 SlideShare 
6 PdfSR 
1 K.P. Beier. “Virtual reality: A short introduction.” Internet: http://www-vrl.umich.edu/intro/, Nov. 25, 2008 [Nov. 16, 2012].
2 D. Talaba and A. Amditis. Product engineering, tools and methods based on virtual reality. Dordrecht,Springer, 2008.
3 A.B. Craig, W. Sherman, J. Will. Developing Virtual Reality Applications: Foundations of Effective Design. Burlington, The Morgan Kaufmann Series in Computer Graphics, 2009.
4 N.I. Ghali, O. Soluiman, N. El-Bendary, T.M. Nassef, S.A. Ahmed, Y.M. Elbarawy, A.E. Hassanien.“Virtual Reality Technology for Blind and Visual Impaired People: Reviews and Recent Advances,” in Advances in Robotics and Virtual Reality, 1st ed., vol. 26, T. Gulrez and A.E. Hassanien, Ed.: Springer,2012, pp. 363-385.
5 L.W. Alonzi, D.C. Smith, G.J. Burlak, M. Mirowski. “Radio frequency message apparatus for aiding ambulatory travel of visually impaired persons,” U.S. Patent 5 144 294, Sep. 1, 1992.
6 D. Ross and A. Lightman. “Talking braille: a wireless ubiquitous computing network for orientation and wayfinding,” in Proceedings of the seventh international ACM SIGACCESS conference on Computers and accessibility, 2005, pp. 98-105.
7 J. Souquet, P. Defranould and J. Desbois. “Design of Low-Loss Wide-Band Ultrasonic Transducers for Noninvasive Medical Application.” IEEE Tran. Sonics and Ultrasonics, vol. SU-26, no. 2, pp. 75-81,1979.
8 A.J. Ali and A.H. Sankar. “Artificial Guidance System for Visually Impaired People.” Bonfring International Journal of Man Machine Interface, vol. 2, Special Issue 1, Feb. 2012.
9 K. Yelamarthi, D. Haas, D. Nielsen, S. Mothersell. “RFID and GPS Integrated Navigation System for the Visually Impaired,” in 53rd IEEE International Midwest Symposium on Circuits and Systems, 2010, pp.1149-1152.
10 N. Márkus, A. Arató, Z. Juhász, G. Bognár and L. Késmárki. “MOST-NNG: An Accessible GPS Navigation Application Integrated into the MObile Slate Talker (MOST) for the Blind,” in Computers Helping People with Special Needs, Lecture Notes in Computer Science, K. Miesenberger, J. Klaus, W.Zagler and A. Karshmer, Ed.: Springer, 2010, pp. 247-254.
11 M. Bessho, S. Kobayashi, N. Koshizuka, K. Sakamura. “Assisting mobility of the disabled using spaceidentifying ubiquitous infrastructure,” in Proceedings of the tenth international ACM SIGACCESS conference on Computer and accessibility, 2008, pp. 283-284.
12 U. Biader Ceipidor, E. D'Atri, C.M. Medaglia, A. Serbanati, G. Azzalin, F. Rizzo, M. Sironi, M. Contenti,A. D’Atri. “A RFID System to Help Visually Impaired People in Mobility,” presented at the EU RFID Forum, Brussels, Belgium, 2007.
13 S. Keane, J. Hall, P. Perry. Meet the Kinect: An Introduction to Programming Natural User Interfaces.New York, Apress, 2011.
14 M. Zöllner, S. Huber, H.C. Jetter, H. Reiterer. “NAVI – A Proof-of-Concept of a Mobile Navigational Aid for Visually Impaired Based on the Microsoft Kinect,” in Proceedings of thirteenth IFIP TC13 conference on Human-Computer Interaction, 2011, pp. 584-587.
15 E. Berdinis and J. Kiske. “Kinecthesia: Using Video Game Technology to Assist the Visually Impaired.”Internet: http://www.noblevisiongroup.com/ophthalmic/kinecthesia-using-video-game-technology-toassist-the-visually-impaired,Nov. 17, 2011 [Nov. 16, 2012].
16 E. Berdinis and J. Kiske. “Students Hack Kinect to Make the Kinecthesia Personal Radar for the Vision Impaired.” Internet: http://au.ibtimes.com/articles/251028/20111116/students-hack-kinect-makekinecthesia-personal-radar.htm,Nov. 17, 2011 [Nov. 16, 2012].
17 C. Reas and B. Fry. Processing: a programming handbook for visual designers and artists. Cambridge,The MIT Press, 2007.
18 Audio Programming Languages: Mathematica, Csound, Max, Pure Data, Supercollider, Comparison of Audio Synthesis Environments. Books LLC, 2010.
19 R.T. Dean. The Oxford handbook of computer music. New York, Oxford University Press, 2009.
Professor Mario Salerno
Faculty of Engineering, Department of Electronics University of Rome,“Tor Vergata” Rome, 00133, - Italy
salerno@uniroma2.it
Associate Professor Marco Re
Faculty of Engineering, Department of Electronics University of Rome,“Tor Vergata” Rome, 00133, - Italy
Mr. Alessandro Cristini
Faculty of Engineering, Department of Electronics University of Rome,“Tor Vergata” Rome, 00133, - Italy
Mr. Gianluca Susi
Faculty of Engineering, Department of Electronics University of Rome,“Tor Vergata” Rome, 00133, - Italy
Mr. Marco Bertola
Faculty of Engineering, Master in Sound Engineering University of Rome,“Tor Vergata” Rome, 00133 - Italy
Mr. Emiliano Daddario
Faculty of Engineering, Master in Sound Engineering University of Rome,“Tor Vergata” Rome, 00133 - Italy
Mr. Francesca Capobianco
Faculty of Engineering, Department of Electronics University of Rome,“Tor Vergata” Rome, 00133, - Italy