Survey on Smart Devices for Visually Impaired Peoples Ms.Deepthi Mohandas Mrs.Divya D Department of Computer Science Engineering, Adi Shankara Institute of Engineering and Technology. Department of Computer Science Engineering, Adi Shankara Institute of Engineering and Technology. Mahatma Gandhi University Kottayam. Mahatma Gandhi University Kottayam. Abstract – The World Health Organization (WHO) reports that 90% of theinformation the human brain receives is transmitted throughsight alone.
There are 285 million blind or visually impaired peopleacross the world. Technology must serve and continue to advance thewelfare of these individuals. The objective of this paper is to compare thetechnological innovation in the field of smart devices for visually impairedpeoples. Index Terms – List key indexterms here. No mare than 5. I.
IntroductionThere are lot of work and research being done to findways to improve life for partially-sighted and blind people. Reading andrecognition devices could make smart phones, tablets, and smart glasses intoindispensable aids for the visually impaired. High performance devices are needed toimprove the quality of life of visually impaired (VI) people.
With the advancement in the moderntechnologies, different types of devices are available for the mobility ofvisually impaired and blind. These devices are known as Electronic Travel Aid(ETAs) 1. The ETAs have sensors which alert the blind in advance about thedanger with sounds or vibrations.
With the introduction of such Electronic TravellingAids (ETAs), there is an increase in blind’s safety and self-confidence. Thispaper presents a rough analysis that can be adopted for the design of suchdevices. Here checking out the mostinteresting technological developmentsthat could help the visually impaired and the blind II.
Methodology A. BlinDar :An Invisible eye for the blinds Some of the visual aid devices includeK sonar 2, Ultra cane 3, Palmsonar 4, iSonic cane 5, Laser cane 6,and Virtual Eye (using Image processing). These devices are not so userfriendly and easy to handle.
Laser canes and Virtual Eye Aids are very costlyand not so user friendly.This paper proposes the device called ‘BlinDar’ whichis an ETA.It aims at improving the life of blind and helps them to self navigatewithout depending on someone. Unlike before, blind people will now be able tolead life as normal people. Apart from that the family members of the blindperson will also be able to track him at any moment and will be able to get theexact location of him from anywhere. BlinDar is IoT based and is very costeffective, efficient and user friendly. The device mainly consists of anAdvanced Blind Stick with Arduino Mega2560 microcontroller.
The proposed system provides thefollowing features:- Inexpensive navigation smart stick with total cost not exceeding more than 135$. Fast response of the obstacle sensors in near range upto 4m. Light weight components integrated on sticks which makes it user friendly with low power consumption. Overall circuit is not complex and uses basics of C/C++/Java to program microcontroller. There is a chance when the blind strolls in a noisy place and cannot hear the alert sound from the system. A Smart wrist band is introduced along with the stick.
A vibrator is attached to it which alerts him of the obstacle by vibrating at different intensity. RF Tx/Rx module is attached for finding the stick in close premises such as home. If the stick is misplaced, a key is on the wrist band which on pressing will activate the buzzer on the stick which helps the person to locate the stick depending on the intensity of sound. A GPS module attached for location sharing to the cloud. Location is updated every 20ms. MQ2 gas sensor for detecting fire in the path or at home Figure1:BlinDar Smart stick B. VirtualEye Continuously feeding the visually impaired user withthe necessary information about his/her surroundings is critical for him/her to”feel” and “sense” things nearby and move around without bumping into objectsand losing orientation.
The wearable system presented in this section is designedto provide users with such assistance. The system consists of two major units:the embedded wearable sensor and the smart phone. Fig. 1 includes the systemscheme.
The embedded wearable small sensor unit has four major units: power,central processing unit (CPU), sensor, and communication. The power modulerelies on one 9-V dc battery to provide power to the wearable unit. The embeddedmicroprocessor (CPU) we selected for developing this prototype was the Our project describes the application of embeddedmicroprocessors with ultrasonic-range sensors to detect objects around theuser.
32-b mbed NXP LPC1768 running at 96 MHz from ARM Ltd.,due to its low price, small size, low power consumption, and, most of all, itsrelatively high computing performance. This microcontroller, in particular, isfor prototyping all sorts of devices. It is armed with the flexibility ofplenty of peripheral interfaces and flash memory.
The microprocessor includes512-kB flash, 32-kB RAM and interfaces including built in ethernet, universalserial bus (USB) host and device, controlled area network, serial peripheralinterface, inter-integrated circuit bus, analog-to-digital converter, digital to-analogconverter, pulse-width modulation, and other input/output interfaces. Most pinscan also be used as Digital In and Digital Out interfaces, which allows foreasy addition of various sensors. The online C++ software-development andcompiling environment allows the development team to work on the projectanywhere and anytime. The communication module is a Bluetooth chip (or a Wi-Fichip) that allows the wearable unit to talk to the smart phone to exchange dataand instructions. The LPC1768 microprocessor offers flexible hardwareinterfaces for us to easily hook up various lower-power-consumption Bluetoothand Wi-Fi chips that are available on the market, such as the RN42-XV Bluetoothand the XBee Wi-Fi module.The sensor module consists of various smart sensorsthat we can embed in the system to obtain data. The ultrasound sensor detectsobjects and measures object distance from the unit. This is similar to sensors that arewidely installed on vehicles to prevent collisions.
In this prototype, we usedUltrasonic Range Finder XL-MaxSonar-EZ4. Other sensors that we plan to use buthave not yet instituted include color sensors to detect traffic lights, motionsensors to identify moving objects in front of the visually impaired user, anda camera that can be used for image processing and identifying objects ofinterest. The smart phone receives data obtained from the wearable smart sensorunit, processes the data in real time, and generates and provides the visually impaireduser with critical warning signals and other information to assist him/her innavigating the surrounding area in the form of voice and/or vibrating signals.For instance, when the sensor detects an object in front of the user within 3 m, the smart phone sends beeps to the user at ahigher frequency as an object gets closer. At the same time, the smart phoneutilizes its GPS and map function to “talk” to the visually impaired user, providing his/hermoving orientation and location (such as street and building names). Thewearable smart-sensor unit can be clipped on a pocket, belt, or hat and facesthe user’s moving direction so that the ultrasound sensor can “see” the objectsin front of the user. The user should also carry the smart phone appropriatelyso that he/she can receive the correct moving orientation information.
Thesetwo units talk to each other via Bluetooth; therefore, there is no wiredconnection between them. Figure 2 provides the details of the circuitconnection among the LPC1768 CPU, the MaxSonar ultrasonic-range sensor, and theRN42XV Bluetooth chip. The ultrasonic sensor’s analog output (PIN-AN) isconnected to one of the CPU analog inputs (PIN-19) for the CPU to read thedistance data of a detected object.
Figure2:Virtual eye system structure C. Smart Guiding Glasses The proposed systemincludes a depth camera for acquiring the depth information of thesurroundings, an ultrasonic rangefinder consisting of an ultrasonic sensor anda MCU (Microprogrammed Control Unit) for measuring the obstacle distance, anembedded CPU (Central Processing Unit) board acting as main processing module,which does such operations as depth image processing, data fusion, ARrendering, guiding sound synthesis, etc., a pair of AR glasses to display thevisual enhancement information and an earphone to play the guiding sound.Depth information isacquired with the depth sensor (the initial prototype only uses the depthcamera of RGB-D camera, which includes a depth sensor and an RGB camera).
Thedepth sensor is composed by an infrared laser source that project non-visiblelight with a coded pattern combined with a monochromatic CMOS (ComplementaryMetal Oxide Semiconductor) image sensor that captures the reflected light. Thealgorithm that deciphers the reflected light coding generates the depthinformation representing the scene. In this work, the depth information isacquired by mounting the depth sensor onto the glasses with an approximateinclination of 30°. This way, considering the height of the camera to theground to be about 1.65 m and the depth camera working range to be limitedabout from 0.
4 m to 4 m, the valid distance in field of view is about 2.692 m,starting about 0.952 m in front of the user.Figure3:Smart guiding glass III. ConclusionIn this paper, a detailed explanation about theBlinDar smart stick ,Virtual eye, smart guiding glass has been given.The lifeof the visually impaired will become much easier and independent.
The main drawback of the visually impaired isthat they deprive themselves of what they deserve. BlinDar helps them in giving confidence andindependence. This product uses the technology of IoT, which is one of the mostdemanding topics in the current scenario. It is userfriendly, easily adaptableand has multipurpose functionality. Future research will primarily focus on developing theapplication system outside of a controlled laboratory environment. A new focuswill be placed on designing the system to be particularly robust in everydayuser interactions with the real world along with real-time, frameby-frame videoprocessing for object recognition.
This current application combined withfuture work will allow the visually impaired to gain a novel sense of the worldaround them. References1 Ayat A. Nada; Mahmoud A. Fakhr; Ahmed F.Seddik,” Assistive infrared sensor based smart stick for blind people”, 2015Science and Information Conference (SAI)2 Kim S. Y & Cho K, “Usabilityand design guidelines of smart canes for users with visual impairments”,International Journal of Design 7.1,pp.99-110,20133 B.
Hoyle and D. Waters, “Mobility AT: The Batcane (UltraCane)”, inAssistive Technology for Visually Impaired and Blind People, M. A Hersh and M.A. Jhonson, Springer London, pp.209-2294 Fernandes. H, Costa P, Paredes H, Filipe V & Barroso J,”Integeration Computer Vision Object Recognition with Location based Servicesfor the Blind,” Universal Access in Human-Computer Interaction Aging andAssistive Environments Springer International Publishing, pp 493-500, 2014 5 Ashwini B Yadav, Leena Bindal,Namhakumar V.
U, Namitha K, Harsha H, ” Design and Development of SmartAssistive Device for Visually Impaired People”, IEEE RTEICT-2016 6. Kim L,Park S, Lee S & Ha S,” An electronic traveler aid for the blind usingmultiple range sensors”, IEICE Electronics Express Vol 6, No. 11, pp.794-799,2009. 6 M. King, B.
Zhu, and S. Tang, “Optimal path planning,” Mobile Robots, vol. 8, no. 2, pp. 520-531, March 2001.7