Survey on Smart Devices for Visually Impaired Peoples


Ms.Deepthi Mohandas

Mrs.Divya D

Department of Computer Science Engineering,
Adi Shankara Institute of
Engineering and Technology.

Department of Computer Science Engineering,
Adi Shankara Institute of
Engineering and Technology.

Mahatma Gandhi University Kottayam.

Mahatma Gandhi University Kottayam.




                Abstract – The World Health Organization (WHO) reports that 90% of the
information the human brain receives is transmitted through
sight alone. There are 285 million blind or visually impaired people
across the world. Technology must serve and continue to advance the
welfare of these individuals. The objective of this paper is to compare the
technological innovation in the field of smart devices for visually impaired



                Index Terms – List key index
terms here. No mare than 5.


I.  Introduction

There are lot of work and research being done to find
ways to improve life for partially-sighted and blind people. Reading and
recognition devices could make smart phones, tablets, and smart glasses into
indispensable aids for the visually impaired.

High performance devices are needed to
improve the quality of life of visually impaired (VI) people.

With the advancement in the modern
technologies, different types of devices are available for the mobility of
visually impaired and blind. These devices are known as Electronic Travel Aid
(ETAs) 1. The ETAs have sensors which alert the blind in advance about the
danger with sounds or vibrations. With the introduction of such Electronic Travelling
Aids (ETAs), there is an increase in blind’s safety and self-confidence. This
paper presents a rough analysis that can be adopted for the design of such
devices. Here checking out the most
interesting technological developments
that could help the visually impaired and the blind 


II.            Methodology


A. BlinDar :An Invisible eye for the  blinds


Some of the visual aid devices include
K sonar 2, Ultra cane 3, Palmsonar 4, iSonic cane 5, Laser cane 6,
and Virtual Eye (using Image processing). These devices are not so user
friendly and easy to handle. Laser canes and Virtual Eye Aids are very costly
and not so user friendly.This paper proposes the device called ‘BlinDar’ which
is an ETA.It aims at improving the life of blind and helps them to self navigate
without depending on someone. Unlike before, blind people will now be able to
lead life as normal people. Apart from that the family members of the blind
person will also be able to track him at any moment and will be able to get the
exact location of him from anywhere. BlinDar is IoT based and is very cost
effective, efficient and user friendly. The device mainly consists of an
Advanced Blind Stick with Arduino Mega2560 microcontroller.


The proposed system provides the
following features:-

navigation smart stick with total cost not exceeding more than 135$.
response of the obstacle sensors in near range upto 4m.
weight components integrated on sticks which makes it user friendly with
low power consumption.
circuit is not complex and uses basics of C/C++/Java to program
There is a
chance when the blind strolls in a noisy place and cannot hear the alert
sound from the system. A Smart wrist band is introduced along with the
stick. A vibrator is attached to it which alerts him of the obstacle by
vibrating at different intensity.
RF Tx/Rx
module is attached for finding the stick in close premises such as home.
If the stick is misplaced, a key is on the wrist band which on pressing
will activate the buzzer on the stick which helps the person to locate the
stick depending on the intensity of sound.
module attached for location sharing to the cloud. Location is updated
every 20ms.
 MQ2 gas sensor for detecting fire in the
path or at home


1:BlinDar Smart stick



B. Virtual


Continuously feeding the visually impaired user with
the necessary information about his/her surroundings is critical for him/her to
“feel” and “sense” things nearby and move around without bumping into objects
and losing orientation. The wearable system presented in this section is designed
to provide users with such assistance. The system consists of two major units:
the embedded wearable sensor and the smart phone. Fig. 1 includes the system
scheme. The embedded wearable small sensor unit has four major units: power,
central processing unit (CPU), sensor, and communication. The power module
relies on one 9-V dc battery to provide power to the wearable unit. The embedded
microprocessor (CPU) we selected for developing this prototype was the

Our project describes the application of embedded
microprocessors with ultrasonic-range sensors to detect objects around the

32-b mbed NXP LPC1768 running at 96 MHz from ARM Ltd.,
due to its low price, small size, low power consumption, and, most of all, its
relatively high computing performance. This microcontroller, in particular, is
for prototyping all sorts of devices. It is armed with the flexibility of
plenty of peripheral interfaces and flash memory. The microprocessor includes
512-kB flash, 32-kB RAM and interfaces including built in ethernet, universal
serial bus (USB) host and device, controlled area network, serial peripheral
interface, inter-integrated circuit bus, analog-to-digital converter, digital to-analog
converter, pulse-width modulation, and other input/output interfaces. Most pins
can also be used as Digital In and Digital Out interfaces, which allows for
easy addition of various sensors. The online C++ software-development and
compiling environment allows the development team to work on the project
anywhere and anytime. The communication module is a Bluetooth chip (or a Wi-Fi
chip) that allows the wearable unit to talk to the smart phone to exchange data
and instructions. The LPC1768 microprocessor offers flexible hardware
interfaces for us to easily hook up various lower-power-consumption Bluetooth
and Wi-Fi chips that are available on the market, such as the RN42-XV Bluetooth
and the XBee Wi-Fi module.

The sensor module consists of various smart sensors
that we can embed in the system to obtain data. The ultrasound sensor detects
objects and measures object distance

from the unit. This is similar to sensors that are
widely installed on vehicles to prevent collisions. In this prototype, we used
Ultrasonic Range Finder XL-MaxSonar-EZ4. Other sensors that we plan to use but
have not yet instituted include color sensors to detect traffic lights, motion
sensors to identify moving objects in front of the visually impaired user, and
a camera that can be used for image processing and identifying objects of
interest. The smart phone receives data obtained from the wearable smart sensor
unit, processes the data in real time, and generates and provides the visually impaired
user with critical warning signals and other information to assist him/her in
navigating the surrounding area in the form of voice and/or vibrating signals.
For instance, when the sensor detects an object in front of the user within

3 m, the smart phone sends beeps to the user at a
higher frequency as an object gets closer. At the same time, the smart phone
utilizes its GPS and map function to “talk” to the  visually impaired user, providing his/her
moving orientation and location (such as street and building names). The
wearable smart-sensor unit can be clipped on a pocket, belt, or hat and faces
the user’s moving direction so that the ultrasound sensor can “see” the objects
in front of the user. The user should also carry the smart phone appropriately
so that he/she can receive the correct moving orientation information. These
two units talk to each other via Bluetooth; therefore, there is no wired
connection between them. Figure 2 provides the details of the circuit
connection among the LPC1768 CPU, the MaxSonar ultrasonic-range sensor, and the
RN42XV Bluetooth chip. The ultrasonic sensor’s analog output (PIN-AN) is
connected to one of the CPU analog inputs (PIN-19) for the CPU to read the
distance data of a detected  object.




2:Virtual eye system structure


C. Smart Guiding Glasses


The proposed system
includes a depth camera for acquiring the depth information of the
surroundings, an ultrasonic rangefinder consisting of an ultrasonic sensor and
a MCU (Microprogrammed Control Unit) for measuring the obstacle distance, an
embedded CPU (Central Processing Unit) board acting as main processing module,
which does such operations as depth image processing, data fusion, AR
rendering, guiding sound synthesis, etc., a pair of AR glasses to display the
visual enhancement information and an earphone to play the guiding sound.

Depth information is
acquired with the depth sensor (the initial prototype only uses the depth
camera of RGB-D camera, which includes a depth sensor and an RGB camera). The
depth sensor is composed by an infrared laser source that project non-visible
light with a coded pattern combined with a monochromatic CMOS (Complementary
Metal Oxide Semiconductor) image sensor that captures the reflected light. The
algorithm that deciphers the reflected light coding generates the depth
information representing the scene. In this work, the depth information is
acquired by mounting the depth sensor onto the glasses with an approximate
inclination of 30°. This way, considering the height of the camera to the
ground to be about 1.65 m and the depth camera working range to be limited
about from 0.4 m to 4 m, the valid distance in field of view is about 2.692 m,
starting about 0.952 m in front of the user.

3:Smart guiding glass




III. Conclusion

In this paper, a detailed explanation about the
BlinDar smart stick ,Virtual eye, smart guiding glass has been given.The life
of the visually impaired will become much easier and independent.  The main drawback of the visually impaired is
that they deprive themselves of what they deserve.  BlinDar helps them in giving confidence and
independence. This product uses the technology of IoT, which is one of the most
demanding topics in the current scenario. It is userfriendly, easily adaptable
and has multipurpose functionality.

Future research will primarily focus on developing the
application system outside of a controlled laboratory environment. A new focus
will be placed on designing the system to be particularly robust in everyday
user interactions with the real world along with real-time, frameby-frame video
processing for object recognition. This current application combined with
future work will allow the visually impaired to gain a novel sense of the world
around them.




1   Ayat A. Nada; Mahmoud A. Fakhr; Ahmed F.
Seddik,” Assistive infrared sensor based smart stick for blind people”, 2015
Science and Information Conference (SAI)

2   Kim S. Y & Cho K, “Usability
and design guidelines of smart canes for users with visual impairments”,
International Journal of Design 7.1,pp.99-110,2013

3  B. Hoyle and D. Waters, “Mobility AT: The Batcane (UltraCane)”, in
Assistive Technology for Visually Impaired and Blind People, M. A Hersh and M.
A. Jhonson, Springer London, pp.209-229

4  Fernandes. H, Costa P, Paredes H, Filipe V & Barroso J,
“Integeration Computer Vision Object Recognition with Location based Services
for the Blind,” Universal Access in Human-Computer Interaction Aging and
Assistive Environments Springer International Publishing, pp 493-500, 2014

5   Ashwini B Yadav, Leena Bindal,
Namhakumar V. U, Namitha K, Harsha H, ” Design and Development of Smart
Assistive Device for Visually Impaired People”, IEEE RTEICT-2016 6. Kim L,
Park S, Lee S & Ha S,” An electronic traveler aid for the blind using
multiple range sensors”, IEICE Electronics Express Vol 6, No. 11, pp.

6  M. King, B. Zhu, and S. Tang, “Optimal path planning,” Mobile Robots, vol. 8, no. 2, pp. 520-531, March 2001.




I'm Katy!

Would you like to get a custom essay? How about receiving a customized one?

Check it out