Professional Documents
Culture Documents
ISSN No:-2456-2165
Abstract -Deaf-mute humans need to talk with regular Gesture Vocalizer is a tool that's being designed to
human beings for their everyday ordinary. The deaf-mute permit the communication of various deaf and blind societies
people use sign language to talk with other humans. and their conversation with the regular humans. The system
However, it is viable simplest for the ones who have may be dynamically reconfigured as a clever tool that may
passed through unique training to recognize the work for all styles of sign languages. Gesture vocalizer is
language. Sign language uses hand gestures and different essentially a data glove and a microcontroller that can
means of non-verbal behaviours to deliver their supposed stumble on nearly all of the movements of a hand and convert
meaning. It includes combining hand shapes, orientation a few certain moves into human recognizable voice.It is
and hand actions, hands or body movement, and facial based on making an electronic device that could translate
expressions concurrently, to fluidly explicit speaker's sign language into speech so as to make the communication
thoughts. take place between the mute communities with the overall
public viable.
The project is based on the need of developing an
electronic glove that can translate sign language into The main components used here are flex sensors,
speech in order to lower the barrier in communication gloves, LCD display, and Arduino UNO speech synthesis.
between the mute communities and the normal people. Features and the architecture of the Arduino UNO Is
explained in the next section.Flex sensor is largely a variable
A Wireless electronic gloves is used, which is resistor whose terminal resistance increases whilst the itis
everyday material driving gloves fitted with flex sensors bent. So this sensor is used to sense the changes in linearity.
along with the every finger. Mute human beings can use LCD is a flat panel display technology commonly used in
the gloves to perform hand gesture which can be TVs and computer monitors. It is also used in screens for
converted into speech in order that regular humans can mobile devices, such as laptops, tablets, and smartphones. A
apprehend their expressions. USB cable is used to upload then program to Arduino UNO.
A Bluetooth module is used here to communicate with the
Human beings have interaction with each other to mobile phones. The gloves consists of flex sensors attached
deliver their ideas and reviews to the people around them. to every finger of the glove to sense the detection of the
But this isn't always the case for deaf-mute people. Sign bending. The bending of the fingers is detected and as
language paves the manner for deaf-mute people to programmed the for particular combinations of the input
communicate. Through signal language verbal exchange given to the Arduino the respective output will be shown.
is feasible for a deaf-mute person without the manner of When you want to measure the flex or bent or angle change
acoustic sounds. of any instrument the flex sensor internal resistance changes
Thus, in order to bridge this gap, this project intends almost linearly with its flex angle.
to implement a real-time video processing-based speech
assistant system for the speech impaired
I. INTRODUCTION
B. FLEX SENSORS:
Fig. 2: Block diagram Flex sensors (Fig-4) are the sensors that degree the
amount of deflection or bending. They act as the supply of
Block diagram of hand gesture vocalizer for deaf is as input to the microcontroller.
shown in Fig-2. The device has hardware and software
program. Hardware part consists of flex sensors, Arduino,
LCD display, Bluetooth module. Software consists of the
programming of Arduino in keeping with the gestures. This
gadget is split into three elements:
Gesture enter
Processing the records
Voice output using smartphone
Gesture input:Sensors are located at the hand of deaf
human beings which converts the parameter like finger
bend hand position perspective into electrical sign and
provide it to Atmega 328 controller and controller take Fig. 4: Flex sensor
movement in line with the signal.
C. LCD 16×2:
Processing the records: The output of flex sensors is Liquid Crystal Display (Fig-5) display is an electronic
converted into digital form by using inbuilt analog to device. A 16x2 LCD presentations 16 characters according
digital converter of Arduino UNO. Predefined gestures to line and there are 2 such strains. This LCD includes
with corresponding messages are saved in the database Command registers and Data registers. The command
of the microcontroller. Arduino UNO checks whether registers keep the command commands given to the LCD.
the enter voltage from the flex sensors exceeds the edge The information registers shops the records to be displayed
price this is stored inside the database. on the LCD. It is used for consumer interface.
Voice output using smartphone:The output from the
Arduino is sent to LCD and smartphone via Bluetooth
module. LCD displays the message that is assigned to
the gesture within the program. Speech signal is
produced using text to speech converter application
through mobile phone.
A. ARDUINO UNO:
Arduino UNO is a microcontroller board based on
ATmega328P. It consists of 14 virtual input/output pins (of
which 6 can be used as PWM outputs), 6 analog inputs, a
sixteen MHz ceramic resonator, a USB connection, a power
jack, an ICSP header and a reset button. It incorporates the
whole lot needed to assist the microcontroller; truly join it
to a pc with a USB cable or power it with a AC-to-DC Fig. 5: LCD
adapter or battery to getcommenced.
B. FLEX SENSORS:
Flex sensors are usually available in two sizes. One is
2.2 inch and another is 4.5 inch . Although the sizes are
different the basic function remains the same. They are also
divided based on resistance. When you need to degree the
flex or bent or angle the flex sensor inner resistance
modifies nearly linearly with its flex attitude.
E. SPEECH SYNTHESIS:
We have used a Bluetooth module which is interfaced
with the Arduino UNO which will give an audio output
using the smartphone device according to the input given
by the user.
V. WORKING PRINCIPLE
Fig ix shows the setup of the whole project, where the LCD
display, Bluetooth module is interfaced to the
Arduinoboard.
Microcontroller is programmed accordingly using the
Arduino IDE software
Four flex sensors are connected to the Arduino board, the
Bluetooth module is connected to the Bluetooth app using
smartphone to provide the audio output.
Fig 10 shows the smartphone result.
Fig 11 shows some gestures with parallel messages
Fig.10: Smartphone Result
A. CONCLUSIONS
The design and implementation of Hand Gesture
Vocalizer For Deaf System proved to be a challenging,
rewarding and exciting experience. While keeping the
2. objectives of the course in mind, we were able to
successfully complete the integration of Hand glove using
hardware components like sensors, Arduino UNO,
Bluetooth and software (Arduino IDE. Since the project is
Arduino UNO based Smart feature and automations that
can be configured for different applications easily.