the advent of wearable and portable wireless 3D accelerometer .... Figure 8 shows the three commands for manual arm control, auto arm ... speaker-dependent.
International Journal of Computer, Information Technology & Bioinformatics (IJCITB) ISSN:2278-7593, Volume-1, Issue-1
ROBOTIC ARM CONTROL USING GESTURE AND VOICE Dr. R. V. Dharaskar
S. A. Chhabria
Abstract—Human-robot voice interface has a key role in many application fields. Hand gesture is a very natural form of human interaction and can be used effectively in human computer interaction (HCI). In this paper, we propose a “Human Machine Interfacing Device” utilizing hand gestures to communicate with computers and other embedded systems acting as an intermediary to an appliance. Developments in field of communication have enabled computer commands being executed using hand gestures. This paper discusses hand glove-based techniques that use sensors to measure the positions of the fingers and the position of the hand in real-time. Interaction using gesture technology for effective communication empowering physically challenged to interact with machines and computing devices including 3-D graphic interactions and simulations. This paper focuses on wireless data gloves that are proposed to be used for gesture recognition and accordingly robot movement will take place.
Keywords - hand gesture technology; human machine interaction; accelerometer; wireless communication, Wireless Data Glove, Accelerometer, PIC18F2510, RF 2.4 GHz tranreceiver. Speech Recognition System.
I INTRODUCTION Many challenges present themselves when it comes to the remote control of robots by humans, such as the ease of operation, haptic sensing, and telepresence. Telepresence where the operator is given the impression of being in the remote environment is currently most often realized through the use of vision systems[1][4] in which robots either need the supervision and direction of a human being or they require collaboration with people to receive and process corresponding data to start a transaction or finish an assignment. It is envisaged that service and personal care robots will become more prevalent at home in the near future [3] and will be very useful in assistive operations for human care, Manuscript received June, 2012. Dr. R. V. Dharaskar, Deptt. of Computer Sci. & Engg., G. H. Raisoni College of Engineering, Nagpur,(M.S),India Prof. S. A. Chhabria, Deptt. of Computer Sci. & Engg., G. H. Raisoni College of Engineering, Nagpur,(M.S),India Mr. Sandeep Ganorkar,Assistant Professor, Department of Computer Technology,K.D.K College of Engineering, Nagpur(M.S), India
41
Sandeep Ganorkar
particularly the elderly and disabled. The glove-based techniques use bend sensors to detect movement of fingers as well as magnetic/inertia tracking devices to track the pitch, yaw, roll and acceleration of the whole glove. The KHU-l data [1][3] glove is capable of transmitting hand motion signals to a PC through wireless communication. Hand gesture is a very usual type of human interaction and can be used efficiently in HCI. Feasibility of controlling home appliances using hand gestures presents an opportunity for a section of the aging population and disabled people to lead a more independent life [3]. The more sophisticated system which has multiple sensors such as bend sensors, abduction sensors and palm-arc sensors , wii remote sensor, Bio metric sensor, IR sensor optical sensor can be used to obtain accurate three-dimensional representations of the hand. The KHU-l data glove [3][5] is capable of transmitting hand motion signals to a PC through wireless communication.
II LITERATURE REVIEW
For hand-gesture recognition, some researchers have tried to perform the early segmentation process using skin-color histograms Zhou et al. [6] used overlapping sub-windows which is useful to extract invariants for gesture recognition, and distinguish them with a local orientation histogram attribute description indicating the distance from the canonical orientation. This makes the process relatively robust to noise, however, much more time consuming indeed. Kuno and Shirai defined seven different stages of hand gesture recognition. It includes position of the fingertip. This is not practically realistic when we have only pointing gestures, but also several other gestures, like grasping. However, the invariants they considered inspired us for our defined invariants. In some similar approaches, the watermark of an image is generated by modifying the invariant-vector. For example, Lizhong Gu and Jianbo Su tried to use Zernike moments along [6][7] with a hierarchical classifier to classify hand-gestures. This method is not appropriate for the JAST project, since there is not a high degree of freedom for the hands due to the limited space for movements and actions.
International Journal of Computer, Information Technology & Bioinformatics (IJCITB) ISSN:2278-7593, Volume-1, Issue-1
III RESEARCH METHODOLOGY EMPLOYED A BACKGROUND The robotic arm is designed to be similar to human arm with four degrees of freedom. Every part of the arm is actuated with servo motors. in addition, this plan will be a flexible robotic arm system to provide the robot the talent to control objects for instance, simple pick and place operations For instance, [8][9] rather than a therapist providing services to a client within a 80.47 kilometers radius; a uniquely specialized therapist in Chicago, Illinois could provide therapy consultation to a Patient in Helena, Montana. With the advent of wearable and portable wireless 3D accelerometer systems, a novel gait quantification system is detailed and subsequently tested and evaluated in a homebound environment. B Data Gloves There are two types of technique for gesture recognition. First is the vision based technique and other is glove based technique. Glove based techniques having some advantages over the vision based techniques. Using data glove is a better idea over camera as the user has flexibility of moving around freely within a radius limited by the range of wireless connecting the glove to the computer, unlike the camera where the user has to stay in position before the camera [4]. The cause of light, electric or magnetic fields or any other interruption does not affect the performance of the glove. [2][11] The movement data glove helps users to interface with the practical world.
Fig. 1 Wireless Data Glove Microcontroller Program flowchart
C HAND GESTURE Gestures are, in fact[6][12], used for the whole thing from pointing at a person to passing on specific information or implying a message. Gestures are, in fact, used for everything from pointing at a person to conveying specific information or implying a message. It happens very often that one cannot simply express his or her feelings or opinions without using additional gestures. Hand gestures, among other necessary domains in HRI, play an important role, both as an accompaniment to speech and as a means of input in their own right. [13] Although the synthesis stage can be optimized to get better detection rates, stochastic language models require large training sets and rarely obtain sufficient data for all possible hand-gesture combinations. Inclination sensing uses the gravity vector and its projection on the axes of the accelerometer to determine the tilt angle.
IV APPROACH In this work, a voice command system is designed and is constructed to manipulate a robot arm. This voice command system can be applied to other systems as well as robot systems. Fig 2. Flowchart of GUI panel
42
International Journal of Computer, Information Technology & Bioinformatics (IJCITB) ISSN:2278-7593, Volume-1, Issue-1
exerted by the moving body. The accelerometer used here is the 3-Axis accelerometer with an easy analog interface and running at a supply voltage of 3.3V, which makes it ideal for handheld battery powered electronics
Fig. 5 Wireless Data Glove
B Microcontroller The function of the microcontroller in this application is to act as an interpreter between the hand gestures and the end application. The ADC port converts analog signals coming in from the accelerometer into corresponding 8-bit digital values. It then shifts out the result through the UART line to the device to be controlled. The PIC microcontroller [10][11] having all these features achieves throughputs approaching 1 MIPS per MHz by executing powerful instructions in a single clock cycle. Microcontroller ports are used for interfacing with accelerometer, motor driver etc. On sensors are used to get the position of human hand interface. It is applied to microcontroller after [7][11] signal conditioning. After processing information it is send to remote robot arm through wireless link.
Fig. 3 Flowchart of Remote ARM Controller
Fig. 4 Speech Recognition System Program Flowchart
V HARDWARE METHODOLOGY EMPLOYED
A.
Accelerometer
An accelerometer is an electromechanical device which measures acceleration. A moving body possesses [15][16]an inertia which tends to resist change in velocity. It is this resistance to change in velocity that is the source of the force
43
Fig. 6 Circuit diagram of robot arm Interfacing with PC Figure 14 is circuit diagram of robot arm interfacing with PC. It is connected with microcontroller vie serial link.
International Journal of Computer, Information Technology & Bioinformatics (IJCITB) ISSN:2278-7593, Volume-1, Issue-1
Data from human interface board is processed and send it to driving circuitry. MAX 232 is voltage level shifter IC. It is directly connected to serial port of computer. The o/p circuit operates in 3.3 V to 5 V and serial port operates in voltage level ± 15 V. Tx and Rx in between I/P and O/P side that should be closed loop circuit for serial data transmission. Octocoupler is used for isolation in between I/P and O/P side. 12 Mz crystal oscillator is used to generate constant frequency of oscillation for PIC 16F628A. The command generated by PC interface software is given as I/P to PIC Microcontroller programme and generates the O/P which is then given to DC motor driving circuitry through port RB7 to RB4 to control the motor M1 and M2. VI.
Table No.1 Robotic Arm Position
Experiments and Results
The experiment is based on wireless data glove which consists of accelerometer, PIC Microcontroller [19][20] and R.F 2.4 GHz wireless module. The analog signals generated by accelerometer are proposed to be used in ADC of PIC microcontroller and then gets converted in to digital form. This digital signal is then given as input to tilt sensing algorithm. These are the program developed for PIC microcontroller. This algorithm generates the output which is sent [21] through RF module to Personal Computer Interface for accomplishing the final task. The output is shown in the table no.1. Table No.2 Analog Value of Accelerometer
Fig. 8 The GUI output screen of the program Figure 8 shows the three commands for manual arm control, auto arm control, voice control. After click on manual arm control the next screen is shown in Figure 9.
Fig 7: Overview of Data Glove and serial interface The second application is based on voice commands. It involves recognition of isolated words from limited vocabulary used to control the movement of selected parts of arm. The kalman filter is used to enhance the first stage in speech recognition agent by de-noising the words . Reference pattern block is created during the training phase of application where the user is asked to enter ten times each command word. For each word and based on ten repetition ten vectors of parameters are stored. Matching block compares the reference patterns and those extracted from I/P signals. In recognition Phase the application gets the word to be processed treats the word then takes the decision by setting the corresponding bits on serial port data register and corresponding LED is on.
44
Fig.9 The result of manual arm control command
Figure18. Shown the result when manual arm control command will executed. Above figure shows the five types of buttons for left, right, up, down movement of robotic arm. Exit button is for exit from program when program is running or executing.
International Journal of Computer, Information Technology & Bioinformatics (IJCITB) ISSN:2278-7593, Volume-1, Issue-1
Fig 10. The first result of auto arm Control commands Fig. 17 The output shown the voice command right
Fig. 11 The output of the data glove when it is Up position
Fig. 18. Robot arm movement vertical up, middle and down position. VI. Fig 12. The output of the data glove when it is from middle to left position
Fig. 13 The output of the data glove when it is from middle to right position
Fig. 14. The output shown the voice command
Fig. 15 The output shown the voice command up
45
CONCLUSION & FUTURE SCOPE
This paper discusses the hand gesture recognition module design for analyzing and classifying hand gestures for HCI including glove-based techniques. Hand Gesture Recognition using Wireless Data Gloves system can be used to solve the problem in supervisory control. It is used to find the way to map a set of angular measurements as delivered by the data glove to a set of predefined hand gestures. Furthermore, it would be advantageous to have a system with a certain amount of flexibility, so that the same system could be used by different people for performing various varying sets of tasks. The proposed work is the recognition of isolated words from a limited vocabulary in the presence of background noise. To reduce the effect of stationary noise (mainly environment noise) a pre-processing stage is added based on Kalman Filter. The application is speaker-dependent. Therefore, it needs a training phase. It should, however, be pointed out that this limit does not depend on the overall approach but only on the method with which the reference patterns were chosen. The future of this work is oriented to gesture recognition in [21] order to make the system capable of distinguishing different gestures and interpret them independently. This project presented a glove-based approach to hand-gesture understanding that shifts the focus from traditional and potentially complex syntactical analysis toward understanding hand gestures and using their underlying meaning. Future work includes evaluation of the performance of this approach in hand-gesture understanding using both a larger vocabulary and a larger knowledge-base. In this paper, we present a 3-D hand motion tracking and gesture recognition via the wireless data glove using accelerometer. The data glove is capable of capturing hand motion via tri-axis accelerometer sensors through wireless communication between the data glove and a PC. We have performed some simple hand gesture recognition in the study. Future work requires faster computation time to reduce the time delay of the system and advanced recognition methods to recognize more complex gestures.
International Journal of Computer, Information Technology & Bioinformatics (IJCITB) ISSN:2278-7593, Volume-1, Issue-1
REFERENCES [1] Mohamed Fezari, Hamza Attoui and Mouldi Bedda “Toward Hybrid Technique to Enhance Vocal Guiding System for a Manipulator Arm TR45” 978-1-4244-5750-2/10/$26.00 ©2009 IEEE [2] Tan Tian Swee, A.K. Ariff, Sh-Hussain. Salleh, Siew Seng,and Leong Seng HuatTan Tian Swee, A.K. Ariff, Sh-Hussain.Salleh, Siew Kean Seng, and Leong Seng Huat “DataGlovesMalay Sign Language Recognition System” 1- 4244-0983-7/07/$25.00 ©2007 IEEE [3] Silas Wan and Hung T. Nguyen,“Human Computer interaction Using Hand Gesture” 978-1-4244-1815-2/08/$25.00©200830th Annual International IEEE EMBS Conferenceancouver, British Columbia, Canada, August 20-24, 2008 [4] Martin Urban, Peter Bajcsy, Rob Kooper and Jean-Chistophe “Recognition of Arm Gestures Using Multiple orientation Sensors”. Lementec2004 IEEE Intelligent TransportationSystems conference Washington, D.C., USA, October34,2004 [5] Ji-Hwan Kim , Nguyen Duc Thang, Tae-Seong Kim . “3-D Hand Motion Tracking and Gesture Recognition using a DataGlove”IEEE International Symposium on Industrialelectronics (ISlE 2009) Seoul Olympic Parktel, Seoul , Korea,July, 2007 [6] I.-K. Park, l -H. Kim, and K.-S. Hong, "An Implementation of an FPGA-Based Embedded Gesture Recognizer Using a Data Glove," Conference on Ubiquitous Information Management and Communication Proceedings of the 2nd International conference on Ubiquitous information management and communication 2008, Suwon, Korea,January31 – February 01, 2008, pp.496-500 [7] Jiayang Liu, Zhen Wang, and Lin Zhong, Jehan ickramasuriya and Venu Vasudevan “Accelerometer-based Personalized Gesture Recognition and Its Applications,” 978-1-4244-3304-9/09/$25.00 ©2009 IEEE [8] Dhairya Dand, Sisil Mehta, Shashank Sabesan, AnkitDaftery “Handicap Assistance Device for Appliance Control Using User-Defined Gestures” 978-0-7695-3977-5/10 $26.00 ©2010 IEEE DOI 10.1109/ICMLC.2010.18 [9] Syed Atif Mehdi, Yasir Niaz Khan, “Sign Language Recognition Using Sensor Gloves” Proceedings of the 9thInternational Conference on Neural Information Processing (ICONIP‘02), Vol. [10] Youngmo Han, “A Low-Cost Visual Motion Data Glove as an Input Device to Interpret Human Hand Gestures” 00983063/10/$20.00 © 2010 IEEE [11]PIC18F2410/2510/4410/4510 Rev. B3 Silicon Errata28/40/44-Pin Flash Microcontrollers with 10-Bit A/D and nano Watt Technology [12] Pujan Ziaie, Thomas Muller and Alois Knoll,” A Novel Approach to Hand-Gesture recognition in a Human-Robot Dialog System” Robotics and Embedded Systems GroupDepartment of 3/08Informatics Technische UniversitatMuinchen 978-1-4244-3322- IEEE [13] Ben W. Miners, Student Member, IEEE, Otman A. Basir,Member, IEEE, and Mohamed S. Kamel, Senior Member,IEEE “Understanding Hand Gestures Using Approximate Graph Matching,” IEEE TRANSACTIONS ON SYSTEMS,MAN, AND CYBERNETICS—PART A: SYSTEMS ANDHUMANS, VOL. 35, NO. 2, MARCH 2005 239 [14] Pierre Jallon, Stephane Bonnet, Michel Antonakios and Regis Guillemaud “Detection system of motor epileptic seizures through motion analysis with 3D accelerometers”, 31stAnnual International Conference of the IEEE EMBS Minneapolis, Minnesota, USA, September 2-6, 2009 [15]Robert Lemoyne, Christian Coroian, and Timothy Mastroianni “Wireless accelerometer system for quantifying gait”978-1-4244-3316-2/09/$25.00©2009IEEE. [16] Qiong Fei, Xiaoqiong Li, Tao Wang, Xiongkui hang,Guoman Liu “Real-time hand gesture recognition system based on Q6455 DSP board” 978-0-7695-3571-5/09 $25.00 © 2009IEEE DOI 10.1109/GCIS.2009.97139 [17] Sergio Rodriguez1, Artzai Picon2 and Aritz Villodas3 “Robust Vision-Based Hand Tracking Using Single Camera for Ubiquitous 3D Gesture Interaction” IEEE Symposium on 3D User Interfaces 2010 20-21 March, Waltham, Massachusetts, USA, 2010IEEE [18] Xuedong Chen and Ou Bai “Towards Multi-Dimensional Robotic Control via Noninvasive Brain-Computer Interface”978-1-4244-3316-2/09/$25.0 2009 IEEE [19] Wei Dong, Kwang Yong Lim, Young Koon Goh, Kim Doang Nguyen,I-Ming Chen, Song Huat Yeo, Been-Lirn Duh,” A Low-cost Motion Tracker and Its Error Analysis” 2008 IEEE International Conference on Robotics and Automation Pasadena, CA, USA, May 19-23, 2008 [20] Sebastian Trimpe and Raffaello D’Andrea,” Accelerometerbased Tilt Estimation of a Rigid Body with only Rotational Degrees of Freedom”, 2010 IEEE International Conference on Robotics and Automation Anchorage Convention District May 3-8, 2010, Anchorage, Alaska, USA
46
[21] LI Wen-liang, YI Zhen-guo, ZHOU Wei, ZHU Ying, LIU Jiaxin,” Vehicle Rollover Dynamic Monitoring based on Tilt Sensor”, 2nd International Conference on Industrial and Information Systems 978-1-4244-8217 -711 0/$26.00 ©2010 IEEE