Mini Rover-The Unmanned Ground Vehicle
A THESIS SUBMITTED IN PARTIAL FULFILLMENT OF THE REQUIREMENT FOR THE DEGREE OF
BE-ELECTRICAL By Arif Hussain Junaid Ahmed Madiha Jawaid Under the Supervision of
Dr. Arslan Ahmed
Department of Electrical Engineering Sukkur IBA University
Declaration This report has been prepared on the basis of my own work where other published and unpublished source materials have been used, these have been acknowledged or referenced. Word Count: 10580 Names of students with signature: Arif Hussain Junaid Ahmed Madiha Jawaid Date of Submission:
1
Dedication
This thesis is dedicated to our parents and teachers for their endless love, support and encouragement.
2
Certificate This project thesis is written by Arif Hussain, Junaid Ahmed and Madiha Jawaid under the direction of their supervisor Dr. Arslan Ahmed and approved by all the members of thesis committee, has been presented and accepted by the Head of Department of Faculty of Electrical Engineering Department in partial fulfilment of the requirements of the degree of BACHELORS OF ENGINEERING in ELECTRICAL ENGINEERING.
Project Supervisor
Head of Department
Internal Examiner
External Examiner
Director
3
Acknowledgement
From the very beginning, we are very grateful to Almighty Allah, Who gave us the opportunity, strength, determination and wisdom to achieve our goal. We would like to thank our supervisor Dr. Arslan Ahmed, Assistant Professor Electrical Engineering Department Sukkur IBA University, who not only served as our supervisor but also encouraged and challenged us throughout our research project. He patiently guided us through the process, never accepting less than our best efforts. We would like to thank HOD of Electrical Engineering Dr. Faheem Akhtar Chachar and faculty members of Electrical Engineering Department Sukkur IBA University for their insightful suggestions and guidance. Many of our colleagues in academics have made significant contributions to the working on this project. The most important is to express our gratitude to our parents for all the sacrifices. They have been fully supported on this project. Their blessings and prayers have been a great inspiration for us to finish this project.
4
Abstract
The Mini Rover-Unmanned Ground Vehicle (UGV) is an exploration vehicle designed to move on the ground and can detect objects, characterize them and send feedback to the remote station wirelessly without any human interference. It will be able to go in places that human cannot reach for the sack of exploration and observation. The UGV operates without an on-board human presence. UGVs are used for the purpose of search and rescue, security and surveillance, terrain and forest monitoring because it is inconvenient, inefficient, dangerous, and sometimes impossible to have a human operator present in such environments. Major problems faced by UGV in natural terrain environment are reliable sensing, route finding and object detection and identification. We combine set of sensors and image processing techniques to counter these problems by getting more extensive perception of environment.
5
Table of Contents CHAPTER 1 ................................................................................................................................... 14 INTRODUCTION TO THE PROJECT ......................................................................................... 14 1.1
INTRODUCTION ........................................................................................................... 14
1.2
HISTORY ........................................................................................................................ 14
1.3
UNMANNED GROUND VEHICLE (UGV) ................................................................. 15
1.4
OBJECTIVES .................................................................................................................. 16
1.5
THESIS CONTRIBUTION ............................................................................................. 16
1.6
MILESTONES ACHIECHED AND GANTT CHART ................................................. 17
CHAPTER 2 ................................................................................................................................... 19 INTRODUCTION TO ROBOTS................................................................................................... 19 2.1
WHY DO WE USE ROBOTS?....................................................................................... 19
2.2
TYPES OF ROBOTS ...................................................................................................... 19
2.2.1 Autonomous Robots ..................................................................................................... 20 2.2.2 Semi-autonomous Robots ............................................................................................ 20 2.2.3 Manual Robots ............................................................................................................. 20 2.3
BASIC FUNCTIONS OF ROBOTS ............................................................................... 20
2.4
PARTS OF ROBOTS ...................................................................................................... 21
2.4.1 Locomotion System...................................................................................................... 21 2.4.2 Actuator System ........................................................................................................... 21 2.4.3 Electric Motors ............................................................................................................. 21 2.4.3.1 Stepper motors........................................................................................................... 22 2.4.3.2 DC geared motors...................................................................................................... 22 2.4.3.3 Servo motor ............................................................................................................... 23 6
2.4.4 H-Bridge ....................................................................................................................... 24 2.4.5 Power Supply System................................................................................................... 26 2.4.6 Sensor System .............................................................................................................. 27 2.4.7 Signal Processing System............................................................................................. 27 2.4.7.1 Arduino...................................................................................................................... 27 2.4.7.2 Raspberry pi .............................................................................................................. 28 2.5
CONCLUSION ............................................................................................................... 29
CHAPTER 3 ................................................................................................................................... 30 COMMUNICATION TOOLS ....................................................................................................... 30 3.1
INTRODUCTION ........................................................................................................... 30
3.2
UGV-to-CONTROL STATION ...................................................................................... 31
3.3 COMPARISON BETWEEN DIFFERENT TECHNOLOGIES........................................ 31 3.4 ZIGBEE (Xbee).................................................................................................................. 32 3.4.1 ZigBee Network ........................................................................................................... 33 3.4.1.1 Coordinator................................................................................................................ 33 3.4.1.2 Router ........................................................................................................................ 33 3.4.1.3 End Device ................................................................................................................ 34 3.4.2 Specifications of ZigBee Network ................................................................................ 34 3.5 UGV-to-CONTROL STATION COMMUNICATION LINK .......................................... 35 3.5.1 ZigBee C ...................................................................................................................... 35 3.6 GSM BASED ALARMS ................................................................................................... 36 3.6.1 GSM Module ................................................................................................................ 37 3.7 LIVE VIDEO STREAMING VIA WLAN ........................................................................ 37 3.7.2 Hardware Setup ............................................................................................................ 40 3.8
GETTING LIVE VIDEO STREAMING FROM RASPBERRY PI TO MATLAB....... 40 7
3.8.1 MATLAB Support Package for Raspberry Pi Hardware ............................................. 40 3.9
CONCLUSION ............................................................................................................... 40
CHAPTER 4 ................................................................................................................................... 41 OBJECT TRACKING, CLASSIFICATION AND RECOGNITION ........................................... 41 4.1 INTRODUCTION .............................................................................................................. 41 4.1.1 Basics of Image Processing .......................................................................................... 41 4.1.1.1 Image ......................................................................................................................... 41 4.1.1.2 Analog Image ............................................................................................................ 41 4.1.1.3 Digital Image ............................................................................................................. 41 4.1.2 Digital Images Vs Analog Image ................................................................................. 42 4.1.2.1 Advantages ................................................................................................................ 42 4.1.3 Analog Image Processing ............................................................................................ 42 4.1.4 Digital Image Processing ............................................................................................. 43 4.1.4.1 High Degree of Flexibility ........................................................................................ 43 4.1.4.2 Image Storage and Transmission .............................................................................. 43 4.1.4.3 Image Acquisition ..................................................................................................... 43 4.1.4.4 Image Enhancement .................................................................................................. 43 4.1.4.5 Colour Image Processing .......................................................................................... 43 4.1.4.6 Wavelets and Multiresolution Processing ................................................................. 43 4.1.4.7 Compression .............................................................................................................. 44 4.1.4.8 Segmentation ............................................................................................................. 44 4.1.4.9 Representation ........................................................................................................... 44 4.1.2.10 Object Recognition .................................................................................................. 44 4.3
IMAGE PROCESSING IN ROBOTICS ......................................................................... 44
4.3.1 Object detection, identification and tracking ............................................................... 45 8
4.3.2 Security......................................................................................................................... 45 4.4 BASIS OF IMAGE PROCESSING ................................................................................... 45 4.4.1 Image ............................................................................................................................ 45 4.4.2 Video ............................................................................................................................ 45 4.4.3 Background .................................................................................................................. 45 4.4.4 Foreground ................................................................................................................... 45 4.5 ALGORITHMS INVOLVED IN UNMANNED GROUND VEHICLE .......................... 46 4.5.1 Background Subtraction ............................................................................................... 46 4.5.2 Morphological Operations to Reduce Noise ................................................................ 47 4.5.2.1 Dilation ..................................................................................................................... 47 4.5.2.2 Erosion ..................................................................................................................... 48 4.5.3 Blob Analysis .............................................................................................................. 48 4.5.4 Kalman Filter............................................................................................................... 48 4.6
CONCLUSION ............................................................................................................... 48
CHAPTER 5 ................................................................................................................................... 49 RESULTS AND DISCUSSIONS .................................................................................................. 49 5.1
WIRELESS CONTROL OF UGV .................................................................................. 49
5.2
GSM BASED ALARMS ................................................................................................. 49
5.3
IMAGE PROCESSING ................................................................................................... 49
5.3.2 Live Video Streaming From UGV to Matlab at Control Station ................................ 50 5.3.3 Face Tracking in Live Video Streaming ..................................................................... 51 5.3.4 Video Recording ......................................................................................................... 53 5.3.5 Sending Recorded video through Email...................................................................... 53 5.3.6 Detection of moving Objects....................................................................................... 54 5.4
CONCLUSION ............................................................................................................... 54 9
CHAPTER 6 ................................................................................................................................... 55 CONCLUSION AND FUTURE WORK ....................................................................................... 55 6.1
CONCLUSION ............................................................................................................... 55
6.2
FUTURE WORK ............................................................................................................ 55
REFERENCES ............................................................................................................................... 57 APPENDIXES ............................................................................................................................... 60
10
List of Figures
11
12
List of Tables Table 1. 1 Timeline of Project ................................................................................................................19 Table 1. 2 Gantt chart.............................................................................................................................20 Table 2. 1 Hardware Specifications of Arduino .....................................................................................30 Table 2. 2 Hardware Specifications of Raspberry Pi .............................................................................32 Table 3. 1 Comparison between different Technologies .......................................................................34 Table 3. 2 Specifications of Zigbee and Zigbee Pro ..............................................................................36 Table 3. 3 Network Specifications of ZigBee........................................................................................38
13
CHAPTER 1 INTRODUCTION TO THE PROJECT 1.1
INTRODUCTION
Robot is a programmable machine with built in capability of executing certain set of actions automatically. It can interact and get perception of environment via certain set of sensors and actuators. A robot can be externally controlled or it may contain embedded control system. Its basic structure includes Locomotion (mechanical structure), Control System, Communication link and set of sensors. The basic process involved in robots to carry out certain predefined tasks is as sense, decide and act. Robots are generally categorized as Autonomous Robots-that does not require external control for execution of tasks and the remotely operated Robots- externally controlled by a control station through a wired or wireless link. Robots have numerous potential applications in various fields with increasing demand. Applications of robots ranges from military missions such as prior inspection, surveillance and combat; industrial and home usage for instance, harvesting crops, cleaning floors, inspection and security; to special tasks such as search and rescue operations. In industrial processes, there are huge number tasks that demand high degrees of speed, precision and accuracy. These tasks such as packaging, assembly, monitoring and inspection are tiring and repetitive in nature, so there are more chances of error when these tasks are carried out by human. In most the industries these tasks were carried out by human resource for long time, but now a days with advancement in robotic technology robots are making their places in most of industries to execute such tasks. Using robots can improve the speed, precision and accuracy by performing such specialized and repetitive tasks more efficiently by saving time and human resource. Nowadays, there is great advancement in industrial robotic technology, robotics systems are integrated with senor feedback systems and vision systems which increases the level of autonomy of robots.
1.2
HISTORY
The journey of remotely controlled robots started in 1920, then in the 1930s, the USSR created Tele tanks, an assault rifle equipped tank remotely controlled by radio from another tank. These were utilized as a part of the Winter War (1939-1940) against Finland. During World War II, the 14
British built up a radio control variant of their Matilda II infantry tank in 1941, known as "Dark Prince". The main significant portable robot improvement exertion named Shakey was made amid the 1960s as an exploration think about for the Defense Advanced Research Projects Agency for Artificial Intelligence (DARPA-AI) to test its compliance with orders, which is unique from advanced robot that are autonomous or semi -autonomous. Shakey had a platform which have sensors cameras and computers to help manage its navigational errands of getting wooden squares and putting them in specific regions on commands [1]. From 1973 to 1981, Hans Moravec led the Stanford Cart assignment on the Stanford University AI Lab, exploring navigation and obstacle avoidance problems the use of advanced stereo vision system [2]. The Cart's single TV camera had moved to each of 9 different positions atop its easy mobility base, and the ensuing images were processed by the off-board KL-10 mainframe. Feature extraction and correlation along images allowed reconstruction of a model of the 3-D scene that had used to plan an obstacle free direction to the destination. The system was incredibly slow, takes up to 15 minutes to make each one-meter move. Moravec moved to Carnegie Mellon University (CMU) in 1981 and continued his work at the smaller CMU Rover [Moravec, 1983] indoor platform. CMU became a major leader in mobile robot research during the 1980s, with its Navlab vehicle as the focus for much of the work [3]
1.3
UNMANNED GROUND VEHICLE (UGV)
In the broadest sense, the UGV is any piece of mechanized equipment that moves across the surface of the ground and serves as a means for carrying or transporting something, but explicitly does not carry a human being [4]. More specifically, the UGV is an exploration vehicle designed to move on the ground and can detect objects, characterize them and send feedback to the remote station wirelessly without any human interference. The UGV’s are gaining popularity due to their versatile applications in various fields such as in military, medical and industries. Applications of the UGV’s ranges from military missions such as prior inspection, surveillance and combat; industrial and home usage , for instance, harvesting crops, cleaning floors, inspection and security; to special tasks such as search and rescue operations. Major problems faced by UGV in natural terrain environment are reliable sensing, route finding, object detection and classification. In this project we combine set of sensors and image processing techniques to counter these problems by getting more extensive perception of environment. In the 15
UGV Arduino is used as central computing device, when connected to sensors and actuators make human like decisions based what it can sense. Communication is the integral part any robotic system, here in this we create a wireless communication link between the UGV and the Control Station (such as Laptop) by using ZigBee Protocol. The UGV also have an integrated Global System for Mobile Communication (GSM) module, the purpose of this module is to messages based alarms when it detects unfavorable condition such as fire or very high temperature. To get more extensive, accurate and real time perception of environment a Vision System is added to the UGV. In this Vision System Raspberry Pi and ordinary Pi-Camera is utilized to get real time video streaming of area under interest by creating Wireless LAN (WLAN) between the UGV and control station. Then, on control station this video streaming fed into MATLAB and image processing techniques are applied to detect, classify and track object under interest.
1.4
OBJECTIVES
The aim of this project is to develop the multipurpose and wirelessly controlled UGV which can move across the surface detect objects and classify them. It can also monitor and inspect object under interest, if detects any undesirable condition then inform corresponding authority for further immediate actions. The main objectives of the thesis are:
To Develop Wirelessly Controlled UGV
Remote Monitoring, Observation and Inspection
Generating GSM based Warnings
Object Detection, Classification and Recognition using Image Processing Techniques
1.5
THESIS CONTRIBUTION
Based on the above objectives, the contribution of this project is summarized as follows:
The thesis present a comprehensive literature review of UGVs
To develop the UGV which can move on natural terrain and can be wirelessly controlled
Combine set of sensors and Pi-Camera to remotely monitor and inspect object of interest
To generate GSM based alarms to corresponding authorities if detecting any undesirable condition
Use Image Processing techniques to detect, classify and track objects
16
Provide issues, challenges and future research directions for the large-scale and commercial realization of this prototype
1.6
MILESTONES ACHIECHED AND GANTT CHART
Table 1-1 contains the details of milestones achieved along with dates
Table 1- 1 Timeline of Project S. No Elapsed Time 01
August 2016
02
August 2016 September 2016
03
September 2016 March 2017
04
April 2017
05
June 2017
06
July 2017
Milestones
Deliverable
Literature survey
A quick survey was carried out on UGV’s to start the project Purchase of Hardware and collection and reading of technical information of hardware The practical work which includes interfacing of sensor, camera, ZigBee and Generate Alarms when sensors integrated in UGV detects undesirable condition in environment. Integrate Raspberry pi and camera in UGV. Also, Get Live streaming in Matlab Directly
to Collection of components and related material to Development of UGV and Wireless Control System
Object tracking and Classification by using live video streaming in Matlab Object tracking and Classification and recognition by using image processing algorithms Final Thesis Report
Apply Static Background Algorithm to live video feed from Raspberry Pi Detect Moving Objects from Moving Camera This includes a report containing everything about the project
1st Quarter of the year: Literature Review and Experimental Setup
2nd Quarter of the year: Testing, Assembling of UGV Platform and Hardware Integration.
3rd Quarter of the year: Designing, Testing debugging algorithms
4th Quarter of the year: Results Analysis and final documentation
17
Table 1- 2 Gantt Chart Task
2016-2017
Task Name
No
01
01
Literature Review
02
Experimental Setup
03
Testing
04
Assembling and
02
03
04
05 06
07
08 09
10
11
Hardware Integration 05
Design of algorithms
In the Gantt chart above the numbers from 1-12 represents months of the year and 1 represents August-2016
18
12
CHAPTER 2 INTRODUCTION TO ROBOTS 2.1
WHY DO WE USE ROBOTS?
Robots are machine that are designed by people to do specific tasks automatically with speed and refinement in measurement. They identify environment using its sensors and decides automatically what to do next. They provide protection against danger by performing many tasks that are harmful to human’s health. Robotics are the combination of electronic, computer science and artificial intelligence and mechanics. It is a branch of technology which deals with the application, operation, designs and construction of robots and uses computer system for information processing, sensory feedback and control. Such can replace humans in dirty, dull and dangerous environments and resemble humans in appearance and behaviour. In 1940’s many robots were introduced to control the radioactive materials. At the beginning of the 1960s, the first industrial robot was used to take up an object and put it down in another place. During the era of 1970s, the Japan started detonation in robotics innovation. In this way, they had become the members of industries.
2.2
TYPES OF ROBOTS
The robots are mainly classified into three main types as shown in figure 2-1.
Autonomous Robot
Semi-autonomous Robot
Manual Robot
Autonomous Robot
Robots Semi-autonomous Robot: The
Combination
of
Autonomous and Manual Robot.
Figure 2- 1 Types of Robots
19
Manual Robot
2.2.1 Autonomous Robots Can perform all tasks independently on a given condition. For example:
Unmanned Ground Vehicle
Line Follower Robot
Obstacle Avoider Robot
2.2.2 Semi-autonomous Robots This type of robot can only perform some tasks independently. They controlled it’s by programming and is restricted to what its programming tells it. 2.2.3 Manual Robots Manual robots are divided in to robot two types one is Articulated Robots and other is Exoskeletons Robots. Generally manual robots waits for the command form its user. When commands are given than it acts accordingly. For example:
Remote controlled Quad Copter
Remote controlled Unmanned Ground Vehicles
2.3
BASIC FUNCTIONS OF ROBOTS
There are three basic functions of robots as shown in figure 2-2.
Sense
Think
Act
Figure 2- 2 Basic Functions of Robots
20
2.4
PARTS OF ROBOTS
Figure 2-3 describes the parts of robots that how each part of robot is connected with other. Furthermore each part of robot is described in detail.
Figure 2- 3 Parts of Robots 2.4.1 Locomotion System This system works with the movement of robots, such as translator and rotatory motions. This system helps the right, left, backward, forward, climb up and climb down movements of the robots. In order to enable such movement, some devices, known as actuators, are needed that convert electrical energy into mechanical energy. One of the famous actuator is the DC Motor [5]. 2.4.2 Actuator System Actuator system is comprised of actuator devices that enable the locomotion of a robot. Some of the famous actuator include DC Motors, Stepper Motors and Servo Motors. Actuator system includes the connection, position, location circuit diagram and orientation of actuator devices [5]. 2.4.3 Electric Motors A device that is used to convert electrical power into mechanical power. With the help of this, a robot can move forward, backward, left and right. There are many types of motor, however this
21
section describes the famous types which are utilized in amateur robotics. The Electric Motors are categorized as:
Stepper motor
DC geared motor
Servo motor
2.4.3.1 Stepper motors Figure 2-4 is sample model of stepper motor. A stepper motor is the motor which rotates step by step. The step size define the resolution of rotation, for example 200 steps defines that stepper motor completes one rotation in 200 steps. Each step of motor has same size and needs a separate pulse for rotation. The speed of stepper motor depends on the speed of the digital pulses provided to it. As the delay decreases while moving from one step to next the speed increases and this is achieved by sending speedy number of pulses. So the speed of rotation is directly proportional to the frequency of the pulses. Because of its precision stepper motor has wide number of application where precision and accuracy is required. The best example can be CD or DVD ROMs and 3-D printers.
Figure 2- 4 Stepper Motor
2.4.3.2 DC geared motors Figure 2-5 shows the sample DC geared motor. A DC geared Motor contains assembly a gear that is connected to the motor. The motor speed is calculated as the shaft’s rotations per minute and it is represented as RPM. The increase in torque and reduce in speed are handled by the gear assembly. The reduction in the speed of gear motor to a specific speed requires the proper 22
combination of gears in the gear motor. The concept of reducing the speed and increasing the torque by a gear is called gear reduction. Such insight helps in exploring the details that constructs the gear head, as well as the working of DC motor.
Figure 2- 5 DC Geared Motor The speed of DC motor depends on the value of voltage applied to it. The increase in voltage increases in the speed. For example a DC motor has least speed at 6V and possible highest speed at 12V. So the equation of speed can be expressed as RPM= K1 * V, where, K1 is the constant of induced voltage and V is the applied voltage. The operation of the gears can be described by conservation of angular momentum. A geared having smaller value of radius has higher speed (RPM) and the gear having larger radius has low speed (RPM). However, the larger gear will provide more torque as compared to the smaller gear, and vice versa. The direction of the rotating gear is always opposite of its adjacent gear. In DC motor, the torque and the RPM are inversely proportional to each other. Hence, the gear with more torque will provide a lesser RPM and converse. In geared DC motors, the concept of pulse width modulation is applied. [6] 2.4.3.3 Servo motor It is an electromechanical device used in the applications where high torque and precision is required. It rotates in number of angles. It consists of a simple motor that runs through the servo mechanism. The name DC servo motor or AC servo motor refers to the type of current on which it is operated. The beauty of servo motor is that it has light weight and small package size with high torque. Due to such characteristics, the servo motors are used in several applications, such as planes, RC helicopters toy car and Robotics, Machine. The electrical pulse, having its circuitry placed beside the motor, decides the position of a servo motor. The sample picture of Servo Motor is shown in figure 2-6. 23
Figure 2- 6 Servo Motor
A servo motor consists of an assembly of gear, a DC or AC motor and a controlling circuit. The gear assembly provides increases in torque and reduces the speed. To reach a particular position of shaft, two signals are important other than the signals responsible for providing operating voltages and etc. These two signals are the control signals. Let’s say a controlling signal is provided by a potentiometer, is the required position of shaft and the other signal is the current position of the shaft. A controller (for example OP-APM) is responsible for calculating the error between the current position and required position of shaft. The resultant signal is fed to DC motor and the motor rotates in loop until the required position is achieved.
2.4.4 H-Bridge The output current of Arduino is very low, approximately 40 mA. To drive a motor the required value of current increases typically more than 1 A. H-Bridges provides the facility to provide the high value of voltages to load that are controlled by Arduino. Another advantage of using H-Bridge is that it provides the current in both directions so that a motor can rotate in two directions. An Hbridge is consisted on four power transistors with two inputs lines and two outputs. H-bridge has two half bridges one is Q1 and Q2 and other is Q3 and Q4 as shown in figure 2-7.
24
Figure 2- 7 H-bridge Circuit
To drive a DC motor one direction, make Q1 and Q4 are turned on and Q2 and Q3 are turned off, this make the current to fellow from side A to side B. And reverse combination will drives the motor in reverse direction. In high voltage applications IGBTs, the switching elements (Q1. Q4) are generally FET or bi-polar transistors. Figures 2-8 and 2-9 shows the fellow of currents. The diodes connected in parallel with each transistor are protecting the transistors from current spikes generated by back EMF [6].
Figure 2- 8 Forward Direction Switching
25
Figure 2- 9 Reverse Direction Switching This dual bidirectional motor driver module is based on a very famous L298 Dual H-Bridge Motor Driver IC as shown in figure 2-10. This module allows easy and independent control of two motors up to 2A each, in both directions. It also allows to control the speed of motor by PWM. It is well suited for connection to a microcontroller that requires just a couple of control lines per motor and is ideal for robotic applications.
Figure 2- 10 L298N Dual H-bridge Module
2.4.5 Power Supply System A power supply is required for the operation of a robot. For robotic and most applications, a DC supply is needed. This DC supply generally supplies 5V, 9V or 12V and sometimes it goes high based on the user requirement, such as 18V, 24V or 36V. The best way for this is to either use a battery (as it provides DC supply directly) because it can be attached with the robot and provides the power to it with respect to the capacity of battery [7].
26
2.4.6 Sensor System Sensors provides measurement for temperature, pressure, IR waves, radio waves, heat etc. and are needed for enabling the interaction of robots with physical world. The sensor system is the mean for the interaction between the real worlds with the electronics system. The signals provided by sensors are processed by the robot in order to take the autonomous decisions [7]. 2.4.7 Signal Processing System Robots need sensors’ data and other digital and electrical signals for their movement and analysing the situation of environment. In this regard, electronic components are needed to process the signal. These components include analog or digital device, Arduino and microcontroller. 2.4.7.1 Arduino Arduino is a microcontroller board used for designing and implementation of the systems, these systems can be robots, computers, data acquisition systems or any other. Arduino is used to design interactive objects by taking inputs from various switches or sensors, and controlling a number of devices, such as lights, motors, and other physical outputs. Arduino projects can be either standalone, i.e., operating independently, or they can communicate with a software running on a personal computer. Arduino offers numerous products, such as Arduino boards, kits, shields and accessories. Moreover, there are several varieties of Arduino boards, such as Arduino MEGA, Arduino UNO as shown in figure 2-11 and hardware description is given in table 2-1, Mega, Micro and Leonardo.
Figure 2- 11 Arduino UNO Board 27
Table 2- 1 Hardware Specifications of Arduino [8] Specifications 01
Microcontroller
ATmega328
02
Operating Voltage
5V
03
Input Voltage (Recommended)
7-12V
04
Input Voltage (Limits)
6-20V
05
Digital I/O Pins
14
06
Analog I/O Pins
6
07
DC Current per I/O Pin
40 mA
08
DC Current for 3.3 V Pin
mA
09
Flash Memory
32kB
10
SRAM
2kB
11
EEPROM
1KB
12
Clock Speed
16 MHz
2.4.7.2 Raspberry pi The Raspberry Pi is a low cost basic microprocessor based embedded system having versatile application. The Raspberry Pi is a rugged, fully functional computer which can be deployed in rough environment. Another important feature of Raspberry is that it can be programmed in a variety of programming languages C, Java, Python etc. Raspberry pi has the slot to interface the camera, this slot is called Camera Slot Interface. The hardware description is given in table 2-2 Table 2- 2 Hardware Specification of Raspberry Pi 9] Specifications 01
Processor
900 MHz quad-core ARM Cortex-A7 CPU
02
RAM
1 GB
03
I/O Pins
40 Pins with 4 PWM, 1 UART, 1 SPI and I2C port, Camera and display interface
28
2.5
CONCLUSION
Generally robots are combination of electronic and mechanical parts. The types of robots are described in detail with respect to their functional capability like fully autonomous, semiautonomous and manual robots. Furthermore the parts of robot like actuators and microcontrollers are described in detail with respect to their working principals and specifications. The project is divided in subsystems like locomotion, actuator, sensor and signal processing systems and each subsystem is descried with respect to its task.
29
CHAPTER 3 COMMUNICATION TOOLS 3.1
INTRODUCTION
Communication is a vital part of any system, it is basically the transfer of signals (message) between or among different entities. The general structure of Communication System contains the sender, receiver, message, medium and Communication Protocol. There are different types of Communication Systems such as Digital and Analog Communication Systems, Wired and Wireless Communication Systems. In Wireless Communication System transfer of information occurs without any dictated electrical wire (conductor). Its range varies from few meters to several kilometres depending upon the technology. Wireless connectivity offers several advantages over wired systems, it gives the flexibility that is impossible in wired system. Communication Protocol is the set of rules governing the communication. There are three communication ways involved in the UGV as shown in figure 3-1.
UGV-to-Control Station
Sensors-to Users
Internet- Live Video Streaming
UGV-to Control Station
Communication
InternetWLAN
Sensor-to-User
Figure 3- 1 Types of Communication in UGV
30
3.2
UGV-to-CONTROL STATION
In teleportation mode the UGV requires a Control Station which controls the movements of the UGV remotely. In order to send commands to the UGV a dedicated wireless communication link is created between the UGV and Control Station, in which control station send commands to the UGV and UGV execute the commands by carrying out certain predefined actions or tasks. Figure 3-2 shows that laptop and UGV are connected with each other using wireless link.
Figure 3- 2 Communication between the UGV and Control Station
3.3 COMPARISON BETWEEN DIFFERENT TECHNOLOGIES Table 3-1 shows the comparison between some available standards that offer wireless connectivity.
Table 3- 1 Comparison Between Different Technologies [11] S.NO
Parameter
ZigBee( Xbee)
Bluetooth
WiFi
01
Frequency Band
2.4 GHz
2.4 GHz
2.4 GHz, 5 Ghz
02
Range
100 m
30 m
30 m
03
Power Consumption
Very Low
Very Low
High
04
License Type
Free ISM Band
Free ISM Band
05
Usage
Embedded Systems
Automation (Old Standard)
31
Wireless Broad Band Access
Among all these options available for wireless connectivity between the UGV and Control Station the ZigBee is more suitable because
It has good Range
Easy interface in Embedded System
Advanced Technology
3.4 ZIGBEE (Xbee) The ZigBee and ZigBee-Pro Radio Frequency Modules were designed on the basis of IEEE 802.15.4 protocols and support the unique requirements of low cost, low-power remote sensor systems. The modules require negligible power and give dependable conveyance of information between devices. The modules work inside the ISM 2.4 GHz frequency band and are pin for-pin compatible with each other. It can support full duplex serial communication.
Table Figure 3- 32 3Zigbee Zigbee PinPin Diagram Diagram
ZigBee is the good for medium ranges it has data rate of 250 Kbps with the receiver sensitivity of -92 [dBm]. It has wide range of applications in robotics, Sensor Networks and Industrial Automation. Figure 3-3 shows the pin diagrams and a sample model of Zigbee and table 3-3 shows the specifications.
32
Table 3- 3 Specifications of Zigbee Pro [11] Specifications Values
S.NO Parameters 01
Indoor/Urban Range
30 meter
02
Transmit Power Output
1 mW
03
RF Data Rate
250,000 bps
04
Serial Interface Data Rate
1200 bps-250 kbps
05
Receiver Sensitivity
-92 dBm
06
Supply Voltage
2.8-3.4 V
07
Transmit Current
45 mA @ 3.3v
08
Idle/Receive Current
50 mA @3.3v
09
Power-down Current
0) { command= Serial.read(); } switch (command) { case 'W': FWD(); break; case 'S': 60
BWD(); break; case 'D': RHT(); break; case 'A': LFT(); break; case '1': stopp(); break; } }
void FWD() { analogWrite(M_right_Motor_Speed, 170); analogWrite(M_left_Motor_Speed, 170); digitalWrite(M_right_Motor_5v,HIGH); digitalWrite(M_right_Motor_Gnd,LOW); digitalWrite(M_left_Motor_5v,HIGH); digitalWrite(M_left_Motor_Gnd,LOW); } void BWD() { analogWrite(M_right_Motor_Speed, 170); analogWrite(M_left_Motor_Speed, 170); digitalWrite(M_right_Motor_5v,LOW); digitalWrite(M_right_Motor_Gnd,HIGH); digitalWrite(M_left_Motor_5v,LOW); digitalWrite(M_left_Motor_Gnd,HIGH); 61
}
void LFT() { analogWrite(M_right_Motor_Speed, 255); analogWrite(M_left_Motor_Speed, 0); digitalWrite(M_right_Motor_5v,HIGH); digitalWrite(M_right_Motor_Gnd,LOW); digitalWrite(M_left_Motor_5v,LOW); digitalWrite(M_left_Motor_Gnd,HIGH); } void RHT() { analogWrite(M_right_Motor_Speed, 0); analogWrite(M_left_Motor_Speed, 255); digitalWrite(M_right_Motor_5v,LOW); digitalWrite(M_right_Motor_Gnd,HIGH); digitalWrite(M_left_Motor_5v,HIGH); digitalWrite(M_left_Motor_Gnd,LOW); } void stopp() { analogWrite(M_right_Motor_Speed, 0); analogWrite(M_left_Motor_Speed, 0); digitalWrite(M_right_Motor_5v,LOW); digitalWrite(M_right_Motor_Gnd,LOW); digitalWrite(M_left_Motor_5v,LOW); digitalWrite(M_left_Motor_Gnd,LOW); }
62
B. SENSOR AND ALARM SYSTEM
#include SoftwareSerial mySerial(8, 9); #define fire A0 #define gas A1 #define temprature A2 int initial_State = 0 ; int counter = initial_State ; int gassens = initial_State ; int tempsens = initial_State ; int firesens = initial_State ;
void setup() { mySerial.begin(9600); Serial.begin(9600); delay(1000); pinMode(fire.OUTPUT); pinMode(gas.OUTPUT); pinMode(temperature.OUTPUT); }
void loop() { gassens=analogRead(gas); tempsens=analogRead(temprature); firesens=analogRead(fire);
if ( (gassens>8) &&(counter 0 ) Serial.write(mySerial.read()); }
void forwardMessage() { mySerial.println(“AT+CMGF=1”); delay(150); mySerial.println(“AT+CMGS=\”+923083288213\”\r”); delay(1000); mySerial.println(); mySerial.println(“—UGV--”); delay(200); mySerial.print(“fire = ”); delay(150); mySerial.println(firesens); delay(100); mySerial.print("gas = "); delay(100); mySerial.println(gassens); delay(100); mySerial.print("temp = "); delay(100); mySerial.println(tempsens); delay(150); mySerial.println((char)26); delay(1000); }
void recieveMessage() 64
{ mySerial.println(“AT+CNMI=2,2,0,0,0”); delay(1000); }
C. IMAGE PROCESSING
C-1 Control Station %control station [22] %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
function varargout= ControlStation(varargin) gui Singleton = 1 ; gui_State = struct(‘gui_Name’ , mfilename,… ‘gui_Singleton’,gui_Singleton,…. ‘gui_OeningFcn’, @ControlStation_OpeningFcn,…. ‘gui_OutputFcn’, @ControlStation_OutputFcn,…… ‘gui_LayoutFcn’,[],…. ‘gui_Callback’,[]); if nargin && inchar(varargin{1}) gui_State.gui_Callback = str2func(varargin{1}); end if nargout [varargout{1:nargout}] = gui_mainfcn(gui_State,varargin{:}); else gui_mainfcn(gui_State,varargin{:}); end Function ControlStation_OpeningFcn(gObect,eventData,handles , varargin) Handles.out. = hObject; guidata(hObject,handles); Function varargout = ControlStation_Output_Fcn(hObject,eventdata, handles) Ah = axes(‘unit’,’normalized’,’position’,[0 0 1 1]); 65
Bg = imread(‘UGV.png’); imagesc(bg); set(ah,’handlevisibility’,’off’,’visible’,’off’) uistack(ah,’button’); varargout{1} = handles output; Function Facetracking callBack(hObject , eventdata , handles) run(‘Facetracking.m’) Function Objectdetection_CallBack(hObject,eventdata,handles) run(‘MotionBasedMultiObjectTrackingExample.m’) function WebCam_Callback(hObject,eventdata , handles) figure; run('webCAM.m') function sendemail_ Callback(hObject,eventdata , handles) run('SendMail.m') function Record_ Callback(hObject,eventdata , handles) run('VideoCapture.m’)
C-2 Facetracking % Create the face detector Object. clear myPi; faceDetector = vision.CascadeObjectDetector(); % Create the point tracker Object. pointTracker = vision.pointTracker(‘MaxBidirectionalError’,2); myPi = raspi('192.168.8.135','pi','raspberry'); cam= cameraboard(myPi,'Resolution','320x240'); videoFrame = snapshot(cam); videoFrame = size(videoFrame); videoPlayer = vision.VideoPlayer(‘Position’,[100 100[frameSize(2), frameSize(1)+30]]); runLoop = true ; numPts = 0; frameCount = 0; while runLoop && frameCount < 1000 videoFrame = snapshot(cam); videoFrame Gray = rgb2gray(videoFrame); frameCount = frameCount + 1 ; 66
if numPts < 10 Bbox = faceDetector.step(videoFrameGray); If ~isEmpty(bbox) points = detectMinEigentFeatures(videoFrameGray,’ROI’,bbox(1,:)); xyPoints = points.Location ; numPts = size(xyPoints,1); release(pointTracker); initialize(pointTracker, xyPoints , videoFrameGray); oldPoints = xyPoints; bboxPoints = bbox2points(bbox(1,:)); bboxPolygon = reshape(bboxPoints’,1,[]); videoFrame = insertShape(videoFrame,’Polygon’,bboxPolygon,’LineWidth’,3); videoFrame = insertMarker(videoFrame,xyPoints,’+’,’color’,’white’); end else %Tracing Mode. [xyPoints , isFound] = step(pointTracker,videoFrameGray); visiblePoints = xyPoints(isFound); oldInliers = oldPoints(isFound,:); numPts = size(visiblePoints,1); if numPts >= 10 [xform , oldInliers , visiblePoints] = estimateGeometricTransform(… oldInliers, visiblePoints , ‘similarity’ , ‘MaxDistance’ , 4) bboxPoints = transformPointsForward(xform , bboxPoints); bboxPolygon = reshape(bboxPoints’,1,[]); videoFrame = insertShape(videoFrame, ‘Polygon’ , bboxPloygon , ‘LineWidth’,3); videoFrame = insertMarker(videoFrame,visiblePoints,’+’,’Color’,’white’); oldPoints = visiblePoints; setPoints(pointTracker , oldPoints); end 67
end
%Display the annomated video frame using the video player object. step(videoPlayer,videoFrame); %Check whether the video player window has been closed. runLoop = isOpen(videoPlayer); end %Clen up. clear cam release(videoPlayer); release(pointTracker); release(faceDetector);
C-3 Live Video Streaming clear all close all myPi = raspi('192.168.8.135','pi','raspberry'); myCam= cameraboard(myPi,'Resolution','320x240'); for i = 1:500
img = snapshot(myCam);
image(img); title(' Live Video Streaming') drawnow; end
C-4 Sending email myaddress = '
[email protected]'; mypassword = 'zealousboy1'; setpref(‘Internet’,’E_Email’, myaddress); setpref(‘Internet’,’SMTP’ , ‘smtp.gmail.com’); setPref(‘Internet’,’SMTP_Username’, myaddress); setPref(‘Internet’,’SMTP_Password’, myaddress); props = java.langSystem.getProperties; props.setProperty(‘mail.smtp.auth’,’true’); props.setProperty(‘mail.smtp.socketFactory.class’,… 68
‘javax.net.SSLSocketFactory’); props.setProperty(‘mail.smtp.socketFactory.port’,’465’); sendmail(‘
[email protected]’,’Video Streaming from UGV’,’Please find the attached VIDEO’,’VIDEIO.MP4’);
C-5 Motion Based Object Detection function MotionBasedMultiObjectTrackingExample() Obj = setupSystemObjects(); Track = initializeTracks(); nextId = 1; While ~isDone(obj.reader) Frame = readFrame(); [centroids ,bbox , mask] = detectObjects(frame); predictNewLocationOfTracks(); [assignments , unassignedTracks, unassignedDetections] = …. detectionToTrackAssignment(); updateAssignedTracks(); updateUnassignedTracks(); deleteLostTracks(); createNewTracks(); displayTrackingRestults(); end
%system object creation function obj = setupSystemObjects() obj.reader = vision.videoFileReader(‘2.mp4’); obj.maskPlayer = vision.VideoPlayer(‘Position’,[740 , 400 , 700 , 400]); obj.maskPlayer = vision.VideoPlayer(‘Position’,[20 , 400 , 700 , 400]); obj.detector = vision.ForegroundDetector(‘NumGaussians’,3… ‘NumTrainingFrames’ , 40 , ‘MinimumBackgroundRatio’,0.7); obj.blobAnalyser = vision.BlobAnalysis(‘BoundingBoxOutputPort’,true,… 69
‘AreaOutputPort’ , true, ‘CentroidOutputPort’,true,…. ‘MinimumBlobArea’,400); end
%tracks initialization function tracks = initilazeTracks() %array of empty tracks tracks = struct(… ‘id’ ,{},… ‘bbox’ ,{},… ‘kalmanFilter’,{},… ‘age’, {},… ‘totalVisibleCount’, {},… ‘consecutiveInvisibleCount’, {}); end %reading video function frame = readFrame() frame = obj.reader.step(); end %detect object function [centroids , bboxes , mask] = detectObjects(frame) mask = obj.detector.step(frame); mask = imopen(mask,strel(‘rectangle’,[3,3])); mask = imclose(mask,strel(‘rectangle’,[15,15])); mask = imfill(mask,’holes’); %blob analysis [~, centroids , bboxes] = obj.blobAnalyser.step(mask); end
70
71