tection, people tracking, and navigation are essential abilities that allow robots to interact with their ..... 14 Calculation of the θ angle for the predicted position . . . . . . . 35 ... to recognize objects, structures and persons, so they can perform tasks in a safe and ...... leg instead of two. In labels "occlusion by a dragged cylinder".
University of Algarve Faculty of Science and Tecnology
Detection and tracking of moving objects Final project for the licenciatura degree in Engenharia de Sistemas e Computação
Advisors: Prof. António Ruano and Eng. Daniel Castro Students: João Xavier and Marco Pacheco
University of Algarve, Faro December 2004
Final project of the licenciatura degree in Engenharia de Sistemas e Computação
The authors declare that the present work is original.
Authors João Xavier
Marco Pacheco
Advisors Prof. António Ruano
Eng. Daniel Castro
University of Algarve, Faro December 2004
Abstract In this work is presented modules for intelligent robot navigation. Feature detection, people tracking, and navigation are essential abilities that allow robots to interact with their surrounding and with persons. The feature detection uses fast algorithms: Recursive line fitting and IAV (Internal Angle Variance), the latter being a inovative method developed in this project to accurately detect arcs/circles. A new people tracking methodology makes use of heuristics and closest pairs are applied to legs in order to keep track of persons. Scene interpretation is achieved so that architectural landmarks are recognized. A visualization tool was developed to better understand the results. Navigation is performed by an implementation of the dynamic window approach, in order to allow safe navigation in unknown environments, taking into account the dynamic and kinematic constrains of the robot.
Resumo Neste trabalho são apresentados módulos para navegação robótica inteligente. A deteção de características, seguimento de pessoas e navegação são essenciais para os robôs interagirem com pessoas. A deteção de características usa algoritmos rápidos: O fitting recursivo de linhas e o IAV (variação interna do ângulo), sendo o último um novo método desenvolvido neste projecto que permite detectar com precisão arcos/círculos. Uma nova metodologia de seguimento de pessoas faz uso de heurísticas e a técnica dos pares mais próximos é aplicada a pernas, a fim de resolver a localização de pessoas. A interpretação do cenário é conseguida de modo que os marcos arquitectónicos sejam reconhecidos. Uma ferramenta de visualização foi desenvolvida para compreender melhor os resultados. A navegação é executada pelo algoritmo da janela dinâmica de velocidades, a fim permitir a navegação segura em ambientes desconhecidos, tendo em consideração as restrições dinâmicas e cinemáticas do robot.
Acknowledgments We would like to thank our advisors Eng. Daniel Castro and Prof. António Ruano without whose guidance and support this work would not have been possible. Their help, suggestions and encouragement were fundamental in all perspectives for the completion of this work. We would like to thank the Player/Stage authors and community for all the help and support given. Keep up the good work! We would like to thank all our friends at the University of Algarve who helped us with suggestions, creative comments and positive feelings. Last but not the least, we would like to thank our families and loved ones for their encouragement and support in all our endeavors. Thanks for everything!
iv
Contents Abstract
ii
Resumo
iii
Acknowledgments
iv
Table of Contents
v
List of Tables
ix
List of Figures
x
1 Introduction
1
2 Background and related work
4
2.1
Feature detection . . . . . . . . . . . . . . . . . . . . . . . . . .
4
2.1.1
Circle detection . . . . . . . . . . . . . . . . . . . . . . .
4
2.1.2
Line detection . . . . . . . . . . . . . . . . . . . . . . . .
5
2.1.3
Leg detection . . . . . . . . . . . . . . . . . . . . . . . .
5
2.2
People tracking . . . . . . . . . . . . . . . . . . . . . . . . . . .
5
2.3
Navigation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
6
3 Overview
9
v
3.1
Hardware . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
9
3.2
Development tools . . . . . . . . . . . . . . . . . . . . . . . . .
10
3.2.1
Candidate development tools
. . . . . . . . . . . . . . .
10
3.2.2
Introduction to Player . . . . . . . . . . . . . . . . . . .
11
3.3
Modules . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
11
3.4
Interaction between modules . . . . . . . . . . . . . . . . . . . .
12
3.4.1
Feature detection . . . . . . . . . . . . . . . . . . . . . .
12
3.4.2
People tracker . . . . . . . . . . . . . . . . . . . . . . . .
13
3.4.3
Navigation . . . . . . . . . . . . . . . . . . . . . . . . . .
13
4 Feature detection
14
4.1
Range Segmentation . . . . . . . . . . . . . . . . . . . . . . . .
14
4.2
Circle detection . . . . . . . . . . . . . . . . . . . . . . . . . . .
14
4.2.1
Detecting lines . . . . . . . . . . . . . . . . . . . . . . .
16
4.2.2
Detecting arcs embedded in lines . . . . . . . . . . . . .
16
4.2.3
Detecting "arc-like" shapes . . . . . . . . . . . . . . . . .
16
4.2.4
Circle prerequisites . . . . . . . . . . . . . . . . . . . . .
18
4.3
Leg detection . . . . . . . . . . . . . . . . . . . . . . . . . . . .
19
4.4
Line detection . . . . . . . . . . . . . . . . . . . . . . . . . . . .
20
5 People tracking
22
5.1
Requisites of the people tracker . . . . . . . . . . . . . . . . . .
22
5.2
Methodology . . . . . . . . . . . . . . . . . . . . . . . . . . . .
22
5.2.1
Closest pairs . . . . . . . . . . . . . . . . . . . . . . . . .
23
5.2.2
Data structures . . . . . . . . . . . . . . . . . . . . . . .
23
5.2.3
The people tracking algorithm . . . . . . . . . . . . . . .
25
vi
6 Navigation
28
6.1
Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
28
6.2
The Dynamic Window Approach . . . . . . . . . . . . . . . . .
31
6.2.1
Dynamic velocity search space . . . . . . . . . . . . . . .
31
6.2.2
The objective function . . . . . . . . . . . . . . . . . . .
34
7 Visualization tool
38
8 Experimental results and discussion
39
8.1
8.2
8.3
Feature detection . . . . . . . . . . . . . . . . . . . . . . . . . .
39
8.1.1
Tests performed . . . . . . . . . . . . . . . . . . . . . . .
39
8.1.2
Benchmarks . . . . . . . . . . . . . . . . . . . . . . . . .
42
People tracking tests . . . . . . . . . . . . . . . . . . . . . . . .
44
8.2.1
Leg matching . . . . . . . . . . . . . . . . . . . . . . . .
44
8.2.2
Person tracking . . . . . . . . . . . . . . . . . . . . . . .
44
Navigation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
44
8.3.1
Simulated experiments . . . . . . . . . . . . . . . . . . .
46
8.3.2
Real tests . . . . . . . . . . . . . . . . . . . . . . . . . .
46
8.3.3
Benchmarks . . . . . . . . . . . . . . . . . . . . . . . . .
49
8.3.4
Detected problems and possible solutions . . . . . . . . .
51
9 Conclusion and future work 9.1
53
Future work . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Appendix A Format of Player data
54 56
A.1 Feature encapsulation in Player . . . . . . . . . . . . . . . . . .
56
A.2 Person tracking encapsulation in Player . . . . . . . . . . . . . .
57
vii
Appendix B Robotic platform
59
B.1 SICK LMS200 Laser Measurement System . . . . . . . . . . . .
59
B.2 Pioneer 2 DX Robot . . . . . . . . . . . . . . . . . . . . . . . .
61
B.2.1 Electronics . . . . . . . . . . . . . . . . . . . . . . . . . .
62
B.2.2 Pioneer DX2 Operating system . . . . . . . . . . . . . .
63
B.2.3 PID Controls . . . . . . . . . . . . . . . . . . . . . . . .
64
B.2.4 Position Integration . . . . . . . . . . . . . . . . . . . . .
64
Appendix C Orthogonal line fitting method
66
Appendix D Motion equations for a differential-drive robot
68
D.1 General motion equations . . . . . . . . . . . . . . . . . . . . .
68
D.2 Approximate motion equations
69
List of References
. . . . . . . . . . . . . . . . . .
71
viii
List of Tables 1
Available configuration parameters . . . . . . . . . . . . . . . .
37
2
Times of feature detection by iteration . . . . . . . . . . . . . .
42
3
Times of feature detection by segment . . . . . . . . . . . . . .
42
4
Dynamic window running time by iteraction . . . . . . . . . . .
51
5
Format of fiducial data fields . . . . . . . . . . . . . . . . . . . .
57
6
Available configuration parameters . . . . . . . . . . . . . . . .
57
7
Format of fiducial data fields of the tracker . . . . . . . . . . . .
57
8
Tracker configuration parameters . . . . . . . . . . . . . . . . .
58
9
Reflectivity of some materials . . . . . . . . . . . . . . . . . . .
60
10
Resolutions, max. scanning angles and max. of measured values
61
11
Technical caracteristics of the LMS200 . . . . . . . . . . . . . .
62
12
Technical caracteristics of the Pioneer 2 DX . . . . . . . . . . .
65
ix
List of Figures 1
Our robot . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
10
2
Interaction between the developed player drivers . . . . . . . . .
13
3
The angles of points inside an arc are congruent in respect to the extremes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
4
15
The angles inside a line are congruent in respect to the extremes and measure 180◦ . . . . . . . . . . . . . . . . . . . . . . . . . .
15
5
Detection of embedded arcs . . . . . . . . . . . . . . . . . . . .
16
6
The apertures of a "V" shaped corner grow around a local minima 17
7
Condition that selects segments for circle analysis . . . . . . . .
8
The process of Recursive Line Fitting: a) principle directions,
19
b) first line segment identified, c) principle direction of second segment, d) both lines detected . . . . . . . . . . . . . . . . . . 9
21
The tracker requisites are: handle temporary occlusions of persons (a), be robust in crowded places (b), handle partial occlusions of already classified persons (c), recognize two legs in one only segment (d) . . . . . . . . . . . . . . . . . . . . . . . . . .
23
10
Closest pairs example . . . . . . . . . . . . . . . . . . . . . . . .
24
11
update/recover persons algorithm . . . . . . . . . . . . . . . . .
27
12
Circular trajectories of a mobile robot . . . . . . . . . . . . . . .
32
13
Resulting search space . . . . . . . . . . . . . . . . . . . . . . .
34
x
14
Calculation of the θ angle for the predicted position . . . . . . .
35
15
Example of an algorithm for indoors behavior . . . . . . . . . .
37
16
Lines detected (color red) in a real test scenario . . . . . . . . .
40
17
A screen capture of PlayerGL, at the end of our feature detection test. Circle detection of the mobile cylinder are the green circumferences, legs are purple circles. Interpretation of walls, columns and benches is also achieved. . . . . . . . . . . . . . . .
41
18
Plot of total times of the algorithms in the static test . . . . . .
43
19
Plot of total times of the algorithms in the dynamic test . . . .
43
20
Association of legs at the end of the test . . . . . . . . . . . . .
45
21
Tracking of the person at the end of the test . . . . . . . . . . .
45
22
Evolution of velocities v, w heading towards a goal . . . . . . . .
47
23
Evolution of odometry and velocities vs. time . . . . . . . . . .
47
24
Collision avoidance while heading towards a goal . . . . . . . . .
48
25
Evolution of odometry and velocities vs. time . . . . . . . . . .
48
26
Collision avoidance in a real scenario . . . . . . . . . . . . . . .
50
27
Evolution of odometry and velocities vs. time . . . . . . . . . .
50
28
The dynamic window algorithm running times per iteraction . .
51
29
A local minima situation . . . . . . . . . . . . . . . . . . . . . .
52
30
Readings from the LMS using a 100◦ and 180◦ scanning angles .
61
31
Internal coordinate system for the Pioneer 2 DX robot . . . . .
64
xi
Chapter 1 Introduction For the Robotic Freedom[1] manifesto to become a reality, robots must be able to recognize objects, structures and persons, so they can perform tasks in a safe and proficient way. Stock replacement, street cleaning and zone security are some examples of tasks not adequate for people, as they could be delegated to machines so that persons live in more hedonic societies. Although Laser Measurement Systems (LMS) do not provide sufficient perception to accomplish these tasks on their own, they can be a great help, especially for navigation tasks where we require fast processing times which are not achievable with camera based solutions. Current navigation systems will benefit from detecting indoor (columns, corners, trashcans, doors, people, etc.) and outdoor (car parking poles, walls, cars, etc.) structures for intelligent and safe navigation while performing tasks involving those structures. Many structures are decomposable in shapes of geometric primitives, which can be used for scene interpretation. This geometric perception is important to make spatial inferences, which is a building block of the Artificial Intelligence required to accomplish complex tasks. In this work we developed three drivers that are the bases of intelligent navi-
1
gation: one for recognition of features (distinctive pattern) from laser data; one of data association and person tracking, and another of local reactive navigation. The chosen primitive features to detect are lines, arcs/circles and legs. Lines and arcs are mostly used for localization, scene interpretation or map building. Leg detection applications range from following people, to excluding leg segments from Simultaneous Localization And Mapping (SLAM) scan matching in order to reduce noise. With fast feature extraction techniques, there is processing time left for other tasks, like scan matching, navigation, scene interpretation, etc. This work proposes a new technique for circle detection that can also be used in line detection, the Internal Angle Variance (IAV). This technique presents a linear complexity O(n) where the average and the standard deviation are the heaviest calculations, and are also the only parameters the user needs to specify. Compared to other methods like the Hough transform for circle detection [2] that have parameters to tune like the accumulator quantification, ours is simpler to implement, with lower computational costs, proving to be an excellent method for real time applications. A recursive line detection algorithm, with complexity O(n · log(n)), similar to [3] and [4] was also implemented. The pattern of a person on a horizontal laser scan depends mostly of the height of the laser. If the laser is above the waist of the person facing the laser, we get a pattern composed of body and arms of the person. This perspective of body parts has many possible combinations and make it a problem suitable for particle filtering techniques, which take into account the concentration of points around an area. The laser scan height is about a person knee; at this height a person is composed (except for small exceptions) of one or two small arcs (legs) with a diameter of about 0.015 meters separated by a small distance. We use these geometric properties to conceive a method for fast laser person tracking.
2
In order to perform safe navigation in unknown environments, a module was implemented which uses the dynamic window algorithm. This method has the advantage of incorporating the dynamic and kinematic constrains of a mobile robot by searching in a restricted velocity space. After the velocity space is found, an objective function O(v, w) is created, incorporating the heading, clearance and velocity criterias. The optimum velocity for the desired behavior is chosen by selecting a local maximum from the objective function. The robotics research community should have algorithms ready to deploy and compare, in a expansible, unconstrained framework. With this idea in mind we developed our algorithms in a standard open robotics platform, Player [5], a network server for robot control. In 2003 the Robotics Engineering Task Force identified Player as the de facto standard open robotics protocol. We also want to encourage the academic community to join this design/development philosophy.
3
Chapter 2 Background and related work 2.1
Feature detection The detection of objects can be done by feature based approaches (using
laser scanners) [6, 7, 8]. As discussed in [9], feature to feature matching is known to have the shortest run-times of all scan matching techniques. Next we introduce the more relevant research in the detection of the features we work with. 2.1.1
Circle detection
Zhen et. al. executed line detection in [10], using the sparse hough transform and circle detection through algebraic circle fitting by least squares. Zhang et. al. tested two methods for circle detection, one on-line and other off-line in [11]. On the on-line circle detection the unscented Kalman filter was used, and on the off-line circle detection the Gaussian-Newton optimization of the circle parameters was used. Extracting tree trunks was also attempted. None of these approaches solves the problem in an elegant manner, making use of geometric properties of arcs that we suggest in this report.
4
2.1.2
Line detection
Different approaches for extracting line models from range data have been presented. The most popular are clustering algorithms [2, 12] and split-andmerge algorithm [3, 4]. The Hough transform is usually used to cluster collinear points. Split-and-merge algorithms, on the other hand, recursively subdivides the scan into sets of neighbor points that can be approximated by lines. 2.1.3
Leg detection
In [9] is shown that SLAM accuracy is affected by moving segments, this system would benefit with identification of leg-like segments for removal from scan matching. Two main approaches for leg detection are known. The most common approach is by scan matching [6], which can only recognize moving persons. The second approach takes advantage of geometric shape of legs [13], similar to arcs as seen by the laser. Our proposed leg detection is based on the geometric shape approach. 2.2
People tracking Common methods for people tracking are based on particle filters, which
measure the "concentration" of points around a centroid and assume that each centroid represents an object to be tracked. This method alone doesn’t make use of the geometric properties of the clusters of points to identify the moving objects. In [14] each object is tracked using a particle filter and Joint Probabilistic Data Association Filters are applied to solve the problem of assigning measurements to the individual objects. Fod et al. [8] tracks multiple moving people within a workspace using statically mounted laser range-finders. These works use Kalman filters to keep track of objects during occlusions. Lindstrom and Eklundh [15] tracks moving objects from a mobile platform using Gaussian
5
hypotheses. Both perform the data association using nearest neighbor. Montemerlo et al. [16] addresses the problem of simultaneous localization and people tracking using range sensors, with conditional particle filters to incorporate the robot uncertainty about its position, and a nearest neighbor approach to solve the data association problem. The determination of the best navigation heuristics in order to keep track of moving targets has been issued by several authors. Lavalle et al. [17] describes a probabilistic algorithm that maximizes the probability of future observability of a moving target from a mobile robot. Gonzalez et al. [18] focus on the problem of tracking a moving target in the presence of obstacles. Tadokoro et al. [19] uses a given probabilistic model of typical motion behaviors in order to predict future poses of the persons and to adapt the actions of the robot accordingly. 2.3
Navigation The motion generating algorithms can be separated in two major approaches:
the motion planners, which compute paths based on a given map of the environment, and the reactive, that control the movements of the robot based on its sensory input. Since our work uses a reactive method to perform collision avoidance, we will cover the similar sensor based approaches. Vector field histogram In the Vector Field Histogram (VFH), developed by Borenstein et. al. [20], an occupancy grid representation is used to model the robot’s environment, and the cell values reflect the probability of obstacles. The motion direction and velocity of the robot are computed from the transformation of the occupancy information into a histogram description. This method was later enhanced to VFH+ in [21], adding information about the dynamics of the robot, resulting in smoother robot trajectories and greater reliability.This method has several
6
limitations, such as the difficulty in passing in narrow areas (e.g. doors) and the local minima problem. Dynamic window approach The Dynamic Window approach was introduced by D. Fox et al.[22]. It is a method that takes into account the dynamic and kinematic constrains of a mobile robot, by searching in a limited velocity space. The basic scheme involves finding the admissible velocities that allow the robot to stop before hitting an obstacle. Using this approach, robust obstacle avoidance behavior has been demonstrated at high velocity in [23]. This method was further expanded in [24] into the Global Dynamic Window, adding a minima-free function (using wave propagation) to the objective function, overcoming the local-minima problem, present in many local reactive approaches. Nearness Diagram Navigation The Nearness diagram navigation approach was introduced by J. Minguez and L. Montano [25] and is characterized by a high-level information extraction and interpretation of the environment in order to generate the motion commands. This approach presented several advantages, when compared to other well known reactive methods, but still lacking the consideration of the kinematic constrains of the robot. This method was later improved with the introduction of the Ego Kinematic Space in [26], incorporating kinematic constraints directly in the spatial representation. This method has the advantage of being applied to several others that do not consider the kinematic constrains, such as the Nearness diagram and Potential fields based methods.
7
Collision avoidance using obstacle tracking information A different methodology has been developed by considering object tracking information along with environment features in order to perform real time dynamic obstacle avoidance, one of the weaknesses of other methodologies. The Velocity-obstacle, introduced by [27] consists of selecting avoidance maneuvers to avoid static and moving obstacles in the velocity space, based on the current positions and velocities of the robot and obstacles. This approach relies heavily on the accuracy of the object tracker. The avoidance maneuvers are generated by selecting robot velocities outside of the Velocity Obstacles, to directly determine potential collisions. This method has been studied in several works such as [2, 28, 29], performing a real-time person/obstacle tracking and adapting the subject’s velocity to that obstacle. The Velocity-Obstacle was recently improved by Large et. al. [30] where the method was expanded to the Non-Linear Velocity Obstacle, proving this method to be highly effective, yet dependable on the accuracy of the obstacle tracker.
8
Chapter 3 Overview In this chapter we present the hardware made available to us, the rationale behind the choice of robotic development tools, the description of the modules and their interaction. 3.1
Hardware Our system is composed by a Laser Measurement System (LMS), mounted
on a Activemedia Pioneer 2-DX mobile robot, that also holds a laptop equipped with a Celeron 500MHz processor with 128 Mb of memory. The connection from the laser to the computer is made with a rs422-to-usb adapter so that higher speeds are achieved. The LMS does planar range scans with angular resolutions of 180◦ with 0.5◦ of angular resolution for a total of 360 scan points, operating at frequencies of 5Hz or 32Hz with a maximum empirical error of 0.017 meters and average error of 0.001 meters. The laser scan height is 0.031 meters, which is slightly below the knee height of a normal adult. The Pioneer 2 DX Robot is a differential wheeled robot, whose movement is based of two wheels placed on each side of the body. It can thus turn on itself
9
and go in any direction. We also have available a Philips webcam capable of captures at a resolution of 640 × 480 pixels at 30Hz, but this is not currently used in this project. For more detailed description of the hardware see appendix B.
Figure 1: Our robot
3.2
Development tools Instead of developing software from scratch we searched for development
tools for robot applications. In this section we present the candidate development tools we found, the motives of choice and a small introduction to the elect. 3.2.1
Candidate development tools
The development tools should fulfill the following requisites: • supported our hardware • provided an accurate simulator 10
• Open Source software license • recording and playback of log files • clean application programming interface (API) Three choices were taken into account: • Aria from Activemedia robotics • Carmen from Carnegie Mellon University • Player from University of Southern California The only package that fulfilled the requisites was Player. 3.2.2
Introduction to Player
Citing Player’s website [5] : "The Player/Stage project began at the USC Robotics Research Lab in 1999 to address an internal need for interfacing and simulation for mobile robotic systems (MRS)." "Player is a device server that provides a powerful, flexible interface to a variety of sensors and actuators. Because Player uses a TCP socket-based client/server model, robot control programs can be written in any programming language and can execute on any computer with network connectivity to the robot." "Stage is a scaleable multiple robot simulator; it simulates a population of mobile robots moving in and sensing a two-dimensional bitmapped environment, controlled through Player. Stage provides virtual Player robots which interact with simulated rather than physical devices." 3.3
Modules The developed modules were developed in a dynamic link library model,
known as plugin drivers; this way they don’t need to be compiled directly into 11
Player’s kernel, thus allowing them to be modified and distributed easily. The plugin drivers run on the server (laptop on-top of robot) and use Player’s interfaces 1 for communication with clients. Both the feature detection and the people tracking modules use the fiducial interface, that was originally conceived to communicate spatial landmarks. The dynamic window module uses the position interface to communicate it’s position, velocities and it’s goal destination. The guideline for the development of the modules is that they can be useful acting individually and also as a whole, without the redundance of a bigger all-in-one module. 3.4
Interaction between modules In this section we present the "big picture" of interaction between the differ-
ent modules. Some of the modules are not developed yet, they are represented only to demonstrate future possibilities. We proceed explaining the interaction between modules ilustrated in Fig. 2. 3.4.1
Feature detection
The process begins with the feature detection driver importing ranges data from the LMS, in order to find lines, arcs/circles and segments with diameter similar to one or two legs. The reliable odometry driver imports lines and arcs and odometry data for correlation with data from previous instants, reducing the odometry drift error present in the dead reckoning of the robot. A new robot position interface with less error is then provided. Making use of the reliable odometry and of feature detection we can create a landmarks map useful for localization. 1
sets of specific data structures, commands and configuration
12
3.4.2
People tracker
The people tracking driver is feeded with reliable odometry and possible legs so it can interpret the movement of possible legs in order to find patterns that resemble persons, and further track those persons. 3.4.3
Navigation
The navigation driver imports persons states, confirmed legs and laser ranges in order to plan a trajectory to navigate safely avoiding collisions and tracking people. Finally the user can command the robot to go a destination, follow a person or map an area.
Laser ranges
Pioneer odometry
Sensorial data Computed data User command
Feature detection lines and arcs
Reliable odometry global robot pose
Navigation trajectory
possible legs User People tracking Scene interpret landmarks map
robot go to
confirmed legs
follow person
person state
map area
Figure 2: Interaction between the developed player drivers
13
Chapter 4 Feature detection This module is composed by two procedures: the range segmentation and the feature extraction. First the scan data segmentation creates clusters of neighboring points. Later these segments are forwarded to the feature extraction procedure, where the following features are considered: circles, lines and people’s legs. 4.1
Range Segmentation The range segmentation is the grouping of consecutive scan points, which
due to their proximity probably belong to the same object. The segmentation criterion is based on the distance between two consecutive points Pi and Pi+1 . Points belong to the same segment as long as the distance between them is less than a given threshold. Isolated scan points are rejected. Because our laser readings have a maximum statistical error of 0.017 meters, like was proved in [31], we don’t have any error compensation scheme. 4.2
Circle detection Circle detection uses a new technique we called the Internal Angle Variance
(IAV). This technique makes use of trigonometric properties of arcs: every point
14
in an arc has congruent angles (angles that have the same number of degrees) in respect to the extremes. An example of this property can be verified in Fig.3: Let P1 ,P4 be distinct points on a circle. Let P2 ,P3 be points on the circle lying on the same arc between P1 and P4 . Then m(6 P1 P2 P4 ) = m(6 P1 P3 P4 )
(1)
This is because both angles measure one-half of m(6 P1 OP4 ) O
P1
P4
95° P2 95° P3
Figure 3: The angles of points inside an arc are congruent in respect to the extremes
P1
180° P2
180° P3
P4
Figure 4: The angles inside a line are congruent in respect to the extremes and measure 180◦ The detection of circles involves calculating the mean and standard deviation. Positive detection occurs with standard deviation values below 8.6◦ and values of mean aperture between 90◦ and 135◦ . These values that were tuned empirically to detect the maximum number of circles, while avoiding false positives. The confidence of the analysis increases with three factors: • the number of in-between points (more than 2) 15
Figure 5: Detection of embedded arcs • the apertures of the in-between points must have a low standard deviation • the average aperture angle near the value 90◦ , this means that more of the circle is visible There are some particular cases and expansions that we will detail next: 4.2.1
Detecting lines
The line detection procedure using IAV uses the same procedure of circles but the mean is 180◦ as can be seen in Fig. 4. 4.2.2
Detecting arcs embedded in lines
Identifying round corners of a wall is possible after standard line detection. This happens because line detection excludes scan points previously identified as belonging to lines from circle detection. For this technique to work we cannot allow small polygons to be formed, this implies line detection with a minimum peak error and distance between endpoints. The idea is summarized in Fig. 5. 4.2.3
Detecting "arc-like" shapes
Suppose that we want to detect a tree trunk, or a object that resembles a cylinder, this is possible by increasing the threshold of the standard deviation. Experimental tests demonstrated that not so regular arcs can be found between
16
117°
120°
102°
105°
84°
Figure 6: The apertures of a "V" shaped corner grow around a local minima 8.6◦ and 23.0◦ of standard deviation. When this acceptance threshold is increased false positives can be found. A further analysis is then required to isolate the false positives, an example of this is the "V" shaped corner in Fig. 6. This case can be isolated by finding the point with the smallest aperture (middle in this case) and verifying if four neighbor points have progressively bigger apertures, if this happens we have a negative detection. With the certain that we have detected a circle all that we need now is to estimate it’s center and radius. From analytic geometry, we know that there is a unique circle that passes through three points. For these points we will use the extremes of the segment P1 and P3 and the point with the aperture value most close to the average P2 . Two lines can be formed through 2 pairs of the three points, the first passes through the first two points P1 and P2 , the second line through the next two points P2 and P3 . The equation of these two lines is ya = ma (x − x1 ) + y1
(2)
yb = mb (x − x2 ) + y2
(3)
Where m is the slope of the line given by ma =
y2 − y1 x2 − x1 17
(4)
mb =
y3 − y2 x3 − x2
(5)
The center of the circle is the intersection of the two lines perpendicular to and passing through the midpoints of the lines P1 P2 and P2 P3 . The perpendicular of a line with slope m has slope
−1 , m
thus equations of the lines perpendicular
to lines a and b and passing through the midpoints of P1 P2 and P2 P3 are 1 x1 + x2 y1 + y2 (x − )+ ma 2 2 1 x + x y + y3 2 3 2 )+ yb0 = − (x − mb 2 2
ya0 = −
(6) (7)
These two lines intersect at the center, solving for x gives x=
ma mb (y1 − y3 ) + mb (x1 + x2 ) − ma (x2 + x3 ) 2(mb − ma )
(8)
The y value of the center is calculated by substituting the x value into one of the equations of the perpendiculars. The value of the radius is the distance from point P2 to the center of the circle. The data we keep from circles is the first and last indexes of the laser scan where the circle was detected, the Cartesian coordinates of the center, the radius, and the standard deviation. 4.2.4
Circle prerequisites
To avoid analyzing all segments for circles, each segment must validate one geometric condition, the middle point must be inside a area delimited by two lines parallel to the line segment composed by the extremes of the segment, as can be seen in Fig. 7. To make this validation simpler, it is preceded by a rotation around the x axis. The full operation is explained by the following equations. angle = arctan
18
lef tx − rightx lef ty − righty
(9)
(P1P3)
P1
0.1 x P1P3
P3 Minimal distance
Maximal distance
0.7 x P1P3
Maximal distance with noise
Laser
Figure 7: Condition that selects segments for circle analysis point = x cos(angle) − y sin(angle)
(10)
diameter = d(lef tmost, rightmost)
(11)
−0.7 × diameter < point < −0.1 × diameter
(12)
Where angle is the angle to rotate (robot facing x axis), lef t and right are the Cartesian coordinates of the left and right extremes of the segment and d is a Cartesian distance function and diameter is the distance between the leftmost and rightmost points. Note that the delimiting zone is adjusted to include circles of small sizes where the SNR of the laser is small. When detecting big circles it is possible to shorten this limit for better filtering of the segments to analyze. 4.3
Leg detection The proposed leg detection methodology assumes the following about legs: • they occupy 2 or more laser scan points • a leg diameter has a maximum value of 0.2 meters • legs can be partially occluded so a minimum diameter of 0.04 meters is set
19
• if the segment is occluded from both sides we don’t consider it Sometimes segments get classified as legs and also as circles, because our process is not mutually exclusive that’s ok. The data we keep from leg detection are the first and last indexes of the laser scan where the leg was detected, and also the Cartesian coordinates of the middle point of the segment. 4.4
Line detection The line detection procedure uses the Recursive Line Fitting algorithm. This
algorithm will try to fit lines to the segment being analyzed. The algorithm operates as follows: 1. Obtain the line passing by the two extreme points 2. Obtain the point most distant to the line 3. If the fitting error is above a given error threshold, split (where the greatest error occurred) and repeat with the left and right sub-scan The fitting function is the Orthogonal Regression of a Line, as demonstrated in [32]; explained with more detail in appendix C. This approach tries to find the "principle directions" for the set of points. An example of the process can be seen in Fig. 8. The recursive process break conditions are: • few points to analyze • distance between extremes is outside the specified in the configuration parameter • fitting error under threshold
20
To avoid creation of small polygons, three rules were state: • more than 5 points in each analyzed segment • distance between extremes greater than 0.1 meters • maximum fitting error 0.02 meters An example of the process can be seen in Fig. 8. For each line we keep the first and last indexes of the laser scan where the line was detected, the slope, bias, number of points analyzed and the error returned.
(a)
(b)
(c)
(d)
Figure 8: The process of Recursive Line Fitting: a) principle directions, b) first line segment identified, c) principle direction of second segment, d) both lines detected
21
Chapter 5 People tracking This chapter describes our approach to the identification and tracking of people. 5.1
Requisites of the people tracker To handle our test scenarios we established four requisites: • handle temporary occlusions of persons • be robust in crowded places • handle partial occlusions of already classified persons • recognize two legs in one only segment
5.2
Methodology This module receives leg features from the feature detection driver and inter-
prets them to compose and track persons. It can be seen as two main procedures: 1. Associate leg features from different samples 2. Recognize and track people
22
3
2
1
(a)
(b)
(c)
(d)
Figure 9: The tracker requisites are: handle temporary occlusions of persons (a), be robust in crowded places (b), handle partial occlusions of already classified persons (c), recognize two legs in one only segment (d) 5.2.1
Closest pairs
The closest pairs algorithm [33] is responsible for associating data, normally to keep track of legs and persons and also for creating new persons by grouping pairs of nearby legs. The algorithm takes advantage of the way people walk, e.g. when a person walks, one leg moves fast in the direction of the torso and the other just bends forward without lifting the foot, thus this leg only moves slightly. Using closest pairs, the updating process resolves from the objects that moved less to the objects that moved more, see Fig. 10. 5.2.2
Data structures
To manage the information we defined the following objects:
23
2nd match
3rd match 1st match
Too far, no match
Figure 10: Closest pairs example possible two legs is a segment imported from the fiducial detection module that has diameter similar to two legs. Will be further analyzed if it’s centroid is near a person. new segment is a segment imported from the fiducial detection module that has diameter similar to one leg. Is composed by one state variable, representing the segment centroid. old segment is a segment that didn’t belong to any leg candidate or person leg in the previous instant. Is matched with new segments to form leg candidates. It’s composed by one state variable, representing the previous segment centroid. leg candidate is possibly a leg, only waiting for another nearby leg candidate to be promoted to person leg. The object is initialized as being static, but if the distance between the initial and the actual positions is greater than a given threshold the object is classified as dynamic. Is composed by three state variables, representing the initial, the previous and the actual positions.
24
person leg it’s structure is identical to a leg candidate, but actually belongs to a person. Contains a pointer to the person that owns it. person is composed by four state variables, representing the initial, previous, actual and estimated positions. Each person contains two pointers to it’s person legs, or to void (if the corresponding person leg wasn’t observed). It also contains a counter that counts the number of instants the person wasn’t observed. The person’s future position estimation equation is the following: leg1 + leg2 2 4 = posek − posek−1 posek =
posek+1 = posek + 4
(13) (14) (15)
Where posek is the current position and posek+1 is the estimation of the future position. Object management is achieved using lists, we have a list for every class of objects. 5.2.3
The people tracking algorithm
Here is the description of the functions involved in the tracking by the order they happen: Lists and counters maintenance remove old segments and possible two legs . Rename new segments to old segments. Increase persons observation counters. Update new segments list import new segments and possible two legs. Convert the possible two legs segments to new segments if they’re near any person. 25
Leg match in time Update the positions of the persons legs and then of the leg candidates with their closest pairs in the new segments list. Leg cleanup Remove leg candidates and persons legs that weren’t observed. Create new leg candidates associating closest pairs of old segments and new segments to create new leg candidates . Update/recover persons Update(if two persons legs are visible) or estimate(if only one or no persons legs are visible) the current position of every person. If the person wasn’t seen for some time remove it. See Fig. 11 for more details. Create new persons find closest pairs of leg candidates that don’t belong to persons and create new persons with them. Fill Fiducial Fill the fiducial list and send it to the player client for visualization.
26
Require: persons≥1 for all persons do L ⇐ number of visible legs if L = 2 then D ⇐ distance between this person legs if D is normal then update person else if D is far and only one leg moves then update person with moving leg else remove the person end if else if L = 1 then approximate the person position to visible leg position else if L = 0 then update person position with previous estimate, without reseting the person seen counter end if if person wasn't seen for some time then remove it end if end for
Figure 11: update/recover persons algorithm
27
Chapter 6 Navigation 6.1
Introduction One of the major objectives of mobile robotics is to build robots that can
perform missions in cluttered environments. The system must be able to feel it’s environment, react to sudden changes and reorganize itself dynamically, so it can comply to its mission, in spite of the adverse changes that can appear. Simmons, in [34] elaborated a list of guidelines that modern collision avoidance systems should comply to: • The robot should navigate safely, even in the face of noisy sensors and dead-reckoning error. • The robot should be goal-directed while trying to avoid obstacles. • The method must be computationally efficient, to run in real-time onboard the robot. • The dynamics of the robot should be taken into account, to enable it to travel at high speeds in crowded environments. • The method should explicitly try to maximize forward progress of the 28
robot. • The method should simultaneously control both the direction and speed of the robot. The motion generating algorithms for mobile robotics can be divided into planning algorithms and obstacle-avoidance algorithms. While the planning algorithms consider a map of the surrounding environment to compute a path from the robot’s current position to the goal, the obstacle-avoidance algorithms usually use sensory information to determine the movements that will prevent collisions with the obstacles in the environment. For most applications in mobile robotics the environment is partially or completely unknown and changes with time. Under such circumstances, the trajectories generated by planning algorithms become inadequate and re-planning is necessary to reach the goal. Therefore, in this kind of environments, real time obstacle avoidance algorithms tend to be much more effective. Purely reactive obstacle avoidance, however, may not result in the behavior required to accomplish the robot’s task, due to the possible lack of available space and the dynamic constants/kinematics of the robot. This chapter covers this particular aspect of the design of a robotic system: the safe execution of a collision-free motion to a goal position. In order to achieve this, we propose an extension to the Dynamic Window approach. Design rationale Our collision avoidance system has several requisites: It has to be a local reactive system, has to be compatible with our development platform (the robot Pioneer 2DX with a SICK LMS200 laser range finder and the Player software) and it has to take in account the dynamic and kinematic capabilities of the robot. Although there is a wide range of techniques and approaches 29
that perform collision avoidance, due to the requisites, only two of the previous approaches have been taken in consideration: the dynamic window and the velocity-obstacle (VO). One of the particularities of the VO approach is the incorporation of the object-tracking information directly on the velocityspace. It has the particularity of being a Cartesian-based system, this meaning that the velocity commands have to be given in Vx and Vy (Cartesian), not in (v, w). Our robotic platform, only accepts velocity commands based on the translational v and rotational w velocities, and, although it would be possible to perform a Cartesian to (v, w) approximation, it would imply an undesired calculation overhead in each velocity-search iteration. The Dynamic Window has the advantage of "working" nativelly in the (v, w) velocity space, thus being the chosen approach for our system.
30
6.2
The Dynamic Window Approach In this work, an implementation of the Dynamic Window approach is used
in order to perform real time reactive collision avoidance. This method was originally introduced in [22] and later expanded in [24]. It is a method that takes into account the cinematic and dynamic constrains of a differential-drive robotic platform, derived from the dynamic equations of mobile robots. The kinematic and dynamic constrains are taken into account by directly searching the velocity space of the robot. The search space is the set of tuples (v, w) of translational velocities v and rotational velocities w that are achievable by the robot under a certain time interval. Among all velocity tuples, are selected those that allow the robot to stop before colliding into an obstacle, given the current position, velocity and acceleration capabilities of the robot. These velocities are called the admissible velocities. Restricting the search to a dynamic window further reduces the search space in accordance with the dynamic limitations of the robot. The dynamic window contains those velocities that can be achieved by the robot, given its current velocity and its acceleration, within a given time interval. This time interval corresponds to a servo tick of the control loop. The dynamic window is a rectangle, since acceleration capabilities for translation and steering are independent. To determine the next motion command, all admissible velocities within the dynamic window are considered. Among those a velocity is chosen that maximizes the alignment of the robot with the target and the length of the trajectory until an obstacle is reached. 6.2.1
Dynamic velocity search space
Circular trajectories As mentioned in Appendix D, it is possible to approximate the trajectory of a differential-drive robot by a sequence of circular arcs, known as curvatures,
31
Figure 12: Circular trajectories of a mobile robot
each determined by a velocity vector (vi , wi ). For the generation of a trajectory to a goal point for the next n time intervals, the robot has to determine the velocities (vi , wi ), one for each of the n time intervals [t0 ..tn ]. The curvatures have to be carefully chosen, in order to avoid any possible obstacle, while still maintaining the heading towards the goal whenever possible. Admissible velocities Obstacles closer to the robot impose restrictions on the rotational and translational velocities the robot is able to reach. The maximum admissible velocity in a certain curvature depends on the distance to the next obstacle in the curvature. Therefore, for a velocity defined by (v, w), the function dist(v, w) represents the distance to the closest obstacle on the correspondent curvature. A distance is considered admissible if the robot is able to stop before it reaches the obstacle. Let v˙ b and w˙ b be the accelerations for breakage. Then the set Va of admissible velocities is defined by:
Va = (v, w)|v ≤
q
2 · dist(v, w) · v˙ b ∧ w ≤
32
q
2 · dist(v, w) · w˙ b
(16)
Dynamic window In order to take into account the limited accelerations executable by the motors, the search space is reduced to the dynamic window that contains only the velocities that can be reached within the next time interval. Let t be the the time interval where the accelerations v˙ and w˙ are applied and (va , wa ) the actual velocity, the dynamic window Vd is defined by: Vd = (v, w)| v ∈ [va − v˙ · t, va + v˙ · t] ∧ [wa − v˙ · t, wa + v˙ · t]
(17)
The dynamic window Vd is centered around the current velocity and its dimension depend on the accelerations v˙ and w˙ that can be exerted during the t time interval. All the curvatures located outside the dynamic window cannot be reached within the next time interval and thus are not considered by the algorithm. Resulting search-space The restrictions to the velocity space described above, result in the velocity space Vr pictured in Fig. 13. Let Vs be the velocity space, Va the admissible velocities and Vd the dynamic window, then Vr is the intersection of those limited velocities, namely:
Vr = Vs ∩ Va ∩ Vd
(18)
The Fig. 13 illustrates the resulting velocity space on a typical indoors scenario, where the robot (in the center of the Vr window) is approximating a narrow corridor, this resulting in a partial "forbidden" zone in the left side of the resulting window.
33
Velocity space Vs Vmax Walls
Dynamic window Vd
_
Va
Vr
Reduced space Va
(Vini,Wini)
-90º/sec
90º/sec Robot's velocity
Figure 13: Resulting search space 6.2.2
The objective function
After the determination of the resulting search-space Vr , the algorithm proceeds into the selection of the velocity. In order to include the goal heading, clearance and velocity criteria, the objective function O(v, w) is created and a maximum (v, w) velocity tuple is determined. O(v, w) = α · heading(v, w) + β · distance(v, w) + γ · velocity(v, w)
(19)
Target heading The function heading(v, w) measures the predicted alignment of the robot with the goal. To determine the predicted position we assume that the angular velocity of the robot is constant during the next time interval. Let θ be the current robot orientation, dt the sampling time and w the current angular velocity, then θaux is the predicted rotation for the next moment, as pictured in Fig. 14. 34
goal
Estimated position
Current position start
Figure 14: Calculation of the θ angle for the predicted position
θaux = θodom + dt ∗ w
(20)
Then, using two auxiliary variables xaux and yaux and multiplying them by a rotation matrix, we get the following predicted x and y values:
v xaux = xodom + ( ) · cos(w · dt) · sin(θaux ) + sin(w · dt) · cos(θaux ) − sin(θaux ) (21) w v yaux = yodom + ( ) · sin(w · dt) · sin(θaux ) − cos(w · dt) · cos(θaux ) + cos(θaux ) (22) w
Then, by calculating the arctg between the xgoal and ygoal and the predicted xaux and yaux we get the θ value for all the v and w contained by the dynamic window. Distance to obstacles The function dist(v, w) represents the distance to the closest obstacle that intersects with the curvature defined by the tuplet (v, w). If no obstacle is found on that curvature the function returns a large constant. The length of each considered radius is calculated by the Eq. 24 and 25, applied to the laser ranges returned by the laser range finder. 35
a = laser[angle] ∗ cos(angle)
(23)
b = laser[angle] ∗ sin(angle)
(24)
1 b2 radius = ∗ ( + a) 2 a
(25)
Then, using the Eq. 25 the length of the radius is calculated.
radius =
wi vi
(26)
The calculation is made by scanning all the considered v and w in the dynamic window, using the Eq. 26 to find a match in all the radius returned by the Eq. 25. Due to the small dimension of the resulting Vr and the discretization of the laser readings, the amount of calculations in each iteration of the algorithm is reduced to a minimum. Velocity The function velocity(v, w) is used to evaluate the progress of the robot on the corresponding trajectory. This function is of primordial importance when used with conjunction with the parameter γ, regulated by well chosen contextrelated heuristics which result in acceleration/deceleration behaviors that vary on narrow or wide spaces, for example. Normalization and parameter tuning All three components of the objective function are normalized to [0, 1] and calculated with appropriate values for α, β and γ, depending of the desired behavior for the current location. As an example, the Fig. 15 pictures a possible algorithm for indoors behavior. Two different behaviors are set: Head for clearance and Head for 36
behaviour ⇐ {head for goal, head for clearance} if behaviour = h e a d for goal then ⇐ 0.8, ⇐ 0.2, ⇐ 0.1 else if Behavious = h e a d for clearance then ⇐ 0.2, ⇐ 0.8, ⇐ 0.1 end if if front_ dist < t h r e s h ol d or side_dist < t h r e s h o l d then ⇐ - 0.1 end if if behaviour = h e a d for goal and goal_dist < t h r e s h ol d then ⇐ - 0.1 end if if front_ dist < t h r e s h ol d_st o p then ⇐ - 0.2 end if
Figure 15: Example of an algorithm for indoors behavior goal. In a wide space, the priority is given to the heading, thus α assumes a value much higher than the other parameters. If the robot is traversing a narrow corridor, the priority is given to the maximization of clearance, thus the priority is given to parameter β, while the parameter γ is tuned to induce deceleration. In the Head for goal behavior, an deceleration is exerted when the goal is near. In this example a heavy deceleration is exerted when the front distance is inferior to a threshold. The available configuration parameters are described in the Table 1.
Name angledisc vdotmax wdotmax vini vend wini wend robotrad v w goaldist goalpath
Table 1: Available configuration parameters Default Meaning 10 Laser discretization for radius fitting (deg.) 500 maximum translational acceleration (mm/s2 ) 300 maximum angular acceleration (deg./s2 ) 0 minimum possible translational velocity (m/s) 1000 maximum possible translational velocity (m/s) -90 minimum angular velocity (deg./s) 90 minimum angular velocity (deg./s) 500 radius of the robot (mm) 100 initial translational velocity (mm/s) 0 initial angular velocity (deg./s) 500 minimum goal distance until the robot stops (mm) goal1 ..goaln goal array, defined in the command prompt (mm)
37
Chapter 7 Visualization tool To test the accuracy of our system a visualization tool named PlayerGL was developed in OpenGL [35]. PlayerGL has the following capabilities: • Two type of camera views (plant and projection) with full freedom of movement • On the fly configuration of the laser and fiducial devices • Two grid types, Polar and Cartesian • Screen logging of detected features PlayerGL is an invaluable tool for testing, comparing and tunning the presented algorithms.
38
Chapter 8 Experimental results and discussion The experiments were carried out with the LMS performing planar range scans with angular resolutions of 180◦ with 0.5◦ of angular resolution for a total of 360 points, operating at frequencies of about 5 Hz with a maximum empirical error of 0.017 meters and average error of 0.001 meters. The laser scan height is 0.031 meters, which is slightly below the knee height of a normal adult. Some screen captures of PlayerGL are provided, in them our robot is represented as the black figure with a blue square (representing the laser). The Euclidean grid unit is meters. The algorithms run on a Celeron 500MHz laptop and the visualization runs on a client. 8.1
Feature detection
8.1.1
Tests performed
To illustrate the accuracy of our methods two tests are shown, one with the robot moving in a corridor, with lots of lines to detect, showing the possibilities of SLAM; and another for arc/circle and leg detection showing the possibilities
39
of scene interpretation. Dynamic test In this test the robot start with a 180◦ rotation, then it follows the corridor for about 10 meters, with some students appearing in the laser scan while we search for lines. The snapshot in Fig. 16 shows lines perfectly detected in red. Leg detection is represented in green to demonstrate the possibilities of people tracking and exclusion of leg segments from SLAM trackers for noise removal.
Figure 16: Lines detected (color red) in a real test scenario
Static test This test occurred in a hallway entrance, with some architectonic entities: two benches, two columns, walls and corners. Benches are formed by two cylinders with a radius equal to the columns radius. In this test a student pulls a perfectly round cylinder (with a radius equal to the columns) mounted on a wheeled platform around two columns. Without
40
Figure 17: A screen capture of PlayerGL, at the end of our feature detection test. Circle detection of the mobile cylinder are the green circumferences, legs are purple circles. Interpretation of walls, columns and benches is also achieved. being planned a student lady (wearing jeans) passed by the low right corner. One challenge is to detect correctly the mobile cylinder in all the course; right now many false positive benches occur when the mobile cylinder is at a distance from other cylinders equal to the bench distance. The results in Fig. 17 show that the circle detection (green circumferences) was accurate, and no objects in the background were misdetected as circles. Scene interpretation is done when possible so that the architectonic entities (walls, corners, columns, benches) present are also visualized. Detected legs are represented as the tiny purple circles. It is possible to see the tracks of the student pulling the mobile cylinder as well as the girl passing 41
in the corner. If the environment is not crowded those tracks should suffice for following a person. 8.1.2
Benchmarks
Looking for the worst case scenario we timed the algorithms during the above tests, the results should give a good idea of the expected performance running in normal conditions. We measured times by iteration (overall time to analyze all the segments) and the time by individual segment. Table 2: Times of feature Statistic Mean Standard deviation Lines Maximum Mean Circles Standard deviation Maximum Mean Totals Standard deviation Maximum
detection by iteration Dynamic (s) Static (s) 0.001125 0.001050 0.000237 0.000047 0.001940 0.001201 0.000362 0.000846 0.000148 0.000046 0.000628 0.001082 0.001667 0.002064 0.000318 0.000086 0.002756 0.002281
Table 3: Times of feature Statistic Mean Standard deviation Lines Maximum Mean Circles Standard deviation Maximum
detection by segment Dynamic (s) Static (s) 0.000007 0.000029 0.000013 0.000092 0.000170 0.000998 0.000005 0.000004 0.000005 0.000006 0.000145 0.000120
The totals in Table 2 consist of the sum of times for the whole process (lines, circles, segmentation and leg detection).In the static test of Fig. 18 the timings are mostly stable, the lines taking a fraction more time than circles. In the dynamic test of Fig. 19 we notice a bigger instability in values, mainly in lines, this is because of the changing scenario. The difference of timings from the lines to the circles is due to the circle pre-conditions that spare lot of duty cycles from the ALU (Arithmetic Logic Unit), this is evident until approximately iteration 30. The two peaks in the total time are due to Player being a multithreaded 42
−3
x 10
Everything Lines Circles
2.5 ← Maximum
Seconds
2
1.5
← Maximum ← Maximum
1
0.5
0
0
10
20
30
40
50 Iteraction
60
70
80
90
100
Figure 18: Plot of total times of the algorithms in the static test −3
3
x 10
Everything Lines Circles
← Maximum 2.5
Seconds
2
← Maximum
1.5
1
← Maximum 0.5
0
0
50
100
150 Iteraction
200
250
300
Figure 19: Plot of total times of the algorithms in the dynamic test
43
application that is also running over a wireless connection. 8.2
People tracking tests We used the static scenario to test the person tracking algorithm. The
student begins to move from the point labeled "start pose" all the way until it reaches the "end pose", this while pulling a mobile cylinder. 8.2.1
Leg matching
Leg tracking imports the leg features from Fig. 17 and associates them using closest pairs. The result of this data association can be seen on Fig. 20. The purple circles represent the legs objects and the red line between them their movements. In the label "leg occlusion" is demonstrated that the system survived to a discontinuity of leg association. In the label "leg switch" is shown a exchange of legs, the "right" became the "left". 8.2.2
Person tracking
The red line represents the centroid of the person. In label "leg occlusion" is shown some error (blue line) due to the estimation of the centroid of the person just with one leg instead of two. In labels "occlusion by a dragged cylinder" and "occlusion by column" is shown a recovery from occlusion. 8.3
Navigation Extensive tests were carried out in a wide range of simulated scenarios using
the Stage simulator. The tested scenarios covered a wide range of simulated conditions bound to be found in everyday conditions, such as corridors, hallways and living rooms. The algorithm was also implemented in a Pioneer 2DX robot and experimented in the corridors of the University of Algarve with positive results.
44
End pose
Occlusion by collumn
Start pose
Leg occlusion
Occlusion by collumn
Leg switch
Occlusion by dragged cylinder
Figure 20: Association of legs at the end of the test End pose
Occlusion by collumn
Start pose
Leg occlusion
Occlusion by collumn
Leg switch
Occlusion by dragged cylinder
Figure 21: Tracking of the person at the end of the test
45
8.3.1
Simulated experiments
The Fig. 22 illustrates the variation in velocity of the robot, defined by the tuple (v, w), while heading towards a goal. In this experiment the maximum translational velocity is 1m/s and the angular velocity is bound within the interval [−90◦ /s, 90◦ /s]. The robot starts by turning to the maximum rotational velocity in order to maximize the heading towards the goal, while accelerating. Then a obstacle avoidance maneuver occurs, implying turning and deceleration. After the obstacle is avoided, the robot accelerates to the maximum velocity, while resuming the maximum heading towards the goal. When the goal is near, the translational velocity decreases, in order to stop in the correct coordinates. In the next simulated experiment, pictured in Fig. 24, the robot is set to a goal located in the end of a corridor with obstacles and variation of width. The maximum translational velocity is 1m/s and the angular velocity is bound within the interval [−90◦ /s, 90◦ /s]. The robot achieves the maximum velocity until the narrow section of the corridor, where a deceleration occurs. The corridor is traversed in low translational speeds (0.4m/s). Near the end of the corridor, the robot accelerates when the walls start to get wider. When the obstacles are found, some obstacle avoidance maneuvers occur, with variation of angular velocities and a decrease in the translational velocity in order to avoid an obstacle. After the obstacles are avoided the robot heads towards the goal with a decrease in translational velocity when the goal is near. 8.3.2
Real tests
After successful experiments in the simulator, the collision avoidance system was implemented in a Pioneer 2DX robot and tested indoors in the University of
46
v=1.0m/s w=0º/s
goal
v=0.3m/s w=0º/s
v=0.7m/s w=-40º/s
v=0.5m/s w=-60º/s v=1.0m/s w=0º/s
start
Figure 22: Evolution of velocities v, w heading towards a goal Odometry (XY Axis) Angular velocity (degrees/s)
5000 4500 4000 3500
2000 1500 1000 500 0 −500 −500
0 −20 −40 −60 −80 −100
2500
Translational velocity (mm/s)
Y axis (mm)
3000
0
500
1000 1500 2000 2500 3000 3500 4000 4500 5000 X axis (mm)
Angular velocity vs. time
20
0
2
4
6 Time (ms)
10
12 6
x 10
Translational velocity vs. time
1000 800 600 400 200 0
0
2
4
6 Time (ms)
Figure 23: Evolution of odometry and velocities vs. time
47
8
8
10
12 6
x 10
walls
v=0.6m/s w=0º/s
v=0.5m/s w=-40º/s
start
goal
v=1m/s w=0º/s
v=0.4m/s w=0º/s
v=0.2m/s w=-50º/s
v=0.3m/s w=0º/s
Figure 24: Collision avoidance while heading towards a goal Odometry (XY Axis)
1500 1000 Y axis (mm)
500 0 −500 −1000
Translational velocity (mm/s)
Angular velocity (degrees/s)
−1500
0
0.5
1
1.5
X axis (mm)
2
2.5 4
x 10
Angular velocity vs. time
100 50 0 −50 −100
0
1
2
3
Time (ms)
4
5
6
7 7
x 10
Translational velocity vs. time
1000 800 600 400 200 0
0
1
2
3
Time (ms)
4
5
Figure 25: Evolution of odometry and velocities vs. time
48
6
7 7
x 10
the Algarve. The experiment consisted, pictured in Fig. 26, consisted on placing the robot on one end of a corridor with the goal set at 5m in front, on the other end of the corridor. Some obstacles were placed in front of the robot, in order to test the collision avoidance system. In this experiment, the maximum velocity is 0.5m/s and the angular velocity is bound within the interval [−90◦ /s, 90◦ /s]. Several collision avoidance maneuvers occured, implying some decelerations, yet the robot managed to run at an average speed of 0.30m/s in a maximum speed of 0.5m/s. Some oscilation in the angular speeds was detected, due to the fact that a differential drive robot is very sensitive to the relative velocity of each wheel and the slight variations in the ground plane; therefore the robot had to perform a number of angular velocity corrections in order to keep the heading towards the goal direction. Analysing the translational speed, as pictured in Fig. 27, the robot accelerates until it reaches an obstacle (7s), then a collision avoidance maneuver occurs, implying a decelaration (9s), then the robot accelerates slightly until it reaches the second obstacle (12s), then a major decelaration occurs due to the third obstacle. After the obstacles are avoided, the robot accelerates until the goal is relatively near. Then a deceleration occurs in order to stop within the distance specified by the goaldist parameter. The robot succeeded in reaching the the minimum distance to the goal specified in goaldist parameter (set to stop at a distance under 0.5m to the goal). 8.3.3
Benchmarks
The algorithm was timed during the test. Running times per iteraction were measured, this meaning the time consumed by the algorithm to obtain an optimal velocity from the objective window O(v, w). As shown in Fig. 28 and in Table 4, the algorithm is quite fast, running at an average of 0.0019s per iteraction.
49
0.2m/s 50º/s
0.1m/s -45º/s
1m
obstacles
0.4m/s 0º/s
goal
0.4m/s 30º/s
walls
0.1m/s 0º/s
Figure 26: Collision avoidance in a real scenario Odometry (XY Axis)
1000 X axis (mm)
500 0 −500
Translational velocity (mm/s)
Angular velocity (degrees/s)
−1000
0
500
1000
1500
2000
100
2500 3000 Y axis (mm) Angular velocity vs. time
3500
4000
4500
5000
50 0 −50 −100
0
5
10
0
5
10
1000
15 Time (s) Translational velocity vs. time
20
25
20
25
500
0 Time (s)
15
Figure 27: Evolution of odometry and velocities vs. time
50
−3
3
x 10
← Maximum
Running time (s)
2.5
2
1.5 ← Minimum 1
0
50
100
Iteration
150
200
Figure 28: The dynamic window algorithm running times per iteraction Table 4: Dynamic window running time by iteraction Statistic Running time (s) Maximum 0.00260 Minimum 0.00120 Average 0.00190 Standard deviation 0.00034 8.3.4
Detected problems and possible solutions
The dynamic window was extensively tested in a wide range of possible conditions. While it proved to be an effective method of performing collision avoidance, it still presents some issues that we tried to solve and improve, whenever possible. One of the detected problems was the tendency for the robot to cut corners. This problem rised due to the fact that the method reduces the robot to a point in the center of the dynamic window, calculated in the admissible velocity space. This situation was solved by decreasing the radious to the robot width and by adding positive reinforcement to the angular velocities that allow a collision free trajectory. Another situation inherent to a local obstacle avoidance methods was detected, as the robot, in certain conditions got stuck in local minima, or the U-shaped traps, like the one in Fig. 29. This situation happens when the robot heads towards a goal in a U-shaped scenario and thus getting "trapped". While this problem is somewhat inevitable in a local navigation algorithm where the map is unknown, especially in long corridor-like structures, if can be mitigated
51
by applying look ahead algorithms to the local reactive navigation, such as the A*, introduced in [36]. Walls
start
goal
Figure 29: A local minima situation Other approaches have been developed, in order to extend the local navigation schema to a global one, such as [24] where a potential function is defined by specifying a function with no or few local minima. These approaches offer excelent results but with the limitation of the need for a a priori map knowlege.
52
Chapter 9 Conclusion and future work In this work algorithms were developed and implemented to make robot navigation more intelligent. We propose algorithms for feature detection that are fast and accurate. The Internal Angle Variance is a novel technique to extract features from laser data. It is accurate in circle detection and very helpful detecting other features. Person tracking is based on heuristics and proved to be robust and fast, fulfilling the requisites. The dynamic-window algorithm was implemented in order to perform collision avoidance in unknown environments. Like in the original dynamic-window approach, the algorithm took in consideration the dynamic and kinematic constrains of the robot to build the velocity search-space, where an optimal velocity was chosen by maximizing an objective function, based on heading, clearance and velocity. This allowed the robot to perform collision avoidance maneuvers while maintaining it’s course towards the goal, whenever possible. The algorithm was extensively tested in the Stage simulator and in a Pioneer 2 DX robot with good results. The performed tests proved that the collision avoidance system allows the robot to safely navigate in a cluttered and unknown
53
environment at relatively high speeds. The developed Player drivers provide an abstraction layer for other layers like scene interpretation, navigation, cooperative robot geometric map building, etc. Distribution in the Player robot server ensures that these algorithms will be further tested and improved. 9.1
Future work Following this work it would be interesting to develop the reliable
odometry module based on features and in a combination of features and particle filters. It would be interesting to compare the advantages of each one in terms of speed and precision of results. The scene interpretation module could be improved using markov chains to associate the laser data with possible known objects, correlating spatial displacement and also the dynamics of the object. This would involve also the creation of a taxonomic scheme for defining known structures. Localization by landmarks with scene finger-prints will be a must if we want the robot to "wake" and automatically recognize where he is on the map based on unique landmarks. Sensor fusion by combining laser with camera(s) will greatly improve the possibilities of safe navigation and object recognition and tracking. Possible improvements, in the navigation and collision avoidance procedures include the navigation could be aided by joining the odometry information with the detected landmarks and thus performing navigation by features. This would somewhat mitigate the odometry errors and improve the general accuracy of the navigation. By further placing some sensory system in the back of the robot, negative translational velocities could be considered, and therefore it would be possible to develop behaviors that would allow the robot to "escape" the local
54
minima traps by going backwards. The collision avoidance system would benefit with the integration of the tracking of moving objects information, as it would improve the collision avoidance from dynamic obstacles, such as persons and moving robots. The extension of PlayerGL using a modular philosophy would create lots of possibilities to interact with the scene and to provide feedback on the robot status. The hability to deploy 3D objects (robots, furniture,etc) to the taxonomy database and to PlayerGL and a 3D object import from Blender 3D [37] would be practical.
55
Appendix A Format of Player data A.1
Feature encapsulation in Player
Player defines data structures and procedures for communicating between clients and a server. Of the available data structures we choose the fiducial interface for communicating with our client. This interface was originally conceived to provide access for devices that detect coded fiducials (markers) in the environment. It communicates to the client a list of detected fiducials with one id, one type and 6 arbitrary data fields. We altered the fiducial proxy on the client side, so that no default conversions were applied to the units on the data fields, allowing the fields to act like a list of 6 integers. The id field is used to define a unique key for the detected feature, the type field as the name suggests is an integer representing the detected geometrical primitive, the other 6 fields further characterize the feature. See table 5 for detailed information. When the driver class is instantiated it reads parameters from a file. If some parameters are not defined in the file, they are filled with default parameters specified in the program source code. It’s also possible to change/read
56
Table 5: Format of fiducial data fields Feature Type Field # Line Circle Leg 1 laser index of first point 2 laser index of last point 3 slope centerx middlex 4 bias centery middley 5 # points radius — 6 error std — Table 6: Available configuration parameters Name Default Meaning laser 0 Index of the laser device to be used seg_threshold 80 Segmentation threshold (mm) circles 1 Algorithm for circles (0 disable) lines 1 Algorithm for lines (0 disable) legs 1 Algorithm for legs (0 disable) circle_std_max 0.15 Maximum standard deviation allowed circle_max_len 99999 Analyze segment for circles if length(mm) is inferior circle_min_len 150 Analyze segment for circles if length(mm) is superior circle_min_pts 4 Minimum number of points for circle analysis line_min_len 150 Analyze segment for lines if length (mm) is superior line_min_pts 5 Minimum number of points for line analysis configuration on the fly by sending a configuration request. Four configuration request are possible: parameters change, parameters query, position of the laser and field of view changes. A.2
Person tracking encapsulation in Player
Field # 1 4 2 5 6 3
Table 7: Format of fiducial data fields of the tracker Feature Type Person Tracker Footsteps Person Legs Leg candidate pastx lef tx middlex pasty lef ty — presentx rightx middley presenty righty — f uturey indexof f irstpoint presenty — f uturex indexof lastpoint presentx —
When the driver class is instantiated it reads parameters from a file. If some parameters are not defined in the file, they are filled with default parameters specified in the program source code. It’s also possible to change/read 57
configuration on the fly by sending a configuration request. Four configuration request are possible: parameters change, parameters query, position of the laser and field of view changes. Table 8: Tracker Name Default inc_fid_index 0 leg_clean_ticks 1 person_clean_ticks
10
leg_update_radius person_radius
600 600
configuration parameters Meaning Index of the fiducial device to be read Number of consecutive occluded instants needed for leg tracker removal Number of consecutive occluded instants needed for person tracker removal Leg search radius (mm) Person search radius (mm)
58
Appendix B Robotic platform This chapter describes our working platform, composed of hardware and software. B.1
SICK LMS200 Laser Measurement System
The SICK LMS 200 laser scanner, described in [38], is a non-contact measurement system that scan it’s surroundings two-dimensionally. It does not require reflectors nor position marks. The LMS system operates by measuring the time of flight of laser light pulses: a pulsed laser beam is emitted and reflected if it meets an object. The reflection is registered by the scanner’s receiver. The time between transmission and reception of the impulse is directly proportional to the distance between the scanner and the object (time of flight). The pulsed laser beam is deflected by an internal rotating mirror so that a fan-shaped scan is made of the surrounding area (laser radar). The contour of the target object is determined from the sequence of impulses received. The measurement data is available in real time for further evaluation via a serial interface. In a radial field of vision, a light impulse (spot) is emitted every 0.25◦ , 0.5◦ or 1◦ , depending on the current configuration. These values correspond to the LMS’s angular
59
resolutions1 . The range of the scanner depends on the reflectivity of the target object and the transmission strength of the scanner. Some reflectivity values for well-known materials have been included in table 9. Table 9: Reflectivity of some materials Material Cardboard, matt black Grey cardboard Wood (raw pine, dirty) PVC, grey Paper, matt white Aluminium, anodised, black Steel, rust-free shiny Steel, very shiny Reflectors
Reflectivity 10% 20% 40% 50% 80% 110..150% 120..150% 140..200% > 2000%
Laser scanner measurement data is used for object measurement and determining position. These measurement data correspond to the surrounding contour scanned by the device and are given out in binary format via the RS 422 interface. Fundamentally, the distance value per individual impulse (spot) is evaluated. This means that a distance value is provided every 0.25◦ , 0.5◦ or 1◦ , depending on the current angular resolution of the scanner. As the individual values are given out in sequence (beginning with value 1), particular angular positions can be allocated on the basis of the values positions in the data string. The LMS readings start on the right side, turning towards the left, as can be seen in Fig. 30. In its default state the scanner is set to the "measured values on request" mode and 9,600 baud transfer rate (changes can be made using the appropriate telegram commands). The data interface of the scanners is for setting parameters using a PC and 1
The resolution of a measuring device is the smallest possible distance different from zero between two consecutive individual measurement values.
60
Table 10: Resolutions, max. scanning angles and max. of measured values Angular resolution Max. scanning angle Max. measured values
0.25◦ 100◦ /180◦ 401
0.25◦ 180◦ 721
0.5◦ 100◦ /180◦ 361
1◦ 100 /180◦ 181 ◦
Figure 30: Readings from the LMS using a 100◦ and 180◦ scanning angles for data exchange on external software evaluation. The interface type may be selected as RS 232 or RS 422 by bridging in the connection plug. The electronics of the scanner is fed directly from a stabilised 24 V DC mains adapter. The LMS 200 scanner heater is regulated internally by a thermostat. An unregulated 24 V DC mains adapter is sufficient for power supply for the heater (current uptake of active heater is approximately 5A). The most relevant caracteristics of the SICK LMS200 are displayed in table 11. B.2
Pioneer 2 DX Robot
The Pioneer 2 robot contains a diferencial drive system that use high-speed, high-torque, reversible-DC motors, each equipped with a high-resolution optical quadrature shaft encoder for precise position and speed sensing and advanced dead-reckoning.
61
Table 11: Technical caracteristics of the LMS200 General caracteristics Maximum range Angular resolution Response time Measurement resolution System error Statistical error standard deviation
80m 0.25◦ , 0.5◦ , 1◦ (selectable) 53ms, 26ms, 13ms 10mm typical ±15mm, at a range of 1..8m typical ±4cm, at a range of 1..20m typical ±5mm, at a range ≤ 8m typical ±10mm, at a range of 1..20m
Electrical Data interface Transfer rate Supply voltage Power consumption Laser protection class Operating ambient temperature
B.2.1
RS232 / RS422 (configurable) 9.6 / 19.2 / 38.4 / 500 kbaud 24V DC current requirement max. 6A (cyclic) approximately 20W 1 (eye safe) −30◦ C..+50◦ C
Electronics
The Pioneer 2 DX, described in [39] is a mobile robot that contains all the basic components for sensing and navigation in a real-world environment, including battery power, drive motors and wheels, position/speed encoders and integrated sensors and accessories that are all managed via an onboard microcontroller an mobile-robot server software.
Pioneer 2’s standard electronics reside on two main boards: a microcontroller and a motor-power distribution board. Each sonar array also has a controller board mounted in its base. A Main Power switch at the back of the robot controls power for the entire system. Processor control switches and indicators fit through the Console. Inside the robot, mounted to the battery box, is the Motor-Power board. It
62
supplies both the 12 and five volts direct-current (VDC) power requirements of the robot’s systems. The standard Motor-Power board has a 12-pin User-Power connector that supports four sets of five- and 12-VDC power ports (total 1.5 ampere) for custom accessories. The Pioneer 2 microcontroller has a 20 MHz Siemens 88C166 microprocessor with integrated 32K FLASH-ROM. It also has 32K of dynamic RAM, two RS232-compatible serial ports, several digital and analog-to-digital, and PSU I/O user-accessible ports, and an eight-bit expansion bus. The robot’s microcontroller has two serial ports and three connectors. One connector, labeled SERIAL, is a standard 9-pin D-SUB receptacle located on the Console and is for direct RS232-compatible serial data communication between the microcontroller and a client computer. This Host serial port shares its three-line transmit, receive, and ground connections with one of the two serial connectors that is inside the robot. B.2.2
Pioneer DX2 Operating system
ActivMedia robots use a client-server mobile robot-control architecture. In the model, the robot s controller servers the Pioneer 2 Operating System (P2OS) work to manage all the low-level details of the mobile robot s systems. These include operating the motors, firing the sonar, collecting sonar and wheel encoder data, and so on - all on command from and reporting to a separate client application, such as Player/Stage, ARIA or Saphira. With this client/server architecture, robotics applications developers do not need to know many details about a particular robot server, because the client insulates them from this lowest level of control.
63
B.2.3
PID Controls
The Pioneer drive servers use a common Proportional-Integral-Derivative (PID) control system to adjust the PWM pulse width at the drivers and subsequent power to the motors. The motor-duty cycle is 200 µsec; pulse-width is proportional 0-500. The P2OS drive servers recalculate and adjust the robot’s trajectory and speed every 5ms based on feedback from the wheel encoders. The default PID values for both translation and rotation are stored in flash.
Figure 31: Internal coordinate system for the Pioneer 2 DX robot
B.2.4
Position Integration
Pioneer keeps track of its position and orientation based on dead-reckoning from wheel motion, which is an internal coordinate position, acording to the the following coordinate system shown in Fig. 31. Registration between external and internal coordinates deteriorates rapidly with movement, due to gearbox play, wheel imbalance and slippage, and many other realworld factors. Actually we can rely on the dead-reckoning ability of the robot for just a short range on the order of several meters and one or two revolutions, depending on the surface. Soft surfaces tend to be worse than hard floors. The most relevant caracteristics of the Pioneer 2 DX are displayed in table 12. 64
Table 12: Technical caracteristics of the Pioneer 2 DX Physical caracteristics Lenght Height Weight
44cm 22cm 9kg
Width Clearance Payload
33cm 5.1cm 20kg
Power Batteries Run time
3 x 12V DC 3-4 hours
Charge Recharge time
252 watt/hrs 6 hour/battery
Mobility Wheels (diameter) Steering Swing Max. translate speed
165mm Differential 32cm 1.6m/s
Wheels (width) Gear ratio Turn Max. rotate speed
65
37mm 19.7:1 0cm 300◦ /sec
Appendix C Orthogonal line fitting method Let (x,y) be the cartesian coordinates of n points. Pi = (x, y), i ∈ [1, n]
(27)
First we compute the average of the x values, the average of the y values, calling ¯ and Y¯ respectively. The point (X, ¯ Y¯ ) is the centroid of the set of points. them X Pn
¯= X
i=1
Y¯ =
Pn n i=1
Px,i Py,i
n
(28) (29)
¯ from each of the x values, and Y¯ from each of the y values, so now Subtract X we have a list of n data points whose centroid is (0, 0). ¯ yi − Y¯ ), i ∈ [1, n] Oi = (xi − X,
(30)
We must solve the quadratic equation of the slope, m2 + Am − 1 = 0 where A=
2 2 n X Oi,x − Oi,y i=1
Oi,x · Oi,y
(31)
that gives two slopes: −A + sqrt(A2 + 4) 2 −A − sqrt(A2 + 4) m2 = 2 m1 =
66
(32) (33)
each slope has a corresponding bias: ¯ b1 = Y¯ + m1 · X
(34)
¯ b2 = Y¯ − m2 · X
(35)
The maximum fitting error is: P i,x r1 = max P i,x r2 = max
!
· m1 − Pi,y + b1 √ , i ∈ [1, n] m12 + 1 ! · m2 − Pi,y + b2 √ , i ∈ [1, n] m22 + 1
(36) (37)
The r with the smallest value exposes the line that better fits our point set.
67
Appendix D Motion equations for a differential-drive robot This appendix describes the fundamental motion equations for a differentialdrive mobile robot [22], under the assumption that the rotational and translational velocities of the robot can be controlled independently and that the robot’s trajectories can be described by a finite sequence of circle segments. These assumptions are very convenient for the checking of possible collisions, since the intersection of circles with obstacles is quite simple to calculate. D.1
General motion equations
Let x(t) and y(t) denote the robot’s coordinates at a t time instant, θ(t) the robot’s orientation (heading direction) and the triplet < x, y, θ > describe the kinematic configuration of the robot. Let xt0 , xtn , yt0 , ytn denote the x and y coordinates of the robot at time t0 and tn respectively, vt denote the translational velocity of the robot at time t and w(t) its rotational velocity, then we can express x(tn ) and y(tn ) as a function of x(t0 ), y(t0 ) and θ(t):
68
x(tn ) = x(t0 ) + y(tn ) = y(t0 ) +
Z
tn
t0 Z tn
v(t) · cosθ(t) dt
(38)
v(t) · sinθ(t) dt
(39)
t0
As stated in the latter equations, the velocity v(t) depends on the initial translational velocity v(t0 ) in t0 and the translational acceleration v( ˙ tˆ) in the time interval tˆ ∈ [t0 , t] . In the same way, the orientation θ(t) is a function of the initial orientation θ(t0 ), the rotational velocity w(t0 ) in the t0 instant, and the rotational acceleration w( ˙ tˆ) in the time interval tˆ ∈ [t0 , t]. By substituting v(t) and θ(t) by the initial kinematic and dynamic values v(t0 ), θ(t0 ) and w(t0 ) and by the accelerations v( ˙ tˆ) and w( ˙ tˆ) we reach the expression1 :
x(tn ) = x(t0 )+
Z
tn
t0
(v(t0 )+
Z
t
t0
v( ˙ tˆ)dtˆ ·cos θ(t0 )+
Z
tn
t0
w(t0 )+
Z
tˆn
!
w( ˙ t˜)dt˜ dtˆ dt
t0
(40) The equations are now in a form where the trajectory of the robot depends exclusively on the initial dynamic configuration in t0 time instant and the accelerations, which we assume to be controllable. D.2
Approximate motion equations
Assuming that the trajectory of a robot can be approximated by piecewise circular arcs, and considering a sufficiently small time interval [ti , ti+1 ] we reach the following motion equation:
x(tn ) = x(t0 ) +
n−1 X Z ti+1 i=0
ti
vi · cos θ(ti ) + wi · (tˆ − ti ) dtˆ
1
(41)
The derivation of y(tn ) is analogous, thus we only describe the derivation for the x coordinate.
69
which, by solving the integral, can be simplified to the x and y coordinate equations: n−1 X
x(tn ) = x(t0 ) +
(42)
(43)
Fxi (ti+1 )
i=0
n−1 X
y(tn ) = y(t0 ) +
Fyi (ti+1 )
i=0
where
vi xi
sin θ(ti ) − sin θ(ti ) + wi · (t − ti )
Fxi = vi · cos θ(ti ) · t
if wi = 0
− vi cos θ(ti ) − cos θ(ti ) + wi · (t − ti ) i xi Fy = v · sin θ(t ) · t i
if wi 6= 0
i
if wi 6= 0 if wi = 0
(44)
(45)
To be noted that if wi = 0, the robot will follow a straight line and if wi 6= 0 the robot’s trajectory will describe a circle, as can be seen by considering the following equations:
Mxi = −
Myi =
vi · sin θ(ti ) wi
vi · cos θ(ti ) wi
(46)
(47)
for which the following relation holds:
Fxi
2
+ Fyi
2
=
v 2 i
wi
(48)
The Eq. 48 demonstrates that the i-th trajectory is a circle Mi about (Mxi , Myi ) with radius Mri =
vi . wi
Hence, assuming by constant velocities in a
small time intervals, we can approximate the trajectory of a robot by a sequence of circular and straight line arcs.
70
List of References [1] M. Brain, “Robotic freedom.” freedom.htm.
http://marshallbrain.com/robotic-
[2] U. N. D. Castro and A. Ruano, “Feature extraction for moving objects tracking system in indoor environments,” in Proc. 5th IFAC/euron Symposium on Intelligent Autonomous Vehicles, (Lisboa), July 5-7, 2004 2004. [3] L. C. A. Mendes and U. Nunes, “Multi-target detection and tracking with a laserscanner,” in Proc. IEEE Intelligent Vehicles Symposium, (Parma), June, 2004 2004. [4] T. Einsele, Localization in indoor environments using a panoramic laser range finder. PhD thesis, Technical University of Mĺunchen, September 2001. [5] R. V. Gerkey B.P. and A. Howard, “The player/stage project.” http://playerstage.sf.net. [6] D. F. D. Schulz, W. Burgard and A. Cremers, “Tracking multiple moving targets with a mobile robot using particle filters and statistical data association,” in Proc. IEEE Int. Conf. on Robotics and Automation, (Seul), pp. 1665–1670, 2001. [7] C. K. B. Kluge and E. Prassler, “Fast and robust tracking of multiple moving objects with a laser range finder,” in Proc. IEEE Int. Conf. on Robotics and Automation, (Seoul), pp. 1683–1688, 2001. [8] A. H. A. Fod and M. J. Mataric, “A laser-based people tracker,” in Proc. IEEE Int. Conf. on Robotics and Automation, (Washington D.C), pp. 3024–3029, May 2002.
71
[9] C.-C. Wang and C. Thorpe, “Simultaneous localization and mapping with detection and tracking of moving objects,” in Proc. IEEE Int. Conf. on Robotics and Automation, (Washington D.C), pp. 2918–2924, 2002. [10] L. M. Zhen Song, Yang Quan Chen and Y. C. Chung, “Some sensing and perception techniques for an omni-directional ground vehicles with a laser scanner,” in IEEE International Symposium on Intelligent Control (IEEE ISIC), (Vancouver, British Columbia), pp. 690–695, October 27-30 2002. [11] M. A. Sen Zhang, Lihua Xie and T. Fan, “Geometrical feature extraction using 2d range scanners,” in International Conference on Control & Automation, (Montreal, Canada), 2003. [12] K. Arras and R. Siegwart, “Feature-extraction and scene interpretation for map-based navigation and map-building,” in Proc. of SPIE, Mobile Robotics XII, vol. 3210, 1997. [13] D. T. D. Fuerstenberg, Kay Ch. Linzmeier and K. C.J., “Pedestrian recognition and tracking of vehicles using a vehicle based multilayer laserscanner.,” in Proceedings of ITS 2003, 10th World Congress on Intelligent Transport Systems, (Madrid, Spain.), 2003. [14] D. Schulz, W. Burgard, D. Fox, and A. Cremers, “Tracking multiple moving objects with a mobile robot using particle filters and statistical data association,” 2001. [15] M. Lindstrom and J. Eklundh, “Detecting and tracking moving objects from a mobile platform using a laser range scanner,” in IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2001. [16] S. T. M. Montemerlo and W. Whittaker, “Conditional particle filters for simultaneous mobile robot localization and people-tracking,” in IEEE International Conference on Robotics and Automation (ICRA), 2002. [17] G. B. S. M. Lavalle, H. H. Gonzalez-Banos and J. Latombe, “Motion strategies for maintaining visibility of a moving target,” in IEEE International Conference on Robotics and Automation (ICRA), 1997. [18] C. L. H.H. Gonzalez-Banos and J. Latombe., “Real-time combinatorial tracking of a target moving unpredictably among obstacles,” in IEEE International Conference on Robotics and Automation (ICRA), 2002.
72
[19] e. a. S. Tadokoro, “On motion planning of mobile robots which coexist and cooperate with human,” in IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 1995. [20] J. Borenstein and Y. Koren, “The vector field histogram - fast obstacle avoidance for mobile robots,” in IEEE Transactions on robotics and automation, pp. 278–288, July 5-7 1991. [21] I. Ulrich and J. Borenstein, “Vfh+: Reliable obstacle avoidance for fast mobile robots,” in Proceedings of the 1998 IEEE Int. Conference on Robotics and Automation, (Leuven, Belgium), pp. 1572–1577, May 16-21 1998. [22] W. B. D. Fox and S. Thrun, “The dynamic window approach to collision avoidance,” in IEEE Robotics and automation magazine, 1997. [23] J. Borenstein and Y. Koren, “A hibrid collision avoidance method for mobile robotics,” in Proceedings of the IEEE Int. Conf. on Robotics and Automation (ICRA), 1998. [24] O. Brock and O. Khatib, “High speed navigation using the global dynamic window approach,” in Proceedings of the IEEE Int. Conf. on Robotics and Automation (ICRA), pp. 341–346, 1999. [25] J. Minguez and L. Montano, “Nearness diagram navigation: A new real time collision avoidance approach for holonomic and no holonomic mobile robots,” in Internal Report RR-00-14 University of Zaragoza, p. 60, February 2000. [26] L. M. J. Minguez and J. Santos-Victor, “Reactive navigation for nonholonomic robots using the ego kinematic space,” in ICRA2002, 2002. [27] P. Fiorini and Z. Shiller, “Motion planning in dynamic environments using velocity obstacles,” in International Journal of Robotics Research, pp. 760– 772, July 1998. [28] E. Prassler, D. Bank, and B. Kluge, “Key technologies in robot assistants: Motion coordination between a human and a mobile robot.” [29] F. L. Shiller Z. and S. Sekhavat, “Motion planning in dynamic environments: Obstacle moving along arbitrary trajectories,” in IEEE Int. Conference on Robotics and Automation ICRA01, (Seul), pp. 3716–3721, 2001.
73
[30] F. L. Z. Shiller and S. Sekhavat, “Using non-linear velocity obstacles to plan motions in a dynamic environment,” in Proc. of the International Conference on Automation, Robotics and Computer Vision, (Singapore, Singapore), 2002. [31] C. Ye and J. Borenstein, “Characterization of a 2-d laser scanner for mobile robot obstacle negotiation,” in Proceedings of the 2002 IEEE International Conference on Robotics and Automation, (Washington DC, USA), pp. 2512–2518, MAY 2002. [32] Mathpages, “Perpendicular regression http://mathpages.com/home/kmath110.htm. [33] D. Eppstein, “Closest pairs algorithms.” stein/projects/pairs/.
of
a
line.”
http://www.ics.uci.edu/ epp-
[34] R. Simmons, “The curvature-velocity method for local obstacle avoidance,” in International Conference on Robotics and Automation, April 1996. [35] “Opengl.” http://www.opengl.org. [36] I. Ulrich and J. Borenstein, “Vfh*: Local obstacle avoidance with lookahead verification,” 2000. [37] “Blender.” http://www.blender3d.com/. [38] G. SICK AG, Industrial Safety Systems, “Sick: Proximity laser scanner.” Technical Description, 05 2002. [39] L. ActivMedia Robotics, “Pioneer 2 / peoplebot,” 2002.
74