Sensor Management for Local Obstacle Detection in Mobile Robots

0 downloads 0 Views 270KB Size Report
Figure 1: Design of a “collision–detection” sensor: Left: Tunnel vision problem. Right: Sensor Field of Re- gard: it is defined by the Field of View , the Depth of.
Proceedings of the 2002 IEEE/RSJ Intl. Conference on Intelligent Robots and Systems EPFL, Lausanne, Switzerland • October 2002

Sensor Management for Local Obstacle Detection in Mobile Robots



  Juan C. Alvarez , Rafael C. Gonz´alez , Diego A. Prieto , Andrei Shkel , V. Lumelsky   Dept. of Electrical & Computer Eng., University of Oviedo, Gijo´ n, Espa˜na, [email protected]

Dept. of Electrical & Computer Eng., University of California, Irvine, USA, [email protected] Dept. of Mechanical Engineering, University of Wisconsin, Madison, USA, [email protected]

Abstract In experimental robotics it is common to complement the motion planning algorithm with a local obstacle avoidance module. We are concerned here with the specifications of a range sensor specifically design for the task. Such sensor has special requirements: it has to guarantee detection and response, and it has to guarantee throughput, providing information at a rate in proportion to robot velocity. An analysis is presented which allows the selection of the sensor requirements for collision avoidance tasks in mobile robots. The design takes into account robot dynamics, it is compatible with fast motion –only the indispensable environment zones are explored– and avoids unnecessary velocity reductions. Experiments show the possibilities and limitations of the approach.

1 Introduction

the time taken to process the sensor data, decreasing proportionally the reactivity to the external world. This trade-off can be addresed by a “moving while processing” strategy, but it will adversaly affect the motion planner performance [2]  Our approach is based on finding the minimum information needed to guarantee the safety of motion. Problem formulation and limits are discussed in Section 2. Section 3 is devoted to the analysis of the sensor requirements; robot dynamics and motion control policy are included in the design. The solution includes a sensor management strategy to pan the sensor in order to eliminate the tunnel effect, see Section 4. The resulting specifications can be implemented with various sensor technologies, such as ultrasonics or CCD cameras. Experimental results (Section 5) show the feasibility of the proposed strategy, allowing a mobile robot to move at high average speeds.



























































This paper is concerned with the design of a range sensor used for collision–avoidance tasks in mobile robots. Being robot motion planning a difficult problem, some simplifying assumptions are usually made, such as stationary environment, no actuator dynamics, flat terrain or perfect sensor information. Since in real robots these assumptions are easily violated, it is common to complement the motion planning algorithm with a “local obstacle avoidance” module. It is a sensor-based task, whose aim is to let the robot reach intermediate goals –as provided by the global planner– while avoiding “unexpected” (from the planner point of view) obstacles. The task nature imposes two key requirements on the sensor: it has to guarantee throughput, providing information at a time rate fast enough to deal with the robot velocity and dynamics. And it has to guarantee detection, by providing information enough to let the robot react adequately to any potential danger. Notice the difficulty for detection guarantee in Figure 1-left: a sensor oriented toward the direction of motion does not sweep some possible motion trajectories, a tunnel–vision problem [1]. Throughput and detection guarantee are opposite demands. If we try to assure the latter by augmenting the scanned area in the robot’s surroundings, it will increase

0-7803-7398-7/02/$17.00 ©2002 IEEE

frontal swept line df γ

ρ



 rr

Figure 1: Design of a “collision–detection” sensor: Left: Tunnel vision problem. Right: Sensor Field of Regard: it is defined by the Field of View  , the Depth of Field  , and the pan angle direction  .

2 Problem Definition We address the problem of a robot  moving in a planar 2D Euclidean environment  =  of associated Cartesian frame  .  has a circular shape with radius  , and its configuration is represented by  "!$#%'&)("*+(",-! ,

2401

the coordinates of its center .0/1#2 &3("*4! and its orientation , relative to the coordinate system 5 . Orientation , , a mobile frame 6 attached to the robot center, and the instantaneous velocity vector, are collinear. The robot is equipped with range sensors, which can sweep an area in front of the robot called field of regard, see Figure 1-right. It is defined by the aperture angle (field of view,  ), and its maximum range (depth of field,  ). The sensor is able to pan, represented by a rotation angle  relative to the mobile frame  6 . The “frontal swept line” (FSL) is chosen to be perpendicular and symmetrical with respect to the mobile frame, and it is defined by parameters (   ,  ,  ). This paper is about the task-dependent selection of these three parameters. Robot motion decisions are computed in short and fixed time periods 798;:Á  © , the risk factor  ±m» tends to one, and the risk of a lateral detection of a harmless obstacle (area    ) ~ vanishes. There are no active detection with the FSL. 2) if  ©›© ª l ª  © : }± ¹ #‡µ3(K}± » #tp4½¿¾H("± À #tp4½¿¾ Unnecessary detection risk is diminished, because part of the FSL is starting to be employed in “useful” detection.

2403

A3 *

df

5

A1

r2

A2

6

A1 A2

γ/2

r1

4 **

df

3

Suggest Documents