Stereo Vision Based Navigation for Sun-Synchronous Exploration

5 downloads 4760 Views 2MB Size Report
Earth. In July 2001 Hyperion was deployed to Devon. Island ... Hyperion fuses information from a carrier phase DGPS, .... 2. Diagram of module interconnections.
Stereo Vision Based Navigation for Sun-Synchronous Exploration Chris Urmson, M. Bernardine Dias Robotics Institute Carnegie Mellon University, Pittsburgh, PA, 15213, USA Abstract- This paper describes the navigation system used on a prototype sun-synchronous robot. Sun-synchrony is a concept that will enable exploration missions by solarpowered rovers that could last months or years. This requires a robust navigation system capable of traversing natural terrain. The obstacle avoidance and navigation algorithms are presented in detail with results from field experiments in the Canadian Arctic.

robot is completely solar powered, drawing its energy through a large vertically mounted solar panel. Hyperion is driven by four independently controlled wheels and steers using a passively actuated joint between its front axle and the body. A combination of stereo vision and laser range finding are used to sense the terrain. To estimate its location, Hyperion fuses information from a carrier phase DGPS, wheel odometry, and a roll/pitch sensor. The mechanical and sensing configuration of Hyperion is optimized for sun-synchronous exploration near the poles of Earth. 1.2

Fig. 1. Hyperion at the field experiment site.

1 Introduction Sun-synchronous exploration is an energy-cognizant strategy for planetary exploration [8][16]. This concept will enable exploration missions by solar powered rovers that could last months or years. Hyperion, a solar powered robot, was developed to demonstrate this idea on Earth. In July 2001 Hyperion was deployed to Devon Island, above the Arctic Circle, to perform experiments in sun-synchrony. The culmination of these experiments was a pair of sun-synchronous routes, which Hyperion planned and executed [17]. This paper describes the obstacle detection and navigation systems utilized on Hyperion and presents relevant results of the field experiments. 1.1

Hyperion

The name Hyperion comes from Greek mythology, and roughly translates to “He who follows the Sun”. The

Navigation Problem

Hyperion’s navigation problem can be divided into two parts: generating a sun-synchronous route and then executing it. A mission planner which provides high level, low resolution plans utilizing available digital elevation maps [13] addresses the problem of sunsynchrony. This planner searches for an optimal route that ensures Hyperion will have enough energy to complete the objectives that make up its current mission. However, it is only capable of planning for features that are detectable at the resolution of the map (typically 25m or greater) such as hills and valleys. To complement the mission planner, a finer resolution situational planner (navigator) is required to navigate the robot around obstacles that are too small to appear in the high level maps (or are unknown at the time of mission specification) while achieving waypoints determined by the mission planner. The navigator is the focus of this paper. Critical to the design and implementation of a navigator are the characteristics of the environment it is to operate in. The high arctic terrain that Hyperion is designed for is typical of many regions of the Moon and Mars. The terrain is mostly rolling with discrete large boulders and impassable rocks interspersed at a low density. The ground generally consists of loose shattered rocks or hard packed soil. This type of environment allows for many optimizations of the navigation algorithm that would be inappropriate in more challenging terrain.

2

Related Work

A majority of work in robot navigation software has dealt with robots operating on a flat plane with discrete obstacles. In recent years, there has been increased interest in algorithms capable of driving robots over rough terrain. Much of this work also maintains the assumption that space can be easily divided into what is and what is not traversable [1][4][12]. Unfortunately, natural terrain rarely provides this simple distinction. There is often a gradation between terrain that is easily traversed and that which is completely impassable. Without modeling this variability, a robot will be paralyzed by being overly cautious or will take undue risks. The abrupt boundary between traversable and impassable can also lead to potentially dangerous instabilities in the navigation algorithm when the terrain is viewed from slightly different perspectives. Researchers are beginning to explore descriptions of terrain that better encode its variability. Continuous measures of terrain have been implemented as fuzzy sets [7][5]. These representations are generally combined with fuzzy logic controllers which suffer local minima problems similar to potential field approaches. There have also been several efforts that describe traversability as a continuous value based on the weighted combination of terrain metrics [2][9]. Generally, systems that use obstacle detection incorporate it as a reactive behavior with little global planning capability [6]. By combining a path planner with a local obstacle detection system useful behavior can be generated in unknown or partially known environments. The novel elements of this work include the serialization of the obstacle avoidance and navigation algorithms and the utilization of complementary sensors to avoid considering terrain pessimistically. In previous work, the output of local obstacle detection software has been combined in parallel with global planning information through an arbiter. In our approach, drive arcs are selected utilizing the obstacle avoidance and global information in serial, avoiding potential conflicts between global and local situational awareness.

3 Software Architecture Hyperion’s software modules run as loosely coupled processes communicating through a TCP/IP based publish/subscribe protocol. The anonymous nature of this system assisted in development, as it was possible to test various system modules without requiring the entire robot to be operational. Furthermore, this approach to module communications allowed for the development of simple, decoupled, testing tools. These tools subscribe to

standard communication messages, essentially eavesdropping on the internal state of the system, and can be easily started and stopped during runtime without affecting other processes. Hyperion utilizes a two tiered obstacle detection system to guarantee safety. The nature of the terrain encourages the use of an optimistic navigation system; allowing terrain that is unsensed by the stereo vision system to be traversed. For this approach to be safe, a backup, highly reliable system must be in place. On Hyperion, a laser range finder is configured as a “virtual bumper”. The laser prevents the robot from hitting any undetected obstacles by stopping the robot when it detects an imminent collision.

Fig. 2. Diagram of module interconnections.

4

Stereo Vision Based Obstacle Detection

The physical characteristics of a robot play a critical role in determining the configuration of its sensors. Key parameters are the stopping distance, speed and height at which the stereo vision cameras can be placed [3]. Optimizing the characteristics of a stereo vision head involves tradeoffs between range accuracy, minimum viewing distance, stereo rate, and field of view. To maximize the effective field of view of the stereo system, the cameras are mounted to steer with the front axle of the robot. This allows the use of relatively narrow (60°) field of view cameras that effectively have a much wider field of view. The stereo vision head is configured to maximize situational awareness while maintaining reasonable computational requirements. Hyperion utilizes a 20cm stereo baseline that yields a range resolution of approximately 15cm at 7m. Limiting the number of disparities the stereo correlation engine searches dramatically reduces the time taken to process each frame, but also limits the minimum distance at which range data can be acquired. Due to this effect, Hyperion is unable to reliably recover accurate range information closer than 1m from the cameras. This

arrangement provides the optimal tradeoff between accuracy at range and minimum sensing distance. 4.1

Stereo Filtering

Point errors due to miscorrelations are a common problem in stereo vision generated 3D data. An approach to preventing miscorrelation is to filter the resultant disparity image by rejecting regions where the input image doesn’t have much visual texture. A similar technique is to filter pixels when the correlation is not sufficiently unique. Instead of filtering the disparity image based on the characteristics of the input images, the filtering on Hyperion utilizes only the output disparity data. The world Hyperion navigates in can be modeled as a (relatively) smoothly varying 2 ½ D surface. Any large step or spike in the disparity data can be assumed to be noise and should be discarded. This approach allows the algorithm to utilize regions of low visual texture while still providing reliable data. To check for steps in the disparity image, each disparity pixel is compared with its neighbors. If its disparity is more than a ¼ pixel different than any of its neighbors, it is discarded. This algorithm can be easily and efficiently implemented. Small patches of bad data can be removed by this approach but the skeleton of large patches will remain.

impassable. Roughness is calculated as the chi-squared residual of the plane fit. The step height metric is an estimate of the variance of the height of points in the cell. It is intended to detect step features in the terrain, such as large boulders. Regions of the stereo footprint that contain more points generally generate a better estimate of the actual terrain. To capture this, a certainty value is computed for each patch. Certainty is calculated as a function of the number of points in a patch and the evenness of the distribution of the points over the patch.

5

Navigation

The navigator enables Hyperion to plan traversable paths to goals generated by the mission planner or the operator. This process is illustrated in Fig. 3. The navigation control structure on Hyperion consists of three embedded loops. The outermost loop handles initialization and waits for either a goal (or several goals) or a command to quit. The second loop is entered when the goal queue is found to be non-empty (i.e. when one or more goals are submitted by the planner/operator). For each goal in the queue, a new map is set up and the third loop is entered but not exited until the goal is either reached by the robot or cancelled by the operator.

A final filtering step compares the computed height of each pixel to the expected ground plane. If it is too far from the ground plane, the pixel is discarded. The combination of these two approaches dramatically reduced the number of false positive obstacles detected by the stereo mapping system. 4.2

Traversability Analysis

Hyperion’s obstacle detection algorithm is derived from MORPHIN [15]. For each stereo pair, the algorithm generates a traversability map for the local terrain. To do this, the local terrain is divided into cells that are 12.5 cm square. Groups of cells are combined into overlapping robot-sized patches. For each patch the algorithm finds the best plane that represents the perceived terrain. Traversability is determined by looking at three metrics: the slope, roughness and “step height”. Each metric is normalized such that it is in the range [0,1]. The traversability of the patch is considered to be the worst of these three values. Slope is considered to be the steepest of the pitch or roll the vehicle would encounter on the plane. It is calculated directly from the plane fit. Hyperion is designed to be able to climb slopes and navigate side slopes of up to 15°; thus any cell with a slope greater than this is considered

Fig. 3. An illustration of the relationship between the Navigator maps and the Mission Planner map.

5.1

Path Evaluation

Path planning to a goal is performed using the D* algorithm [10][11]. An arc-set of 15 arcs with radii ranging from –4m to +4m is used. The minimum radius that can be safely steered using the stereo cameras determines this range. A default speed of 30cm/s is used by the navigator, but this can be changed by an operator

at any point during execution. The navigator utilizes 100m x 100m square maps with a cell size of 25 cm. The navigator uses a combination of traversability and certainty (c) to evaluate the cost of paths. Each navigation map cell is directly mapped to four cells in the traversability map with traversability (t1 to t4). The cost for navigation map cell is:

(1 − Min(t i )) * c The basic procedure of the navigator when evaluating paths is as follows: •

Update the robot position via the state estimator



Obtain stereo information from the stereo mapper



Insert the relevant information into the D* map in the corresponding position



Once again update the position to account for movement of the robot during stereo data evaluation



Evaluate the cost along each of the arcs, vetoing any arc that collides with an obstacle



Choose the arc that results in the lowest cost (total sum of cost along the arc plus cost from the end of the arc to the nearest point in the goal region). If several arcs with equal cost are found, the first of these arcs to be evaluated is chosen.



Send radius, speed, and time for this arc to the controller. The radius is the radius of the arc with the lowest cost. The speed is the current speed. The time is the estimated time to complete a cycle in the control loop (pessimistically set to 2.0 s).



If the goal is not reached, repeat procedure. (Thus, the robot travels a fraction of the chosen arc, and then the entire procedure is repeated.)

5.2

Traversability Scaling

Costs are discretized into a specified number (we used 10) of integer values ranging from FREE to OBST. When transferring the stereo cost values generated by the stereo mapper to corresponding cost values in the D* map, the following algorithm was used: If the stereo cost is greater than the obstacle threshold (0.6), the scaled cost is set to OBST. If the stereo cost is less than the free space threshold (0.2), the scaled cost is set to FREE. Else, the scaled cost is set to (stereo cost * (OBST – FREE)) + FREE. Thus, any cell with a cost of OBST is considered an obstacle, any cell with a cost of FREE would be considered free space, and cells with costs between FREE

and OBST indicate terrain that is more difficult than clear flat terrain but is still traversable by the robot. 5.3

Goal Regions

Waypoints are specified as goal regions whose dimensions and orientation are specified at runtime. In general, the map is oriented along the direction in which the robot should travel to optimize sun-angle with the goal region aligned perpendicularly to the preferred direction of travel. 5.4 Response to Faults Two faults are detected and reported to the health monitor: OFF_MAP_FAULT and NO_ARCS_FAULT. The OFF_MAP_FAULT is generated whenever the robot is moved off the map; this can occur when the robot is not in autonomous mode. If an operator drives the robot off the map, the OFF_MAP_FAULT is triggered and the navigator keeps checking for the fault condition until it is cleared. When the robot is back on the map, the navigator continues with normal operations and stops triggering the fault. When the navigator discovers a situation where all of its arcs are blocked it triggers a NO_ARCS_FAULT. Here too, the navigator keeps checking for the error condition to clear. Neither of these conditions actually disables the navigator, they simply cause the output arcs to be discarded. At this point, the navigator cannot autonomously recover from these faults. Instead, an operator needs to handle recovery from error situations.

6

Results

To evaluate the concept of sun-synchronous navigation, we deployed Hyperion to Haughton Crater (on Devon Island), in Arctic Canada. As part of an end-to-end system test, the mission planner generated a sequence of waypoints marking out a 6km sun-synchronous circuit that the navigator was to execute over a period of 24 hrs. Human operators monitored progress. For recharging cycles between waypoints, operators would orient the robot as required by the mission planner. Further information regarding this experiment can be found in other papers [13][17]. Stereo Mapping Accuracy The raw stereo mapping data generated by the stereo algorithm had planar errors ranging from -5cm to +20cm increasing with distance from the cameras. In practice, this error didn’t affect the validity of maps generated by the stereo obstacle detection algorithm. Given the cell size of our traversability map, obstacles were mislocated

by at most 4/5 of a cell. As the robot travels towards obstacles, the position of the obstacles is revised repeatedly and this data would be appropriately updated in the navigation map. In practice the stereo based obstacle detection was more reliable than the “virtual bumper” which generated a large number of false positives due to an algorithm that was overly simplistic for the terrain.

successfully. A reasonable metric for efficiency of the navigator is heading error: the difference between the optimal heading and the actual heading. Over the 24 hour traverse, the robot had an average heading error of 21.0°. In the context of sun-synchrony this meant the robot received 93% of the maximum energy the mission planner expected it to receive.

Algorithm Speed The navigation loop executes in 0.7s to 1.5s. The largest individual components of this time were the stereo matching (0.1 to 0.15s) and the traversability analysis (0.15 to 0.3s). Variability in the timing was primarily due to I/O blocking. The navigator often had to wait for pose information from the state estimator. This jitter in the control frequency was not a problem due to the slow robot speed. 6.1 Field Experiment Results The majority of Hyperion’s traverse was performed autonomously. Post analysis of mission telemetry revealed that Hyperion drove 6.1 km, operating autonomously for 90% of the time. At no point was it necessary for human observers to physically interact with the robot. Tele-operators found that manually adjusting the arc speed Hyperion used had an impact on Hyperion’s ability to navigate through dense obstacle fields. Due to the decoupling of forward speed and steering, Hyperion can change its heading in a much shorter distance when traveling at lower speeds, allowing to navigate tight constraints that it would not be capable of at higher speeds.

Fig. 5. A plot of obstacle density vs. waypoint.

The average density of rough or impassable terrain was 6.9%. That is, 6.9% of the total surface area sensed by the robot was not smooth terrain. During several legs, the density was considerably higher, reaching a maximum of 34.2%.

The navigator and obstacle detection algorithms proved very reliable. The level of performance was evident in the navigator’s ability to traverse very cluttered terrain

Fig. 6. The navigation map for a traverse where Hyperion traversed a very cluttered rock field.

Fig. 4. A Histogram of heading error over a 24 hour circuit

One problem observed was a phenomenon where the robot would hook right just before entering a goal region. This problem was due to several arcs evaluating to the same cost. The navigator would then arbitrarily choose the right most arc. This problem has been fixed so that the navigator selects a straight-ahead arc preferentially.

7 7.1

Future work Variable Speed Control

Operators can control the velocity at which the navigator drives the robot. In tight quarters, slowing the robot improved its ability to navigate around obstacles due to the relative increase in nimbleness. By monitoring the local obstacle density, Hyperion could have autonomously decreased its speed as necessary. This would decrease human input further and would likely result in a more robust robot which does not sacrifice speed unnecessarily. Fig. 7. Hyperion in a dense rock field.

6.2 Use of Debugging Tools A key lesson from this experience is that high quality debugging tools are vital. Critical to the success of the obstacle detection algorithm was the development of tools that allowed in-depth visualization of its inner workings. This software greatly improved the efficiency with which appropriate parameters for the terrain classification could be determined and also greatly simplified the debugging of the algorithm. Without such tools, the tuning of the navigation system would have been extremely difficult.

7.2 Rough Terrain Even when utilizing continuous values to describe the traversability of terrain, there are many subtleties about the terrain that are missed. Often the robot is required to consider obstacles pessimistically; in some cases, the shape of an obstacle would allow a rover to traverse it safely even though the statistical analysis indicates that it is an unsafe region for the rover to drive. Through mapping the highly dimensional characteristics of the terrain to a low dimensional space, as is performed in the traversability analysis, subtle but important characteristics of the terrain can be missed. By choosing a different mapping (and mapping to a higher dimensional space), it is likely that terrain could be better represented. This will become particularly important as mobile robots operate in more difficult regions with greater reliability requirements. 7.3

Generalization of the Algorithm

The NASA Jet Propulsion Laboratory is developing a common object based framework, called CLARAty, to allow robotic software to be more portable between software and hardware platforms [14]. As part of this effort, the navigation algorithm described in this paper will be ported to this framework. This will lead to more robust and generally useful software while providing a framework within which various navigation algorithms can be more directly compared.

Acknowledgments Fig. 8. 3D view of terrain rendered by a debugging tool.

This paper describes the work of the Sun-synchronous Navigation project and all of its members are important contributors. The authors acknowledge and thank Dimi Apostolopoulos, Jesse Boley, Stewart Moorehead, Benjamin Shamah, Reid Simmons, Sanjiv Singh, Surya Singh, Tony Stentz, James Teza, Paul Tompkins, Vandi Verma, Michael Wagner and David Wilkinson. Field

experimentation was conducted in collaboration with the NASA Haughton-Mars project, Pascal Lee, Principle Investigator. This work is supported by NASA under grant NAG9-1256, Chris Culbert, program manager. The support of Melvin Montemerlo was vital to the completion of this work.

References [1] P. Belluta et al. “Terrain Perception for Demo III”, Proc. IEEE Intelligent Vehicles Symposium, Dearborn, USA, October 2000. [2] J. Biesiadecki et al. “The Athena SDM Rover: a Testbed for Mars Rover Mobility”, Proc. International Symposium on Artificial Intelligence and Robotics & Automation in Space, St-Hubert, Canada, June 2001 [3] A. Kelley and A. Stentz. “Rough Terrain Autonomous Mobility – Part 1: A Theoretical Analysis of Requirements”, Autonomous Robots, No. 5, May 1998, pp 129-161 [4] S. Laubach and J. Burdick. “An Autonomous SensorBased Path-Planner for Planetary Microrovers”, Proc. IEEE International Conference on Robotics and Automation, Detroit, USA, May 1999. [5] A. Martin-Alvarez et al. “Fuzzy Reactive Piloting for Continuous Driving of Long Range Autonomous Planetary Micro-Rovers”, Proc. IEEE Aerospace Conference, 1999. [6] H. Seraji et al. “Safe Navigation on Hazardous Terrain”, Proc. IEEE International Conference on Robotics and Automation, Seoul, Korea, May 2001. [7] H. Seraji. “Traversability Index: A new concept for Planetary Rovers”, Proc. IEEE International Conference on Robotics and Automation, Detroit, USA, May 1999. [8] K. Shillcutt. Solar Based Navigation for Robotic Explorers,” Ph.D. thesis, CMU-RI-TR-00-25, October 2000. [9] S. Singh et al. “Recent Progress in Local and Global Traversability for Planetary Rovers”, Proc. IEEE International Conference on Robotics and Automation, San Francisco, USA, April 2000. [10] A. Stentz. “Optimal and Efficient Path Planning for Partially-Known Environments”, Proceedings of IEEE International Conference on Robotics and Automation, volume 4, pp.3310-3317, 1994. [11] A. Stentz. “The Focused D* Algorithm for RealTime Planning”, Proceedings of International Joint Conference on Artificial Intelligence, August 1995.

[12] A. Stentz and M. Hebert. “A Complete Navigation System for Goal Acquisition in Unknown Environments”, Proc. IEEE/RSJ International Conference On Intelligent Robotic Systems, Aug 1995. [13] P. Tompkins et al. “Mission planning for the Sunsynchronous Navigation Field Experiment”, Submitted to Proc. IEEE International Conference on Robotics and Automation, Washington, USA, May 2002. [14] R. Volpe et al. “The CLARAty architecture for robotic autonomy”, Proc. IEEE Aerospace Conference, 2001. [15] D. Wettergreen et al. “Developing Nomad for Robotic Exploration of the Atacama Desert”, Robotics and Autonomous Systems, February 1999. [16] D. Wettergreen et al. “Robotic Planetary Exploration by Sun-Synchronous Navigation,” Proc. International Symposium on Artificial Intelligence and Robotics & Automation in Space, St-Hubert, Canada, June 2001. [17] D. Wettergreen et al. “Experiments in SunSynchronous Navigation”, Submitted to Proc. IEEE International Conference on Robotics and Automation, Washington, USA, May 2002.

Suggest Documents