Systems for Automated Launch and Recovery of an ...

4 downloads 2065 Views 2MB Size Report
Figure 4 shows the mirror assembled on the drive mo- tor. ... The HSIF comes standard with a 2K hard- ..... of XGA video imagery to an external monitor or PAL.
Systems for Automated Launch and Recovery of an Unmanned Aerial Vehicle from Ships at Sea Matt Garratt, Hemanshu R. Pota, Andrew Lambert and Sebastien Eckersley-Maslin The University of New South Wales at the Australian Defence Force Academy Canberra, ACT 2600, Australia {m.garratt,h.pota,a.lambert}@adfa.edu.au, [email protected]

ABSTRACT Sensors and systems for a fully autonomous unmanned helicopter have been developed with the aim of completely automating the landing and launch of a small unmanned helicopter from the deck of a ship. Our approach is novel in that we are targeting a much smaller helicopter than previously considered for maritime operations. Our tests so far have been completed on a Yamaha RMAX unmanned helicopter with an all up weight of about 80kg including autonomy systems. In this paper we will describe new sensors to meet this challenge including a new deck orientation and ranging sensor and a small beacon tracking sensor using FPGA based image processing technology. We will describe a technique for predicting shipmotion that will later assist us with trajectory planning for the helicopter landing approach. Finally, a lightweight deck securing system will be presented that will prevent toppling and sliding of the helicopter from a moving deck.

BIOGRAPHY Matt Garratt spent ten years service in the Royal Australian Navy, working as a dual specialist in alternate aeronautical and marine engineering postings. On leaving the service, he spent two years working as a consultant for LEAP Australia in the field of computer-aided engineering. He left the commercial world to work as the Control Systems Engineer before joining a DARPA funded project to build an autonomous helicopter using biologically inspired vision, which led to successful visual control of a helicopter in hover and forward flight in 2000. Since 2001, he has been with the University of New South Wales as a lecturer in the School of Aerospace, Civil and Mechanical Engineering. His main research areas are helicopter dynamics and sensing and control for autonomous systems. Hemanshu Roy Pota received the B.E. degree from Sardar Vallabhbhai Regional College of Engineering and Technology, Surat, India, in 1979, the M.E. degree from the Indian Institute of Science, Bangalore, India in 1981, and the PhD degree from the University of Newcastle, Australia in 1985, all in electrical engineering. He is currently an Associate Professor with the University of New South Wales at the Australian Defence Force Academy, Canberra, Australia. He has held visiting appointments at the University of Delaware, Iowa State University, Kansas State University, Old Dominion University, the University of California, San Diego and the Centre for AI and Robotics, Bangalore, India. He has a continuing interest in the area of power system dynamics control and modelling control of mechanical systems such as flexible structures, acoustical systems and UAVs. Andrew Lambert lectures in Electrical Engineering at The University of New South Wales at the Australian Defence Force Academy (UNSW@ADFA), Canberra, Australia. He specialises in signal and image processing, optoelectronics and high speed DSP and FPGA hardware. He obtained his BSc(Hons) in Physics from University of Otago, New Zealand in 1987, and his PhD on Optical Image Processing from UNSW in Electrical Engineering in 1997. His active research area is Imaging through Turbulence for astronomical and surveillance applications with recent expansion of the image processing and adaptive optics to ophthalmology. This involves turbulence sensing, image restoration, development of electro-optical and digital hardware, path modelling, and signal processing using FPGAs. Sebastien Eckersley-Maslin is a Weapons Electronics Engineering Officer with the Royal Australian Navy and is studying a Masters in Engineering at the Australian Defence Force Academy. Lieutenant Eckersley-Maslin joined the Navy in 2000 and was top of his class in his Weapons Electronic Application courses. He served in the guided missile frigate HMAS SYDNEY for 18 months as a Weapons and Combat Systems Engineer before joining the Air Warfare Destroyer program in Jan 2006 where he currently works on the Combat System Architecture.

Systems for Automated Launch and Recovery of an Unmanned Aerial Vehicle from Ships at Sea

1 Introduction When considering systems which would allow VTOL UAVs to operate from small ships a number of challenges arise. Of these, we address two major problems. Firstly, there is a need to know with accuracy, the relative position and orientation of the ship’s deck. Secondly, as the deck is continually moving, a system is required that will secure the UAV to the deck up until the instant of launch and from the moment of touchdown. Without such as system, the aircraft is liable to topple or slide off the deck as soon as the ship rolls or pitches excessively. For small vessels at sea which have significant motion, this is especially critical. To address these problems, we have prototyped a number of systems which could be used to enable small UAVs of the order of 100kg operate from ships less than 60m in length. Determining the relative position and orientation of a pitching and heaving deck is a major challenge. Previous attempts at doing this have predominantly focused on using either (a) relative GPS which is subject to satellite availability, signal blockage from the ships superstructure, multipath errors and jamming [3, 9] or (b) radar guidance which is expensive and reliant on substantial radio frequency emissions. As part of the US Department of Defence Joint Precision Approach and Landing System (JPALS) project [8], research is being funded on means to make Shipboard Relative GPS (SRGPS) sufficiently robust in the presence of errors to permit automatic landings on Aircraft Carriers [5, 9, 13]. This work is progressing but it is likely that, for robust operations on small ships, the use of ship-based navigation beacons known as pseudolites may be required to augment satellite signals close to the ship [18]. The two main ship launched VTOL UAVs under development, Northrop Grumman Firescout and Bell Eagle Eye, make use of the UAV Common Automatic Recover System (UCARS) developed by Sierra Nevada Corporation [25]. The UCARS uses a millimeter-wave radar on the ship in combination with a transponder mounted on the aircraft to track the trajectory of the UAV. While the UCARS is effective, it requires the use of radar emissions which can be undesirable in a tactical situation. It is also expensive and requires significant infrastructure on the ship which, while affordable on an aircraft carrier, may not be suited to a small patrol boat. We are aiming to field systems that could be used on much smaller vessels that experience greater ship motion and where expensive ship-based infrastructure cannot be justified. In addition we are constrained to using lightweight systems onboard the UAV owing to the much reduced payload capability of a small VTOL UAV (30kg for the RMAX). In our approach, we have assumed that the UAV can complete its mission, including waypointing, using a mature navigation technology such as GPS. Submeter accuracy relative to a moving base (i.e. the ship) are achievable with current COTS solutions such as the NovAtel moving base option [1,27], that would require

a single GPS receiver on the ship and another on the UAV with a low bandwidth data link between them. We assume that this technology is sufficient to get the UAV to a keypoint over the deck of the ship. For the final descent and landing phase we propose a new precision guidance system that incorporates an accurate estimate of the ship motion and a sensor for measuring the instanteous orientation and location of the landing deck. We have tested a simple yet robust algorithm for predicting ship motion which will be used in conjunction with the new sensors to assist in trajectory planning, with the ultimate aim of landing the UAV on the deck at an instant where the deck motion is at a minima. As the primary sensor, we have modified an existing laser rangefinder device with a novel scanning mechanism which determines both the distance to and orientation of the deck in one cycle. We have constructed a downward looking optical sensor to complement the laser system. The sensor comprises a digital camera interfaced to a Field Programmable Gate Array on a single printed circuit board. A narrow band light source on the deck is detected by the digital camera and tracked by the FPGA to provide a relative bearing to the deck from the helicopter. By combining the optical sensor bearing with the information from the laser system, the exact position of the helicopter relative to the deck can be found. In this paper we discuss the algorithms used and present flight test results for the positioning system. One of the most difficult requirements of seaoperations is the need to restrain the helicopter so that from the moment of touchdown and just prior to launch, the helicopter is prevented from toppling and sliding due to ship motion. For this purpose, we have designed and flight tested a series of 4 spring-loaded probes to engage with a deck grid to positively lock the helicopter to the deck upon touchdown and immediately prior to launch. This system requires no moving parts on the ship and has been shown to positively secure the helicopter up to a roll angle of 39◦ . We have also begun construction of a three degree of freedom moving deck platform to simulate the motion of a sea going vessel. The deck will be capable of pitch and roll up to 25◦ and heave up to 1m and will be used to dynamically test the deck probe system. Although, originally designed for testing recovery and launch systems, we propose that such a system could be used to eliminate most of the ship motion by moving in opposite phase to the motion of the ship. By cancelling out the most troublesome motions and keeping the deck level, the recovery envelope for the UAV could be expanded greatly. We will give an overview of the deck recovery systems and their applications.

2 System Overview Experiments for this research were conducted on the 80kg Yamaha R-Max helicopter shown in Figure 1. This platform has been used for autonomous helicopter

22nd International UAV Systems Conference, Bristol, April 2007

Systems for Automated Launch and Recovery of an Unmanned Aerial Vehicle from Ships at Sea research at a number of other institutions including Georgia Tech [7], UC Berkeley [14], Link¨oping University [22], Carnegie Mellon University [12] and NASA/US Army [19]. The Yamaha R-Max platform is predominantly used for agricultural work in Japan, although a fully autonomous version has been marketed for airborne surveillance. A number of variants have been produced but the underlying systems are similar in each model. Our vehicle is an L-15 R-Max with a 30kg payload and an endurance of approximately one hour. The performance of the R-Max makes it an ideal platform for research. The R-Max comes with a stability augmentation system based on an attitude control inner loop. The Yamaha control system is known as the Yamaha Attitude Control System (YACS). The YACS on our RMAX has been configured to output inertial information to our system via an RS-232 link which includes the output of three fibre optic rate gyroscopes and three accelerometers. We have added a R Microstrain 3DM-GX1 attitude sensor, which incorporates a 3-axis magnetometer, to the RMAX to provide heading information to the flight computer. Our avionics system includes a processing unit comprising a PC104 computer which is interfaced to the laser rangefinder, a radio modem, a Novatel RT2 RTK DGPS and the YACS. Currently our PC104 computer runs the RTLinuxPro real-time operating system and flight control software that performs sensor fusion and generates the control inputs to drive the collective, cyclic and tail rotor servos. The PC104 software includes an algorithm to find the plane of the deck from the laser data. Figure 2 shows the avionics system architecture for the UNSW@ADFA RMAX. An 8-port PC104 serial card is necessary to connect the PC104 to all of the RMAX systems and the sensors. The native USB port has been used to implement data-logging using a COTS memory stick. This allows high bandwidth flight test data to be recorded in flight and then transferred to a PC workstation immediately after landing for analysis. Memory sticks are now available with 4GB capacity enabling practically unlimited amounts of flight test data to be recorded.

3 Laser Rangefinder Sensor Our deck landing sensor comprises a Laser Rangefinder (LRF) with a rotating mirror as shown in Figure 3. Owing to the orientation of the axis of the mirror and the angle of the mirror to the axis, the laser scans a conical pattern centred on the deck below. As the laser traces out a circle or ellipse of points on the deck, an array of 3D coordinates is assembled that defines the intersection of the laser scan pattern and the deck. Each scan takes place in less than 40 milliseconds and typically comprises 100 points. As the range accuracy of each point on the deck is better than 2cm in practice, the error in the deck position is small and suitable for guiding the trajectory of the helicopter as it descends to the deck. A plane fitted through these points using an appropriate technique such as least

squares then defines the relative distance and orientation of the deck with respect to the helicopter. We used an AccuRange 4000 Laser Rangefinder from Acuity for this project. This rangefinder uses a modulated beam to measures range using a time-offlight method. The 20 milliwatt beam is transmitted by a laser diode at a wavelength of 780 nm. The manufacturer claims a stated range of a few cm up to 16.5m with an accuracy of 2.5mm. Although Acuity provide a linescanner system of their own, we needed to replace this system to obtain the conical scan pattern peculiar to this application. A mirror was machined out of a block of aluminium with a reflecting face cut at 86.5◦ to the axis of the motor. This provided a cone with an included angle of 7.0◦ as the mirror spins. The mirror was handpolished and then electroplated with silver to provide a reflectivity to the laser light of 95%. Figure 4 shows the mirror assembled on the drive motor. To obtain a fast enough scan rate, we wanted to spin the mirror at least 1500RPM or 25 cycles per second. As the mirror was not symmetrical about the axis, the imbalance of the mirror needed to be addressed to operate at these speeds. We therefore designed a balancing collar out of stainless steel that offset the static and dynamic imbalance of the mirror. The collar was of constant thickness but the end profile was machined to maintain the centre of gravity of each longitudinal slice of the combined collar/mirror assembly on the axis of the shaft. Once assembled, the balance of the assembly was finely adjusted by adding tiny weights and removing material where required. The entire assembly with LRF and mirror was mounted under the belly of the RMAX using 4 cable mounts for vibration isolation. The mounts were tuned to attenuate vibrations at the main rotor frequency and above. The mirror is mounted directly on to shaft of a small DC motor which is itself mounted at 45◦ to the beam of the laser rangefinder. The speed of the motor can be adjusted by changing the input voltage using a multiposition switch. An optical encoder is fitted on the shaft of the motor. The encoder outputs a quadrature pulse train with a precision of 4096 pulses per revolution. An index pulse is triggered once per revolution for synchronisation purposes. The pulse train from the encoder is monitored by an analogue safety interlock system that automatically disrupts power to the laser in case of the mirror speed falling below 1000rpm. This stops the laser from being concentrated on to a single spot for too long and causing a hazard to observers. The rangefinder and encoder signals are read into a PC104 form factor High Speed InterFace (HSIF) manufactured by Acuity. The HSIF uses an ISA bus interface to communicate with the CPU and enables a sample rate of up to 50,000 samples per second. For our application, to limit the data processing requirements, we have only used a sample rate of 2KHz but there is no reason why this could not be increased for better accuracy. The HSIF comes standard with a 2K hard-

22nd International UAV Systems Conference, Bristol, April 2007

Systems for Automated Launch and Recovery of an Unmanned Aerial Vehicle from Ships at Sea

Figure 1: UNSW@ADFA RMAX in flight during laser-rangefinding experiments

900 MHz RF Modem 900 MHz RF Modem

Remote Login

Com1

Com1

PC104 Flight Computer

Novatel OEM4 DGPS Com2 Memory stick for logging flight test data

GPS Base Station

8 port Serial Interface

36 MHz Radio Control Reciever

Com1

Com2

Com2

USB

Com3

RC Data Control Data

Com4

High Speed Interface Card Com5 Range Data

Com1

Sensor Data

Com2 Com3

Servo Commands

Com4

Com6 Com7 Com8 (Spare) 2.4 GHz RF Modem

Laserrangefinder

Magnetometer

Yamaha Attitude Control System (YACS)

LRF Commands

Groundstation Commands, Telemetry

Heading Data

2.4 GHz RF Modem

Figure 2: RMAX Avionics Architecture

Figure 3: Laser Scanning System

22nd International UAV Systems Conference, Bristol, April 2007

Ground Control Computer

Systems for Automated Launch and Recovery of an Unmanned Aerial Vehicle from Ships at Sea

Figure 4: Rotating Mirror Assembly ware buffer. A half-full interrupt on the HSIF triggers a real-time driver program on the PC104 to download the next series of samples and load them into RAM. As each sample comprises 8 bytes including amplitude, range, encoder and intensity fields, the buffer becomes half-full after 128 samples are recieved. A control thread running on the PC104 executes at 100Hz and checks for the latest data in RAM each time it is woken up. With a sample rate of 2KHz, the interrupt is triggered about every 64ms and takes about 1ms to execute. The processing takes place after the full data set for each rotation of the mirror is received which occurs about every 40ms. Combining all of the latencies together results in a maximum latency of approximately 10 + 64 + 1 + 40 = 124ms.

3.1 Deck Measurement Algorithm For each scanned sample, a distance measurement and encoder output are taken from the laser apparatus. The range measurement is corrected for known errors due to changes in temperature, ambient light and reflectivity. The range measurement is scaled into meters using a look up table based on calibration data. The encoder measurement is converted to an azimuth angle measured from a reference point designated as the nose of the vehicle. Given an encoder with 4096 discrete positions per revolution, the azimuth angle (ψ) would be calculated from the encoder output (E) using equation (1). 2πE (1) 4096 The range (r) and azimuth angle are then converted into a three-dimensional position relative to the aircraft axes system, taking into account the included angle (α) of the laser cone. The aircraft body axes are a right-handed axes system fixed at the sensor position and rotating with the vehicle. The x-body axes is aligned with the length of the helicopter so that the positive x direction points forwards from the nose. The y-axis passes out to the right parallel to the straight line which would join the wing tips if it were an aircraft. ψ=

The z-axis points vertically down when the helicopter is flying level. Equation (2) below provides the transformations from range/azimuth to the 3D body axes coordinate system [xb , yb , zb ]. xb = r sin(α) sin(ψ) yb = r sin(α) cos(ψ) zb = −r cos(α)

(2)

If desired, each 3D point can be adjusted for the attitude (pitch, roll, yaw) of the flight vehicle as measured by the vehicle’s attitude reference system. After this transformation, the points are in global coordinates defined relative to the earth based axes system [xg , yg , zg ]. Each 3D point is stored into a buffer in the processing unit memory. After a complete scan, the buffer of 3D points is passed to a software subroutine which will calculate the plane of best fit to the stored points. A plane in 3D space is described by equation (3). By determining the coefficients K1 , K2 and K3 the plane of the surface is then defined. K1 x + K 2 y + K 3 z = 1

(3)

To determining the coefficients describing the plane we use a least-squares method. The objective of the least-squares minimisation is to find the value of T the plane coefficient vector λ = [K1 K2 K3 ] such that the sum of the squares of the error residuals (R) in equation (4) is minimised. R=

n X

(1 − Ki xi − K2 yi − K3 zi )

2

(4)

i=1

To implement this, the coordinates of the scan points are arranged in matrix form as per equation (5). 

x1  x1 A= ··· xn

y1 y1 ··· yn

T z1 z1   ··· zn

22nd International UAV Systems Conference, Bristol, April 2007

(5)

Systems for Automated Launch and Recovery of an Unmanned Aerial Vehicle from Ships at Sea Equation (6) is a solution to the least square problem −1

 T b = 1 1 ... 1 (6) Once the equation of the plane is found from equation (7) or by some other means, the instantaneous height H of the vehicle above the surface can be found using equation (7). λ = AT A

AT b where

H=−

1 K3

(7)

Likewise the roll and pitch of the surface can be found from the equation of the plane. If we define pitch angle (θ) as the inclination of the plane from the y-axis and roll (φ) as the inclination of the plane from the x-axis, then equation (8) defines the orientation of the plane. θ = arcsin K1

φ = arctan

K2 K3

(8)

When the helicopter is flying at altitude, the circle drawn on the deck below will be quite large and for a small deck this can cause the circle to partially fall off the edge of the deck. Large helicopter pitch and roll changes can also translate the circle so that part of the circle is no longer on the deck. To deal with this, we have two strategies. Firstly, in the production version, we intend to adjust the mounting bracket design so that the laser is tilted forwards during hover such that the rear arc of the circle lies more or less underneath the centre of gravity of the helicopter. This way, regardless of altitude, there will always be part of the circle that lies on the deck, provided the helicopter is overhead. Parts of the circle that are not within a predefined radius of a point directly below the helicopter will be disregarded. Secondly, we have refined the deck estimation algorithm to disregard points that are not within some tolerance of the current deck estimate. The aim of this is to eliminate parts of the circle which are not on the deck by neglecting those points which are not close to the plane of where we believe the deck to be. This is achieved using a simple iterative process. First the distance between every scanned point and the estimated plane of the deck is calculated. Any points that are outside of a certain tolerance are ignored. If the percentage of points being ignored becomes too high or too low, the tolerance is adjusted. A new deck estimate is then calculated and stored. A further strategy can be used in conjunction with the visual sensor. As the vision sensor provides a measure of the helicopter’s lateral and longitudinal position with respect to the center of the deck, it can be used to eliminate parts of the circle which are known apriori to lie outside of the known boundaries of the landing platform.

3.2 Sensor Simulation Results We have developed a SimulinkTM representation of the Laser Rangefinder sensor including a C s-function

block implementation of the deck estimation algorithm. The deck position calculation algorithm was run with various amounts of white noise added to the range data to test the tolerance of the algorithm to noise. The set of error results presented in Figure 5 are from simulations of the scanning algorithm at a 10 meter height above a flat moving platform. The platform was simulated to move at a sinusoidal rate in both roll and pitch. As seen from the results, there is an almost linear relationship between the roll and pitch calculation error and the amount of simulated range noise. Importantly, the algorithm is seen to be very resilient to noise, demonstrated by less than 0.5 degrees of error with up to 10cm of noise in the laser range finder. Increasing the range noise to an unrealistic level of 50cm, only increases the roll and pitch calculation error to within 1.5 degrees.

3.3 Flight Test of Laser Sensor Our flight test experiments consisted of three parts: determination of attitude, determination of height and closed loop control of height. In the first part we wanted to determine how well the laser sensor worked at measuring orientation. As we did not have access to a moving deck platform at the time, we choose instead to have a stationary ground plane and stimulate the attitude of the helicopter through the pilot controls. We did not transform the coordinates of the scanned points into earth centered coordinates, so that the attitude of the measured deck plane would correspond to the attitude of the helicopter. By comparing the attitude measured by the Yamaha Attitude Sensor (YAS) with the deck orientation estimated by the laser system we were able to check that the system was working. The RMAX was flown over a dis-used carpark and made to roll and pitch whilst in hover. The comparison of the laser output and the YAS is shown in Figure 6a/b. There is a clear correlation between the two sensing systems with some minor deviations. The performance of the sensor is particularly encouraging considering the very small included angle of the laser cone. For this test, we only used an included angle of 7◦ which meant the laser circle on the ground was only 12cm in diameter when flying at a height of 1m. Small bumps in the ground of a few cm can therefore result in significant attitude errors. We are currently working on constructing a new mirror with a 20◦ included angle which will improve the accuracy. The downside of a bigger included angle is that more of the circle would lie off the edge of a small deck at altitude. Also, for this experiment, we did not adjust the coordinates of the helicopter for the lateral and longitudinal motion of the helicopter. In this experiment there were times when the speed of the helicopter exceeded 1m/s, resulting in a displacement of over 4cm during a complete 360◦ scan. With a circle diameter of 12cm, this would warp the circle by 25% and bias the attitude results. The second test of the system was to check that

22nd International UAV Systems Conference, Bristol, April 2007

Systems for Automated Launch and Recovery of an Unmanned Aerial Vehicle from Ships at Sea 3.5

Deck Orientation Error (Deg)

3

2.5

2

1.5

1

0.5

0 0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

Range Noise (m)

Figure 5: Effect of Noise on Laser Rangefinder (LRF) the height output of the algorithm was reasonable and compared well with a second measurement of height. The helicopter was placed in vertical flight with climbs and descents set up using pilot collective. The output of the sensor was compared against the altitude output of the NovAtel DGPS system after subtracting the elevation of the ground under the helicopter from the DGPS altitude. The results of this comparison, in Figure 6c, show a very close match between the two systems. The final tests involved using the laser system to control the height of the helicopter. A simple PID controller was used to control the collective pitch of the main rotor to keep a constant altitude. The controller gains were tuned manually starting from a low value. On handover to the helicopter flight computer, the desired altitude fed to the control system was set to the current altitude measured by the laser system. The control system was able to keep the helicopter to within 10cm of the desired altitude despite strong wind gusts on the day. The error in altitude is shown in Figure 6d. The same controller was then extended to control of landing. In this case, with the control handed over to the PID controller, a command sent from the ground control station initiated a gradual change in reference height to be made. The flight control software decreased the desired height by 1cm per second. In response, the helicopter descended from its initial hover height until it came in contact with the ground. For future, work this descent speed will be increased several fold as it was very conservative. A plot of the height of the helicopter as it descended is shown in Figure 6e.

4 Visual Tracking Sensor The concept of using vision to land a UAV on a moving deck is not new. Saripalli et al at the University of Southern California have landed a small autonomous helicopter on a helipad using vision and inertial information [26]. In their work, a pattern comprised of polygons painted on the helipad was identified and

tracked using a fixed threshold to pick out the pattern from the background. The helicopter was able to track the pattern even when the helipad was moving, however, the motion was switched off for the actual landing. The US Naval Postgraduate School, has devised a ship tracking system using an infrared camera [21]. A long way from the ship, the heat signature of the ship (e.g. from the smoke stack) can be used to track the relative position of the ship. Close to the ship, three known hotspots on the ship are tracked by the camera and their azimuth and elevation relative in the field of view are used to solve for the position and orientation of the ship relative to the helicopter. Using a fixed wing UAV, the output of the tracking algorithm was compared against data from a DGPS system and found to provide position accuracy of about 1m at close range. In previous work, one of the authors [17] has used a system of three visual landmarks on the ground to control a small 7kg helicopter in hover. Such systems could all be used in theory to hover over the deck of the ship, however, all of these systems suffer from the problem of losing track. For the problem of a moving ship’s deck, there is concern that sea spray could obscure parts of a pattern or that with the combined motion of the ship and helicopter, that parts of the pattern could dissappear from the field of view at times. For best accuracy, the desire is to have as much of the field of view as possible being taken up by the pattern. However, a larger pattern in the field of view, results in a greater likelihood of part of the pattern being lost due to relative motion effects. For this reason, we are proposing using a single light source, or beacon, as the target which would be centered in the field of view where practicable. Such a target would be the least likely to disappear from view and the use of a single bright point light source target simplifies the tracking problem significantly. In conjunction with our laser sensor, a single beacon is all that is required to fix the position of the helicopter with respect to the centre of the deck. Whilst the yaw angle of the helicopter is not determined using the combination of a point target and

22nd International UAV Systems Conference, Bristol, April 2007

Systems for Automated Launch and Recovery of an Unmanned Aerial Vehicle from Ships at Sea

(b) LRF attitude versus YAS attitude

(a) LRF Roll Angle versus YAS Roll Angle 15

30

Pitch Angle (Degrees)

Roll (Deg)

20

10

0

-10

-20 140

5 0 -5 -10

YAS Roll Angle LRF Roll Angle 145

10

150

155

160

165

YAS Pitch Angle LRF Pitch Attitude

-15 130

170

135

140

Time (Seconds)

145

150

155

160

110

130

Time (Seconds)

(d) Closed Loop Height Error

(c) LRF Deck Height Estimation versus DGPS 0.2

8 7

5 4 3 2

0.1

0

-0.1

1 0

-0.2 0

20

40

60

80

100

10

30

50

Time (Seconds)

70

90

Time (Seconds)

(e) Height during Landing 1.2

Height Above Ground (m)

Height (m)

Height Error (m)

DGPS Laser Rangefinder

6

1 0.8 0.6 0.4 0.2 0 0

10

20

30

40

50

60

Time (Seconds)

Figure 6: Laser Rangefinder (LRF) Flight Test Data

22nd International UAV Systems Conference, Bristol, April 2007

Systems for Automated Launch and Recovery of an Unmanned Aerial Vehicle from Ships at Sea laser scanner, we have found the yaw loop very easy to control using a simple PD feedback scheme based on heading angle, and all that is required is a system to tell the helicopter what heading to steer to match the course of the ship. This would only require a slow update as ships take a long time to change course when underway. To simplify the visual detection problem further, we have been experimenting with a bright LED for the beacon. Rejection of unwanted specular reflections and other lights is possible by choosing a narrow band filter matched to the spectral output of the beacon, which in turn is ideally chosen where OH- absorption lines are minimised. We have settled on a peak wavelength of 650 nm, which is better than the near infra-red wavelengths which would suffer more given the waterborne application. Our beacon tracking system has been designed to be lightweight, self contained, and low power consumption. The latter is achieved through the use of a megapixel CMOS image sensor which uses 363 mW, and is helped by locating all the necessary image processing and coordinate determination within a single Xilinx Spartan IIE FPGA. The FPGA interfaces to the flight control system using RS232, and provides extra diagnostics in the form of XGA video imagery to an external monitor or PAL composite video to a 2.4GHz video transmitter to the ground. The Philips I2C bus is used to configure the sensor on the fly, and the FPGA translates RS232 characters from the host using a data locked loop for character bursts which alter the configuration. These variations may be computed within the FPGA. It in turn delivers the reliable coordinates of the beacon within the image field to the flight computer using RS232. The image sensor used is a monochrome megapixel 1/2” format, Micron MT9M001 with 5.2µm square 10 bit pixels and a published sensitivity of 2.1V/ lux sec. The optics used to image onto the sensor define the positioning accuracy and initial capture range of the beacon. We have used a machine vision CS mounted lens with 6mm focal length and red narrowband optical filters to improve rejection over the specular reflections expected in the water environment. The CS mount allows for other lenses to achieve a desired FOV and f-number. This means the FOV of the XGA field (which may be moved around the sensor) is 42 degrees, the full 1/2” format sensor sees 52 degrees, and the pixel FOV is 147 arc sec. At an altitude of 10 metres this implies a 7 mm pixel size on the deck, and hence the beacon occupies less than one pixel in unperturbed air for a stable platform at this height, but will obviously look larger from the movement occurring during the integration time. At one metre above the deck the 5 mm diameter beacon will occupy a region of approximately 7x7 pixels. We are still undecided if we will narrow the beam pattern, increasing the output aperture, in which case more of the sensor will be illuminated by the beacon, but our algorithms will work regardless. To enhance performance, the image sensor must be

controlled in exposure and frame rate either remotely over the RS232 link or automatically with a simple algorithm within the FPGA. The exposure can be adjusted in steps of 25.4µs for the fastest possible pixel clock, up to full frame rate. To determine this we have decided on a radiation pattern that approximates cos4 (θ). The beacon will appear 42% dimmer at the corners of the capture zone from an from a level platform, than when it is landed upon, ignoring the atmospheric attenuation (0.003cm−1 at 650 nm). When the landing platform is tilting, there will be further reductions, so we have designed in an extra angular variation of 20% between these two, bringing the light to 10% of its maximum. This, and a further atmospheric attenuation to 10% are easily accounted for in the 10 bit pixel range of the sensor for a fixed exposure time. The image sensor has anti-blooming capability so can be over exposed wihtout detriment, and this can be augmented by changing the electronic rolling exposure. Under reasonable conditions the exposure is optimal at a single row time for an f1.4 aperture from a 500mW beacon. The 10 bit pixel data is delivered to the adjacent FPGA, upon which the first of two pipelined algorithms are run. This algorithm monitors the live pixel stream to locate adjacent pixels of brightness above a threshold, and determines whether to consider these as contenders for the beacon by checking the existence of similar brightness pixels on the following lines. In this manner, areas are approved if the bright pixels have another on one adjacent edge. As an example, in Figure 8 the algorithm has found four regions (1,2,3 and 4) that have a sufficient brightness and the requirement for adjoining pixels to be considered as a part of the tracked light. For regions 5 and 6 the connection between the pixels that is not good enough to create a single region. As the determination of the area of the region must be taken as pixels are rastered from the sensor, the left most column and run length within that row are remembered for each of a finite number of regions, as is the first encountered row (top). As rows are analysed the boundaries of each region are moved outwards to the farthest contiguous breadth and height, hence the left, top, right and bottom encompass each region contender in the frame. From this the rectangular area is obtained, and the horizontal and vertical centres are easily determined. A single line depth FIFO is all that is required for the vertical comparison. The threshold is updated at the end of each frame based on a fixed percentage of the peak pixel value within that frame which is remembered during the streaming, and is applied during the folowing frame on the assumption there will be little interframe change. The computed centres of the regions are analysed when the rectangular extremity is located, with the centres and area stored in an asynchronous FIFO that is reset at the end of the frame. The largest area spot is chosen for each frame. Its centre is forwarded to the second stage of the pipeline that runs a temporal filter of 4 to 8 frames in length to void those areas that are

22nd International UAV Systems Conference, Bristol, April 2007

Systems for Automated Launch and Recovery of an Unmanned Aerial Vehicle from Ships at Sea

Figure 7: The beacon tracking sensor combines a megapixel CMOS sensor and Xilinx FPGA for a low profile, lightweight, and economical unit.

1 1 1 1 1 1 1

2 2 2 4 4

3 3 3 3 3 3 3 3 3 3 3

5 5 6 6

Figure 8: The tracking algorithm determines bright regions from the live pixel stream using adjacency rules. not stationary to within a 10 by 10 pixel location over this time. At 50 frames per second, this is a period of 80-160 ms which is short enough to miss atmospheric induced scintillation effects, and motion caused in the image field by aircraft and landing platform alike. By reducing the region of interest it is possible to increase this frame rate and hence limit motion susceptibility, particularly as the beacon is locked in. The viable centre of the beacon is then forwarded to the flight computer over RS232. Should no reliable beacon be determined from this stage, no coordinate is sent in that frame time. Finally for diagonostics a red cross hair is superimposed on the video output at the chosen centre, and a green square overlay indicates a locked beacon. An indication of this imagery is shown in Figure 9. During this graphical diagnosis, the frame rate of 50 Hz and the used region of the sensor conform to the current XGA graphics requirements. All the processing is VHDL code and schematic based within the FPGA, and has been developed in Altium Designer 6. Tests in a variety of light levels within the laboratory have been undertaken with visually apparent robustness, even in the presence of strobing between flourescent lights and the frame rate. We have been able to track the beacon from one hundred meters away in bright sunlight.

5 Ship-motion Prediction Ship-motion estimation and prediction is of interest for autonomous landing of aircraft and firing control systems [23, 24]. In our specific application, we would like to estimate the ship motion, so that the UAV can track the mean position of the target landing spot rather than trying to match the ship’s motion throughout the approach. By maintaining an estimate of the ship’s motion, we can subtract the estimate from the measured relative position to obtain the mean deck trajectory. Predicting the deck motion is also very useful as it allows us to build the future position of the ship into our trajectory planning algorithm. We will discuss progress in developing a ship prediction technique for this purpose. Prediction using Kalman filtering technique is presented in [24]. In [24], ship-motion along each degree-of-freedom is modelled as a second-order system which has Gaussian noise as its input. The power spectral density functions of ship-motion are obtained from past measurements. In [4], again a second-order system is used as a model for ship-motion but instead of Gaussian noise input, a sum of sinusoids is used as input. The input is estimated by first using a Kalman filter to estimate the system states and then “inverting” the system. This estimate then updates the input sinusoid function parameters. The input function is then applied to the second-order dynamics to predict shipmotion. In [23], ship-motion is modelled with a single

22nd International UAV Systems Conference, Bristol, April 2007

Systems for Automated Launch and Recovery of an Unmanned Aerial Vehicle from Ships at Sea

Figure 9: Typical diagnosis of the beacon tracking is achieved on a computer monitor with graphical overlay, all generated from the FPGA. The two images shown here are for an overexposed image with no filter (left) and a correctly exposed image (right) sinusoid, and a recursive algorithm is used to continuously update the sinusoid’s amplitude, frequency, and phase. Once the complete sinusoid describing the motion is obtained, it is used to make the prediction. A step-ahead auto regressive moving averages (ARMA) predictor and the predictor model order selection using multi-level recursive method, is discussed in [16]. It uses time series analysis techniques. Here we consider the problem of predicting shipmotion 2-5 seconds ahead, using the measured motion data as a time series. To get an idea of the type of predictor needed, we have started using simulated ship motion data produced by the FREYDYN 8.0 software package for an 8,500 ton LPA class amphibious platform. There are two plots of this data in Figure 10; one plot is of the actual ship-pitch and another plot is the same ship-pitch shifted to the left by 5 seconds. We need a predictor which will take the measured pitch data and output the left-shifted plot. This is the prediction problem we need to solve, i.e., have a scheme for looking at the measured data plot and outputting the left-shifted plot. One option is to fit a sum of sinusoids to the measured data and use that function to predict future values. From the plots in Figure 10 we see that although the frequency of the sinusoidals is relatively constant, the magnitude changes by a significant factor and this makes it extremely difficult to fit sinusoidal functions to the data. We have tried this approach but did not have much success with it. A look at Figure 10 tells us that the input and output are shifted in phase—the output leads the input by about 30 degrees. We know that dynamical systems can provide such a phase lead, a good example is an RC-circuit. With that motivation we choose a secondorder system as our predictor: G(z) =

Y (z) b0 + b1 z −1 + b2 z −2 = U (z) 1 + a1 z −1 + a2 z −2

(9)

We use recursive least squares (RLS) method to obtain the parameters of the predictor G(z). The predictor

parameter and input vectors can be written as: T

θ = [b0 , b1 , b2 , −a1 , −a2 ] , φ(t) = [u(t), u(t − 1), u(t − 2), y(t − 1), y(t − 2)]T . With these definitions the predictor dynamic model in (9) can be given as: y(t) = θT φ(t) + v(t)

(10)

where y(t) is the predicted output, φ(t) is the input, θ are the model parameters, and v(t) is distrubance. The RLS minimises N 2 1 X  αt y(t) − θT φ(t) N t=1

where {αt } is a sequence of positive numbers to give weightings to observations. From [15] the RLS algorithm is: h i ˆ ˆ − 1) + L(t) y(t)θˆT (t − 1)φ(t) θ(t) = θ(t

(11)

P (t − 1)φ(t) 1/αt + φT (t)P (t − 1)φ(t)

(12)

L(t) =

P (t) = P (t − 1) −

P (t − 1)φ(t)φT (t)P (t − 1) (13) 1/αt + φT (t)P (t − 1)φ(t)

Figure 11 compares the actual ship-pitch and the predicted value using the RLS algorithm in (11)–(13). The predictor is trained using the first thousand samples. Figure 11 also shows the predicted mean position obtained by averaging the predicted pitch values. The Bode plot of the predictor is shown in Figure 12. From the plot it can be seen that the predictor is essentially a phase lead system.

6 Deck Recovery System A number of systems for securing a manned helicopter to the deck of a ship have been developed since the

22nd International UAV Systems Conference, Bristol, April 2007

Systems for Automated Launch and Recovery of an Unmanned Aerial Vehicle from Ships at Sea

0.15

Pitch−angle (Input and Output)

0.1

0.05

0

−0.05

−0.1

−0.15

−0.2

0

500

1000 Samples

1500

2000

Figure 10: Predictor Input-Output (dashed - input), sample interval 0.25s

System output 0.15

Pitch−angle (True and estimated)

0.1

0.05

0

−0.05

−0.1

−0.15

−0.2

0

500

1000 Samples

1500

2000

Figure 11: Predicted pitch versus actual pitch (dashed - actual) and predicted mean pitch

Bode Diagram

Magnitude (dB)

20

10

0

−10

−20 90

Phase (deg)

45 0 −45 −90 −2

10

−1

10

0

10 Frequency (rad/sec)

1

10

2

10

Figure 12: Bode Plot of the Predictor

22nd International UAV Systems Conference, Bristol, April 2007

Systems for Automated Launch and Recovery of an Unmanned Aerial Vehicle from Ships at Sea 1960s. For example, Indal Tecnhologies, have developed an Aircraft/Ship Integrated Secure and Traverse (ASIST) shipboard helicopter and reovery system [2]. The ASIST systems secure manned helicopters to the deck of the ship immediately on landing and expands the ability of the helicopter to operate in rough weather without toppling and sliding. The systems are mechanically complex comprising deck rails, winches, hydraulic systems, deck mounted tracking cameras and control hardware and require a special retractable probe and winch system on the helicopter. For a small UAV, such a large system is completely impractical. The 1400kg Firescout uses a deck restraint called the Light Harpoon designed by DCN international [11]. The system uses a passive deck grid on the deck with a belly mounted probe (or harpoon). A mechanical locking system in the head of the harpoon automatically locks when the aircraft lands and is released by the operator just before takeoff. The 1400kg Eagle Eye can be equipped with a deck arrest system which is electronically activated upon touchdown [10]. The system comprises a hook which engages with a NATO-standard grid [20]. The hook drops down on an extendable arm from the bottom of the UAV and retracts for flight. For ship landings, the 190kg Canadair CL-227 Sea Sentinel [6] VTOL UAV used another system consisting of a probe mounted on each foot of the landing undercarriage and an electrically operated latch on each probe that would engage with the deck grid on landing to stop the probes from pulling out of the deck grid. Once again these systems involve electrically actuated components that are too complex and heavy for use on a small UAV which could weigh less than 100kg. Our prototyped system is far simpler and more compact by comparison and consists of a series of 4 spring-loaded probes which engage with a deck grid. Each probe is located in a cylindrical housing welded to the top of a circular landing foot of 10cm diameter. The landing feet are attached to the ends of the port and starboard undercarriage legs. When landing on normal flat landing area such as tarmac, the weight of the helicopter overcomes the spring tension and pushes the probes inside their housings. When landing on a deck grid with holes in it, the probes push through the holes and prevent sideways motion of the helicopter. On launch, the probes simply pull out of the holes. Two different grid arrangements were tested, a grid with square holes in it and a grid with overlapping countersunk circular holes to form a hexagonal grid pattern. The grid spacing between the centers of the holes in both cases was 25mm. A diagram and photo of one of the probes is shown in Figure 13. Before use on the helicopter, static topple testing was completed using a gymbaling test rig which allowed the grid to be rolled about any axis. The pitch and roll topple characteristics were tested uder a variety of take off weights using both the original skid undercarriage and the modified landing gear fitted with probes. A mock up of the RMAX helicopter was

constructed using a spare undercarriage and a wooden frame. Weights were added at points on the frame to replicate the RMAX take-off weight and vertical center of gravity location. Under static testing with the test rig, we found that the original landing skids could withstand a pitch of 13◦ and a roll of 12◦ before they would start to slide, using the minimum and maximum take off weights respectively. When the landing gear was changed to the four-probe design, the maximum permissible pitch and roll limits increased to 33◦ and 39◦ respectively. The latter limits were due to the probes starting to pull out of the grid. For flight tests, an indicator pin was installed on each landing probe to give a visual indication of the position of the probe and whether locked to the grid or not. Each indicator pin was also coupled to a microswitch, which activates a high intensity LED mounted on the rear of the helicopter to provide a convenient indication of the probe status to the operator. We noted that the spring loaded probes tended to align themselves with the holes in the grid upon landing. Occasionally one of the four probes did not lock down fully, but usually the vibration of the helicopter was sufficient to move the probe into a gap. In any case, two probes locking into the grid is enough to stop the helicopter from sliding. A series of 50 landings both parallel to and at 45◦ to the grid was completed. Of these landings, 40 resulted in all 4 probes engaging with the grid, 5 resulted in 3 probes engaging and 5 landings resulted in 2 probes engaging. Because of the natural spring of the undercarriage and the weight of the helicopter, the probes push out against the outboard sides of the holes in the grid and are held in place by friction. This helps to prevent toppling of the helicopter as well as sliding. However, on a couple of occasions the probes did cling to the grid on take-off due to this effect, and this created an assymetric pull on the helicopter during the launch. We believe this problem could be fixed with a slightly greater grid spacing and a slight re-design of the probe tip shape. This would also help to increase the percentage of landings with all four probes engaging.

7 Moving Deck Simulator In order to validate the ability of the RMAX helicopter to land on a moving deck, a prototype deck mock-up capable of simultaneous heave, pitch and roll is under construction (Figure 14). The system is driven by three electrical linear actuators capable of delivering up to 3000N each. The actuators are orientated vertically at the corners of an equilateral triangle. At the centre of the deck is a universal joint which permits pitch and roll but not yaw. The lower end of the universal joint is attached to an I-beam which can move vertically. The I-beam is guided by a series of nylon rollers which prevent the I-beam from yawing or translating horizontally. The weight of the deck and other fixtures is counterbalanced with steel weights so that the actuator thrusts required are only needed to accelerate

22nd International UAV Systems Conference, Bristol, April 2007

Systems for Automated Launch and Recovery of an Unmanned Aerial Vehicle from Ships at Sea

Figure 13: Deck Probe and Grid System. Probe engaged with square grid (left), Cross section of probe (center), Alternate hexagonal grid (right). the structure and not to counteract gravity. A deck of 3mx3m will be used for our future experiments. We have simulated the motion of the moving deck in the Recurdyn dynamics simulation package and have shown that the following minimum specifications can be achieved without overstressing the structure or exceeding the limits of the actuators: 1. Heave of 1m; 2. Roll and pitch of 25◦ ; and

9 Acknowledgements This work has been supported by an Australian Research Council Linkage project grant in conjunction with industry funding from UAV Australia Pty Ltd. We would like to thank Julien Haven, Fabien Moitrier and Clement Farabet from INSA (Lyon) who have contributed to developing the FPGA hardware for the image tracking sensor. We would like to thank the Australian Defence Science and Technology Organisation for providing the LPA ship motion data used for testing our ship motion prediction algorithm.

3. A period of 7 seconds or greater.

References A spin-off application of the moving deck simulator is as a means to compensate the deck of an actual ship for ship motion. If the three actuator controller was coupled to an inertial attitude reference, it would be possible to cancel out most of the deck motion of a vessel, and hence enable significant expansion of the launch and recovery envelope of the helicopter. Whilst this approach is not practical for manned helicopters and larger VTOL UAVs it is quite achievable with a small UAV like the RMAX. We have been able to purchase all of the components and materials for construction of the moving deck whilst staying within a project budget of US $30,000.

8 Conclusions and Future Work We have described systems for enabling a rotary wing UAV to launch and land on a moving ship’s deck at sea. These systems are smaller and less costly to produce than existing systems and will allow much smaller UAVs and ships to be integrated with each other. In future work, we will use the combined image sensor and laser rangefinder sensor to perform an autonomous landing on to a moving deck platform.

[1] Relative moving baseline software RMBS. Technical report, NovAtel Inc. www.novatel.com/products/ waypoint techreports.htm. [2] A.R.Feldman and R.G.Langlois. Autonomous straightening and traversing of shipboard helicopters. Journal of Field Robotics, 23(2):123–139, 2006. [3] B.Pervan, F.Chan, D.Gebre-Egziabher, S.Pullen, P.Enge, and G.Colby. Performance Analysis of Carrier-Phase DGPS Navigation for Shipboard Landing of Aircraft. Journal of the Institute of Navigation, 50(3), 2003. [4] Jai Chul Chung, Zeungnam Bien, and Young Seog Kim. A note on ship-motion prediction based on wave-excitation input estimation. IEEE Journal of Oceanic Engineering, 15(3):244– 250, July 1990. [5] D.Belton, S.Butcher, G.Ffoulkes-Jones, and J.Blanda. Helicopter recovery to a moving platform using a GPS relative positioning system. In Proceedings of the 12th International Technical Meeting of the Satellite Division of the Institute of Navigation, pages 1769–1776, Nashville, 14-17 Sep 1999. [6] W.H. Eilerston. Remotely piloted vehicle/vertical attitude take-off and landing demonstration vehicle. In Proceedings of the 31st Annual National Forum of the American Helicopter Society, Washington, DC, May 1975. [7] E.N.Johnson and D.P Schrage. The Georgia Tech Unmanned Aerial Research Vehicle: GTMax. In Proceedings of the AIAA Guidance, Navigation, and Control Conference, Austin, Texas, Aug. 11-14 2003. [8] US Air Force. Joint Precision Approach and Landing System (JPALS) operational requirements document. USAF 002-94-1.

22nd International UAV Systems Conference, Bristol, April 2007

Systems for Automated Launch and Recovery of an Unmanned Aerial Vehicle from Ships at Sea

Figure 14: Moving deck demonstrator under construction (left). Dynamic computer simulation of moving deck demonstrator (right). [9] K.L. Gold and A.K.Brown. A hybrid integrity solution for precision landing and guidance. In Position Location and Navigation Symposium, pages 165–174. IEEE, 26-29 April 2004.

In Proceeding of the 12th International Technical Meeting of the Satellite Division of the Institute of Navigation, pages 1541–1547, 1999.

[10] Bell Helicopter. Bell Eagle Eye UAV Pocket Guide. http://www.bellhelicopter.textron.com/en/ aircraft/military/pdf/EagleEye PG 05 web. pdf.

[19] M.Whalley, M.Freed, M.Takahashi, D.Christian, A.Patternson-Hine, G.Shulein, and R.Harris. The NASA/Army autonomous rotorcraft project. In Proceedings of the American Helicopter Society 59th Annual Forum, 2003.

[11] Kell-Strom Tool Co. Inc. A Safer Landing and Takeoff with the Light Helicopter Landing Aid. http://www. kell-strom.com/tools/dcn/dcnpg03.htm.

[20] NATO. Ship Helicopter Harpoon/Grid Rapid Securing System, STANAG 1276 HOS edition.

[12] J.Charles. CMU’s autonomous helicopter explores new territory. Intelligent Systems and Their Applications, 13(5):85–87, 1998. [13] K.Gold and A.Brown. An Array of Digital Antenna Elements for Mitigation of Multipath for Carrier Landings. In Proceedings of ION 2005 National Technical Meeting, pages 26–28, San Diego, CA, Jan 2005. [14] H. J. Kim, D. H. Shim, and S. Sastry. Flying robots: Modeling, control, and decision making. In International Conference on Robotics and Automation, May 2002. [15] Lennart Ljung and Torsten S¨oderstr¨om. Theory and Practice of Recursive Identification. The MIT Press, Cambridge, Massachusetts, 1987. [16] Jie Ma, Teng Li, and Guobin Li. Comparison of representative method for time series prediction. In Proceedings of the 2006 IEEE International Conference on Mechatronics and Automation, pages 2448–2453, Luoyang, China, June 2006. [17] M.A.Garratt and J.S. Chahl. Visual Control of a rotary wing UAV. In UAV Asia-Pacific Conference. Flight International, Feb 2003. [18] P.Y. Montgomery, D.G. Lawrence, K.R.Zimmerman, H.S.Cobb, G.M.Gutt, C.E.Cohen, and B.W. Parkinson. UAV application of IBLS for autonomous takeoff and landing.

[21] O.A.Yakimenko, I.I.Kaminer, W.J.Lentz, and P.A.Ghyzel. Unmanned aircraft navigation for shipboard landing using infrared vision. IEEE Transactions on Aerospace and Electonic Systems, 38(4):1181–1200, Oct 2002. [22] P.Doherty. Advanced research with autonomous unmanned aerial vehicles. In Proceedings of the 9th International Conference on Knowledge Representation and Reasoning, 2004. [23] W. S. Ra and I. H. Whang. Real-time long-term prediction of ship motion for fire control applications. Electronic Letters, 42(18), August 2006. [24] Mehahen M. Sidar and Brian F. Doolin. On the feasibility of real-time prediction of aircraft carrier motion at sea. IEEE Transactions on Automatic Control, 28(3):350–356, March 1983. [25] Sierra Nevada Corporation. Automatic Recovery System. http://www.sncorp.com/prod/atc/uav/ default.shtml. [26] J.F.Montgomery S.Saripalli and G.S.Sukhatme. Visually guided landing of an unmanned aerial vehicle. IEEE Transactions on Robotics and Automation,, 19(3):371–380, 2003. [27] T.Ford, M.Hardesty, and M.Bobye. Helicopter Ship Board Landing System. Technical report, NovAtel Inc., 2004? http://www.novatel.com/products/ alltechpapers.htm.

22nd International UAV Systems Conference, Bristol, April 2007

Suggest Documents