Rapid Flight Test Prototyping System and the Fleet of ... - CiteSeerX

12 downloads 93754 Views 885KB Size Report
software tools currently adopted for the system and briefly discusses the variety of .... The Piccolo avionics, shown in Fig.2, is an autopilot designed to track the.
Rapid Flight Test Prototyping System and the Fleet of UAVs and MAVs at the Naval Postgraduate School Isaac I. Kaminer,* Oleg A. Yakimenko,† Vladimir N. Dobrokhodov,‡ and Kevin D Jones§ Naval Postgraduate School, Monterey, CA 93943

This paper describes the development and application of a rapid prototyping system for flight testing of autonomous flight algorithms for unmanned air vehicles (UAVs) at the Naval Postgraduate School. The system provides a small team with the ability to rapidly prototype new theoretical concepts and flight-test their performance in realistic mission scenarios. The original development was done using MATRIXX Xmath/SystemBuild environment almost a decade ago. Currently, the system has been converted to the Mathworks MATLAB/Simulink development environment. The fleet of UAVs currently available at the NPS includes several fixed-wing airplanes with wing-spans of up to 3.5m all the way down to flapping-wing micro UAVs (MAVs) with wings spans as small as 23cm. The paper describes the hardware and software tools currently adopted for the system and briefly discusses the variety of projects that have included path following algorithms, voice control, vision based navigation for shipboard landing, autoland system development, etc.

I. Introduction

THE

past two decades have witnessed a dramatic increase in the utilization of unmanned air vehicles (UAVs) by the armed forces, both in the US and abroad. More recently, many researchers in the academic community have realized the usefulness of UAVs both as teaching and research tools. The development of UAVs and their flight control systems requires addressing a number of engineering problems in a wide range of issues that include weight and energy restrictions, portability, risk factors, electronic interference, vibrations and manpower, to name but a few. Furthermore, the testing of new algorithms, sensor packages, and vehicles is truly a multi-disciplinary effort that borrows from many branches of the engineering sciences that include aeronautic, electrical, and computer engineering. This effort is costly and time consuming, and has the potential for catastrophic failure. When successfully done, however, it provides developmental information, insight, and field data that are unavailable from other sources. Since all theoretical and numerical results must be verified by some form of experiment, flight-testing is clearly the best way to achieve those goals. Motivated by these considerations, and as a contribution towards the development of a versatile set-up for advanced UAV system design and testing, the UAV lab at the Naval Postgraduate School (NPS) developed a so-called Rapid Flight Test Prototyping System (RFTPS) for a prototype UAV. This paper briefly describes the RFTPS, including available UAVs and explains how it is being used as a rapid proof-of-concept tool for testing new guidance, navigation, and control algorithms for different applications. The paper starts with a general discussion of the RFTPS including the main motivation behind its development, system capabilities, hardware description, and a fleet of vehicles currently employed at the NPS for different research and development projects. The second part focuses on the description of the RFTPS capabilities that were demonstrated on several occasions while the new guidance, navigation and control (GNC) algorithms were taken from theoretical development to a flight test in no time.

*

Associate Professor; Dept. of Mechanical and Astronautical Engineering, Code MAE/Ka, Senior Member AIAA Research Associate Professor, Dept. of Mechanical and Astronautical Engineering, Code MAE/Yk, Associate Fellow AIAA. ‡ Research Associate Professor, Dept. of Mechanical and Astronautical Engineering, Code MAE/Dv, Member AIAA. § Research Associate Professor, Dept. of Mechanical and Astronautical Engineering, Code MAE/Jo, Senior Member AIAA. †

1 American Institute of Aeronautics and Astronautics

II. Hardware Description The RFTPS consists of a test bed unmanned air vehicle equipped with an avionics suite necessary for autonomous flight, and a ground station responsible for flight control of the UAV and flight data collection. A functional block diagram of the original RFTPS setup is shown in Fig.1.1,2 The key decision when designing the RFTPS was to use off-the-shelf technology as much as possible, thus exploiting the economy of scale of a number of commercial industries. Furthermore, since the UAV development program is to span many years and to draw on the talents of the NPS students in the future, the RFTPS emphasizes highlevel algorithm design. Low-level code and device driver generation is therefore kept to a minimum, the vast majority of the code “writing” being done via autocode tools. The system architecture is open, providing the ability to add, remove, or change real time input/output. Computational power can be increased as mission requirements dictate. The telemetry links are secure, yet low power and unobFigure 1. RFTPS hardware architecture. trusive to the public, thus dispensing with the need for special authorizations from government authorities. The onboard components are lightweight and low power, allowing for the inclusion of additional payload. A. Components The RFTPS setup consists of four main parts: - avionics control system located onboard the UAV, - ground station, - pilot manual control, and - operator interface. These four elements provide a reliable way to fly the UAV and enable the end user to program desired routes for the UAV via waypoints. For takeoff and landing of the UAV a pilot in the loop mode is enabled where the ground station only retransmits the signal generated by the pilot’s manual control. Since the RFTPS was first introduced a decade ago the actual components of the system has been continuously upgraded to accommodate current technologies. Some details on the original and following, upgraded setups can be found in Refs.1-4. Moreover speaking of onboard avionics, different UAVs available at the NPS have slightly different sets of sensors and/or autopilot. The following describes one of the latest setups, employing Cloud Cap Technology, Inc. (www.cloudcaptech.com) hardware, which is currently used for several projects. The Piccolo avionics, shown in Fig.2, is an autopilot designed to track the commanded path transmitted by the Ground Station. It generates the required signals for the control surfaces of the airplane (ailerons, elevator and rudder) and the engine’s thrust. The main processor is an MPC555 microcontroller based on the PowerPC architecture delivering a 40MHz operation including hardware floating point. The sensor group of the system includes: three rate gyros and three accelerometers used to determine the UAV’s attitude; a Motorola G12 global positioning system receiver (GPS) to determine its geodetic position, and a set of dynamic and static pressure sensors coupled with a thermometer to determine the airplane’s true air speed and altitude. The data link to the ground station is provided by a 900MHz/2.4GHz radio modem (as opposed to the PWM link for the original RFTPS setup1,2 as shown in Fig.1). All the data transmitted is wrapped in a custom developed two layers serial protocol. The ground station for the current setup is actually compounded of two Figure 2. Piccolo avionics.

2 American Institute of Aeronautics and Astronautics

ground stations - the Piccolo ground station being a part of a complete Piccolo system and the NPS ground station where all NPS GNC algorithms developed in MARLAB/Simulink being executed. The Piccolo ground station has two very important roles in the system: first, to provide the communication link between the avionics, the operator interface, and the pilot manual control; second, to convert the intentions of the end user captured through the operator interface, into meaningful commands for the autopilot. The main processor, as in the Avionics, is an MPC555 microcontroller based on the PowerPC architecture; it is equipped with an SMB connector for the GPS antenna, and a BNC connector for the UHF communications antenna. The connection to the operator interface is done trough a standard 9-pin serial cable. External communication interfaces of the NPS ground station, low-level communication protocols and device drivers for the sensors and actuators were developed in ANSI C and implemented as Level-2 S-Functions inside of the Simulink models. Mathworks xPC Target. The real-time workshop tools were used to generate, compile and download the control algorithm into the target computer running the real-time operating kernel. This technique offers the convenient capability to perform hardware-in-the-loop (HITL) simulation under different scenarios in order to prove the feasibility of control schemes proposed in the projects prior to the actual flight. The central role in the entire hardware environment in terms of GNC algorithms development and execution is carried by the NPS ground station, a real-time control computer (target PC). It is a Pentium® MMX-based PC104 form factor computer manufactured by WinSystems Inc., which is based on 266MHz processor chip. It contains a 256Mb disk-on-chip memory drive; Ethernet controller; four independent, full-duplex, RS-232 serial ports; USB port and PC/104 expansion slot for extension of the system capability. The MS-DOS compatible operating system runs the MathWorks xPC Target kernel. Real-time workshop provides automatic code generation and seamless implementation of the models developed in Simulink to run a control algorithm in real-time on the target PC. Communication with the target PC during the program execution is based on RS232 and UDP protocols that are supported by the PC104 board. The Piccolo operator interface consists of a portable computer running Windows 2000 operating system and a custom developed software to allow the end user to configure and operate the Piccolo system. The operator interface software has nine configuration screens that provide the end user with an interface to: - monitor the telemetry data from all sensors aboard the UAV and Piccolo’s ground station; - choose among several control modes (manual, autonomous with fixed control commands, fully autonomous); - program and display a desired route for the UAV via waypoints; - display the current position of the UAV in a geographical chart; - perform a preflight checklist; - calibrate control surfaces; - impose the limits of control inputs and the UAV’s states; - interactively adjust the desired gains of the control system in the autopilot. The Pilot manual control is mainly used for takeoff and landing, as well as while tuning the gains of the autopilot. It provides the end user with a way to override the commands generated by the ground station and allows a qualified UAV pilot to take control of the UAV. It is a commercial Futaba® radio remote control typically used for midsize hobby radio control (RC) airplanes. B. Fleet of Available UAVs and MAVs The NPS UAV program currently has three mid-size UAVs, one small UAV and several micro UAVs (MAVs) in its repertoire. The first UAV adopted into the program was a variant of the fiber optic guided-remote (FOG-R) or FROG UAV, shown in Fig.3. The FROG was a military-grade production UAV with a 3.2m wingspan and a 41kg takeoff weight – with roughly 11kg for payload, and about one hour endurance. While the original FOG-R was piloted by a fiber-optic line dispensed from a reel in the aft fuselage, the UAV lab’s FROG was controlled with the commercial off-the-shelf (COTS) hobby industry RC equipment, with 5-channel operation: ailerons, elevator, throttle, rudder and flaps. The trainer functionality of the RC equipment was exploited to allow for one or more channels to be controlled by a slave transmitter or ground-based auto-pilot system. Thus, an exogenous source (RFTPS) could be Figure 3. FOG-R UAV. given control of one, some, or all of the control actuators

3 American Institute of Aeronautics and Astronautics

of the aircraft using the same RC link currently controlling the aircraft. This first UAV was also equipped with the BTA autopilot. The sensor suite onboard the air vehicle consisted of an inertial measurement unit (IMU), differential GPS (DGPS) receiver, elevator, aileron, and rudder actuator position sensors, and angle of attack, side-slip angle, pitot-static, and static pressure air data sensors. The IMU included a three-axis rate gyro, three-axis accelerometer, magnetic heading indicator, and a two-axis pendulum that measured the vehicle angular rates, accelerations, and attitude, respectively. The sensor data were processed by a navigation filter inside the IMU. The IMU also provided a four channel analog-to-digital converter which was used to capture data from any four of the following sensors: elevator, aileron, rudder actuator position, angle of attack, side slip angle, dynamic pressure, or static pressure. When two other Frog UAVs became available they were also equipped with similar but more accurate sensor suites. One of them had a Humphrey vertical gyro, Trimble DGPS, 8-12 micron infrared (IR) and visible spectrum video cameras installed. More recently the NPS UAV lab acquired the FOG-R replacement, the Tern UAV (by BAI Aerosystems, Inc.), shown in Fig.4. The Tern UAV has a similar build, but is somewhat larger with a 3.5m wingspan, a 59kg maximum takeoff weight, and four hour endurance. The Tern has been fitted with a Piccolo autopilot, but due to the high weight of the UAV airframe and under-powered flight performance, as well as a high cost, it has fallen out of favor. To keep airframe cost and production times manFigure 4. Tern UAV with Piccolo autopilot. ageable the remaining aircraft in the NPS fleet are modified COTS hobby aircraft. The first in this series is the Tiger 60, shown in Fig.5, an almost ready to fly (ARF) sport trainer, with a 1.8m wingspan and 3.6kg weight. The prefab airframe can be purchased for about $200, and can be made ready to fly with another $300 to $400 of hardware and about 15 or 20 hours of labor. The plane is powered by a two-stroke 10cm3 glow-fuel engine, which has been very reliable, and has more than enough power, but the use of glow fuel limits the endurance to about 15 minutes. The Piccolo autopilot was installed, and the gains of the PID controllers in all channels were quickly tuned allowing the plane to fly waypoints autonomously. Because of its small size and excellent handling qualities, the aircraft is well suited to small fields and can typically handle higher winds than the larger vehicles in our fleet. The Tiger UAV is currently used to Figure 5. Tiger 60 ARF UAV fitted with Piccolo test the NPS autopilot and autoland algorithm (Section autopilot. IIIG). Because of its small size, the Tiger cannot carry much beyond the autopilot; consequently larger vehicles are used when additional payloads are required. The next vehicle in the fleet is a modified Senior Telemaster UAV, with a 2.5m wingspan, and 8kg weight, shown in Fig.6. The Telemaster UAV was built from a kit and was heavily modified to suit our needs, including a reinforced two-piece bolt-on wing with enlarged ailerons, removable tail surfaces, and a much larger fuselage to house payload components. It is powered by a 23cm3 twostroke gasoline engine, and with a 150g gas tank, the plane has an endurance of about three hours. The Telemaster uses the Piccolo autopilot, and the gains were tuned for autonomous way-point flight during the first flight. Its primary payload is a gimbaled camera, mounted just under the nose. The custom pan-tilt unit is driven by COTS Figure 6. Modified Senior Telemaster UAV with hobby servos, driven via a Freewave serial link, and progimbaled camera.

4 American Institute of Aeronautics and Astronautics

vides 360º of pan, and 90º of tilt. The gimbal currently has a high resolution low-light black and white camera, but it is designed to accept two cameras at a time, eventually adding either a color camera or an IR camera. Video is transmitted to the ground using an 800mW 2.4GHz link from Electra Enterprises, with either an omni antenna for 1.6km to 3.2km range or a high gain tracking patch for 8km to 16km range. In Fig.6 the pan-tilt gimbal can be seen under the nose, with a low light high resolution camera mounted on one side. Note the engine exhausts upward to keep the camera clean. A close up of the gimbal is shown in Fig.7, and a close up of the equipment bay in Fig.8 shows the aft fuselage revealing slide-in cards with the video transmitter, serial-to-PWM card to drive the gimbal servos, Freewave radio to receive the serial commands for the gimbal from the ground, and the power conversion for the servos. Figure 7. Gimbal detail in the Telemaster UAV. The electronics components are mounted to lightweight aircraft-plywood trays which slide into EPT foam blocks, allowing for vibration isolation and easy access to all of the components. A hyde-softmount is used to isolate engine vibration from the airframe, both increasing video quality and the lifespan of the avionics. At idle the vibration is still quite noticeable in the video, but above about 10% throttle the vibration is almost completely damped providing very clean video.

a)

Figure 8. Electronics bay in the Telemaster.

b)

Figure 9. Silver Fox UAV (a) and ACR developed launcher (b). payload capacity for different payloads, including state-ofthe-art “eyes in the sky” camera technology and other small, state-of-the-art detection systems. It has a flight endurance of several hours that enables it to cover large areas of territory. Launched using a small compressed air powered launcher (Fig.9b) and flying autonomously using DGPS, Silver Fox employs autoland system developed by the NPS with ACR of Tucson, AZ through the SBIR program. On the other end of the scale, about a decade of research in low Reynolds number, flapping-wing propulsion has led to the very unconventional vehicle shown in Fig.10. The flapping-wing MAV shown has a wingspan of 23cm, and weighs under 1.5g, including a 3-channel radio and battery power for about 15 to 20 minutes of flight.

Another UAV that was also used at the UAV lab at the NPS is the Office of Naval Research/Advanced Ceramic Research (ACR) small tactical SWARM UAV and its successor Silver Fox UAV (Fig.9a). The Silver Fox UAV has a 1.8m long fuselage with detachable 2.4m wide wings and tail fins that fit into a super-sized golf bag carrying case. Weighing only 9kg and powered by a model plan engine, it can soar upwards of a 300m and has the

Figure 10. Flapping-wing MAV.

5 American Institute of Aeronautics and Astronautics

The unusual design is often referred to as bio-inspired; as it does not look like anything in nature, but it emulates some facets animal behavior to improve performance. The MAV is not generally considered to be an ornithopter, as it does not produce lift and thrust from the same components. Instead, the vehicle produces most of its lift from the large fixed wing, and a biplane pair of flapping wings is used to produce thrust. The flapping wings operate in counterphase to emulate flapping in ground effect, and this provides the additional benefit of aerodynamic and mechanical balancing. The flow they entrain, while producing thrust, has the additional benefit of preventing flow separation over the main wing. Flow separation over lifting surfaces is usually unavoidable at these low Reynolds numbers (2×104), causing partial or complete stall, and loss of aerodynamic efficiency. However, wind tunnel experiments have shown that the flow over a completely stalled wing is reattached in about 1/10 of a second when the wings start flapping. This phenomenon is also seen in the gust response in flight. The MAV is essentially a stallproof design while under power. Using a popular figure of merit (FOM) adopted by the MAV and small helicopter community (FOM = lift capacity in grams / shaft-power required in Watts), the flapping-wing design yields a 60% higher FOM than conventional helicopter designs with the same disk loading. Currently the MAV carries no payload other than the avionics and power system needed for flight and the size is driven entirely by these components. Somewhat smaller models have been built and flown using smaller batteries and actuators. In order to carry any kind of useful sensor payload the size will undoubtedly have to increase. The models excel at lower flight speeds, making them ideal for flight in confined areas, such as under the cover of trees or inside building. To make use of this, a sensor package will need to be developed allowing for autonomous flight without GPS and without Table 1. Fleet of UAVs and MAV available at the NPS. line-of-sight with the Wing Max takeoff EndurPayload, ground control. More deUAV Wing type span, m weight, kg ance, hour kg tails about the Micro Air 3.2 high wing 41 1 11 FOG-R Vehicles developments at 3.5 high wing 59 4 Tern the NPS may be found in 1.8 low wing 3.6 0.25 Tiger 60 Ref.5. 2.5 high wing 8 3 Telemaster Table 1 summarizes the Silver Fox 2.4 middle wing 9 3 major characteristics of the 0.23 flapping-wing 0.015 0.3 MAV NPS’s UAV/MAV fleet. C. RFTPS Capabilities Summarizing this section the following provides with the list of the RFTPS capabilities. - Within the RFTPS environment, one can synthesize, analyze and simulate guidance, navigation, control, and mission management algorithms using a high-level development language; - Algorithms are seamlessly moved from the high level design and simulation environment to the real time processor; - The RFTPS utilizes industry standard I/O including digital to analog, analog to digital, serial, RF, and pulse width modulation capabilities; - The RFTPS is portable, easily fitting in a van. In general, testing will occur at fields away from the immediate vicinity of the Naval Postgraduate School; - The unmanned air vehicle can be flown manually, autonomously, or using a combination of the two. For instance, automatic control of the lateral axis can be tested while the elevator and throttle are controlled manually; - All I/O and internal algorithm variables can be monitored, collected, and analyzed within the RFTPS environment.

III. Applications This section provides with brief description of different experiments that were carried out for the Office of Naval Research, NASA, Air Force and Homeland Security Department using different UAVs. NPS students, Navy officers were extensively involved in them so that almost every research ended up with a M.S. thesis. It should be noted that in each of the projects three main software related problems were always addressed. The first is to establish effective communication between the various types of hardware and the target computer (very time consuming part if hardware changes are involved). The second relates to the design and implementation of an effective controller stabilizing the vehicle using various control techniques (inner control loop). And only the third problem deals with the development of the GNC algorithm itself (outer control loop).

6 American Institute of Aeronautics and Astronautics

A. UAV Model Identification One of the first projects was to develop a routine for model identification. Correspondent analytical methods were developed and used for this purpose.6 (Later on, to develop aerodynamic derivative coefficients the panel codes were also employed7). Then a series of flight tests were conducted to test the validity of the analytical model. The control surface inputs and the corresponding aircraft response measured by various IMU sensors were stored by the RFTPS and used for parameter identification. The conventional configuration of the FROG Table 2. Comparison of selected longitudinal derivatives. UAV, which model was developed first, sugDerivative Analytic Experimental ∆, % gested that the parameter identification problem CLα 4.3 4.09 5.13 could be decoupled by identifying longitudinal Cmα and lateral/directional dynamics separately. -0.417 -0.557 -25.1 Maximum likelihood parameter identification was CLδ -1.12 -0.391 186 used to refine existing analytical estimates of the Cmδ -1.62 -1.05 54.3 stability and control derivatives. The results for a few key longitudinal stability derivatives for the FROG UAV are compared to their analytic estimates in Table 2. Figure 11a compares pitch rate response of the refined model to an elevator doublet with in flight response. e

e

60

15 Pitch Rate in simulation (º/s)

40

Pitch Rate in flight test (º/s)

10

20 5

0 -20

0

Elevator doublet(º)

Yaw Rate in simulation (º/s)

-40 -5 -60

Yaw Rate in flight test (º/s) -10

0

a)

5

10 Time (sec)

15

20

0

10

20 Time (sec)

30

b)

Figure 11. Pitch rate response to an elevator doublet (a), and yaw rate response to a step signal sent to the autopilot’s lateral channel (b). Similar process was repeated for the lateral axis. Aileron and rudder doublets were executed while aileron and rudder position, roll and yaw rates, and sideslip angle were measured. Results, for a few key lateral stability derivatives for the FROG UAV, are compared to analytic Table 3. Comparison of selected lateral derivatives. estimates in Table 3. Derivative Analytic Experimental ∆, % CY β B. Autopilot Model Identification -0.31 -0.987 -68.6 For the Frog UAV the control laws of the autopilot Cl β -0.051 -0.094 -45.7 were needed to be identified as well. The autopilot Cn β 0.058 0.176 -67.0 was considered to be a “black box” containing both 6 C 0.181 0.339 -24.3 lδ sensors and controller logic. The intent was to model the unit as closely as possible without disassembling it. Since autopilot controls vertical speed using elevators and yaw rate using ailerons it was natural to decouple the problem into an identification problem for the lateral and longitudinal channels separately. The lateral channel employs a rate gyro to track yaw rate commands via feedback to the ailerons. The Generalized autopilot structure (for each channel) is shown in Fig.12. In order to identify the dynamics of the block labeled T, the unit was rotated at differing yaw rates. This provided a variable frequency signal to the feedback path with a frequency content covering 0 to 20 radians per second. The commanded yaw rate was held constant. The feedback loop was broken at the summing junction of command and measFigure 12. Generalized autopilot structure. ured signals, and the feedback signal was captured. Paramea

7 American Institute of Aeronautics and Astronautics

ter identification algorithms were used to determine that the feedback loop dynamics could be approximated as 1 − 0.1s Tlat ( s ) = . 1+ s With the autopilot on, a step command in yaw rate was transmitted to the vehicle in flight. Vehicle yaw rate, as measured by the IMU, was recorded and used to estimate the value of the gain K lat at 0.25. Figure 11b shows that simulation using found Tlat ( s ) and K lat matches the real flight data fairly well. The longitudinal channel of the autopilot senses the rate of change of static pressure in order to control the vehicle’s vertical velocity via feedback to the elevator. Flight test data capturing the response of the vehicle to a step input in climb rate command was used for model identification. The structure of longitudinal channel occurs to be 0.45s the same with Tlon ( s ) = and K lon = 0.25 . 1 + 0.45s C. Path Following Algorithm Development The next project after the UAV model has been Figure 13. Feedback diagram. established was the design and flight testing of nonlinear inertial path following algorithms. Figure 13 shows the generalized feedback diagram of the developed integrated guidance and control algorithm. Figure 14 presents flight test results of tracking of a circle in presence of turbulence.8 yLTP (m)

7

Bird-Eye View

330

9

Yaw Rate command (º/s)

6

6

5

3

4

0

165 0

21

Lateral Error (m)

Climb Rate command (m/s)

-3

14

-165

-6

7 0

-330

-9

Vertical Error (m)

-7 -495 -495 -330

-165

0 165 xLTP (m)

330

495

-14

0

20

40 60 Time (sec)

Figure 14. Flight test results of a circle tracking. Fig.15 shows the results of tracking of a trajectory representing a river-bed with the use of a vision-based guidance algorithm.9 A map of a river to be tracked is also included in the figure.

D. Voice Control Experiment The voice control experiment using ViA Wearable PC (Fig.16a,b) was carried out next.10 The Voice Control System (VCS) was comprised principally of commercially available components. In addition to the standard RFTPS, the VCS required the use of a laptop computer to translate commands between the ViA wearable PC and the Sun workstation. The VCS software programs that run on the laptop and wearable computers are the only custom made components of the system. The ViA Wearable system featured two methods of entering commands during normal operation. Commands might be entered in the traditional manual method utilizing the touch-

80

100

-12

0

20

40 60 Time (sec)

80

100

yLTP (m) Bird-Eye View

1330

Pt 1

Runway

Pt 2

0

Pt 3

-1330 Pt 4

-2660

-1330

0 xLTP (m)

1330

Figure 15. River tracking trajectory.

8 American Institute of Aeronautics and Astronautics

2660

Table 4. The set of available commands.

screen display (Fig.16c) or with voice via the audio headset and voice recognition software. A set of available commands is presented in Table 4. The ViA Wearable and voice recognition software performed exceptionally well. Once the user determined the proper microphone placement and speaking volume level, the software recognized almost 100% of voice commands. Figure 17a illustrates the lateral commands and resulting aircraft response for one of the test runs. The top graph represents the input command from the VCS before it is augmented by the variable gain. A value of one represents a standard rate right turn; negative one - a standard rate left turn. The middle graph is the resulting PWM Futaba command transmitted. The final graph shows GPS heading and demonstrates that the aircraft does indeed turn in the direction commanded. Figure 17b details the longitudinal results for the same run. Here, a value of negative one represents a full climb; -0.5 represents a half rate descent. Voice command “RIGHT STANDARD” “RIGHT HALF” “LEFT STANDARD” “LEFT HALF” “CENTER” “CLIMB FULL” “CLIMB HALF” “DESCEND FULL” “DESCEND HALF” “LEVEL”

Action command Standard rate right turn Half rate right turn Standard rate left turn Half rate left turn Constant heading Standard rate climb Half rate climb Standard rate descent Half rate descent Constant altitude

a)

b)

c)

Voice Longitudinal C GPS LTP Elevation (ft) Elevator PWM CMD

GPS Heading (deg)

Aileron PWM CMD

Voice Lateral CMD

Figure 16. ViC wearable computer (a), ViC hand-held display (b), and touchscreen display (c).

Time (sec) a) b) Figure 17. Examples of lateral and longitudinal commands performing.

9 American Institute of Aeronautics and Astronautics

Time (sec)

E. Integrated IR Vision/Inertial Navigation System Design As discussed in Section II, an enhanced sensor suite includes such sensors as visual or IR cameras (see examples of the images taken with these sensors in Fig.18). These sensors were used to demonstrate the capabilities of new navigation algorithms that integrate vision, GPS and inertial (all passive) sensors and explicitly address - performance measures compatible with the sensor specifications and with control system requirements, - the nonlinear nature of the underlying sensor geometry (e.g., vision sensors), - physical characteristics of the sensors, - multi-rate nature of the data generated by the passive sensors, - robustness with respect to out-of-frame events and occlusions. To allow vision navigation, frame grabbing and image processing cards were integrated into the RFTPS. New nonlinear filter structures were introduced to estimate the position and velocity of the autonomous UAV with respect to a moving ship, based on measurements provided by IR vision and inertial sensors.11 These results were extended later to deal with occlusions and out-of-frame events that arise when the vision system loses its target temporarily.12 The idea was to use the natural hot spot of the ship’s smokestack as well as additional hot spots that may be added to assist in the task of determining relative position and orientation of the UAV with respect to a ship during autonomous landing.13 Usual barbeques were used to emulate the ship’s smokestack in the flight experiment. Several state-of-the-art numerical algorithms including IR camera image processing were developed and successfully flight-tested.14 The accuracy of the resulting solution was compared with the DGPS position obtained from the Trimble receiver and proved to be quit reasonable (Fig.19).

a) b) c) Figure 18. Examples of images provided with the usual camera (a), and the IR camera (b,c).

Figure 19. Comparison of newly developed algorithm solution with true (DGPS) position of the UAV. F. Demonstration of Real-Time Linked UAV Observations and Atmospheric Model Prediction in Chem/Bio Attack Response Under the project funded by the newly organized Department of Homeland Security the integration of components for a near-real time decision aid designed to enable small units to respond in a focused way to a chem/bio attack was evaluated. The main goal of the project has lain in the developing the utility of the real-time combining meteorological and oceanographic prediction data with the UAV’s navigation and sensor measurements. The Tern UAV was instrumented with meteorological data sensors and programmed to fly in a certain trajectory pattern. The UAV’s principal role was to both map the effective plume dispersion in the atmosphere and provide the

10 American Institute of Aeronautics and Astronautics

wind estimation for the prediction of chem/bio agent dispersion. The data acquisition role was to make meteorological observations with tolerable errors of the atmospheric parameters of 1.0m/s, 1.0º, 1.0ºK, 5.0m, 1.0millibar, 0.005º and 5.0% for wind speed, direction, temperature, altitude, pressure, latitude/longitude, and relative humidity, respectively. The wind estimation model has been implemented in a Simulink real-time workshop environments providing capability to execute this model in a remote computer in a real time (see Fig.20). The data about spatial position of r ri r r − bi Rab RVa , where W i is the vector of the aircraft was used to solve in real time a simple vector equation W i = VGPS wind velocity calculated in an inertial ri is the inertial velocity vector frame, VGPS r measured by GPS, Va is the vector of air speed, ba R and bi R are rotation matrixes for the velocity transformation from airframe to body and from body to inertial frame. A successful demonstration of the system’s capabilities in the fall of 2002 proved the feasibility and high effectiveness of linking the UAV’s sampled navigation and meteo/chem/bio data with the wind model for producing fine resolution Figure 20. Wind estimation model. forecast of chem/bio agent dispersion. The average error in determining the wind magnitude (compared with data obtained traditionally from balloons) was under 3m/s while the average error in the determining the wind direction was less than 5º.7,15

G. Shipboard Autoland System Development The current research extends the previous research on the integrated vision/inertial navigation system design and addresses the development of a shipboard autoland system for multiple UAVs. The typical mission scenario includes a ship under way that has launched and now needs to recover a team of small UAVs. (Specific UAVs considered in this project were Silver Fox UAVs. It is assumed that initially UAVs are flying in formation towards the ship. The approach for sequential autoland of the UAV formation:16 i) real-time trajectory generation for each UAV so as to bring it to the top of the glideslope from its place in the formation during a specified time slot (Segment 1 on Fig.21). These trajectories must guarantee deconfliction; furthermore, the time slots are selected to provide the UAV team aboard the ship sufficient time to retrieve each UAV from the net; ii) real-time glideslope generation to bring each UAV from the top of glideslope to the center of the net, moving with the ship (of course this Segment 2 is actually being computed first to provide the final point for the Segment 1); iii) a control strategy to force each UAV to track the trajectories developed in steps 1 and 2. So far, all supporting algorithms have been developed and tested in simulation (including HITL simulation). Several real landings at the predetermined stationary touchdown spot (not-moving recovering net prototype) were also fulfilled. Figure 22 provides with an Two DGPS receivers at net’s example of near-optimal collision-free tracorners and barometer jectories for multiple UAVs computed in real-time using the direct method of calcuSegment Segment 2: 2: Stabilized Stabilized lus of variations17 modified to accommoglideslope glideslope tracking tracking date deconfliction constraints for multiple UAVs.18 Figure 23 shows the results of Monte Carlo simulation for the single UAV Glideslope capture including net impact points distribution. As Engine shut down proved by simulation the available sensor ~25m before net suit aboard the UAV and developed algoAutoland initiation point rithms provide sufficient accuracy to land Segment Segment 1: 1: Glideslope Glideslope capture capture UAVs within the net’s frame. Flight tests (from (from any any initial initial condition) condition) are currently under way. Figure 21. Small UAV shipboard autoland strategy.

11 American Institute of Aeronautics and Astronautics

Figure 22. Scheduled cooperative glideslope capture by a group of three UAVs.

Net Impact points 3D projection

z (m)

znet (m) x (m) y (m)

xnet (m)

Figure 23. Monte Carlo analysis for the single UAV. H. Automatic Target Tracking System Design Another ongoing project, which also extends the previous work on integrated IR vision/inertial navigation system design, is devoted to the automatic target tracking (ATT) from a live video obtained from an autonomously piloted reconnaissance platform with gimbaled camera on it. The Telemaster UAV carries a two axis gimbaled camera which acquires video of the territory and sends the information to the ATT computer in real time (see Fig.24). During the mission, the operator of the ATT computer may identify a target of interest with a Gimbaled camera Piccolo avionics joystick click on the real time video where a target is present. The target which ap900MHz Piccolo pears inside of a small rectangular polygon protocol 2.4GHz video link could be tracked by engaging track mode. Gimbals control The position of the target is identified by Serial link command two Cartesian coordinates in a camera Piccolo ground station frame. This information is processed by Pilot manual control the control algorithm and commands are Full duplex serial Guidance command transmitted to the onboard gimbaled platATT computer Serial link form to correct the pan and tilt of the camFull duplex serial era in a local body frame so that the target appears at the center of the frame throughNPS ground station Operator interface out tracking. Figure 24. Architecture of the target tracking system. The automated motion tracking software by PerceptiVU, Inc. supports the target tracking part of the project. When the tracker senses a target in motion within the fixed camera scene, the gimbaled camera tracks the object as it moves within the scene. Two cameras can be linked to the system to provide low light and IR sensing capability thus extending the reconnaissance performance of the whole system.

12 American Institute of Aeronautics and Astronautics

An autonomous flight capability around the chosen target is supported by the algorithms being currently developed by the authors. Flight tests for this project are currently under way too.

IV. Conclusion In conclusion it can be stated that the developed RFTPS proves itself as a powerful, portable, rugged, effective tool both in the field and in the lab. It ensures safety and reliability at controlling the UAV and provides excellent opportunities for rapid flight testing of sensors and new algorithms. Its high-level interface allows NPS students to be easily involved into the design process. In addition the fleet of UAVs and MAVs currently available at the NPS allows performing challenging timely research projects.

Acknowledgements The authors would like to thank all PhD and graduate students that were involved into different projects. They would also like to thank the Office of Naval Research for funding most of these projects.

References 1

Hallberg, E., Kaminer, I., and Pascoal, A. “Development of a Flight Test System for UAV’s”, IEEE Control Systems, Feb. 1999. 2 Hallberg, E., Komlosy, J., Rivers, T., Watson, M., Meeks, D., Lentz, J., Kaminer, I., Yakimenko, O., “Development and Applications of Rapid Flight-Test Prototyping System for Unmanned Air Vehicles,” IEEE International Congress on Instrumentation in Aerospace Simulation Facilities, Toulouse, France, June 14-17, 1999. 3 Flood, C.H., “Design and Evaluation of a Digital Flight Control System for the FROG Unmanned Aerial Vehicle,” Master’s Thesis, Aeronautics and Astronautics Dept. Naval Postgraduate School, Monterey, CA (NPS), September 2001. 4 Lim, B.-A., “Design and Rapid Prototyping of Flight Control and Navigation System for an Unmanned Aerial Vehicle,” Master’s Thesis, NPS, March 2002. 5 Jones, K.D., Bradshaw, C.J., Papadopoulos, J., and Platzer, M.F., “Improved Performance and Control of Flapping-Wing Propelled Micro Air Vehicles,” 42nd AIAA Aerospace Sciences Meeting and Exhibit, AIAA Paper 2004-0399, Reno, Nevada, January 5-8, 2004. 6 Papageorgiou, E., “Development of a Dynamic Model for a UAV”, Master’s Thesis, NPS, CA, March 1997. 7 Sir, C., “Real-Time Wind Estimation and Display for Chem/Bio Attack Response Using UAV Data,” Master’s Thesis, NPS, June 2003. 8 Kaminer, I., Pascoal, A., Hallberg, E., and Silvestre, C., “Trajectory Tracking for Autonomous Vehicles: An Integrated Approach to Guidance and Control,” AIAA Journal of Guidance, Control, and Dynamics, Vol.21, No.1, 1998, pp.29-38. 9 Watson, M., “Vision Guidance Controller for Unmanned Air Vehicle,” Master’s Thesis, NPS, December 1998. 10 Komlosy, J., “Application of rapid prototyping to the design and testing of UAV flight controls systems,” Master’s Thesis, NPS, March 1998. 11 Hespanha J., Yakimenko O., Kaminer I., and Pascoal A., “Linear Parametrically Varying Systems with Brief Instabilities: An Application to Integrated Vision / IMU Navigation,” IEEE Transactions on Control Systems Technology, Vol.40, No.3, 2004, pp. 12 Kaminer I., Pascoal A., Kang W., Yakimenko O.A., “Application of Nonlinear Filtering to Navigation System Design Using Passive Sensors,” IEEE Transactions on Aerospace and Electronic Systems, Vol.37, No.1, 2001, pp.158-172. 13 Yakimenko O.A., Kaminer I.I., Lentz W.J., Ghyzel P.A., “Unmanned Aircraft Navigation for Shipboard Landing using Infrared Vision,” IEEE Transactions on Aerospace and Electronic Systems, Vol.38, No.4, 2002, pp.11811200. 14 Ghyzel, P.A., “Vision-Based Navigation for Autonomous Landing of Unmanned Aerial Vehicles,” Master’s Thesis, NPS, September 2000. 15 Tan, K.L., “Precision Air Data Support for Chem/Bio Attack Response,” Master’s Thesis, NPS, March 2003. 16 Lizarraga, M.I., “Autonomous Landing System for an UAV,” Master’s Thesis, NPS, March 2004. 17 Yakimenko, O., “Direct Method for Rapid Prototyping of Near-Optimal Aircraft Trajectories,” AIAA Journal of Guidance, Control, and Dynamics, Vol.23, No.5, 2000, pp.865-875. 18 Kaminer, I.I., Yakimenko, O.A., Dobrokhodov, V.N., Lizarraga, M.I., Pascoal, A.M, “Cooperative Control of Small UAVs for Naval Applications,” to appear in Proc. 43rd IEEE Conference on Decision and Control, Paradise Island, Bahamas, December 14-17, 2004. 13 American Institute of Aeronautics and Astronautics

Suggest Documents