Model-Driven In-the-Loop Validation: Simulation-Based Testing of ...

5 downloads 2526 Views 2MB Size Report
environment for UAV software are the modeling and simulation of the environment and the ... field of UAVs (unmanned aerial vehicles) they have lead to low-cost .... is send by the software, the value is used to divide the force that is generated by .... form of the hard- ware abstraction layer and the flight management software.
Model-Driven In-the-Loop Validation: Simulation-Based Testing of UAV Software Using Virtual Environments Florian Mutter, Stefanie Gareis, Bernhard Sch¨atz, Andreas Bayha, Franziska Gr¨uneis, Michael Kanis, Dagmar Koss fortiss GmbH Guerickestr. 25, 80805 M¨unchen, Germany {mutter,grueneis,schaetz,bayha,gareis,kanis,koss}@fortiss.org Abstract—With the availability of the off-the-shelf quadrocopter platforms, the implementation of autonomous unmanned aerial vehicle (UAV) has substantially been simplified. Such UAVs can explore unaccessible terrain and provide information about the local situation in a specified target area. For the early development of an autonomous aerial vehicle, virtual integration of the system makes it possible to test a software implementation without endangering the hardware or the environment. Key elements of such a virtual test environment for UAV software are the modeling and simulation of the environment and the hardware platform, as well as the integration of the software in the simulation.

I. I NTRODUCTION The increased computational power available in current embedded processors has drastically simplified the implementation of robotic hardware by providing off-the-shelf basic components with complex control functionality. In the field of UAVs (unmanned aerial vehicles) they have lead to low-cost quadrocopter platforms. UAVs are vehicles that are operated without a pilot on board. UAVs are mostly remotely controlled by a pilot, but increasingly autonomously operating UAVs are explored, specifically UAVs that are used for scientific observations from the air or to explore unaccessible areas [1]. The FALTER project [2] implements an autonomous UAV that is intended to be used indoors. The major use cass for the FALTER UAV is the exploration of areas in buildings that are not accessible for humans, e.g., to explore a factory site after an accident. To that end, the FALTER unit is equipped with different sensors to investigate the terrain and situation. Unlike most autonomously operating units, FALTER is designed for the use of indoor exploration, therefore the GPS-based navigation, commonly used in autopilot approaches for UAVs [1], cannot be used. The unit operates on its own after it received the mission data from the mission control. It starts at a given point with a rough floor plan of the building under consideration and tries to find a way to a given target area. After collecting situation data, it returns to the start point and transmits the data to the mission control station. Equipped with a large collection of sensors – including gyroscopes, accelerometers for position estimation, and infrared sensors for collision avoidance – the FALTER unit autonomously controls the actuators – its propellors – to achieve the mission goal. Obviously, the development of

such an autonomous behavior requires the implementation of complex algorithmic functionally, from collecting the sensor data and controlling the actuators via the execution of flight commands up to the complete (re-)planing of the mission. Testing such a complex functionality poses threats to both the equipment under development as well as the test environment and test personnel, since the UAV can move with a substantial speed, using rotations of the propellors of up to 5000 rpm. As the system under development has to operate autonomously, classical stimulus-response black box testing is not adequate. Rather the system under test must continuously interact with its environment in a feedbackloop fashion, issuing commands via its actors based on the data collected via its sensors. To avoid such kind of damages during the development and test of the software and still support in-the-loop testing, virtual integration of the software can be used, i.e., the process of bringing a system into service without actually building the system by simulating it with all components of interest included. This allows to adequately and safely test the system under development on the software as well as on the hardware level; either by simulating hardware, equipment, and environment (’software-in-the-loop’), or by simulating only (part of) the equipment and the environment (’hardware-in-the-loop’). A. Overview and Contribution In the following, an approach to support early testing of embedded software via a software-in-the-loop virtual integration is presented. The approach consists of • the definition of platform model and environment model capable of expressing UAVs with range sensors • the implementation of such a combined platform and environment model using Matlab/Simulink • the simulation and visualization of this combined model based on the Simulink VRML To that end, the entire hardware platform, equipment under development and environment is modeled in software and simulated. This method provides a fast way to get test results, while using the unmodified software under development, and does not need expensive testbed equipment. After relating the presented approach to other research in this field, the remainder of this contribution is structured as follows: Section II shows how simulatable models for the hardware platform as well as the environment can be

built, with a specific focus on the modeling of ultrasonice distance sensors. Section III shows how the software under development is integrated into a simulation of these models based on Matlab/Simulink, and how the execution of the simulation can be visualized. Section IV gives a short discussion of the achieved results and experiences, followed by a brief summary. It concludes with a discussion of future work. B. Related Work In-the-loop simulation is commonly used in the embedded industry to reduce development time and cost. To that end, a simulation is constructed by using a plant model to be controlled by the system under development, either explicitly by building a mathematical representation of the plant or by using prerecorded observations of the plant. The work presented here focuses on the construction of an explicit platform and environment model for UAVs that allows to include the simulation of range sensors. In contrast to approaches like [3], the approach presented here does not deal with the validation of a model using simulation. Furthermore, since test goal and test oracle are comparatively simple, unlike in [4] the generation of suitable test scenarios is not in the focus of this contribution. Rather, like in [5], a model of the platform and the environment is used to validate the behavior of a system using a in-theloop simulation. In contrast to later, however, here a 3D simulation is used. Especially in the virtual integration and simulation of UAV software, the focus is generally put mainly on the modeling of the aerodynamics. Little or no support is offered for the simulation of the sensor interface specifically used in the autonomous control of UAVs. Therefore, even extensive virtual test beds like [6] or [7] use simulation engines provided for flight simulators, but provide no facilities to simulate sensors other than gyroscopes, accelerometers, altimeters, GPS, etc, as needed in the FALTER project. As a result, current virtual test beds are mainly suited for detailed tests of low-level control functionality – for example, of the self-stabilization properties of a flight control – but do not provide the functionalities for in-the-loop testing of autonomous control algorithms. The construction of simulation models of sensors like the ultrasonic sensors used in the FALTER project is generally only found in the filed of sensor and/or filter development. Here, the sensor is modeled in very fine-grained detail to describe the signal propagation on the transducer level [8]. However, this level of detail is not suitable to efficiently support in-the-loop simulation of systems with multiple sensors and autonomous behavior as in the FALTER unit in a complex environment like a building with several walls. In contrast to the approaches mentioned above, the approach presented here allows to perform functional testing of autonomic UAVs using range sensors via virtual integration, with a sufficient level of abstraction to enable efficient simulation of the platform under development and complex environments.

Figure 1.

The FALTER unit

II. P LATFORM AND E NVIRONMENT M ODEL To support a virtual software-in-the-loop test, a model for the FALTER platform – i.e., the unit without the software under test – as well as a model of the environment the unit operating is in – i.e., the building including additional obstacles – are needed. The first model mainly has to deal with the dynamics of a moving UAV, while the second model mainly has to address the aspects of sensing objects. In the following, these two models are described, after a brief overview of the FALTER unit. A. FALTER Overview The FALTER unit, as shown in Figure 1, is based on two main hardware components. The first one is the quadrocopter platform L3-ME from the HiSystems GmbH. The second components is the RoBoard RB-100 from DMP Electronics Inc. The L3-ME is a self-assembly kit for a quadrocopter. The controller, that comes with the quadrocopter platform, is the preassembled FlightCtrl. It has gyroscopes, a 3D accelerometer and an atmospheric pressure meter installed. The FlightCtrl is responsible for holding the unit in the air. In the FALTER unit setup, the FlightCtrl receives commands from the RoBoard and translates these commands into signals for every motor. The accelerometer can measure three axes at the same time. For the rotations around the three axes, three gyroscopes are installed. The FlightCtrl has a hold-flight-level mode in which is used to keep the unit on a specified flight level. The controller which is responsible for the autonomous flight is a RoBoard RB-100, using a Vertex86DX CPU running at 1000MHz and 256MB DRAM. The Pulse Width Modulation interface is used to transmit the commands from the RoBoard to the FlightCtrl. An I2 C interface is used to connect the ultra sonic range finders. The three front range finders are used to triangulate the position of an object that is detected in front of the copter. Two sensors are faced to the left and the right side of the copter to measure the distance to objects that get currently passed by the unit. Another range finder measures

stabilization algorithms shipped with the original hardware platform.

Figure 2.

Axes and Momentums 1

the distance to the ground supporting the air pressure meter to hold the flight level. One is pointed to the rear of the unit and one to the top to keep enough distance to the ceiling. B. Platform Model For the FALTER project, the path planning and sensor data aggregation are the main parts which need to be tested. Because the project is not about the control algorithms to stabilize the unit or any other low level control of a quadrocopter, these aspects are not key part of the simulation. Therefore the flight dynamics are not modeled up to a degree of detail corresponding to the real dynamics of the system but only in a simplified way. The unit is represented as a single point of mass, aerodynamic aspects are not modeled at all. Consequently, the unit can stand perfectly still in the air if no commands are issued without drifting from its position. While these simplifications do not reflect the behavior of the unit at a fine-grained level of control, they allow to limit the complexity and to make it possible to test the path finding and sensor data fusion algorithms. Due to the simplifications mentioned above, it is possible to use an equations-of-motion block from an off-the-shelf Simulink Aerospace library that does not need to be substantially extended. The six-degrees-of-freedom (6DoF) model represents a single point of mass and reacts on three forces and three momentums. The FALTER unit is designed to send the control command from the high level controller to the original hardware, with the low level flight stabilization. To limit the complexity of the flight controlling algorithms, the software is modeled with only three degrees of freedom. The unit can move up and down along the z-axis by setting the engines power. It can yaw around the z-axis and it can pitch to move forward along the x-axis as shown in Figure 2. In the simulation the yaw command and the engine power are directly converted to a momentum around the z-axis and a force along the z-axis of the copter. When a pitch command is send by the software, the value is used to divide the force that is generated by the engines in two parts: one part along the z-axis of the copter and another along the x-axis. This eliminates the pitch movement that would be seen on the real hardware but also reduces the need to simulate the flight

C. Environment Model The main sensors of the FALTER unit to determine its position and to detect obstacles are the eight Devantech SRF08 ultrasonic sensors, used to measure ranges from 3 cm to 6 m. They are connected via the I2 C bus and can receive and record up to 17 echo signals. To determine its position in the room, the FALTER unit also uses gyroscopes and the accelerometer. In the simulation, the sensor values for the last two sensor types come directly from the 6DoF model which is provided by the Aerospace Blockset of Simulink. For the ultrasonic sensors, no off-the-shelf simulations models are provided. The position of a ultrasonic sensor is given by the position of the FALTER unit and an offset that is given by the design of the unit, i.e., the location of the sensor on the unit. The direction of every range finder is calculated from the direction of the copter. The three front range finders are heading in the same direction like the unit itself. To get the directions of the range finders facing to the sides, the direction of the FALTER unit is rotated around the z-axis of the copter for 90◦ . The same applies to the ground facing range finder that is rotated 90◦ around the y-axis of the copter. To simulate a SRF08 range finder, the beam pattern is divided in several single lines as shown in Figure 3. The corresponding sensor lines of one range finder are derived from its direction vector. To construct an equivalent mathematical model, each sensor is represented by a collection of vectors, obtained by applying the corresponding rotation matrix to the direction vector. To get a sensor beam pattern that is similar to the pattern of the SRF08 range finder, two circles, each with eight lines, are build around the center beam.That makes a total of 17 lines from the position of the sensor as shown in figure 3. The length of every line represents the maximum range and is used to determine if a facing wall is too far away to be measured. After the sensor beam lines are generated, the intersections with the walls need to be determined. Each wall is represented by a vertex and two direction vectors from that vertex. The length of the direction vector also gives the length of each edge of a wall. A wall is represented by the vertex ~v and the edges ~e and f~. To calculate the intersection between one sensor line and the wall, the equations of the line and of the plane need to be equated. Once the intersection point between one sensor line and one wall is known, a check is needed whether the intersection point is within the bounded wall, because the method above will return an intersection point in all cases where the line is not parallel to the wall. To check if a point is in the quadrangle, that is given by the two vectors ~e and f~, the dot product is used. For points within the boundry, the distance is calculated and returned. The distance is simply the distance between the position vector of the sensor and the intersection point at the wall. To limit the range of the sensor, the distance is compared to the length of the vector representing the

of the control software into the simulation, the S-function mechanism of Simulink is used. For the visualization of the simulation, the VRML extension of Simulink is used together with standard functionalities provided by Simulink to display simulation results.

6

5

x

4

3

2

1

0

2.5

2

1.5

1

0.5

0 y

−0.5

−1

−1.5

−2

−2.5

z

y Figure 3.

Simulated Sensor Beam – Top and Side View

corresponding sensor beam line, and if the distance is greater than the length, 0 is returned. This is done for all sensor beam lines. If the intersection point does not lie within the wall or the line is parallel to the wall, 0 is returned. The result of the above calculations is a matrix with the distances to every wall that was in the range of the sensor. This matrix is now minimized and the values are combined if they lie close together. This is done to imitate the result of a real SRF08 range finder. The SRF08 does only return one value, except errors in measurement, if it is faced to a wall. If the angle between the sensor and the wall is getting more acute, more values are returned. To respect this behavior, all values that differ not more than 10 cm are combined to one. III. I MPLEMENTATION To simulate and visualize the models described in Section II, Matlab/Simulink is utilized as implementation framework. To obtain simulatable models, the Simulink core library is applied to formalize the sensor and environment model; the Aerospace Block Set is applied to formalize the dynamics of the platform model. For the virtual integration

A. Simulation Figure 4 shows the overall architecture of the simulation. The simulation is implemented in form of Simulink blocks, with each block capturing a part of the model under simulation. The model of the environment of the FALTER unit describing walls and obstacles according to the representation described in Section II-C is captured in the blue ’Environment’ part. The dynamics of the FALTER unit are captured in the ’Physical Model’, including the 6DoF and flight dynamics part, as described in Section II-B. The remaining parts of Figure 4 describe the software and hardware parts of the unit relevant for the autonomous flight management. The ’SFR08 Sensors’ and ’Split Signal’ blocks capture the part of the model dealing with the sensor functionality, using the mathematical representation described in Section II-C. All the above mentioned blocks are built using basic blocks from the Simulink core library. The two remaining blocks – ’Control Task’ and ’Flight Management Task’ – are capturing the models dealing with the software parts of the FALTER unit in form of the hardware abstraction layer and the flight management software. They are implemented in form of S-Function blocks. S-Functions provide a simple mechanism to embed additional functionality into Simulink. S-Function blocks can simply be used like other blocks from the core library to build a simulatable model. In the FALTER project, SFunctions are one key part of the simulation besides simulating the sensors and simulating the movement of the copter. The code that is running on the RoBoard in a RT-Linux is included in the simulation in form of an S-Function block. Using an S-Function to include the code has the advantage that the code running in the simulation and the code running on the FALTER unit are exactly the same. To integrate the code that runs on the RoBoard, two SFunctions are used. One calls the hardware abstraction layer code and one the flight management code. The code implementing the hardware abstraction layer is embedded in a dedicated S-Function and called with a sample time of 0.05, corresponding to a period of 50 milliseconds. The control code is intended to run with the same time settings on the RoBoard. A sample time of 0.05 is used to match the real SRF08 range finders cycle time of 50 ms. The S-Function for the hardware abstraction layer has three interfaces for the interprocess communication to the flight management task: the obstacles that were detected by the SRF08 ultrasonic sensors (’OBSTACLES’), the consolidated environment data collected from the accelerometers and the gyroscopes (’ENV DATA POINTER’), and the control commands for the hardware abstraction layer issued by flight management code (’CTRL REF’). Furthermore, to communicate with the FALTER unit platform, the corresponding interfaces for reading out the range sensors, accelerometers,

[Walls]

Static Objects Walls Dynamic Objects

[Xe]

Distances

Position

Distances

Environment [DCM]

DCM

SRF08 Sensors

GAS

Ve (m/s)

[Ve]

Xe (m)

[Xe]

GND LHS RHS FRONT_RHS FRONT_CTR FRONT_LHS REAR CEILING

Split signal

[GAS] [YAW]

Angle (rad) YAW DCM

[DCM]

[NICK]

[Angle]

[ROLL] Vb (m/s)

[Vb]

w (rad/s)

[w]

NICK

dw/dt ROLL

[dw]

Ab (m/s)

Physical Model OBSTACLES CTRL_REF ENV_DATA_POINTER Flight Management Task

OBSTACLES LED

ENV_DATA LED

[GAS]

GAS

[YAW]

YAW

[NICK]

NICK

[ROLL]

ROLL

Transport Delay

CTRL_REF SRF08 GND SRF08 LHS SRF08 RHS SRF08 FRONT_RHS SRF08 FRONT_CTR SRF08 FRONT_LHS SRF08 REAR SRF08 CEILING FC ACC FC GYRO LASER Control Task

Figure 4.

Overall Architecture of the Simulation Model

and gyroscopes as well as controlling the unit movements are provided. The S-Function encapsulating the flight management code uses an interface matching the corresponding part of the hardware abstraction layer and is also called every 50 milliseconds. Since the flight management code does only communicate with the hardware abstraction layer, it receives or sends no signals to other blocks. Although not shown here, for debugging purposes additional signals can be introduced to access internal data of the software, e.g., the map constructed dynamically in the flight management task. This helps to visualize the actual state of the copter during the virtual execution of the software. B. Visualization The visualization component of the simulation serves two purposes: It allows to observe the overall behavior in the virtual environment and to get a detailed view on how the

FALTER unit perceives its environment through the sensors and how the unit interacts with it through its actuators. Figure 5 shows an example of the concrete user interface used during a simulation run. The interface includes different views of the FALTER unit in a 3D model of the virtual environment, like the top view and the centered view shown in the left upper half of Figure 5. These views also allow to depict the sensing zones of the range sensors. These visualizations support a straight-forward observation of the overall behavior of the unit. To get a more detailed view on how the unit perceives its environment and how it interacts with it, the elements in the lower left side and the right half can be used. Plots and gauges give direct readings of sensors and actuators, like the measured height or the commanded speed. Furthermore, as mentioned above but not shown in the user interface in Figure 5, also information about the internal state of the FALTER unit can be visualized during execution.

Figure 5.

User Interface of the Simulator

For the implementation of the plots and gauges, core functionalities of Simulink are used. To provide a 3D visualization of the simulation, Simulink 3D Animation is used. Using this framework, the visualization component reads the environment model and creates a VMRL representation of the virtual environment, and generates a 3D animation of it including a virtual FALTER unit. IV. C ONCLUSION As discussed in the following subsections, the presented solution demonstrates an effective and efficient approach to a simulation-based in-the-loop validation of UAV software. While several aspects – like the flight mechanics or the used sensors – are specific to the FALTER unit, some general principles can be observed. A. Results Using the approach described above, the control software of the FALTER unit could be integrated into the simulation without further modification. The implementation allowed to simulate the behavior of the unit in a building with more than 80 walls and additional obstacles in real-time, using a standard PC simulation platform (Intel Core 2 Duo with 3GHz and 4GB RAM). The runs included the simulation of the unit and the environment as well as the immediate visualization of the results. The simulation proved to be an effective tool to validate the functionality of the control software. Especially the

effects of the noise-to-signal ratio in the sensors (distance sensors and position sensors) could be tested, supporting a functional testing of the software using a model of lownoise-ratio sensors and robustness testing using a model of high-noise-ration sensors. Furthermore, the simulation allows to evaluate the effect of using alternative hardware components (for example, improved accelerators and gyroscopes) before effectively applying these hardware components. B. Experiences The architecture of the presented approach – in terms of splitting the overall simulation into a model of the environment, a model of the platform, and the control software – proved to be suitable framework for the simulation of autonomous systems. The overall architecture follows the standard design of a control loop: The model of the environment corresponds to the ’Environment’ block – capturing the walls and moveable objects – plus the parts of the ’Physical Model’: the position ’Xe’, orientation ’DCM’, the inclination ’Angle’, and the acceleration ’Ab’; the remainder of the ’Physical Model Block’ corresponds to the actuator model. The ’SRF08 Sensors’ correspond to the sensor model; inclination and acceleration can be considered to be measured by a trivial sensor. The platform model consists of the ’Control Task’ and the control software, corresponding to the ’Flight Management Task’.

Control Software

Platform Model

Sensor Model

Acutator Model

Environment Model

Figure 6.

Overall Architecture for Model-Based In-the-Loop Simulation

Using a model-based approach based on such a framework allows to identify reusable model parts, suitable for a range of application. For example, the model of an ultra sonic sensor can be reused for other kind of distance sensors like infrared or laser-based measurements. C. Summary and Outlook The presented approach provides means of software-inthe-loop testing of autonomous UAVs, using virtual integration of the controlling software with a model of the platform and environment. In contrast to other approaches, it supports an efficient simulation of the behavior of an UAV covering the interaction between the UAV and its environment including complex sensors like ultrasonic range sensors. It furthermore provides means for the visualization of the results of the simulation. A detailed description on the virtual integration can be found in [9]. Furthermore, detailed descriptions of the hardware abstraction layer, the path-planing, and the mission control can be found in [10], [11], and [12], respectively. While the current implementation supports the core features needed for early testing, there are several possibilities for improvement. Currently, the testing scenarios – i.e., the model of the building and of possible obstacles – are constructed manually. Here, as described in [4], a more automated approach to systematically explore the state-space of the planning algorithm could be applied. Furthermore, currently only static objects – walls as well as obstacles – are supported. For a more realistic setting, also moving obstacles should be included. Finally, only box-like objects are supported; furthermore, possible different reflection properties of different materials are not considered. A more detailed environment model could be supported to allow more realistic simulations at the cost of adding a substantial computational load to those simulations. R EFERENCES [1] H. Chao, Y. Cao, and Y. Chen, “Autopilots for Small Unmanned Aerial Vehicles: A Survey,” International Journal of Control, Automation, and Systems, vol. 8, 2010.

[2] “FALTER project website,” 2010, www.fortiss.org/en/research/software-systemsengineering/demonstrators/falter.html. [3] “Testing and Validation of Simulink Models with Reactis,” Reactive Systems, Inc., Tech. Rep. RSIIR 1.10, 2010. [4] Z. Saigol, F. Py, K. Rajan, C. McGann, J. Wyatt, and R. Dearden, “Randomized Testing for Robotic Plan Execution for Autonomous Systems,” in Autonomous Underwater Vehicles. IEEE, 2010. [5] J. Shen and H. Hu, “A Matlab-based Simulator for Autonomous Mobile Robots,” in 3rd Innovative Production Machines and Systems International Conference, D.T.Pham, E. Eldukhri, and A.J.Soroka, Eds., 2007. [6] E. N. Johnson, D. P. Schrage, J. Prasad, and G. J. Vachtsevanos, “UAV Flight Test Programs at Georgia Tech,” Georgia Institute of Technology, Tech. Rep., 2004. [7] R. D. Garcia, “Designing an Autonomous Helicopter Testbed: From Conception Through Implementation,” Ph.D. dissertation, University of South Florida, 2008. [8] A. Bilgin, “A simulation model of indoor environments for ultrasonic sensors,” Master’s thesis, bilkent University, 2003. [9] F. Mutter, “FALTER - Flight unit for Autonomous Location and Terrain Exploration: Virtual Integration,” Bachelor’s thesis, Technische Universit¨at M¨unchen, 2010. [10] A. Bayha, “FALTER - Flight unit for Autonomous Location and Terrain Exploration: Hardware Abstraction Layer,” Bachelor’s thesis, Technische Universit¨at M¨unchen, 2010. [11] F. Gr¨uneis, “FALTER - Flight unit for Autonomous Location and Terrain Exploration: Path Planning,” Bachelor’s thesis, Technische Universit¨at M¨unchen, 2010. [12] M. Kanis, “FALTER - Flight unit for Autonomous Location and Terrain Exploration: Mission Control,” Bachelor’s thesis, Technische Universit¨at M¨unchen, 2010.

Suggest Documents