Delay and Dropout Tolerant State Estimation for ... - Semantic Scholar

2 downloads 0 Views 623KB Size Report
The underlying model fk(•) is a centripetal acceleration model taking into ac- count both the ..... IEEE Control Systems Society, Ajaccio, Corsica, France (2008). 2.
Delay and Dropout Tolerant State Estimation for MAVs Fr´ed´eric Bourgeois, Laurent Kneip, Stephan Weiss and Roland Siegwart

Abstract This paper presents a filter based position and velocity estimation for an aerial vehicle fusing inertial and delayed, dropout-susceptible vision measurements, without the a priori knowledge of the exact variable time delay. The data from the two sensors, which are running at different rates, is transmitted via independent wireless links to a ground station. A synchronization between both communication ways makes it possible to determine the image transmission and processing time. The computational complexity of the algorithm is kept at a low level. The images are processed by a Visual SLAM algorithm that builds up a map of the area and simultaneously tracks the pose of the camera. With a delay going up to 230 ms and an amount of 16% dropout in the vision data, we show that with the presented filter a quadrotor can be stabilized and kept in the region of a setpoint with a simple PID controller.

1 Introduction The use of filters for state estimation is common in many robotic control applications. This paper presents a Kalman filter based position, velocity and acceleration estimation for Micro Aerial Vehicles (MAVs) in Cartesian space. All measurements may run at different sampling rates. The Visual SLAM position measurement has a Fr´ed´eric Bourgeois Autonomous Systems Lab, ETH Zurich, e-mail: [email protected] Laurent Kneip Autonomous Systems Lab, ETH Zurich, e-mail: [email protected] Stephan Weiss Autonomous Systems Lab, ETH Zurich, e-mail: [email protected] Roland Siegwart Autonomous Systems Lab, ETH Zurich, e-mail: [email protected]

1

2

Fr´ed´eric Bourgeois, Laurent Kneip, Stephan Weiss and Roland Siegwart

moderate and time-variant delay compensated up to the smallest delay of the different measurements, namely the inertial measurement. The filter is tolerant to sensor dropouts and, furthermore, the aspect of an embedded implementation is also taken into account. The designed filter is the result of a continuation of the work around the framework presented by Bl¨osch et al. [4], where a downward looking monocular camera is used for a vision-based localization and stabilization of an MAV. While navigating in an unstructured environment, the Visual SLAM algorithm [9] that is running on the ground station builds up a map of the area, and simultaneously tracks the pose of the camera (location and rotation). The camera image is transmitted to the ground station via a wired connection. The present work also takes into account the frequently sampled acceleration measurements from an Inertial Measurement Unit (IMU), and fuses them with the position measurements using the mentioned filter. Additionally, the cable is replaced by a wireless link, which introduces a larger delay of the position information and a less reliable and deterministic communication affected by occasional dropouts. Accurate state estimation is an important subject of current research in the domain of MAVs. In general, due to unbounded accumulation of integration errors [5], translational position and velocity can not be estimated from a low cost IMU for more than a few seconds. Therefore, the fusion of additional sensors is needed for a drift-free estimation of these quantities. For MAVs, a common sensor system consists of a set of proprioceptive sensors, typically accelerometers and gyros, and an exteroceptive sensor like a Global Positioning System (GPS), camera, laser or ultrasonic sensor [8]. Fusing IMU and GPS data is an ongoing research field (see for example [14] - [1]). However, the precision of the estimation is only in the range of some decimeters. Another main approach for MAVs is the use of vision data to obtain an information about the absolute position or orientation (see [11] [12]). In [13], an observer is designed fusing inertial and visual data for the estimation of the camera orientation with respect to an inertial system. Moreover, they show how orientation and position estimation can be decoupled from each other. Cheviron et al. [6] present a filter using similar sensors to provide a pose and velocity estimation, however, the sensor model includes no delay. Two main drawbacks of vision systems are the considerable amount of data generated by the camera that must be transmitted or recorded, and the subsequent processing of the images themselves. Such algorithms for Visual SLAM, optical flow, or stereo vision techniques typically require high computational power. Fusing sensors introduces the problem that both sensor classes are sampled at different rates. Increasing the overall sampling rate up to the slowest sensor frequency is the easiest solution, which however leads to a lack of dynamics due to the Nyquist sampling constraint. A better solution is given with multi-rate filters. Armesto et al. [2] propose two different interfaces to deal with multi-rate filtering, which can easily be incorporated in Kalman filters. Multi-rate holds generate frequently sampled input signals from a sequence of measurements sampled at lower frequency using zero-, first-, or higher-order holds. Multi-rate samplers, in contrary, generate a size-

Delay and Dropout Tolerant State Estimation for MAVs

3

varying measurement output vector by appending measurements only if they are available at the current time step. Regarding the delay of the measurements, Larsen et al. [10] summarize various methods of how to fuse them in a Kalman filter and compare their computational burden. They also propose an own method, which extrapolates the delayed measurement forward into the present time, and calculates an optimal gain. Each delayed measurement needs at each time step two matrix multiplications. State augmentation methods can also handle delays optimally [7]. However, the system order increases according to the delay period, thus these methods are only used for systems with small, known, and constant delays. With the same disadvantage of state dimension growing, Van der Merwe et al. [15] propose a Sigma-point Kalman filter based latency compensation to fuse lagged GPS with inertial measurements for an Unmanned Aerial Vehicle (UAV) application. The novel contribution of this work is the incorporation of sensor-fusion, multirate, and delay-compensation techniques resulting in a delay and dropout tolerant filter based estimation of an MAV’s position, velocity, and acceleration for robust wireless MAV hovering control. A synchronization between the independent communication ways of the two sensors makes it possible to determine the image transmission and processing time. With this information, we are able to deal with the variable delay. Section 2 begins with the design of the multiple parts of the filter. Each part, namely underlying model, multi-rate filtering, and delay compensation, is discussed in a separate subsection. Section 3 presents the MAV used for the experiments and gives some hardware and software specifications. The experiment results are shown and discussed in Sect. 4. Finally, Sect. 5 summarizes this work and concludes with an outlook on possible future works.

2 Filter Design This section presents the design of the filter, namely the motion model and the systems used for enabling multirate filtering and delay compensation.

2.1 Motion Model The presented filter is based on an EKF with a non-linear discrete-time system given in the general form: x[k + 1] = fk (x[k], u[k], w[k]) y[k] = hk (x[k]) + ν [k]

(1)

4

Fr´ed´eric Bourgeois, Laurent Kneip, Stephan Weiss and Roland Siegwart

where fk (•) and hk (•) are non-linear functions, and w[k] and ν [k] are Gaussian distributed white noise with covariances Q[k] and R[k], respectively. The state vector x[k] consists of position p[k], velocity v[k], and acceleration a[k] in Cartesian coordinates, and the output vector y[k] of the position pm [k] from the vision algorithm, and acceleration measurement am [k] from the IMU. The model has no input signal.  T x[k] = p[k] v[k] a[k]  T y[k] = pm [k] am [k]

(3)

T  w[k] = j[k] α [k]

(4)

(2)

The system noise is composed of jerks j[k] and angular accelerations α [k].

The underlying model fk (•) is a centripetal acceleration model taking into account both the tangential and centripetal acceleration. The model is the same as in [2]. 2

3

p[k + 1] = p[k] + Ts v[k] + T2s a[k] + T6s j[k] 2

v[k + 1] = v[k] + Ts a[k] + T2s j[k] a[k + 1] = a[k] + Ts j[k] + Ts (α [k] × v[k]) + Ts (ω [k] × a[k])

(5)

Ts is the sampling period and ω [k] is the angular rate taken from the IMU. The measurement vector hk (•) is in our case linear and simply pm [k] = p[k] am [k] = a[k]

(6)

All equations are formulated in the frame M of the Visual SLAM map. The position measurement can be taken as is. The acceleration measurement, however, must be rotated from the helicopter frame H beforehand. Figure 1 shows the rotation consisting of two steps. Using the Tait-Bryan angles convention, where ϕ , θ , and ψ denote the roll, pitch and yaw angle of the helicopter, respectively, the following matrix describes the rotation from the helicopter frame H into the inertial frame I RIH (ϕ , θ , ψ ) = Rotz (ψ ) · Roty (θ ) · Rotx (ϕ )

(7)

Rotx,y,z is a basic rotation about the x, new rotated y, and new twice-rotated z axis, respectively. The second rotation RMI is a constant rotation from the inertial frame I into the map frame M. This rotation comes from the way the camera is mounted on the helicopter. Additionally, the gravity effects must be compensated. If aIMU [k] denotes the raw acceleration measurement from the IMU, then am [k] = RMI · RIH (ϕ , θ , ψ )k · aIMU [k] − [0 0 g]T

(8)

Delay and Dropout Tolerant State Estimation for MAVs

5

ZM YM XM

XI YI

RMI

RIH(φ,θ,ψ)

ZI

XH ZH

YH

Fig. 1: Rotation of the raw acceleration data, first from helicopter H to inertial frame I, then to map frame M.

Note that in Sect. 3.1, Eq. 8 is modified in a way that the yaw drift is also taken into account. Also note that with this general formulation, the filter can also be used for applications not related to MAVs.

2.2 Multi-rate Filtering Due to the fact that the motion model has no inputs, only multi-rate samplers are needed (see Sect. 1). The idea behind is to vary the size of the measurement vector y[k] and the covariance matrix R[k] depending on which measurement is available at the current time step. The Kalman gain and update step are computed accordingly. Table 1 shows the four possibilities for our model. The time index k is omitted. Rp and Ra denote the covariance for the position and the acceleration measurement, respectively. We assume that there is no correlation between Rp and Ra , although mechanical vibrations may cause a dependency. If neither the position nor the acceleration measurement is available (case 1), only the prediction step is performed. To

Table 1: Multi-rate sampler Case 1 2 3 4*

Pos. no yes no

Acc. y R no [] [] no pm Rp yes  am   Ra  pm Rp 0 yes yes am 0 Ra

reduce the computational burden when both measurements are available (case 4), only the position measurement is considered (case 2). Since the sampling rate of the IMU is higher than the camera frame rate, omitting the acceleration measurement in this special case is assumed to have little impact.

6

Fr´ed´eric Bourgeois, Laurent Kneip, Stephan Weiss and Roland Siegwart

2.3 Delay Compensation A delay compensation for the position measurement pm [k] is included in the filter. The delay is time-variant and is mainly determined by two factors, namely the wireless communication and the processing time of the camera image. Especially, the computational time can vary depending on the processor load. The measurement pm [k] is available at time tk , but actually represents the position at time tl = tk − ∆ t, ∆ t being the variable time delay. At time tl , when the camera frame is taken, the image is tagged with an ID. This ID is sent together with the IMU data down to the filter, where the current state estimation xˆ [l] and the ID are stored in a buffer (see Fig. 2). As a consequence, the delay is only compensated up to the delay of the IMU data transmission, which is assumed to be small. When the position data is available at time tk , the residual is calculated using the value from the buffer with the corresponding ID. The error is then fused using the normal Kalman filter update rule, i.e., xˆ [k] = xˆ [k|k − 1] + K (pm [k] − h (ˆx[l]))

(9)

helicopter

In other words, the correct innovation pm [k] − h (ˆx[l]) is fused with a later state prediction xˆ [k|k − 1], and is thus non-optimal. However, the low computational burden of this method is a main advantage and outbalances this drawback.

IMU

camera

TI

TV

assign ID transmission

image transmission

Δt≈ 0 Vis.SLAM

ground station

rotation

Δt>0 Filter prediction TS

am

^xl ID

pm

update buf fer

Fig. 2: Scheme of the filter. Both sensors are sampled at different rates TV and TI , and transmitted via two different communication ways to the filter. The ID of the camera frame is sent over the fastest path.

Delay and Dropout Tolerant State Estimation for MAVs

7

The approach holds under the assumption that the drift of the residual during the delay period is small. Note that the method allows non-delayed measurement from other sensors to be fused during the delay period.

3 Experimental Setup This section presents the whole setup for testing the filter introduced in Sect. 2. It describes the details about the platform, the integration of a simple controller around the filter for doing hovering experiments, and the specifications of the environment in which the experiments have been carried out.

3.1 Platform

Fig. 3: Hummingbird quadrotor from AscTec with mounted GumstixTM Overo flying over unstructured environment.

The platform used in this work is the Hummingbird from Ascending Technologies [3] with an overall weight of 0.67 kg (see Fig. 3). This quadrotor is equipped with various sensors and an internal estimation of the roll, pitch, and yaw angles. These values are used by an onboard controller to stabilize the helicopter at a desired attitude. All IMU sensor data and the frame ID (see Sect. 2.3) are sent at a rate of about 50 Hz to a ground station via an XBee digital radio. Additionally, a monocular PointGrey FireFly MV USB 2.0 camera is mounted at the bottom side of the helicopter gathering greyscale 752×480 pixel images. It is connected to a GumstixTM Overo featuring an OMAP 3530 processor with a 32-bit ARM Cortex-A8 CPU clocked at 600 MHz, also mounted on the helicopter. The images are captured at 15 f ps, and, using a UDP unidirectional data stream, sent via an 802.11n WLAN connection to the ground station, where the Visual SLAM

8

Fr´ed´eric Bourgeois, Laurent Kneip, Stephan Weiss and Roland Siegwart

algorithm and the filter are implemented. The ground station features an Intel Core 2 Duo 2.66 GHz processor with a Linux OS. The SLAM is configured such that if parts of the image are not transmitted correctly, the image is discarded. The scale-invariance problem is not tackled in this work. The image scale factor is set by hand and the environment is chosen sufficiently small to assume the value to be constant. The helicopter-internal estimation of the yaw angle ψIMU is prone to drift, which effects the rotation of the acceleration measurements. To tackle this problem, an additional filter estimates the drift using the yaw angle gathered from the Visual SLAM ψV IS as a reference. The following measurement model is used, where D denotes the drift:

ψIMU = ψreal − D + νIMU ψV IS = ψreal + νV IS ∆ ψ = ψV IS − ψIMU = D + ν

(10)

νV IS and νIMU are independent Gaussian distributed noise and can be merged to one single noise ν . The estimation of the yaw angle is then ψˆ = ψIMU + D

(11)

RIH (ϕ , θ , ψ ) is then replaced by RIH (ϕ , θ , ψˆ ) in Eq. 8.

3.2 Controller

Estimator & Controller Position Controller

Helicopter Device Camera

Input Transform.

Attitude Controller

Quadrotor IMU

Yaw Controller Yaw Estimation Rotation State Estimation

SLAM Tracking

Mapping

Fig. 4: Closed loop scheme of the system, including the two designed filters.

Based on the estimated position and velocity, the horizontal position and the altitude of the helicopter are stabilized by a simple PID controller expressed in the

Delay and Dropout Tolerant State Estimation for MAVs

(a) real environment

(b) distorted view from camera with map grid

9

(c) map of environment

Fig. 5: Experimental Environment

Visual SLAM frame M (see Fig. 4). The desired force vector F∗ is then transformed into the control inputs accepted by the onboard controller (see [4]), i.e., desired roll ϕ ∗ , pitch θ ∗ , and thrust T ∗ , and transmitted to the helicopter at 20 Hz via the bidirectional digital radio. The yaw angle is controlled such that the angle at the initialization is hold during the whole flight, i.e., no rotation along the vertical axis. This is realized by controlling the desired yaw angular rate ψ˙ ∗ with a bang-bang controller.

3.3 Experiment Environment All experiments are done in an indoor laboratory. The helicopter is secured by a nylon wire attached at the top of the vehicle. The unstructured environment and the according map from the Visual SLAM can be seen in Fig. 5. The map is initiated and built before the experiments start. This way, the SLAM can focus on the tracking task.

4 Experiment Results and Discussion Figures 6 to 10 all refer to the same experiment, namely a hovering flight of 60 s around a given setpoint. The filter parameters are set based on experiments. The overall sampling period of the filter is set to Ts = 0.015 s, which is slightly faster than the IMU rate. This way, no IMU packages are missed due to buffering issues. The uncertainty matrices R[k] and Q[k] are assumed to be constant. The values for the acceleration were obtained from data captured while the helicopter was not moving. An estimation of the position measurement uncertainty was obtained by analyzing various experiments and finding values leading to good filter performances. The values for the system process noise were estimated the same way.

10

Fr´ed´eric Bourgeois, Laurent Kneip, Stephan Weiss and Roland Siegwart

Rp = I3x3 · 10−7



50 Ra =  0 5 00

 0 0  · 10−4 7

I · 10−2 03x3 Q = 3x3 03x3 I3x3 · 10−1 

(12)



(13)

The assumption that omitting the acceleration measurement if also the position measurement is available does not seriously affect the quality of the results is confirmed to be valid. With the same data logged from the experiment in this section, two offline filters are compared, namely one omitting the acceleration, and the other not. The differences are in the range of 10−4 m and 10−3 ms for the position and velocity, respectively, thus negligible. The elapsed time between the notification of a camera frame being captured, and the instant where the position measurement is available, is observed to be between 0 and 230 ms. This is the maximal delay that can be compensated by the filter and does not imperatively match the real delay. E.g., dropouts or corrupted packages in the IMU communication delays the incoming information that a frame was taken. The effect of the delay compensation is shown in Fig. 6. The online filter is compared with an offline filter with the delay compensation turned off. The time lag between both filters is about 150 ms. 40 30

position y [cm]

20 10 0 −10 −20 −30

Filter

SLAM w/o delay −40 41

42

43

online, w/ delay comp.

44

45

46

Filter

offline, w/o delay comp

47

48

time [s]

Fig. 6: (I) Comparison between the online filter and an offline filter without delay compensation. The time lag is about 150 ms. The crosses denote the position measurements without delay. (II) Longer position measurement dropouts take place at t = 42 s (460 ms) and t = 45.5 s (410 ms).

The amount of dropouts in the position measurement with the current setup is about 16%, whereas a dropout is counted only if at least two frames are skipped. The dropouts do not necessarily come from the wireless connection, the Visual SLAM may also skip frames if the processor load is too extensive. The dropouts are irregular and no correlation is found with other variables, like for instance the absolute helicopter velocity. In this experiment, the maximal amount of frames skipped in a row is 4 frames (this occurred 6 times, e.g., at t = 42 s) causing a maximal interval of 460 ms between two position measurements. During this period, the estimation

Delay and Dropout Tolerant State Estimation for MAVs

11

drifts due to the integration of the acceleration measurement, as seen in Fig. 6, and is corrected when the position measurement is again available.

position x [cm]

50

0

−50

0

10

20

30 time [s]

40

50

60

0

10

20

30 time [s]

40

50

60

0

10

20

30 time [s]

40

50

60

position y [cm]

50

0

−50

position z [cm]

110 100 90 80

SLAM w/o delay

Filter

Fig. 7: Estimated position from the filter in comparison to the non-delayed position measurements from the Visual SLAM.

Figs. 7 to 9 show the estimation of the position, velocity, and acceleration in Cartesian space together with an appropriate measurement. ’SLAM w/o delay’ denotes the position measurement or its derivative, respectively, without delay, i.e. at the time tl when the ID arrives at the filter. The derivation and the time-shift were done in an offline process. IMU labels the acceleration measurement. An occasional overshooting due to the delay of the measurements is observable in the position measurement. The velocity estimation shows a smooth curve in comparison to the derivation of the position measurement. This has also been observed in the flight behavior when the derivative part of the PID was set to a high value, resulting in non-jerky motions. From the acceleration estimation, the correct rotation of the acceleration measurement can be seen. The quadrotor can be stabilized with a simple PID controller (see Fig. 10). However, the position deviation from the setpoint has an RMS of about 20 cm in the horizontal plane and 6 cm in the vertical axis, which mainly occurs due to the overshoots in the position. We presume that better results can be achieved with more complex control techniques, which is however not the purpose of this work.

12

Fr´ed´eric Bourgeois, Laurent Kneip, Stephan Weiss and Roland Siegwart

velocity x [m/s]

2 1 0 −1 −2

0

10

20

30 time [s]

40

50

60

0

10

20

30 time [s]

40

50

60

0

10

20

30 time [s]

40

50

60

velocity y [m/s]

2 1 0 −1 −2

velocity z [m/s]

1

0.5

0

−0.5

SLAM w/o delay

Filter

Fig. 8: Estimated velocity from the filter in comparison to the derivation of the non-delayed position measurements.

5 Conclusion and Future Works In this paper, we designed a filter for the estimation of the position, velocity, and acceleration in Cartesian space for an aerial vehicle application. It fuses inertial and vision data. The variable delay of the dropout-susceptible position measurement is successfully compensated up to the delay of the fastest communication channel, and without any a priori knowledge of the exact time delay itself. During dropouts in the wireless transmission of the camera images, the filter is updated with acceleration measurements in order to continue having an estimation of the helicopter’s state. With the estimated position and velocity, an MAV can be stabilized using a simple PID controller. Future work envisages an embedded implementation. By this means, a better knowledge of the delay is obtained, since the frame ID contained in the IMU package is not sent by a digital radio link anymore, but is available immediately. So does the IMU data itself. A compression of the camera image on the helicopter might be an interesting point to investigate as well. The benefits are a reduced load of the net, thus a possible reduction of the dropout frequency, and an increase of the frame rate. The major drawback is the additional delay introduced by the compression. The controller and the filter work separately, hence other control techniques could be investigated to improve the stability of the helicopter.

Delay and Dropout Tolerant State Estimation for MAVs

13

acceleration x [m/s2]

2 1 0 −1 −2

0

10

20

30 time [s]

40

50

60

0

10

20

30 time [s]

40

50

60

0

10

20

30 time [s]

40

2

acceleration y [m/s ]

1.5 1 0.5 0 −0.5 −1

2

acceleration z [m/s ]

1 0.5 0 −0.5 −1

50 IMU

60 Filter

Fig. 9: Estimated acceleration from the filter in comparison to the acceleration measurements from the IMU.

105

position z [cm]

100

95

90

85 40 20

20 0

0 −20

−20 position y [cm]

−40

−40 position x [cm]

Fig. 10: Position in space during a 60 s hovering flight. The cross marks the setpoint. The RMS of the deviation from the setpoint is 20 cm in the x and y axis, and 6 cm in the z axis.

Acknowledgements The research leading to these results has received funding from the European Community’s Seventh Framework Programme (FP7/2007-2013) under grant agreement n. 231855 (sFly), and from the Swiss National Science Foundation under grant agreement n. 200021 125017/1.

14

Fr´ed´eric Bourgeois, Laurent Kneip, Stephan Weiss and Roland Siegwart

References 1. Abdelkrim, N., Aouf, N., Tsourdos, A., White, B.: Robust nonlinear filtering for INS/GPS UAV localization. In: Proceedings of the 16th Mediterranean Conference on Control and Automation, pp. 695-702. IEEE Control Systems Society, Ajaccio, Corsica, France (2008) 2. Armesto, L., Chroust, S., Vincze, M., Tornero, J.: Multi-rate fusion with vision and inertial sensors. In: Proceedings of the IEEE International Conference on Robotics and Automation, pp. 193-199. New Orleans, LA, USA (2004) 3. AscTec Hummingbird Quadrotor Helicopter, Ascending Technologies GmbH, http://www.asctec.de 4. Bl¨osch, M., Weiss, S., Scaramuzza, D., Siegwart, S.: Vision Based MAV Navigation in Unknown and Unstructured Environments. In: Proceedings of the IEEE International Conference on Robotics and Automation. Anchorage, Alaska, USA (2010) 5. Bryson, M., Sukkarieh, S.: Vehicle model aided inertial navigation for a UAV using lowcost sensors. In: Proceedings of the Australasian Conference on Robotics and Automation. Australian Robotics and Automation Association, Canberra, Australia (2004) 6. Cheviron, T., Hamel, T., Mahony, R.E., Baldwin, G.: Robust nonlinear fusion of inertial and visual data for position, velocity and attitude estimation of UAV. In: Proceedings of the IEEE International Conference on Robotics and Automation, pp. 2010-2016. Roma, Italy (2007) 7. Hsiao, F.H., Pan, S.-T.: Robust Kalman filter synthesis for uncertain multiple time-delay stochastic systems. Journal of Dynamic Systems, Measurement, and Control 118(4), 803– 808 (1996) 8. Kingston, D.B., Beard, A.W.: Real-time Attitude and Position Estimation for Small UAVs Using Low-Cost Sensors. In: Proceedings of the AIAA 3rd Unmanned Unlimited Technical Conference, Workshop and Exhibit, pp. 2004-6488. Chicago, Illinois, USA (2004) 9. Klein, G., Murray, D.: Parallel tracking and mapping for small AR workspaces. In: Proceedings of the 6th IEEE and ACM International Symposium on Mixed and Augmented Reality, pp. 1-10. IEEE Computer Society, Washington, DC, USA (2007) 10. Larsen, T.D., Andersen, N.A., Ravn, O., Poulsen, N.K.: Incorporation of time delayed measurements in a discrete-time Kalman filter. In: Proceedings of the 37th IEEE Conference on Decision & Control, pp. 3972-3977. Tampa, Florida, USA (1998) 11. Lobo, J., Dias, J.: Integration of inertial information with vision towards robot autonomy. In: Proceedings of the IEEE International Symposium on Industrial Electronics, pp. 825-830. Guimaraes, Portugal (1997) 12. N¨utzi, G., Weiss, S., Scaramuzza, D., Siegwart, R.: Fusion of IMU and Vision for Absolute Scale Estimation in Monocular SLAM. In: International Conference on Unmanned Aerial Vehicles. Dubai (2010) 13. Rehbinder, H., Ghosh, B.K.: Pose estimation using line-based dynamic vision and inertial sensors. IEEE Transactions on Automatic Control 48(2), 186–199 (2003) 14. Sukkarieh, S., Nebot, E.M., Durrant-Whyte, H.F.: A high integrity IMU/GPS navigation loop for autonomous land vehicle applications. IEEE Transactions on Robotics and Automation 15(3), 572–578 (1999) 15. Van der Merwe, R., Wan, E., Julier, S.: Sigma-point Kalman filters for nonlinear estimation and sensor-fusion: Applications to integrated navigation. In: Proceedings of the AIAA Guidance, Navigation, and Control Conference and Exhibit. Providence, Rhode Island (2004) 16. Yun, B., Peng, K., Chen, B.: Enhancement of GPS signals for automatic control of a UAV helicopter system. In: Proceedings of the IEEE International Conference on Control and Automation, p. 1185-1189. IAENG, Hong Kong (2007)

Suggest Documents