6Riskin, D., Bahlman, J., Hubel, T., Ratcliffe, J., Kunz, T., and Swartz, S., âBats Go Head-Under-Heels: the Biomechanics of Landing on a Celing,â J. Exp. Biol., ...
Trajectory Estimation of Bat Flight Using a Multi-View Camera System Matt J. Bender∗ , Hunter G. McClelland † , Gerardo Bledt ‡ , Andrew Kurdila § , Tomonari Furukawa ¶ , and Rolf Mueller k Virginia Tech, Blacksburg, VA, 24060, USA
The high maneuverability and stability of biological fliers has re-ignited research in flapping wing flight mechanics in the last 10 years. Research to date has studied the kinematics, dynamics, and aerodynamics of bats to determine how they maneuver. This research requires motion capture studies of a specimen in flight to determine wing and body motion. A key difficulty in using motion capture techniques is frequent point occlusion which is caused by the highly articulated nature of bat wings during flight. In this paper we present extended and unscented Kalman filter algorithms that are derived for multi-view motion capture experiments with highly redundant camera configurations. Two experimental data sets are studied in this paper. The first contains bat ear motion capture data obtained using two low-distortion cameras; this data is used to validate the Kalman filter algorithms. The second set includes bat flight data which uses 5 high-distortion cameras that are employed to estimate inertial trajectory reconstruction. ∗ PhD
Student, Mechanical Engineering, 144 Durham Hall, Blacksburg, VA 24060, AIAA Student Member Student, Mechanical Engineering, 144 Durham Hall, Blacksburg, VA 24060 ‡ BS Student, Mechanical Engineering and Computer Science, 144 Durham Hall, Blacksburg, VA 24060 § W. Martin Johnson Professor, Mechanical Engineering, 141 Durham Hall, Blacksburg, VA 24060 ¶ Professor, Mechanical Engineering, 225 Goodwin Hall, Blacksburg, VA 24060 k Associate Professor, Mechanical Engineering, 312 ICTAS II, 1075 Life Science Circle, Blacksburg, VA 24060 Director, Taishen Professor, SDU-VT International Laboratory, Shandong University, Jinan, Shandong, China † PhD
1 of 13 American Institute of Aeronautics and Astronautics
Nomenclature α
A C d f h(·, ·) (·) Hc0 ˆ
h
H i
I j K λ m M n
Qk φ Π0
Rk
Σ τ θ UT v V
v w x
xk Xk
yk Yk
= = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = = =
Flap Angle Measurement State Transition Matrix Coriolis Matrix Distance Frequency Observation Function Measurement Model Homogeneous Transform from Inertial Basis (0) to Camera Basis (c) Jacobian of the Measurement Model Point Index Variable Identity Matrix Camera Index Variable Camera Calibration Matrix Z-coordinate of Feature Point in Camera Basis Number of Cameras Mass Matrix Number of Points Correction Covariance at time step k Pixel Coordinates of Feature Point Canonical Projection Matrix Prediction Covariance at time step k State Covariance Matrix Joint Torque Vector Joint Angle Vector Unscented Transform Point Velocity Potential Energy Function Measurement Noise Vector Process Noise Vector Point in Inertial Coordinates State Vector [3n × 1] at time step k Matrix of Sample Points Observation Vector [2nm × 1] at time step k Measurement of Predicted Sample Points
I.
Introduction
number of investigators have studied the complexity of flapping flight in bats where experiments are A based on placing markers on bat wings and subsequently collecting video of various flight regimes. The video is used to reconstruct three dimensional trajectories of the fiducial markers in inertial space. 1, 2
3, 4, 5
These trajectories are subsequently used to study the complexity of the highly articulated motion observed during bat flight,6 and they are likewise used to construct boundary conditions for high-fidelity simulations of fluid dynamics surrounding flapping wings.7, 8, 9, 10 In this paper we address one particular difficulty that arises during the measurement and characterization of complex, articulated bat motion from multiple image sequences recorded from different viewpoints. Typically, trajectories of fiducial markers experience periodic intervals during which they cannot be seen from the view point of a single camera, or even from the points of view of a limited number of cameras. This problem of self-occlusion is pervasive in imaging experiments of large amplitude, highly articulated bat
2 of 13 American Institute of Aeronautics and Astronautics
motion. Evidence suggests that capture volume optimization which accounts for resolution and occlusions is most sensitive to occlusion rather than camera resolution.11 Examples of such self-occlusion are evident in Figure Ia, for example.
(a) Two Overlayed Camera Views of a Single Bat In Flight
(b) Looking Down The Flight Tunnel
Figure 1. Point occlusions are common in bat flight. Top Left: A downward facing camera which can see the back of the bat and top of the wing in most frames. Bottom Left: A side facing camera which can view the top, bottom, and side of the bat periodically, but never sees the back. Right: Looking down the flight tunnel 15 cameras are visible in this test.
Over the past two years, the investigators have established an experimental setup for capturing bat flight motion that is based on 30 high-speed, low-cost, low-resolution, video cameras (GoPro Hero 3+ Black, 120 fps). The multi-camera imaging facility is depicted in Figure Ib. With this setup, the flight of a bat can be captured over a distance of about 4 meters. The cameras are arranged on the wall of a cylindrical flight tunnel and capture the flying bat from various viewing directions. In this way, even large deformations or articulations of the wings do not lead to total occlusion of any wing parts. The authors have developed experimental protocols wherein roughly 180 distinct marker points are distributed over the upper and lower wing surfaces, the body, and the head of the bats to provide high-resolution fiducial markers for motion studies. Figure Ia illustrates the initial form of the data obtained during a typical experiment, from the point of view of two different cameras. Since, the facility includes 30 cameras, the loss of observations due to self-occlusion that occurs during large displacement, multi-body, articulated motion of bat wings is minimized. Raw video sequences such as those depicted in Figure Ia are first post-processed to identify correspondence of fiducial markers among the highly redundant collection of cameras. Pixel coordinates of the identified feature points from redundant camera views are used to estimate inertial trajectories of fiducial markers in three spatial dimensions. The inertial trajectories, in turn, serve as the input for identification of a full motion, multi-body dynamics model. By employing a recursive representation of the motion model, an efficient numerical formulation of system kinematics and dynamics is achieved. Sample trajectories obtained from the identified motion of the bat wing are depicted in Figure 2a. Figure 2b plots the identified angle motion at the elbow when only a few cameras are used for motion identification. These plots were created by data obtained from researchers at Brown University as noted in the acknowledgments. This last plot illustrates some of the inherent difficulties that the proposed experimental facility has been designed to address. While the overall motion can appear qualitatively correct as in Figure 2a, the detailed plot of joint motion illustrates the substantial variance from the mean due to measurement noise. In addition, substantial portions of the trajectory cannot be identified owing to a lack of observability of feature points during some portions of the flight trajectory. The goal of this research is to derive and test a general methodology for the estimation of fiducial point markers on the wings of bats in flight using a highly redundant, multi-view camera imaging system. We accomplish this goal by developing a motion model and sensor model for use in extended and unscented
3 of 13 American Institute of Aeronautics and Astronautics
Humerus Radius Digit 5
Z (cm)
10
Digit 4
0
Digit 3
0
−10 −20 −20 −40
Y (cm) −20
0
20 −40
Angular Displacement (deg)
Body
Point Occlusion 4 2 0 −2 −4 −6 100
X (cm) (a) Reconstruction of a Bat Wing in Motion
200 300 Timestep
400
(b) Angular Displacement of the Shoulder
Figure 2. Identification of Flapping Motion. Left: A rigid-link, open, kinematic-chain, robotic model has been fit to the point cloud generated by 3D reconstruction of the motion capture data. Right: One of the identified shoulder angles is depicted here. The line is a smooth curve fit for the experimental data. Breaks in this curve fit indicate the regions which could not be identified due to point occlusion.
Kalman filters. We begin in Section II by presenting the full equations of motion of the bat along with a discussion of the difficulties in obtaining various terms in this class of motion models. Then in Section III.A we make some simplifying assumptions to arrive at a motion model which is practically implementable. In Section III.B we derive a sensor model which assumes that radial camera distortion is removed prior to incorporation of tracked image points in the Kalman filters. The update laws for the extended and unscented Kalman filters are then presented in Sections III.C and III.D respectively. These filters are used to perform the seamless data fusion of multiple observations of the fiducial markers to yield estimates of inertial trajectories. Our methods are novel due in that they use many low-cost cameras, as opposed to a few high-precision cameras to estimate bat flight. As discussed previously, for highly occluded motions of bat flight, more viewing angles are desirable as opposed to more resolution. Ultimately, we wish to understand the kinematics and dynamics of complex, articulated, multi-body, flapping wing flight and this paper creates a foundation of the numerical methods used to process the raw motion capture footage into inertial trajectories.
II.
Dynamics of Bat Flight
While self-occlusion is an inherent issue which makes our problem difficult, bat flight is also difficult to study due to complex and uncertain dynamics. The full dynamics of the bat includes complex aerodynamic loads,12 anisotropic wing membrane properties,13 and flexible wing bones.14 To begin modeling bat flight, a key simplifying assumption must be made: the bat bioskeletal system can be represented as a collection of rigid bodies interconnected by ideal joints. Making this assumption, equations 1 and 2 are prototypical of models that govern the dynamics of a bat in flight ¨ + C(θ, θ) ˙ θ˙ + ∂V(θ) = τ a (θ, θ) ˙ + D(θ)τ c M(θ)θ ∂θ
(1)
˙ y = h(θ, θ)
(2)
For an n degree of freedom model of bat flight, these equations consist of an n × n nonlinear generalized ˙ used to form centripetal and Coriolis contributions, a mass matrix M(θ), an n × n nonlinear matrix C(θ, θ) ˙ of generalized forces due to aerodynamic loads, scalar potential energy function V (θ), the n−vector τ a (θ, θ) the n × n nonlinear control influence matrix D(θ), the input forces and torques τ c , and the n−vector of generalized coordinates θ. The second equation above, Equation 2, expresses how the Cartesian outputs y ˙ The specific form of depend in a nonlinear fashion on the generalized coordinates θ and their derivatives θ.
4 of 13 American Institute of Aeronautics and Astronautics
the entries of Equation 1 has been discussed in control problems15, 16 and creation of a bat-inspired robotic mechanism development.17, 18 Roughly speaking, the contributions to the equations of motion fall into two categories: geometric and aerodynamic contributions. The mass matrix, centripetal matrix, potential energy, and control influence matrix arise from the underlying problem geometry. That is, these terms can be understood as coordinate realizations of an evolution law on a suitably defined smooth manifold, and methods for their determination are well known.19, 20 On the other hand, the generalized forces τ a depend inherently on the particular model used to represent flapping wing aerodynamics. Many alternatives have been studied for these aerodynamic contributions. An excellent survey of low-order models that have been used to represent aerodynamics of flapping flight mechanics exists.21 In principle, the governing Equations 1 and 2 could used as a motion model for the predictor in a filtering estimate of the inertial trajectories of the fiducial markers. Several issues make an approach based on a direct implementation of this form of the equations of motion intractable. The issues with using this model are that τ a and τ c are unknown, non-negligible, and un-measureable in real-time. In recent years, computational fluid dynamics analysis and particle imaging velocimetry (PIV) have been conducted on models of bats and living specimens to study their aerodynamics. Such studies provide estimates of aerodynamic contributions from computationally intensive batch calculations. These methods are not feasible for on-line estimation of aerodynamic loads. Additionally, EEG and other tests have been conducted to determine control forces applied by the muscles during flight maneuvers. Researchers at Brown University have conducted tests which show that the stiffness of the wing membrane is actively controlled during flight.22 Like the aerodynamics studies, however, the results only give an approximation of the actual values of these controls during flight and, again, are not suitable to build on-line estimates. Since the aerodynamic and control contributions are significant and unknown, a different motion model will be used for the Kalman filters developed in this paper.
III.
Trajectory Estimation of Bat Flight
Motion tracking has been used widely in recent years to study articulated motion of animals because it can estimate posture without the use of mechanical sensors which can constrain motion. Much of this work relies on tracking fiducial markers in image space and combining multiple observations using Kalman or particle filtering.23, 24 This section develops the motion and sensor models for Kalman filtering and then explicitly details the extended and unscented Kalman filter algorithms. A.
Motion Model
Human motion studies commonly use a random-walk model23, 24 due to similar difficulties implementing the full dynamics as discussed earlier in Section II. For our implementation of Kalman filters for bat flight, we use a random-walk model as well
xk+1 = xk + w
(3)
The motion model does not predict the direction of motion, but rather estimates that the state does not change dramatically over a typical time step. This is a valid assumption for systems with inertia. While this motion model has a large degree of uncertainty, a conservative estimate for the noise in the system can be determined by careful study of bats during flight. Our experiment was conducted with great Himalayan leaf-nosed bats (Hipposiderous armiger ). The specimen used in this experiment had a wing span of 0.5m. During straight and level flight, flaps its wings at a rate of 4Hz, through an angle of 90 degrees, to create a body velocity of approximately 3 m s . The cameras used to capture the motion were recording at 120fps. From this information, we can develop an estimate of the largest motion which will occur between frames. s 2 2 q vf 1 dm = d2v + d2f = + αf lw ff (4) fc fc In Equation 4, dm is the estimated distance a point can travel, dv is the distance due to forward velocity, df is the distance due to flapping, vf is the velocity of flight, fc is the camera frame rate, αf is the flap angle, lw is the length of the wingspan, and ff is the frequency of the flap cycle. Assuming this motion is possible
5 of 13 American Institute of Aeronautics and Astronautics
in all directions, there is no cross covariance, and specifying a confidence interval of 95% yields a covariance of: dm Rk = 7.8147 I 2
I
(5)
where is the n × n identity matrix, and 7.8147 is a scaling factor for a 95% confidence interval in a 3 DOF χ2 cumulative distribution. This covariance is used in the prediction step of the Kalman filters presented later. Note that this matrix is constant. The ear motion data uses the same formulation, however, with different characteristic parameter values. The flight speed is zero, ear length is 0.05m, ear motion occurs at a frequency of 10Hz, the ear bends through an angle of 90o , and the camera frame rate is 250fps. B.
Sensor Model
The next portion of the probabilistic filter incorporates the sensor model which will be used for the correction step of the algorithm. While the sensor model for a camera is nonlinear, there are well established equations for projecting a point in inertial space x into camera pixel coordinates.25 The location of the feature in the image space φ is known to be ( ) 1 ψx (6) φ= = c KΠ0 Hc0 x λ ψy where λ is the distance from the focal point of the camera to the point x in inertial coordinates, c K is the camera calibration matrix, Π0 is the canonical projection matrix, and Hc0 is the homogeneous transform from the inertial basis to the camera basis. In our multi-view camera system, the j th camera observation of the ith point is thus given by ) cj ( 1 cj ψ c x cj φ pi = KΠ0 H0j xpi (7) = cj λ ψy pi pi
Given m cameras, all is:
cj
φpi will be stacked such that the vector form for all observations of all feature points c1 c1 c1 1 KΠ H x c 0 p ,k 1 0 1 λp1 ,k φ p1 . . . . . . c1 c1 1 c1 λp ,k KΠ0 H0 xpn ,k φpn n . . .. .. = (8) ( k) = cm c c 1 m m KΠ0 H0 xp1 ,k c φp1 m λp1 ,k . . . . . . c c 1 m m φpn KΠ H x cm λ 0 p ,k n 0 k p ,k
hx
h
n
In this equation, is the complete measurement function. As a result, the sensor model in the Kalman filters can be expressed with this deterministic model and system noise as
yk = h(xk ) + v
(9)
The noise vector in this measurement model can be approximated using the errors in the calibration parameters. The first order approximation of the sensor model covariance is (xk ) ∂ h(xk ) Qk = ∂ h∂p Σp ∂p
T
(10)
where p is the vector of parameters which have uncertainty (e.g. intrinsic and extrinsic parameters of the calibration), and Σp is a diagonal matrix of parameter covariances. The values for Σp are produced by the camera calibration.26 Now that we have developed a motion model and sensor model for our experiment, we can present the update laws for the extended and unscented Kalman filters. 6 of 13 American Institute of Aeronautics and Astronautics
C.
Extended Kalman Filter
The EKF update law is,
xk = Ak xk−1 ≡ Mean Prediction T Σk = Ak Σk−1 Ak + Rk ≡ Covariance Prediction −1 Kk = Σk Hˆ Tk Hˆ k Σk Hˆ Tk + Qk ≡ Kalman Gain Calculation xk = xk + Kk (˜yk− h(xk )) ≡ State Correction ˆ k Σk Σk = I − Kk H ≡ Covariance Correction
(11) (12) (13) (14) (15)
The Jacobian of the sensor model for a single camera measurement of a single point takes the standard form
h ˆ= 0 where, z
0
−2 1 cj ∂ cj h(xpi ) c c c c ˆH0j z ˆH0j xpi KΠ0 H0j − cj KΠ0 H0j xpi z = cj λ pi ∂xpi i 1 0 . The Jacobian for a single camera, which observes multiple points, is then ∂ cj h(xp ) 1 0 ... 0 ∂xp1 .. .. cj . 0 . ˆ = .. .. . 0 . ∂ cj h(xpn ) 0 ... 0 ∂xp
H
(16)
(17)
n
The complete Jacobian for all sensor measurements of all points is:
Hˆ = c Hˆ T h
1
...
cj
Hˆ T
iT
(18)
For this implementation, frequent point occlusions are expected. To account for occlusions, we omit rows of the sensor model, its Jacobian, and the observation vector. Similarly, we omit rows and columns from the sensor covariance matrix such that the dimensions are consistent. This rescaling procedure eliminates the points which are occluded in certain camera views from the correction step of the algorithm. As long as two cameras can view a particular point, the correction will still be meaningful. D.
Unscented Kalman Filter
We wish to compare the performance of the extended Kalman filter to the unscented Kalman filter because the latter can often perform better in the presence of non-Gaussian noise or classes of nonlinearities.27 Using the same motion model, sensor model, and covariance functions, the UKF update law is,
7 of 13 American Institute of Aeronautics and Astronautics
xk−1, Σk−1)
Xk−1 = UT( ∗ Xk
=
xk =
Ak Xk−1 2n X
∗ wm [i] X k [i]
≡ Unscented Transform of Previous Mean
(19)
≡ Prediction of Sample Points
(20)
≡ Mean Prediction
(21)
≡ Covariance Prediction
(22)
≡ Unscented Transform of Prediction
(23)
≡ Measurement on Predicted Sample Points
(24)
≡ Measurement Prediction
(25)
≡ Predicted Measurement Covariance
(26)
i=0
Σk =
2n X i=0
∗ wc [i] X k [i] −
xk
∗ X k [i] −
xk
T
+
Rk
xk , Σk )
X k = UT(
Yk = H(X k ) yk =
2n X
wm [i]
Yk [i]
i=0
Sk =
2n X
wc [i]
Yk [i] − yk Yk [i] − yk T + Qk
i=0 2n X
Kk = wc [i] X k [i] − xk Yk [i] − yk i=0 xk = xk + Kk (˜yk − yk ) Σk = Σk − Kk Sk KTk
T
! Sk−1 ≡ Kalman Gain Calculation
(27)
≡ State Correction
(28)
≡ Covariance Correction
(29)
As in the extended Kalman filter presented above, we account for occlusions by trimming the observation vector k and reshaping the Kalman gain and sensor covariance matricies, k and k , respectively.
y
K
IV.
Q
Results
The EKF and UKF algorithms developed above are applied to two sets of motion capture data: ear motion and straight flapping wing flight. The purpose of considering the ear data is to validate the performance of the Kalman filters against stereo triangulation performed with data captured using two high quality cameras. After the performance of the filters is determined for the bat ear data, the results for a flight trajectory estimation are presented. A.
Bat Ear Data
The first set of motion capture data includes head and ear motions of a great Himalayan leaf-nosed bat (Hipposiderous armiger ) echolocating a presented sonar target. The head and ear motion was captured using two GigaView cameras having a frame rate of 250 fps and a resolution of 720×1280 pixels. These cameras use lenses with minimal lens distortion, so rectification of images was not deemed necessary. For the ear motion data set, the results of both the EKF and UKF were compared to stereo triangulation. The trajectory reconstruction of all three methods is shown in figure 3a, the error between the EKF and the stereo triangulation is shown in figure 3b, and the error between the UKF and stereo triangulation is shown in figure 3c. The trajectory estimation via the EKF and UKF algorithms are within 1mm of the trajectories produced by the stereo triangulation. Furthermore the errors are approximately constant over the entire dataset. Due to the similarity between the stereo triangulation and Kalman filter reconstructions, we can conclude that the filters converge to the correct trajectory. Furthermore, if the identified trajectories are projected back into image space, the error between the observation (original point in the image) and the identified point location can be determined. Figure 4a shows the re-projection of the EKF, UKF, and stereo triangulated trajectories on an original image. Figure 4b, shows the error between these re-projections and the original image features. As shown in the figures, all three methods produce re-projections within 5 pixels of the original feature locations. Note that a 5 pixel reprojection error in image space corresponds to a roughly 2mm uncertainty in the 3D location of the feature point. Thus, we conclude that the Kalman filter implementation developed 8 of 13 American Institute of Aeronautics and Astronautics
Ear Points in 3D Coordinates
Ear 2
0 y (mm)
Ear Tip
−20 −50
0
2.5 Ear Tip Ear 1 Ear 2 Ear 3 Ear 4 Eye Head 1 Head 2 Head 3
2
1.5
1
0.5
Magnitude of error (mm)
Eye
Ear 1 490 20
Magnitude of error (mm)
z (mm)
495
Ear 3 Head 3Ear 4
Head 2
Error: UKF vs Stereo Triagulation
2.5
Head 1
505 500
Error: EKF vs Stereo Triagulation
EKF UKF Stereo
510
Ear Tip Ear 1 Ear 2 Ear 3 Ear 4 Eye Head 1 Head 2 Head 3
2
1.5
1
0.5
50 0
x (mm)
(a) Ear Marker Inertial Trajectories
2
4
6
8
0
10
2
4
6
8
10
Timestep
Timestep
(b) Error between EKF and Stereo
(c) Error between UKF and Stereo
Figure 3. EKF, UKF, and stereo triangulation inertial trajectory reconstruction. In (a) the inertial reconstructions using all three methods are shown. In (b) and (c), the error between stereo triangulation and the EKF and UKF reconstructions are shown, respectively. Thus the results of the Kalman filters match the stereo reconstruction well (error < 1mm).
Image Features Stereo Reprojection EKF Reprojection UKF Reprojection
Reprojection Error
Ear Tip
5
Ear 1 Ear 2
Ear 3 Ear 4
y (pixels)
EKF UKF Stereo 0
Head 1 Head 2 Head 3
−5 −5
Eye (a) Reprojection of Ear Points
0 x (pixels)
5
(b) Reprojection Error in Ear Points
Figure 4. Reprojection error of EKF, UKF, and stereo reconstruction methods. The white ear outline in (a) is for visualization only and was not tracked using the outline points. The three head points show minimal motion which is desirable for studying ear motion. The re-projection error in (b) is within 5 pixels in x and y.
here is satisfactory. This ear motion data was used as an initial training set of data for the development, implementation, and validation of the Kalman filters software and algorithms. The ultimate goal was to produce inertial trajectory estimates of marker locations for the flapping flight experiments. B.
Bat Flight Data
The second set of data describes the same species of bat flying straight and level through the flight tunnel. The flight motion was recorded with a number of GoPro Hero3+ Black cameras at a frame rate of 120 fps and a resolution of 720×1280 pixels. These cameras use the factory lens which contains significant distortion. The distortion was removed prior to 3D reconstruction using a standard 5 parameter distortion model and parameters.28, 29, 30 For both data sets, feature recognition of fiducial markers was conducted prior to performing inertial trajectory reconstruction. Two preprocessing steps were necessary before the Kalman filters could be applied to the bat ear motion data. First radial distortion was removed from the images as mentioned above. Secondly, structure bundle adjustment was performed on the extrinsic parameters computed by the camera calibration toolbox.26 The inertial trajectories estimated by the Kalman filters are shown in figure 5a. These trajectories are then projected into image space and the reprojection error is calculated. Again, this reprojection error is presented as an uncertainty in the 3D location of the estimated trajectories. Figure 5 plots b-f show the reprojection
9 of 13 American Institute of Aeronautics and Astronautics
errors for 5 cameras. The reprojection errors for camera 1 are notably lower than the reprojection errors in the other cameras. This artifact is due to the fact that each camera was calibrated with respect to camera 1. In other words, the cameras are calibrated pairwise with camera 1 as the base camera. Thus, the camera 1 frame is assumed to be the world coordinate system and all of the extrinsic calibration error is assumed to exist in the other cameras. The reprojection errors induce uncertainty in 3D that are approximately 2cm or less for all cameras and camera 1 has an associated uncertainty less than 3mm. Additionally, we can qualitatively evaluate the reprojection of the estimated trajectories by projecting them into image space. Figure 6 shows the image features, EKF reprojection, and UKF reprojection. Figure 6a further demonstrates the accuracy of the reprojection in camera 1; the image features and estimated points are almost coincident. These figures also demonstrate the need for a large number of cameras to observe this motion. Even with 5 cameras of data, occlusion is still present in the middle of the flap cycle presented here. This portion of the data captures the upstroke of the wing which is the most difficult portion of the flight to capture. While this work was not able to overcome the effects of occlusion, it still demonstrates the need for a highly-redundant imaging system.
V.
Conclusions
In this paper we present the formulation of trajectory estimation using filtering techniques of bat flight using a multi-view camera system. A motion model based on random-walk and a sensor model consisting of intrinsic and extrinsic calibration parameters is developed for a multi-view camera system. The trajectory estimation was implemented by both extended and unscented Kalman filters. We utilized two data sets in this paper: ear motion and straight and level flapping flight. The bat ear motion data was used to validate the performance and formulation of the Kalman filter. The re-projection error of the estimated trajectories was less than 2mm for the ear data, which proves the formulation of the Kalman filters. Then, inertial trajectories of fiducial markers were determined for straight and level bat flight. The re-projection error of the flight trajectories was less than 20mm which is approximately four times the size of the markers used in the experiment. Future work will utilize calibration methods more suited for the multi-camera system to reduce the error further. Finally, this work has demonstrated the need for a redundant camera system to adequately capture bat flight motion.
VI.
Acknowledgments
The foundation of this research was motion capture data collected at Shandong University in Jinan, China in the summer of 2014. The authors would like to thank the following students for assistance in processing data: Alex Matta, Ma Nuo, and Xiaoyan Yin, Jin Zhen, Pei Xuan Li. The authors would also like to thank Dr. Kenny Breuer for supplying initial test data used to develop motion identification algorithms and create figure 2.
10 of 13 American Institute of Aeronautics and Astronautics
20 EKF UKF
15
600 Digit 3
500
Digit 4 Digit 5 Elbow Body 2
400 RShoulder
300
10 y (mm)
z (mm)
700
5 0
−100 −5
0 100 x (mm)
200 300
−100
100
0
200 −10 −20
y (mm)
(a) Inertial Trajectory Reconstruction
15
y (mm)
y (mm)
5
10
15
20
15
20
15
20
EKF UKF
5
0
0
−5
−5
−15
−10
−5
0 x (mm)
5
10
15
−10 −20
20
(c) Reprojection Error Camera 2
−15
−10
−5
0 x (mm)
5
10
(d) Reprojection Error Camera 3
20
20
15
15
10
10 y (mm)
y (mm)
0 x (mm)
10
5
5 0
5 0
−5 −10 −20
−5
20 EKF UKF
10
−10 −20
−10
(b) Reprojection Error Camera 1
20 15
−15
−5
EKF UKF −15
−10
−5
0 x (mm)
5
10
15
−10 −20
20
(e) Reprojection Error Camera 4
EKF UKF −15
−10
−5
0 x (mm)
5
10
(f) Reprojection Error Camera 6
Figure 5. The inertial trajectories are reconstructed in (a) using both UKF and EKF methods of reconstruction. The blue lines are for visualization of the bat wing. The bat is flying in the negative x direction. The reprojection errors in each camera are shown in (b) through (f ). In these figures the ellipses represents 95% uncertainty ellipses for the EKF (dashed) and UKF (solid) reprojection errors. The re-projection error shown in (b) is less than 3mm while the reprojection errors shown in (c) through (f ) are less than 20 mm.
2
11 of 13 American Institute of Aeronautics and Astronautics
Image Features EKF Reprojection UKF Reprojection
Digit 3
RShoulder
Digit 4
Body 2 Digit 5
Elbow Wrist
Digit 5
Elbow
Wrist
Digit 4
Body 2
Digit 3 RShoulder
(a) Camera 1
Image Features EKF Reprojection UKF Reprojection
(b) Camera 2 Image Features EKF Reprojection UKF Reprojection
RShoulder
Digit 3
Body 2 Elbow
Wrist
Wrist
Digit 5
Digit 4
Digit 5 Elbow
Digit 4
Body 2 RShoulder
Image Features EKF Reprojection UKF Reprojection
Digit 3
(c) Camera 3
(d) Camera 4
RShoulder Body 2
Elbow Wrist Digit 5 Digit 4
Digit 3
Image Features EKF Reprojection UKF Reprojection
(e) Camera 5 Figure 6. Reprojection of Inertial Trajectories. Body points follow an approximately linear trajectory. The wing points undergo nonlinear motions which increases in severity with distance from the body.
References 1 Tian, X., Iriarte-Diaz, J., Middleton, K., Galvao, R., Israeli, E., Roemer, A., Sullivan, A., Song, A., Swartz, S., and Breuer, K., “Direct Measurements of the Kinematics and Dynamics of Bat Flight,” Bioinspiration & Biomimetics, Vol. 1. 2 Iriarte-Diaz, J. and Swartz, S., “Kinematics of Slow Turn Maneuvering in the Fruit Bat Cynopterus Brachyotis,” J. Exp. Biol, Vol. 211, 2008, pp. 3478–3489. 3 Bergou, A., Swartz, S., Breuer, K., and Taubin, G., “3D Reconstruction of Bat Flight Kinematics from Sparse Multiple Views,” IEEE Computer Science Conference on Computer Vision and Patern Recognition, Vol. 1, IEEE, 2011, pp. 238–245. 4 Irarte-Diaz, J., Riskin, D., Willis, D., Breuer, K., and Swartz, S., “Whole-Body Kinematics of a Fruit Bat Reveal the Influence of Wing Inertia on Body Accelerations,” J. Exp. Biol., Vol. 214, 2011, pp. 1546–1553. 5 Hubel, T., Hristov, N., Swartz, S., and Breuer, K., “Changes in Kinematics and Aerodynamics Over a Range of Speeds in Tadarida brasiliensis, the Brazilian Free-Tailed Bat,” J.R. Soc. Interface, Vol. 9, 2012, pp. 1120–1130. 6 Riskin, D., Bahlman, J., Hubel, T., Ratcliffe, J., Kunz, T., and Swartz, S., “Bats Go Head-Under-Heels: the Biomechanics of Landing on a Celing,” J. Exp. Biol., Vol. 212, 2009, pp. 945–953. 7 Muijres, F., Bowlin, M., Johansson, L., and Hedenstrom, A., “Vortex Wake, Downwash Distribution, Aerodynamic Performance and Wingbeat Kinematics in Slow-Flying Pied Flycatcher,” J.R. Soc. Interface, Vol. 9, 2012, pp. 292–303. 8 Muijres, F. T., Spedding, G., Winter, Y., and Hedenstrom, A., “Actuator Disk Model and Span Efficiency of Flapping Flight in Bats Based on Time-Resolved PIV Measurements,” Journal of Exp. Fluids, Vol. 51, 2011, pp. 511–525.
12 of 13 American Institute of Aeronautics and Astronautics
9 Hubel, T., Swartz, D., and Breuer, K., “Wake Structure and Wing Kinematics: The Flight of the Lesser Dog-Faced Fruit Bat, Cynopterus brachyotis,” J. Exp. Biol., Vol. 213, 2010, pp. 3427–3440. 10 Hedenstrom, A., Muijres, F., von Busse, R., Johansson, L., Winter, Y., and Spedding, G., “High Speed Stereo DPIV Measurement of Wakes of Two Bat Species Flying Freely in a Wind Tunnel,” Journal of Experimental Fluids, Vol. 46, 2009, pp. 923–932. 11 Chen, X. and Davis, J., “Camera Placement Considering Occlusion for Robust Motion Caputre,” Stanford University Computer Science Technical Report, 2000. 12 Viswanath, K. and Tafti, D. K., “Effect of Stroke Deviation on Forward Flapping Flight,” Vol. 51, 2013, pp. 145–160. 13 Swartz, S., “Mechanical Properties of Bat Wing Membrane Skin,” Zoology, Vol. 239, 1996, pp. 357–378. 14 Swartz, S., Skin and Bones: the Mechanical Properties of Bat Wing Tissues, Smithsonian Institution Press, 1998, pp. 109–126. 15 Bayandor, J., Bledt, G., Dadashi, S., Kurdila, A., Murphy, I., and Lei, Y., “Adaptive control for bioinspired flapping wing robots,” American Control Conference (ACC), 2013 , June 2013, pp. 609–614. 16 Dadashi, S., Gregory, J., Lei, Y., Bender, M., Kurdila, A., Bayandor, J., and M¨ uller, R., “Adaptive Control of a Flapping Wing Robot Inspired by Bat Flight,” AIAA SciTech: Guidance, Navigation, and Control, AIAA, 2014. 17 Colorado, J., Barrientos, A., Rossi, C., and Breuer, K. S., “Biomechanics of smart wings in a bat robot: morphing wings using SMA actuators,” Bioinspiration & Biomimetics, Vol. 7, No. 3, 2012, pp. 036006. 18 Bahlman, J. W., Swartz, S. M., and Breuer, K. S., “Design and characterization of a multi-articulated robotic bat wing,” Bioinspiration & Biomimetics, Vol. 8, No. 1, 2013, pp. 016009. 19 Mark W. Spong, Seth Hutchinson, M. V., Robot Modeling and Control. 20 Bullo, F. and Lewis, A., Geometric Control of Mechanical Systems, Springer, 2004. 21 Orlowski, C. T. and Girard, A. R., “Modeling and Simulation of Nonlinear Dynamics of Flapping wing Micro Air Vehicles,” AIAA Journal, Vol. 49, 2011, pp. 969–981. 22 Swartz, S. M., Groves, M. S., Kim, H. D., and Walsh, W. R., “Mechancial properties of bat wing membrane skin,” Journal of Zoology, Vol. 239, No. 2, 1996, pp. 357–378. 23 Hauberg, S., Lauze, F., and Pedersen, K. S., “Unscented Kalman Filtering on Riemannian Manifolds,” Journal of Mathematical Imaging and Vision, Vol. 46, No. 1, August 2013, pp. 103–120. 24 Canton-Ferrer, C., Casas, J., and Pardas, M., “Towards a low cost multi-camera marker based human motion capture system,” Image Processing (ICIP), 2009 16th IEEE International Conference on, Nov 2009, pp. 2581–2584. 25 Ma, Y., Soatto, S., Koseck, J., and Sastry, S. S., An Invitation to 3-D Vision: From Images to Geometric Models, Springer, 1st ed., 2004. 26 Bouguet, J.-Y., “Camera Calibration Toolbox for Matlab,” http://www.vision.caltech.edu/bouguetj/calib_doc/ index.html, Dec. 2013. 27 Julier, S. J. and Uhlmann, J. K., “A New Extension of the Kalman Filter to Nonlinear Systems,” Proc. SPIE: Signal Processing, Sensor Fustion, and Target Recognition VI , Vol. 3068, 1997, pp. 182–193. 28 Zhang, Z., “Flexible camera calibration by viewing a plane from unknown orientations,” Computer Vision, 1999. The Proceedings of the Seventh IEEE International Conference on, Vol. 1, 1999, pp. 666–673. 29 Clarke, T. A. and Fryer, J. G., “The Development of Camera Calibration Methods and Models,” The Photogrammetric Record, Vol. 16, No. 91, 1998, pp. 51–66. 30 Heikkila, J. and Silven, O., “A four-step camera calibration procedure with implicit image correction,” Computer Vision and Pattern Recognition, 1997. Proceedings., 1997 IEEE Computer Society Conference on, 1997, pp. 1106–1112.
13 of 13 American Institute of Aeronautics and Astronautics