Two-Stage PCA Extracts Spatiotemporal Features ... - Semantic Scholar

2 downloads 0 Views 3MB Size Report
Sandhitsu R. Das, Robert C. Wilson, Maciej T. Lazarewicz and Leif H. Finkel. Department of Bioengineering, University of Pennsylvania, Philadelphia, PA, USA.
JOURNAL OF MULTIMEDIA, VOL. 1, NO. 5, AUGUST 2006

9

Two-Stage PCA Extracts Spatiotemporal Features for Gait Recognition Sandhitsu R. Das, Robert C. Wilson, Maciej T. Lazarewicz and Leif H. Finkel Department of Bioengineering, University of Pennsylvania, Philadelphia, PA, USA Email: [email protected] Abstract— We propose a technique for gait recognition from motion capture data based on two successive stages of principal component analysis (PCA) on kinematic data. The first stage of PCA provides a low dimensional representation of gait. Components of this representation closely correspond to particular spatiotemporal features of gait that we have shown to be important for visual recognition of gait in a separate psychophysical study. A second stage of PCA captures the shape of the trajectory within the low dimensional space during a given gait cycle across different individuals or gaits. The projection space of the second stage of PCA has distinguishable clusters corresponding to the individual identity and type of gait. Despite the simple eigen-analysis based approach, promising recognition performance is obtained. Index Terms— Gait recognition, principal component analysis, motion features.

I. INTRODUCTION Human gait and the characteristic movement patterns of animals in general, provide a challenging paradigm for studying spatiotemporal pattern recognition. These movements are characterized by unique patterns of relative motion of different body parts. Motion of each part is articulated, yet constrained by the laws of biomechanics. In the biological vision literature, it is an open question as to what kinds of intermediate to highlevel spatiotemporal features drive recognition of biological motion. Gunar Johansson was the first to systematically study biological motion recognition [1] by the visual system. He introduced what are known as point-light displays (PLD). Human subjects wearing dark attire were filmed walking and running in a dark room with lights attached to their body joints so that only the discrete movements of the lights were visible. He showed that observers were able to identify human gait from the motion of these point-lights. Subsequently, researchers have shown that in addition to discriminating different gaits (walking, running, limping) observers could distinguish identity [2], gender [3], and emotion [4] of the walker as well. Even though structural information is minimized in a point light video – there is little form information in a single static frame particularly when distracting noise dots are added to the display – it has been suggested that both form and motion based features are used in recognition [5]. Also, observers can detect the presence of a walker within a fraction of the gait cycle [6]. © 2006 ACADEMY PUBLISHER

Gait recognition has been a longstanding research topic in computer vision, both using marker data ([7], [8]) as in PLD and using full-body videos ([9], [10], [11]). In general, interest in human motion analysis and recognition has come from several different application domains. Vision-based technologies like smart surveillance systems ([12], [13]), intelligent humanmachine interfaces [14], and biometrics using gait [15] are examples of computer vision applications that make use of human motion analysis. In Biomechanics, kinesiologists develop models of the human body movements and use gait analysis to treat patients with gait disorders and to increase movement efficiencies [16]. Reviews of early work in human motion analysis can be found in [17] and [18]. Gait recognition-based person identification research in computer vision advanced recently with the HumanID project [19] as reviewed by [15] and [12]. Most gait recognition research has concentrated either on image processing/machine learning methodology, or on psychophysical aspects of the perception of biological motion (such as the role temporal information [20], or the contributions of structural vs. kinematic cues [2]). A key to progress will be to identify human psychophysical mechanisms and incorporate efficient versions of these mechanisms into recognition systems [10]. In psychophysical studies [21], we have found evidence suggesting that the perception of gait depends upon the detection of specific “motion features” that characterize the relative motion of body parts. Focusing on the legs, we identified three particular motion features: (1) anti–symmetrical movement of the thighs, (2) knee flexion followed by extension during the swing phase of gait, and (3) relative absence of knee movement during the pivot phase compared to the swing phase of gait. Interestingly, these three features are the core features of Cutting’s [22] algorithm for generating realistic looking point light displays of human gait. While the visual system most likely makes use of additional motion features, these three motion features define a lowdimensional manifold of motion parameters, as described by orientation angles of limb segments, and the shape of this manifold can be used for accurate recognition and classification of gait. Such manifolds, commonly known as cyclograms [23], have been shown to be sensitive descriptors of gait in biomechanics [24] and computer vision [25] applications.

10

JOURNAL OF MULTIMEDIA, VOL. 1, NO. 5, AUGUST 2006

We tested human observers’ sensitivity to perturbations in these motion features in upright vs. inverted point-light walker displays. Since it is well known that display inversion impedes recognition [26], we hypothesized that observers should be more sensitive to features used for recognition in upright displays than in inverted displays. This is indeed what we found psychophysically [21]. In these experiments, perturbations to a variety of potential motion features were introduced. Perturbations that led to distortions of the low-dimensional cyclogram manifold were detectable at much lower thresholds than similar motion perturbations that did not distort the manifold— suggesting that the cyclogram manifold represents critical information used by the visual system for recognition. Most critically for the present study, we found that the first two principal components of gait data have close correspondence with the identified motion features [27]. It has been shown that PCA can be successfully employed to represent movement data in a lowdimensional space ([28, 29]). We can therefore use the low-dimensional manifold defined by the first few principal components to perform a second stage of principal component analysis that describes deviations in the manifold across individuals or types of gait. This second PCA stage primarily distinguishes the temporal characteristics of the motion. We demonstrate high accuracy for gait and identity discrimination tasks, and show that the two-stage PC representation yields insights into the contributions of various parameters to different recognition tasks. Preliminary versions of results described in this paper have appeared in [30].

Figure 1. Subject running on a treadmill. The thigh angle θ is defined as the relative orientation of the thigh with respect to the vertical – it is negative for thigh behind the torso and positive for thigh in front of the torso. The knee angle κ is defined as the joint angle at the knee – zero means a fully extended knee, a positive angle means flexed knee.

II. METHODS Gait data was obtained using the ReActor2 motion capture system (http://www.ascension-tech.com). Six human subjects (3 male, 3 female) walked, jogged, ran and limped on a Quinton Hyperdrive Club Track treadmill. Limping was simulated by tying weights to one ankle of the subject. Infrared emitters placed at 30 body positions provided 3D spatial positions of markers at 33 frames/s with a spatial resolution of 3 mm. The locations of 13 major body parts (shoulders, elbows, hands, hips, knees, feet, and head) were calculated from this data and projected to the sagittal plane. Angles and angular velocities were calculated from the data using angles as defined in Figure 1. We concentrate on the lower limbs, as they provide more salient information on gait than other body parts [31]. An example of how the thigh and knee angles vary with time during two consecutive gait cycles is shown in Figure 2. III. TWO STAGE PCA

Figure 2. Thigh and knee angles over one gait cycle for run. The left and right thigh angles vary antisymmetrically. The knee angle flexes then extends during the swing phase.

and right thigh angles, the inter-thigh angle and interthigh angular velocity, and the left knee angle and left knee angular velocity. 60 seconds of data for each subject was used for each treadmill speed and type of gait (running, walking etc.). Each data point is a 6dimensional vector consisting of values x ij of variable i (= 1, 2, …, 6) at time j. If each such data point is denoted X j = [x1 j , x 2 j ,..., x 6 j ] and the principal by components are denoted by Pk (k = 1..6) , then the projections Ykj of X j on Pk ! are given by

Ykj = X j , Pk

! In the first stage, we performed principal component analysis [32] on a 6-dimensional dataset consisting of left

!

! © 2006 ACADEMY PUBLISHER

! !

!

..(1)

JOURNAL OF MULTIMEDIA, VOL. 1, NO. 5, AUGUST 2006

where

,

11

Zˆ lm = Yˆ l , Qˆ m

represents the dot product. For every 60

Pk and Ykj were calculated separately. In the second stage, the projections Y1 j and Y2 j of

second segment,

the original data onto the first two principal components

!were considered. Y and Y represent a “D” shaped 2D 1j 2j ! ! 6). In order to capture information about manifold (Figure the temporal variability of the the gait ! data throughout ! cycle, the projection values for each cycle of gait were considered a time series. If we represent the gait cycle ! as ! by N uniformly spaced time points, each data point Yˆ th constructed from the l gait cycle for the second PCA is given by l

Yˆ l = [Yˆ1l Yˆ2l ]

..(2)

! we have introduced the variables ! l Yˆ1l = [Y11l Y12l ... Y1Nl ] and Yˆ2l = [Y21l Y22l ... Y2N ] which where

represent the time series of projections onto the first and second PCA!dimensions. This is pictorially explained in Figure 3.

!

!

Figure 3. Construction of each data point Yˆ for the second PCA decomposition. The manifold plotted here represents the projection of data points X j on the first two PCs P1 and P2 . The color represents l

the phase of gait. For each gait cycle l divided into N normalized time points, the projection values

!

Y!1lj

and

Y2l j

are collated together as

shown in the vertical box to construct the 2N dimensional vector

!

!

Yˆ l .

We now perform a second PCA decomposition with

! Yˆ ! representing a gait cycle. The each data point principal components thus obtained after!the second stage of PCA are denoted by l

..(3) Qˆ m = [Qˆ1m Qˆ 2m ] ! m = 1..2N is the second stage principal where ˆ m = [Qm Qm ... Qm ] is the component number and Q 1 11 12 1N ˆ m that acts on the first dimension part of component Q ! m l m m m !(i.e. Yˆ1 ) and Qˆ 2 = [Q21 Q22 ... Q2N ] is the part that acts l on the second ! dimension, Yˆ2 . Therefore, the projection Zˆ lm of ! Yˆ l on Qˆ m is given by ! ! ! © 2006 ACADEMY PUBLISHER ! ! !

= Yˆ1l , Qˆ1m + Yˆ2l , Qˆ 2m

..(4)

All results presented here are based on projections onto the first three principal components of this second PCA stage, viz. Zˆ , Zˆ , and Zˆ . A similar two-stage PCA ! method has been used for classification of spatiotemporal patterns of neuronal activity in turtle cortex [33]. l1

!

!

!

l2

l3

IV. RESULTS

A. First Stage PCA Figure 4 and Figure 5 show the results of the first stage of PCA. The projection space has the appearance of a “D”-shaped manifold (Figure 5). The “D” is traversed once every gait cycle in the counterclockwise direction. The semicircular portion of the “D” corresponds to the swing phase and the vertical stem corresponds to the pivot phase. The first stage of PCA thus extracts a lowdimensional representation of gait with similar properties to the gait cyclogram [23-25]. In Figure 5, PC 1 captures the differences in relative motion between the first and second halves of the swing phase - during which the knee flexes and extends respectively, as seen by red and magenta colored points respectively, as described by our second motion feature. PC2, on the other hand, captures the differences between the swing and pivot phases. The projection values for PC2 are near zero during pivot and goes through a wide range of values during swing - indicating the relative asymmetry in knee motion between the two phases, as described by our third motion feature. Indeed, the “D” shape of the phase space of thigh and knee angle provides a canonical low-dimensional manifold representation of a gamut of gait variables. It captures the majority of the variance in the data (Figure 4b) and provides an optimal way of describing spatiotemporal relationships between gait variables. There is, in fact, a close correspondence between the motion features identified psychophysically and the principal components [27]. In Figure 4a, the loadings of PC1 on left and right thigh angles (variables 1 and 2) are approximately equal and opposite - representing the first motion feature of antisymmetric thigh motion. Loadings of PC1 on Left thigh angle (variable 2) and left knee velocity (variable 6) are of opposite signs - meaning when thigh angle is negative (behind the body, as in first half of swing), knee angle velocity is positive (flexion); similarly when left thigh angle is positive (in front of the body, as in second half of swing), knee angle velocity is negative (extension). This describes the second motion feature. The projection of PC2 is only significant and positive during the swing phase and near zero during the pivot phase (see Figure 5). PC2 shows positive loadings on both left knee angle (variable 5) and inter thigh velocity (variable 4). This translates to positive knee

12

JOURNAL OF MULTIMEDIA, VOL. 1, NO. 5, AUGUST 2006

Figure 5. Results of Principal Component Analysis on data consisting of thigh angles, inter-thigh angle and angular velocity, and knee angle and angular velocity over 9240 gait frames. Projection of all data points onto the first two principal component (PC) axes. Different colors indicate different phases of gait. The two dimensional manifold has a “D” shape and describes 80% of the total variance in the data. PC 1 (y-axis) separates the two halves of the swing phase - projection values being negative for the first half (red points) and positive for the second half (magenta points). PC 2 (x-axis) separates swing and pivot - projection values are close to zero (blue points) during pivot, signifying minimal knee motion, and are positive during swing.

angle (knee movement) only during positive inter thigh angle velocity (swing phase). This describes the third motion feature of asymmetry of movement between pivot and swing phases. B. Second Stage PCA Figure 6 shows the variation of the projections and

!

Figure 4. Results of the first stage of Principal Component Analysis on data consisting of thigh angles, inter-thigh angle and angular velocity, and knee angle and angular velocity over 9240 gait frames. (a) Loadings of first three PCs on each of the six kinematic variables. (b) Scree plot showing the percentage of variation in the data explained by the principal components.

© 2006 ACADEMY PUBLISHER

Y1 j

Y2 j after the first stage of PCA. The appearance of

the “D”-like manifold contains information about the type of gait, the identity of the subject, etc. This information is extracted by the second stage of PCA. ! We used projections in the eigenspace after two-stage PCA to perform various classification tasks. For all the results reported here, linear discriminant analysis [34] based on the first three principal components was performed. Gait Classification The presence of a “D”-like manifold in PC space is the signature of a gait-like motion. The “D” manifolds for walking versus running, shown in Figure 6, differ subtly in their shape characteristics. Differentiation between similar manifolds representing gaits or individuals is extracted by a second stage of PCA. Figure 7 shows the results of gait classification for four types of gait – walking, running, jogging and limping for one subject. There are distinguishable clusters representing different gaits. Identity Recognition Figure 8 shows results of identity classification on 6 running subjects. The variations in the appearance of the “D” manifold and the temporal dynamics of how the

JOURNAL OF MULTIMEDIA, VOL. 1, NO. 5, AUGUST 2006

13

manifold is traversed during a gait cycle are more subtle in this case (compare the two panels in the top row of Figure 6). Nevertheless, more than 90% accuracy in classification was obtained in most cases.

phases of gait) of the “D” manifold is integrated by the

ˆ and Qˆ . second stage PCs Q All four panels have the same points plotted, namely, points that correspond to projections of each time point of 528 gait cycles – that make up the “D” manifold. However, ! in !each panel the points are color coded according to the loadings of a certain second stage PC 1

2

ˆ or Qˆ ) on a certain other first stage PC projection (Q ( Y1 j or Y2 j ). For example, top left panel has colors 1

2

proportional to the loadings of

! !

Qˆ 1 on Y1 j . These values

! signifies the relative contribution of x/y location on the ˆ lm ! “D” manifold on the discriminative projection axes Z . Similarly for the other three panels – top right panel has ! ˆ 1 on Y , bottom ! colors proportional to the loadings of Q 2j

ˆ2 left has colors proportional to the loadings ! of Q on Y1 j , and bottom right has colors proportional to the loadings of

!

Qˆ 2 on Y2 j .

!

! !

!

!

Figure 6. Manifolds in PC space after first stage of PCA. Top row running, bottom row walking. Left column subject 1, right column subject 2. Manifold is traversed once every gait cycle in the counterclockwise direction, color represents the phase of gait cycle. 2000 frames of gait data were used in each panel.

Figure 8. Recognition of different people from their running gaits. Points corresponding to gait cycles of each individual are color coded in the second-stage PC space—points for each individual are distinguishably clustered. Classification accuracy for 528 gait cycles in these 6 subjects was 96.7%.

Therefore, different patterns of colors in the four panels in Figure 9 signify that information carried by the projections Y1 j and Y2 j (x and y locations on “D” manifold respectively) are integrated with different

Qˆ 1 and Qˆ 2 to generate the lm eventual discriminating metrics of projections Zˆ . ! ! Moreover, in each panel, the variation in color as the manifold is traversed once every gait cycle at N equally spaced time points ! indicate ! that different phases of gait differ in the way they contribute towards! the formation of relative weights by the PCs

Figure 7 Clusters for gait classification. The classification rate was 99% in this case.

The “D” manifold defined by the projection space of the first two PCs of the first stage contains information about gait type and identity as borne out by distinguishable clusters in Figure 7 and Figure 8. These clusters are manifested in the projection space of second stage PCs, which are constructed from projection values along the “D” manifold. Therefore, in Figure 9, we look at how information present in different parts (different © 2006 ACADEMY PUBLISHER

the discriminating metric. Hot and cold colors indicate positive and negative loadings – ! both of which signify

substantial contribution towards Zˆ values. Neutral colors (green) indicate loadings close to zero – meaning little contribution to discrimination. As an example of how different parts of the gait cycle contribute differently to!second stage PCs, observe that lm

14

JOURNAL OF MULTIMEDIA, VOL. 1, NO. 5, AUGUST 2006

Qˆ 1 has high positive loadings on the y locations on the “D” manifold during the mid-swing part of the gait cycle (top left). This would measure the variability of limb

second stage PCs combine the variability of first stage PC projections at different phases of the gait cycle with different weights.

ˆ has positions around the mid-swing phase of gait. Q significant positive and negative loadings around the bottom and top parts of the “D” manifold respectively on the x locations (top right). This would produce an estimate of the amount of tilt in the ! semi-circular part of 1

!

ˆ , on the other hand, has a the “D”. The second PC Q high positive loading on the vertical stem part of the “D” that represent the pivot phase of gait – this would translate to a measure of how the leg moves during this phase (bottom ! left). These observations based on the loadings of second stage PCs, therefore, reveal the relative importance of information available in different phases of the gait cycle that is useful for a particular recognition task. Another way to look at the discriminating ability of PC projection spaces is depicted in Figure 10. The left panel shows the 2D discrimination space for identity recognition from running gaits. We consider four points in this space approximately situated at four corners of the area enclosing all the distinguishable clusters. These points, when back projected onto the Y space, would therefore reveal the different types of “D” manifolds that can be discriminated. This is exactly what is shown in the right panel. Color represents phase of gait. Observe that the four points on the discrimination space in the left ! panel give rise to very different manifolds in the Y space. This qualitatively captures the discriminating ability of the PC projection space. 2

!

Figure 10. Left panel shows the projection plane defined by the first two PCs of second stage that discriminate subjects from their run. Different colored clusters represent different people. The four orange markers denote approximate bounding points of the classification space. Their back-projections to “D” manifold space are plotted on the right panel with corresponding marker types.

It is also interesting to note that depending on the type of gait used for identity recognition, different parts of the gait cycle and/or different properties of the manifold contain discriminating information. One way to look at the importance of different parts of the gait cycle for recognition is to use data from only part of the gait cycle in the second stage PCA. The results for this for identity recognition using only half of the gait cycle are shown in Figure 11. Observe that discriminating individuals from their running gait is in general easier than from other gaits. Also, the first half of the gait cycle seems to contain more discriminating information for running compared to walking or jogging (lower error rates). Note that the relatively higher error rates for certain conditions are due to only half of the gait cycle being used.

Figure 11. Average identity recognition error rates using different gaits when data were taken from early (starting phase = 0), middle (starting phase = 0.25), and late (starting phase = 0.5) half of the gait cycle.

Figure 9. Loadings of first two of the second stage principal components on different parts of the “D” manifold (i.e. different times in the gait cycle) for identity recognition from running gaits. Red saturation levels indicate positive loadings and blue saturation levels indicate negative loadings. Green indicates loadings close to zero. We can see that the

© 2006 ACADEMY PUBLISHER

A similar analysis can be done for gait type recognition. In Figure 12, data was provided for different fractions of a gait cycle, with the cycle beginning at various phases of gait (e.g., start of swing, midway

JOURNAL OF MULTIMEDIA, VOL. 1, NO. 5, AUGUST 2006

through swing, etc..). The initial portions of the cycle contain more discriminatory information about the type of gait than the later parts. V. D ISCUSSION We have developed a PCA-based gait representation that generates components corresponding to a set of psychophysically-identified features used in the visual recognition of biological motion. We perform a two-stage principal component analysis where the first stage extracts salient information, in the form of a “D”-shaped manifold, concerning the relative motion of limbs. This is followed by a second PCA stage that discriminates differences in the temporal trajectory of data points along the manifold. Despite this straightforward eigen-approach, promising recognition accuracies are obtained for classifying both gait type and identity, and informative task-specific discriminating properties emerge.

Figure 12. Error rates for gait type recognition when different starting phases and different fractions of gait cycle were used. Note that the error increases as we go towards later parts of the cycle.

Joint angles or orientation angles of the limb segments have been traditionally used in kinesiology to describe human gait [23]. In Biomechanics, angle-angle diagrams, also known as cyclograms [24] have been used for the description of gait. Tanawongsuwan and Bobick used joint angle trajectories derived from motion capture data for gait recognition [8]. In a practical computer vision application where marker data may not be available, calculation of joint angles from real images may be difficult. However, gait analysis and recognition from real images where joint angles are extracted as a preprocessing step have recently been attempted successfully by Yoo et al. [35] by trigonometric models and more recently, by Su and Huang [36]. Computer vision researchers have used both traditional machine vision techniques and techniques that take advantage of the unique properties of periodic gait. Boyd and others have exploited the periodic nature of gait to define gait as a periodic sequence of events with predictable relative timings and used it for gait recognition [37-39]. Others have also tried to use the © 2006 ACADEMY PUBLISHER

15

spatial symmetry inherent in gait motion - for example, the antisymmetric movement of the thighs, for gait recognition [10, 40, 41]. From a feature-based recognition point of view, these are probably the closest to the approach we present here. The other aspect of our algorithm is the use of a data reduction technique like principal components analysis (PCA). Orrite-Urunuela et al. used PCA for gait recognition from silhouette data [42]. Others have used similar data representation and classification methods like a more generalized version of PCA [43] or locally linear embedding (LLE) [44]. Note that our manifold representation normalizes the range of original angle variables as well as temporal evolution of PCs in a gait cycle. This makes the representation time-warp invariant ([45]) but discards potentially discriminating information like stride length – instead focusing on the relative motion of limbs on a normalized time scale – yet the classification performance is good. A more sophisticated classification technique in the second stage may improve performance further. Also, instead of performing a second stage of PCA, the manifolds generated after the first stage could be classified directly using other manifold recognition techniques like locally linear embedding [44]. Another approach would be to directly utilize shapes of cyclogram manifolds without performing any kind of eigen analysis on angle data. For example, the “D”-shaped cyclogram defined by thigh angle vs. knee angle varies among subjects and between different gaits [46]. A shape recognition algorithm could directly be applied to these cyclograms in order to perform recognition [47]. The set of variables used are a reduced set that describes the relative motions of limb segments. Including more variables may also improve performance. Our dataset is relatively small, and for large-scale biometric applications, the representations and/or algorithms may have to be extended, for example by including more form information, to achieve maximal performance. Also, we have used motion capture data, with the assumption that accurate joint position and gait cycle data are available. Our intention is to present a proof of concept based on the use of perceptually salient information. This two-stage analysis can be applied to any spatiotemporal sequence dataset. Future work may reveal that this approach has utility for a wider class of problems in motion-based recognition. ACKNOWLEDGMENT We thank the Center for Human Modeling and Simulation for use of the ReActor2 system. This research was supported by the DoD Multidisciplinary University Research Initiative (MURI) program administered by the Office of Naval Research under grant N00014-01-1-0625. REFERENCES [1] J. Johansson, Visual perception of biological motion and a model for its analysis, Percept Psychophys, 14, 201-11 (1973).

16

JOURNAL OF MULTIMEDIA, VOL. 1, NO. 5, AUGUST 2006

[2] N. F. Troje, C. Westhoff, M. Lavrov, Person identification from biological motion: effects of structural and kinematic cues., Percept Psychophys, 67(4), 667-75 (2005).

[17] Aggarwal J. .. K. .., Cai Q..., Human motion analysis: a review, Nonrigid and Articulated Motion Workshop, 1997 Proceedings , IEEE, , pp. 90-102 (1997).

[3] J. W. Davis, H. Gao, An expressive threemode principal components model for gender recognition., J Vis, 4(5), 362-77 (2004).

[18] D. M. Gavrila, The Visual Analysis of Human Movement: A Survey, Computer Vision and Image Understanding, 73(1), 82-98 (1999).

[4] A. P. Atkinson, W.H. Dittrich, A.J. Gemmell, A.W. Young, Emotion perception from dynamic and static body expressions in point-light and full-light displays., Perception, 33(6), 717-46 (2004).

[19] S. Sarkar, P.J. Phillips, Z. Liu, I.R. Vega, P. Grother, K.W. Bowyer, The humanID gait challenge problem: data sets, per-formance, and analysis, IEEE transactions on pattern analysis and machine intelligence, 27(2), 162-77 (2005).

[5] M. A. Giese, T. Poggio, Neural mechanisms for the recognition of biological movements., Nat Rev Neurosci, 4(3), 179-92 (2003). [6] M. Pavlova, W. Lutzenberger, A. Sokolov, N. Birbaumer, Dissociable cortical processing of recognizable and non-recognizable biological movement: analysing gamma MEG activity., Cereb Cortex, 14(2), 181-8 (2004). [7]

A. Y. Johnson, A.F. Bobick, A Multi-View Method for Gait Recognition Using Static Body Parameters, LNCS, 2091, 301-11 (2001).

[8]

Tanawongsuwan R., Bobick A., Gait recognition from time-normalized joint-angle trajectories in the walking plane, Proceedings of IEEE Computer Vision and Pattern Recognition Conference (CVPR 2001), , (2001).

[9]

Collins R., Gross R., Shi J., Silhouettebased Human Identification from Body Shape and Gait, Proc. IEEE Conf FG, , pp. 366-71 (2002).

[10] J. B. Hayfron-Acquah, M.S. Nixon, J.N. Carter, Automatic Gait Recognition by Symmetry Analysis, Pattern Recognition Letters, 24(13), 2175-83 (2003). [11]

Lee L., Grimson W.E.L., Gait Analysis for Recognition and Classification, Proceedings of the Fifth IEEE International Conference on Automatic Face and Gesture Recognition (FGR'02), , (2002).

[12]

N.V.Boulgouris, D. Hatzinakos, K.N. Plataniotis, Gait recognition: a challenging signal processing technology for biometric identification, Signal Processing Magazine, IEEE, 22(6), 78-90 (2005).

[13]

A. Hampapur, L. Brown, J. Connell, A. Ekin, N. Haas, M. Lu, H. Merkl, S. Pankanti, Smart video surveillance: exploring the concept of multiscale spatiotemporal tracking, Signal Processing Magazine, IEEE, 22(2), 38-51 (2005).

[14] M. Turk, Visual interaction with lifelike characters, Automatic Face and Gesture Recognition, 1996 , Proceedings of the Second International Conference on, , 368-73 (1996). [15] Nixon M. S. , Carter J.N., Advances in automatic gait recognition, Automatic Face and Gesture Recognition, 2004 Proceedings Sixth IEEE International Conference on, , pp. 139-44 (2004). [16]

T. Calvert, A. Chapman , Analysis and synthesis of human movement, in Handbook of Pattern Recognition and Image Processing (Academic Press, San Diego, 1994).

© 2006 ACADEMY PUBLISHER

[20] H. Hill, F.E. Pollick, Exaggerating temporal differences enhances recognition of individuals from point light displays., Psychol Sci, 11(3), 223-8 (2000). [21] Das S.R., Lazarewicz M.T., Wilson R.C., Finkel L.H., Motion Features for Gait Recognition and Prediction,Submitted (2005). [22] J. E. Cutting, A program to generate synthetic walkers as dynamic point-light displays, Behav Res Methods Instrum, 10, 91-4 (1978). [23] D. W. Grieve, The assessment of gait., Physiotherapy, 55(11), 452-60 (1969). [24] A. Goswami, A new gait parameterization technique by means of cyclogram moments: Application to human slope walking., Gait Posture, 8(1), 15-36 (1998). [25] Yoo J. -. H., Nixon M.S., Harris C.J., Model-driven statistical analysis of human gait motion, Proceedings of IEEE International Conference on Image Processing, 1, pp. 285-8 (2002). [26] M. Pavlova, A. Sokolov, Orientation specificity in biological motion perception., Percept Psychophys, 62(5), 889-99 (2000). [27] S. R. Das, M.T. Lazarewicz, L.H. Finkel, Insights into Biological Motion Recognition from Principal Component Analysis of Human Gait, Proceedings of BMES 2004, , (2004). [28] A. Daffertshofer, C.J. Lamoth, O.G. Meijer, P.J. Beek, PCA in studying coordination and variability: a tutorial., Clin Biomech (Bristol, Avon), 19(4), 415-28 (2004). [29] N. A. Borghese, L. Bianchi, F. Lacquaniti, Kinematic determinants of human locomotion., J Physiol, 494 ( Pt 3), 863-79 (1996). [30] S. R. Das, R.C. Wilson, M.T. Lazarewicz, L.H. Finkel, Gait Recognition by Two-Stage Principal Component Analysis, Proceedings of the 7th IEEE International Conference on Automatic Face and Gesture Recognition, , (2006). [31] T. Todd, Perception of Gait, J Exp Psychol Hum Percept Perform, 9(1), 31-42 (1983). [32] J. E. Jackson, A User's Guide to Principal Components (John Wiley and Sons, Inc., 1991). [33] Z. Nenadic, B.K. Ghosh, P.S. Ulinski, Modeling and estimation problems in the turtle visual cortex., IEEE Trans Biomed Eng, 49(8), 753-62 (2002). [34] W. J. Krzanowski, Principles Multivariate Analysis (Oxford University Press, 1988).

of

JOURNAL OF MULTIMEDIA, VOL. 1, NO. 5, AUGUST 2006

[35] Yoo J., Nixon M., Harris C., Extracting Gait Signatures based on Anatomical Knowledge, BMVA Symposium on Advancing Biometric Technologies 2002, , (2002). [36] Su H., Huang F., Human Gait Recognition Based on Motion Analysis, Machine Learning and Cybernetics, 2005 Proceedings of 2005 International Conference on, 7, pp. 4464-8 (2005). [37] Boyd J. E. , Video phase-locked loops in gait recognition, Computer Vision, 2001 ICCV 2001 Proceedings Eighth IEEE International Conference on, 1, pp. 696-703 vol.1 (2001). [38] Boyd J. E., Little J.J., Phase in model-free perception of gait, Proceedings of IEEE Workshop on Human Motion, , pp. 3-10 (2000). [39] Rajagopalan A. N., Chellappa R., Higherorder Spectral Analysis of Human Motion, ICIP00, III, pp. 230-3 (2000). [40] Cutler R., Davis L., Robust periodic motion and motion symmetry detection, IEEE CVPR 2000, , (2000). [41] Havasi L., Szlavik Z., Sziranyi T., Pedestrian detection using derived third-order symmetry of legs, ICCVG HL 2004, , (2004). [42] Orrite-Urunuela C., del Rincon J.M., Herrero-Jaraba J.E., Rogez G., 2D silhouette and 3D skeletal models for human detection and tracking, Proceedings of the 17th IEEE International Conference on Pattern Recognition, 4, pp. 244-7 (2004). [43] Cattin P.C., Biometric Authentication system Using Human Gait, in Thesis, Swiss Federal Institute of Technology (2002). [44] Li H., Shi C., Li X., Lle Based Gait Recognition, Machine Learning and Cybernetics, 2005 Proceedings of 2005 International Conference on, 7, pp. 4516-21 (2005). [45] J. J. Hopfield, C.D. Brody, What is a moment? "Cortical" sensory integration over a brief interval., Proc Natl Acad Sci U S A, 97(25), 13919-24 (2000). [46] S. R. Das, M.T. Lazarewicz, R.C. Wilson, L.H. Finkel, Sensitivity to motion features in upright and inverted point-light displays [Abstract], J Vis, 6(6), [47] Wilson R.C., Das S.R., Finkel L.H., Motion as Shape: A Novel Method for Recognition and Prediction of Biological Motion, Proceedings of British Machine Vision Conference, 2006. Sandhitsu R. Das received his B.Tech. and M.Tech. degrees from Indian Institute of Technology, Kanpur, India in 1997 and 1999 respectively and the Ph.D. degree from University of Pennsylvania, Philadelphia in 2006. Currently he is a postdoctoral fellow at the University of Pennsylvania. His research interests are human and machine vision, object recognition, and medical image processing. Robert C. Wilson obtained B.A. and M.Sci. degrees in Chemistry from Cambridge University in 2002 and an M.S. in Bioengineering from the University of Pennsylvania in 2003. Since then he has been working towards his Ph.D. in Bioengineering at the University of Pennsylvania investigating biologically plausible neural networks for gait recognition and Bayesian inference. © 2006 ACADEMY PUBLISHER

17

Maciej T. Lazarewicz received his M.D. in 1999 and M.S. in Mathematics in 1995 from the University of Gdansk in Poland. He held a postdoctoral appointment in computational neuroscience at the George Mason University with Giorgio Ascoli, and is currently a Research Associate at the University of Pennsylvania. His current research combines theoretical and experimental approaches to investigate dynamical processing in neural circuits and behavior. Specific interests include biological motion recognition, schizophrenia, attention, and cortex circuitry. Leif H. Finkel is a Professor of Bioengineering at the University of Pennsylvania interested in computational modeling of cortical circuitry in normal perception and disease states including schizophrenia and epilepsy. He received an MD and PhD from the University of Pennsylvania, then worked with Dr. Gerald Edelman on the theory of Neural Darwinism at Rockefeller University. The Neuroengineering Research Laboratory at Penn carries out modeling and translational studies directed at a range of perceptual, cognitive and clinical aspects of brain disease. Further information is available online at www.neuroengineering.upenn.edu.

Suggest Documents