Fuzzy Shape Matching with Boundary Signatures

0 downloads 0 Views 827KB Size Report
regular polygons: Diamond, Pentagon, Hexa- gon, Heptagon, Octagon, Nonagon and Deca- gon. 2. Simple Geometric Shapes. These shapes in- clude Square ...
Mustafa, Adnan A. “Fuzzy Shape Matching with Boundary Signatures”. Pattern Recognition Letters, Volume 23, Issue 12, October 2002, 1473-1482.

Fuzzy Shape Matching with Boundary Signatures

1

Fuzzy Shape Matching with Boundary Signatures Adnan A. Y. Mustafa Kuwait University, Department of Mechanical Engineering, P. O. Box 5969 - Safat, Kuwait 13060

Email: [email protected]

Abstract. In this paper we present a fuzzy approach to shape identification based on matching boundary signatures. Boundary signatures are surface feature vectors that reflect the probability of occurrence of a boundary feature for a surface (or an object). Four types of surface boundary signatures are deployed for matching; The Angle Boundary Signature, the Curvature Boundary Signature, the Distance Boundary Signature and the Parameter Boundary Signature. These four signatures are constructed based on local and global geometric shape attributes of the boundary. Matching of objects to models is accomplished by the boundary signature error which is a weighted sum of the four boundary signatures errors obtained by comparing the object’s signatures to a model’s signatures. A fuzzy approach is then employed to evaluate the performance of the matching. Tests conducted on simulated shapes as well as on shapes extracted from real image scenes are reported. Keywords: Model-based vision

Signatures

1 Introduction An important aspect of any robust object recognition system is that the system should not fail when partial object information is missing. A typical example, is the case of object occlusion where only a part of the object is visible to the camera (or vision sensor). Here, the shape of the object and its boundary are no longer similar to its original shape and hence more difficult to correctly match and identify. Such a case usually results in object mismatch and incorrect object hypotheses resulting in incorrect recognition. However, depending on how much boundary is missing the object may still retain important features of its boundary that matching using signatures can still produce correct hypotheses. In our previous work, Mustafa (1996) (1999), we introduced surface signatures as robust feature descriptors for surface matching. A surface signature is a feature vector that reflects the probability of occurrence of a surface feature on a given sur-

Shape

Matching

face. Surface signatures are scale and rotation invariant. We showed that by using surface signatures correct identification was possible under partial occlusion and shadows. Two types of surface signatures were previously employed; surface curvature signatures and surface spectral signatures, which statistically represented surface curvature features and surface color features, respectively. We recently introduced surface boundary signatures in Mustafa (2001a) (2001b) as an extension to our surface signature formulation. Four types of surface boundary signatures were deployed for matching; The Angle Boundary Signature, the Curvature Boundary Signature, the Distance Boundary Signature and the Parameter Boundary Signature. These four signatures were constructed based on local and global geometric shape attributes of the boundary. The performance of each the boundary signatures was tested and reported. In this paper we continue our shape matching work by introducing the boundary signature error which measures the discrepancy between different

Fuzzy Shape Matching with Boundary Signatures

objects based on the four boundary signatures defined above. We then evaluate our matching results using a fuzzy approach that not only measures how well an object matches to its correct hypothesis but also to all models of the database. Testing was conducted on simulated shapes as well as shapes extracted from real image scenes. The remainder of this paper is divided into five sections; Section 2 presents a brief review of related literature, section 3 briefly describes the concept of boundary signatures and presents the four boundary signatures, section 4 describes the matching procedure employed, section 5 presents the results of tests conducted and the paper concludes with section 6.

2 Literature Review The literature on shape analyses methods is vast (see Loncaric (1998)). Due to space limitations we present a sample of recent papers that are relevant to our work. Hong (1998) presented an indexing approach to 2-D object description and recognition that is invariant to rotation, translation, scale, and partial occlusion. The scheme is based on three polygonal approximations of object boundaries where local object structural features (lines and arcs) are extracted. Ozcan and Mohan (1997) presented a computationally efficient approach which utilizes genetic algorithms and attributed string representation. Attributed strings were used to represent the outline features of shapes. Roh and Kweon (1998) devised a contour shape signature descriptor to the recognition of planar curved objects in noisy scenes. The descriptor consisting of five-point invariants was used to index a hash table. Nishida (1998) proposed an algorithm for matching and recognition of deformed closed contours based on structural features. Nishidi (2001) later presented a structural feature indexing method for retrieval of model shapes that have parts similar to a given query shape, but is sensitive to noise, scale and shape deformations. The contours are described by a few components with rich features. Shaked (2001) presented a weak-affine re-sampling method for polygonal approximations of smooth curves and proposed a signature scale space matching scheme for recognizing objects.

2

Mokhtarian et. al (1996a) used the maxima of curvature zero-crossing contours of curvature scale space image as a feature vector to represent the shapes of object boundary contours which was at

p+

i

d p j

Fig. 1. The feature vectors extracted from the boundary.

the core of the object recognition system described in (1996b). They later extended their work for cases of shallow concavities in Abbasi et. al (2000). Kovalev and Petrou (1996) extracted features from co-occurrence matrices containing description and representation of some basic image structures. The extracted features express quantitatively the relative abundance of some elementary structures. Kovalev et. al (1998) later used the relative distance and relative slope orientation histograms to extract three features for class discrimination of objects. In our work we employ four types of boundary signatures that are constructed based on local and global geometric shape attributes of the boundary. While some similarities exist between our approach and that of Kovalev et. al, ours differ from theirs in several aspects, particularly, 1) our signatures do not only measure boundary distances and slope curvature but also measure boundary bending and boundary shape orientation offset, 2) rather than matching based on extracted features from the signatures our approach is more comprehensive in that we compare the total signature profile using the metrics described in Mustafa (1996), and 3) we use a fuzzy matching approach to matching and identification.

3 Boundary Signatures The boundary ( ) of any surface (or object) consists of a finite number of an ordered sequence of points ( ) that define the shape of the surface (or object) see Fig. 1,

Fuzzy Shape Matching with Boundary Signatures

= (xi ,yi), i = 0, …, N

(1)

tween the pair of points (dij) to the parametric distance between the pair of points (pij), (3) ij = ( i, j) = dij / pij

is closed, i.e. 0 follows N-1 has a single point thickness (i.e. has been thinned) is traversed in a counter-clockwise sense (the object is to the left) does not contain any internal holes

Since is closed, two different values of p exist between any pair of points, one in the counter+ clockwise direction ( pij ) and the other in the

={

i

1}

Several assumptions are made about • • • •

3

:

Surface boundary signatures are constructed from features extracted from the boundary. A surface signature is a feature vector that reflects the probability of occurrence of the feature for a given surface. In general, if S denotes a surface signature of size N, and Si denotes the ith component of S, then by definition Si = 1. We employ four types of surface shape signatures or surface boundary signatures: the Angle Boundary Signature (SAB), the Curvature Boundary Signature (SCB), the Distance Boundary Signature (SDB) and the Parameter Boundary Signature (SPB). These four signatures are constructed based on local and global geometric shape attributes of the boundary ( ) as described below. All four signatures SDB, SAB, SCB and SPB are rotation, translation and scale invariant.

We extract four types of features from an object’s boundary that characterize the general shape of an object (see Fig. 1): 1. The Boundary Inter-distance This represents the distance (dij) between a pair of points, j and i, on and is given by,

where i and respectively.

j

(x j

distances is taken as the parameter distance between the two points. Hence, + (4) pij = min( pij , pij ) producing, ij

xi ) 2 + ( y j

yi ) 2 (2)

have coordinates (xi ,yi) and (xj ,yj),

2. Chord Bending Chord bending ( ) represents the amount of parameter bending between a pair of points on . For any pair of points, i, j , the chord bending is defined as the ratio of the distance function be-

=

d ij

(5)

min( pij+ , pij )

3. Local Curvature The local curvature ( i ), represent the amount of local boundary bending by computing the local angle of the tangent, -1 i = cos ( v i 1 ( v i )) sign( v i 1 v i ) (6) where

vi = v i = ( xi + 1

3.1 Boundary Features

dij = d( i, j ) =

clockwise direction ( pij ). The shortest of the two

vi |vi |

xi ) i + ( yi + 1

(7)

yi ) j

(8)

4. Relative Orientation The relative orientation ( ij) of a point j with respect to i, is given by, ( i, j) = arctan(yj yi , xj xi) (9) The boundary features are calculated for every pair of boundary points producing a feature matrix for every feature type, except for the local curvature which is calculated at every boundary point. All matrices are symmetric. These features can be effectively used for recognition with occlusion. For example, plots of the distance matrix d for a circle (dcircle) and an ellipse (dellipse) are shown in Fig. 2. For illustrative purposes, each plot is shown from two viewpoints; a side view and a top view. Because of the symme-

Fuzzy Shape Matching with Boundary Signatures

try inherit in the shape of a circle, dcircle has the unique feature of having the only profile containing diagonal lines with constant values. As a circle is stretched in a given direction evolving into an ellipse, dcircle looses its symmetry, producing dellipse as illustrated. In general, when a portion of an object’s boundary is missing d will change from its original profile. However, if the amount of boundary missing is insignificant then d will not change by much. This is illustrated in Fig. 3 which shows three variations of a circle with different amount of its original boundary missing along with their corresponding d. We see that as the amount of boundary missing increases, the similarity of d to its original d decreases. Let denote the percentage of boundary missing. When = 0.1, dcircle_0.9 is very similar to dcircle. Even at = 0.5, dcircle_0.5 is very similar to dcircle, implying that the shape can be identified as being a (part of a) circle or at least having circular shape by using d. It is this similarity in d and the other boundary features described in this paper that we exploit in our work to arrive at successful matching when partial boundary information is missing.

4

piece-wise functions which are then combined to -1

obtain f (Mustafa (2000)). The inverse distance function is subsequently normalized with respect to the maximum inverse distance value producing the ~ normalized inverse distance function ( d ), ~

d ( u)

d ( u)

=

(12) max( d (u)) The Distance Boundary Signature (SDB) which represents the frequency of occurrence of the normalized Euclidean distance between a pair of points for a given distance, is then simply obtained by, d ~ SDB = (13) d (u) du Because of the discrete nature of images, we construct the signature by,

(

S = where MDB size 1/MDB.

~

)

~

(14) d ((i+1)/MDB) d (i/MDB) is the number of bins of SDB with bin

3.2 The Distance Boundary Signature Let the distance function (dij) between any pair of points, j and i, on be given by Eq. (2) above. If the distance function is normalized with respect to the maximum distance, a metric that is scale invariant is obtained, d( i , j ) (10) d( i , j ) = max(d ( i , j )) Let i.e.

d

denote the inverse distance mapping of -1

(u) = f ( d ( i , j ) )

(11)

d

where u

d . Since

d,

d is multi-modal, its inverse

-1

function f is not mathematically a function. However, since we are only interested in constructing a -1

d -i.e. its signature, f is con-1 structed by segmenting d at all local maximum statistical profile of

points and all local minimum points producing Nd

Fig. 2. Plots of d: for a circle; dcircle (top) and an ellipse with eccentricity 0.995; dellipse (bottom)

3.3 Other Signatures The development of the remaining signatures follows a similar analysis to that described above. The Parameter Boundary Signature (SPB) represents the frequency of occurrence of the chord bending between a pair of boundary points at a given value. Chord bending is defined as the ratio of the distance between a pair of boundary points to the parameter distance between the pair of points. The Curvature (or Tangent) Boundary Signature (SCB)

Fuzzy Shape Matching with Boundary Signatures

represents the frequency of occurrence of the local boundary curvature at a point on at a given angle. The Angle (or Direction) Boundary Signature (SAB) represents the frequency of occurrence of a pair of boundary points oriented relative to each other at a certain angle.

5

4.1 The Boundary Match Error As described above, the differences between an object (o) and a model (m) based on shape attributes is measured by matching their four boundary signatures (SAB, SCB, SDB and SPB) which measure the differences (i.e. errors) based on the four boundary features: boundary angle, boundary curvature, boundary distance and boundary bending. These errors are combined to produce the object boundary match error (EB),

EB (o, m) = E B WBT

(15) where EB = (EAB ECB EDB EPB), WB = (wAB wCB wDB wPB). EAB, ECB, EDB and EPB are the angle, curvature, distance and perimeter boundary signature match errors, respectively, and wAB, wDB, wPB and wCB are the error weights for the respective signature errors. Fig. 3. Surface plots of the distance matrix (d) for a circle with different amount of its border missing (from top to bottom): dcircle_0.9 ( = 0.1), dcircle_0.7 ( = 0.3) and dcircle_0.5 ( = 0.5).

4 Signature Matching Matching between objects to models is accomplished by comparing their boundary signatures. The signature match error (E) for each boundary type is calculated by comparing the object’s signature to the model’s signature based on the four error metrics described in Mustafa (1999). These metrics compare the signature profiles based on data distance, variance, spread and correlation. The four error metrics are then combined to form the signature error, which gives a weighted signature error. We will refer to the correct model that an object should match to as the model-match.

4.2 The Fuzzy Match Error and the Fuzzy Match Efficiency Rather than evaluating our matches based on a correct hypotheses of the model-match, we will use a fuzzy approach to matching. In this approach, the degree of matching of an object to each model in the database is taken into consideration in evaluating the goodness of the match. This is accomplished by constructing fuzzy membership sets for all models in the model database which reflect the joint similarity between every pair of models. Fig. 4 shows fuzzy membership for Circle. The fuzzy match error (Ef) measures the amount of error in matching an object to a model based on its fuzzy membership value. Given the model-match of object o is model m*, then the

Fig. 4. Circle Membership values for several objects: from left to right: Circle = 1.000, Decagon = 0.959, Nonagon = 0.956, Octagon = 0.913, Heptagon = 0.870, Star10 = 0.791, Hexagon = 0.778, Pentagon = 0.702, Star9 = 0.570 and Diamond = 0.174.

Fuzzy Shape Matching with Boundary Signatures

6

fuzzy match error of object o is,

E f ( o, m*) =

M

(1 CI ( o, mi ))

i =1

(mi , m*)

(16) where CI(o,mi) is the model percentile of matching o to mi. CI is defined as the signature matching error percentile of matching model mi to object o and is calculated by, CI(mi,o) = (r(mi,o)

1) / (M

1)

(17)

where r(m,o) is the matching rank of the model m when matched to object o, and M is the number of models in the database. Hence, a CI value of n% for a particular model indicates that n% of the models produced lower matching errors than this particular model. (mi,m*) is the fuzzy membership value of mi belonging to m*. The joint fuzzy membership value (mi,mj) for all models are calculated off-line in a pre-matching phase. The fuzzy match efficiency ( f) measures how well the fuzzy match error is with respect to its two extreme matches, the best possible case and the worst case, f

(o,m*) = (Ef(o,m*) (Ef,max(o,m*)

Ef,min(o,m*)) / Ef,min(o,m*))

(18)

where Ef,max(o,m*) and Ef,min(o,m*) are the maximum and minimum possible fuzzy match errors between o and m*.

Fig. 5. Database models. From top to bottom and left to right. (Top row): Diamond, Pentagon, Hexagon, Heptagon, Octagon, Nonagon and Decagon. (Second Row): N, ZL, H, Arrow, E, 2, T and K. (Third Row): L, C, Gamma, Square, Rectangle, Triangle, Equilateral Triangle, Bilateral Triangle and Plus. (Fourth Row): Sun-4, Sun-5, Sun-6, Sun-7, Sun-8, Sun-9, Sun-10, Sun15 and Sun-20. (Bottom Row): Star-5, Star-6, Star-7, Star-8, Star-9, Star-10, Circle and Ellipse. the overall percentage of correct hypotheses found for a given scene set. The scene recognition efficiency ( ) is defined as the efficiency of matching a surface to its model-match surface for a given surface.

5 Experimental Results A model database was constructed for 41 common shapes as shown in Fig. 5. These models can be categorized into 5 groups:

4.3 Recognition Rate and Recognition Efficiency 1. We define the signature recognition rate ( S) as the percentage of correct hypotheses found for a given set using a particular signature type. The signature recognition efficiency ( S) is defined as the efficiency of a signature in matching a surface to its model-match surface for a given surface and is calculated by, = (1

(19) AMCI)·100 % where AMCI is the average model-match percentile of a set. The overall scene recognition rate ( ) is

2.

3.

Regular Polygons. This set consists of seven regular polygons: Diamond, Pentagon, Hexagon, Heptagon, Octagon, Nonagon and Decagon. Simple Geometric Shapes. These shapes include Square, Rectangle, Triangle, Equilateral Triangle, Bi-lateral Triangle, Circle and Ellipse. Star Shapes. These are ‘star’-like shapes with the number of protrusions varying from 5 to 10; Star-5, Star-6, Star-7, Star-8, Star-9 and Star-10.

Fuzzy Shape Matching with Boundary Signatures

Fig. 6. Boundary signatures for model C. Top to bottom and left to right. SAB, SDB, SPB, SCB

4.

5. 6.

Sun Shapes. These are ‘sun’-like shapes with the number of protrusions varying between 4 and 20; Sun-4, Sun-5, Sun-6, Sun-7, Sun-8, Sun-9, Sun-10, Sun-15 and Sun-20. Letters. These shapes include a sample of alphabet letters: N, H, E, 2, T, L, C and K. Symbols. These include a sample of symbols: Arrow, Gamma, Plus and ZL.

Several models share great shape similarities. For example, the larger polygons, Octagon, Nonagon and Decagon are similar to a Circle. Also, a Diamond is similar to a square and Star-10 is to a large extent similar to a Circle. The star shapes are, in general, similar to the sun shapes when the number of protrusions are small but differ greatly when the number of protrusions are large. The model database can be easily extended to include more complicated objects. Future research will concentrate on adding more complex high curvature shapes (e.g. a banana shape). Boundary signatures for all models were constructed off-line and stored in the model database. Boundary signatures for a sample model is shown in Fig. 6. Tests were conducted on objects of different shapes and sizes. All test objects extracted from image scenes were initially pre-processed to assure that all boundary assumptions (section 3) are satisfied. Particularly, all boundaries must be connected and closed. This is achieved by filling any boundary gaps by straight line segments. The first series of tests were conducted to evaluate the per-

7

Fig. 7. Sample objects

formance of each of the four boundary signatures. The second test was conducted on real image scenes to evaluate the matching performance on real image scenes.

5.1 Simulated Shapes Images consisting of simulated shapes were generated for these tests. The simulated objects are of random sizes and orientations. Two sets of objects were generated; The first set consisted of objects that are completely visible (Fig. 7) while the second set consisted of incomplete objects simulating occlusion (Fig. 8). Boundary signatures for these objects were constructed and used for the subsequent matching stage. We present a brief report on the performance of the signatures here, a complete report can be found in Mustafa (2001). 5.1.1 Matching Complete Objects Testing was done on 135 instances of the models at random scales and rotations. SDB produced the best signature recognition rate with an excellent value of ( DB = 91.9%), followed by SPB ( PB = 63.7%) while SCB and SAB produced weak results ( CB = 15.6% and AB = 8.1%). SDB and SPB had excellent signature efficiencies ( DB = 99.1% and PB = 91.1%) while SAB and SCB had very poor efficiencies ( AB and CB < 60%). These results clearly indicate that the distance and chord bending features are

Fuzzy Shape Matching with Boundary Signatures

8

The signature boundary error produced = 27.1% and = 76%. f was in the range 0.0451.000, with an average value of f = 0.779. Here 47.1% of the objects produced f > 0.9, 71.4% of the objects produced f > 0.8 and only 14.3% of the objects produced f < 0.5. If only objects with < 0.3 are considered then the results improve greatly. In this case, f are in the range 0.8011.000, with an excellent average value of 0.954. Fig. 8. A sample of the incomplete objects

more important in distinguishing between different objects than the other features. Matching based on the signature boundary error produced a low = 63% but an excellent = 92.6%. Recall that these results assume a single correct hypotheses and do not take into account how well the object matched with all models. The fuzzy match error which takes into account the overall matching with respect to all models produced fuzzy match efficiencies ( f) in the range 0.411-1.000, with an excellent average fuzzy match efficiency of 0.908. In fact, 68.1% of the objects produced excellent fuzzy match efficiencies ( f > 0.9), 84.4% of the objects produced strong fuzzy match efficiencies ( f > 0.8) and only 3% of the objects produced weak fuzzy match efficiencies ( f < 0.5). 5.1.2 Matching Incomplete Objects 41 incomplete objects were tested with varying from 0.12 to 0.61 with an average value of = 0.37 (recall that denotes the percentage of boundary missing). In the total range of examined, it was found that all signatures were found to have weak signature recognition rates ( DB = 19.5%, PB = 14.6%, CB = 7.3% and AB = 2.4%). The signature efficiencies varied from = 53.6% to = 78.4% ( PB = 78.4%, DB = 76.0%, CB = 65.9% and AB = 53.6%). However, the performance of the signatures improved greatly when objects were limited to those with < 0.3. In this case, it was found that the performance of SDB and SPB greatly improved ( DB = 66.7%, PB = 16.6%, DB = 92.3% and PB = 77.7%).

5.2 Real Images Fig. 9 and Fig. 10 show two real image scenes, the House Scene and the Coin Scene, that were used for testing. The House Scene consists of a house under construction, with several visible windows while the Coin Scene, consists of many coins, some of which are completely visible while many others are only partially visible due to occlusion. Image segmentation was performed by applying a Kirsch edge operator and discarding boundaries smaller than a pre-defined threshold. The House Scene, consists of a house under construction. 13 objects were extracted from the House Scene as shown in the extracted objects image. These objects are the windows appearing in the scene which are either rectangular or square in shape. Object matching was then performed against the model-database. This produced only four correct hypotheses ( = 30.8%). However, the signature efficiency was very good with a value of = 83.1%. The square windows matched much better than the rectangular windows which is due to the fact that the rectangular windows have a different length/width ratio than the rectangular model (length/width = 2). Evaluating our match based on the fuzzy efficiency produced f in the range 0.740 - 0.932, with an average value of 0.837. Here 30.8% of the objects produced f > 0.9 and 61.5% of the objects produced f > 0.8. The lowest fuzzy match efficiencies were for the small distorted windows (objects #6, #7 and #14) where f was between 0.740 - 0.772 for these objects. Object #5 which had part of its boundary distorted due partial occlusion produced f = 0.794. 16 objects were extracted from the Coin Scene as shown in its extracted objects image. Object

Fuzzy Shape Matching with Boundary Signatures

matching was then performed against the modeldatabase. Matching for the Coin Scene based on the signature boundary error produced = 27.1% and = 76%. f had an average value of f = 0.825. Here 62.5% of the objects produced f > 0.9 and only 12.5% of the objects produced f < 0.5. The lowest fuzzy match efficiency was for object #14 ( f = 0.084) which has about 60% of its boundary missing ( = ~0.6) followed by object #12 ( f = 0.390) with = ~0.5, all remaining objects had f > 0.7. If only objects with < 0.3 are considered, then f increases to 0.935.

6 Conclusion In this paper we have introduced the boundary signature error which is a weighted sum of the four boundary signature errors. This error was used to match objects to models. A fuzzy approach was employed in our matching strategy that not only measured how well the object matched to its correct hypothesis but also to the remaining models of the database. This was accomplished by introducing the fuzzy match error (Ef) which measures the amount of error in matching an object to a model based on its fuzzy membership value and the fuzzy match efficiency ( f) which measures how well the fuzzy match error measures with respect to all models. Testing conducted on simulated shapes with random sizes and rotations produced excellent results with an average value of f > 0.90. When partially visible objects were tested the average value of f decreased to 0.78; However, when the object retains at least 70% of its boundary the average value of f increases to more than 0.90. Tests on real images with occlusions and distortions produced strong matching results with f greater than 0.82. If only objects that retain at least 70% of their boundaries are considered then the average value of f increases to more than 0.90. Increased matching efficiency can be further achieved by expanding the model database to include additional models. For instance, having a rectangle with a length-width ratio of about 6 will greatly enhance the matching performance of the system with respect to the House Scene. Future research will concentrate on recognition of complex high curvature shapes (e.g. a banana

9

shape) as well as exploring other boundary features that can enhance the matching efficiency of the system. Acknowledgment. The author would like to thank the Kuwait University Research Administration for providing financial support for this research (KUGrants EM-109 and EM-112).

References Hong, D., Sarkodie-Gyan, T., Campbell, A. and Yan, Y. 1998. “A Prototype Indexing Approach To 2-D Object Description and Recognition”. Pattern Recognition, Vol. 31, No. 6, pg. 699725. Kovalev, V. and Petrou, M. 1996. “Multidimensional Co-occurrence Matrices for Object Recognition and Matching”, GMIP, V58 (3), pp. 187-197. Kovalev, V. and Petrou, M., and Bondar, Y. 1998. “Using Orientation tokens for Object Recognition”, Pattern Recognition Letters, 19, pg. 1125-1132. Loncaric, S. 1998. “A survey of shape analysis techniques”. Pattern Recognition, Vol. 31, No. 8, pg. 983-1001. Mokhtarian, F. 1996a. “Silhoutte-Based Object Recognition with Occlusion through Curvature Scale Space”. ECCV, Cambridge, UK, pg. 567578. Mokhtarian, F., Abbasi, S and Kittler, J 1996b. “Robust and Efficient Shape Indexing through Curvature Scale Space”. BMVC, Edinburgh, UK, pg. 53-62. Mokhtarian, F., Abbasi, S and Kittler, J. 2000. “Enhancing CSS-based Shape Retrieval for Objects with Shallow Concavities”. Image and Vision Computing, 18, pg. 199-211. Robust and Efficient Shape Indexing through Curvature Scale Space”. BMVC, Edinburgh, UK, pg. 53-62. Mustafa, A. A., Shapiro, L. G. and Ganter, M. A. 1996. “3D Object Recognition from Color Intensity Images”. In the 13th International Conference on Pattern Recognition, Vienna, Austria, pp. 627-631, August 25-30.

Fuzzy Shape Matching with Boundary Signatures

10

Fig. 9. The House image (left) and its extracted objects (right).

Fig. 10. The Coin image (left) and its extracted objects (right).

Mustafa, A. A., Shapiro, L. G. and Ganter, M. 1999. “3D Object Identification with Color and Curvature Signatures”. Pattern Recognition, Vol. 32, No. 3, pg. 1-17. Mustafa, Adnan A. 2000. “Object Matching using Surface Boundary Signatures”. Technical Report #EM-112R1, Department of Mechanical Engineering, Kuwait University, June. Mustafa, Adnan A. 2001a. “Object Matching using Shape Surface Signatures”. SPIE conference on Machine Vision Applications in Industrial Inspection IX, San Jose, Ca, Jan. 22-23. Mustafa, Adnan A. 2001b. “Matching Incomplete Objects using Boundary Signatures”. 4th International Workshop on Visual Form, Capri, Italy, May 28-30. Nishida, H. 1998. “Matching and Recognition of Deformed Closed Contours Based on Structural Transformation Models”, Pattern Recognition, Vol. 31, No. 10, pg. 1557-1571.

Nishida, H. 2001 “Robust Structural Indexing Through Quasi-Invariant Shape Signatures and Feature Generation”. Proc. of the 4th International Workshop on Visual Form, Capri, Italy, pp. 696-705. Ozcan, E. and Mohan, C. 1997. “Partial shape matching using genetic algorithms”. Pattern Recognition Letters, V 18, pg. 987-992. Roh, K. and Kweon, I. 1998. “2D Object Recognition Using Invariant Contour Descriptor And Projective Refinement” Pattern Recognition, V 31, N 4, pg. 441-455. Shaked. D. 2001. “Invariant Signatures from Polygonal Approximations of Smooth Curves”. Proc. of the 4th International Workshop on Visual Form, Capri, Italy, pp. 451-460.

Suggest Documents