Document not found! Please try again

Evolving Fuzzy Modeling for MANETs Using Lightweight Online ...

3 downloads 387 Views 361KB Size Report
The fuzzy clustering is exploited for the purpose of learning fuzzy inference rules online. That calls for one-pass Lightweight Evolving Fuzzy Clustering Method ...
Int J Wireless Inf Networks (2010) 17:34–41 DOI 10.1007/s10776-010-0114-0

Evolving Fuzzy Modeling for MANETs Using Lightweight Online Unsupervised Learning Anna Lekova

Received: 10 March 2009 / Accepted: 24 February 2010 / Published online: 19 March 2010  Springer Science+Business Media, LLC 2010

Abstract The study presents a methodology for evolving fuzzy modeling tasks in Mobile Ad hoc Networks (MANETs) based on distributed data-driven fuzzy clustering and reasoning. The fuzzy clustering is exploited for the purpose of learning fuzzy inference rules online. That calls for one-pass Lightweight Evolving Fuzzy Clustering Method (LEFCM) suitable for deploying on mobile devices with constrained resources in MANETs. There is no standard method to determine the optimal number of fuzzy rules and most of the fuzzy systems still apply the trial and error method, unsuitable for online modeling tasks. The proposed methodology addresses the issues of uncertainties, simplicity and speed to run in non-intrusive way. It estimates online the number of clusters and their centers in the input data space, accordingly the fuzzy rules, by online adaptation of the LEFCM threshold value that affects the number of clusters. Adaptation is based on the combination of geometrical and statistical analyses, as well as on incorporating a multidimensional fuzzy membership degree into the clustering process. The proposed LEFCM is proven by using traditional cluster validity indexes and tested on real data sets. Keywords Evolving systems  Data-driven fuzzy models  Online evolving clustering  Mobile Ad hoc networks  Unsupervised learning  Knowledge and data integration

A. Lekova (&) Institute of Control and System Research, Bulgarian Academy of Sciences, Acad. G. Bonchev str., bl. 2, 1113 Sofia, Bulgarian e-mail: [email protected]

123

1 Introduction Mobile Ad hoc Networks (MANETs) are multi-hop wireless networks without fixed infrastructure formed by mobile nodes. Due to the dynamic nature of the network topology and restricted resources, quality of services and stability routing are MANETs main concerns. If we learn online the incoming spatial and temporal context we might adapt MANETs services dynamically and improve MANETs performance. A key factor for reliable performance is the manner in which service protocols adapt to route changes caused by mobility. The problem complexity and the lack of knowledge about the functional dependences over time to future mobility states make questionable the usage of analytical modeling. We focus on fuzzy logic systems [1], since they have the capability to work on incomplete data and provide the ability to introduce some flexibility in modeling mobility. In our previous studies we used offline fuzzy C-mean clustering as a data mining technique to extract group fuzzy models from the MANETs mobility traces recorded over simulations of events during emergency and rescue scenario. Continuing our study for another rescue scenario [2] calls for intelligent models that should be able to evolve as they operate. A new paradigm of the evolving computational intelligence systems (ECIS) through the prism of the knowledge and data integration approach is introduced in [3]. Evolving systems develop their structure and functionality dynamically from a continuous input data stream in online and adaptive mode. ECIS use different online clustering methods [4, 5] as a well known technique for unsupervised learning. Evolving clustering approaches do not need the number of clusters to be pre-specified. Their algorithms are one-pass than batch-mode, fast and do not keep any

Int J Wireless Inf Networks (2010) 17:34–41

information of passed samples, therefore they are suitable to be placed on mobile nodes not wasting resources during the run of MANET. We focus on fuzzy clustering [6–8], since the clusters we are searching for, are not well-defined and possess smooth boundaries. The membership to a cluster is real value in the range [0–1]. Algorithms for realtime clustering and generation of rules from data are presented in [7, 8], however their feasibility have not been tested in Ad-hoc and wireless sensor networks. Many works in MANETs report self-organization and intelligence. Most of them rely on fixed rule-base or neural networks trained off-line and do not adapt to the environment. Often used Intelligent Agents exploit reinforcement learning. However, the time-consuming feedback from the environment is not a suitable paradigm for adaptive modeling of continuous dynamic processes in MANETs. A review of fuzzy models in MANETs is presented in [9]. Only one performs adaptive, online knowledge and data integration [10]. Since, we seek for more simplicity in the fuzzy structure identification, we have developed one of the most lightweight distancebased Evolving Clustering Method (ECM) [4]. We enhanced it to be used from MANETs service protocols or applications in high dynamism and restricted resources. The contribution of the proposed study is a methodology for evolving fuzzy modeling tasks in MANETs based on data-driven fuzzy clustering and reasoning. A lightweight online evolving fuzzy clustering method (LEFCM) is used as a technique to automatically identify the fuzzy rule base. LEFCM enhances the evolving distance-based connectionists clustering method of Kasabov and Song [4] to fit the above mentioned MANETs concerns by addressing the issue of simplicity and speed to run in non-intrusive way, uncertainties and online estimation of the number of clusters and their centers in the input data space. We propose a novel multidimensional membership function and incorporate fuzzy membership degrees and a reference center for each cluster into the clustering process. The reference center is the statistical arithmetic mean to which new input vectors are referred in term of distance. The key idea is if the input example is close to the reference center and has high multidimensional membership degree to a cluster—then the ECM distance threshold is set up to produce cluster updating instead of new cluster creation. The work is organized as follows: In Sect. 2 we describe our enhancements of ECM and the fuzzy reasoning model. In Sect. 3 we give a brief overview of the used data sets and validation of the proposed methodology, followed by discussing the results. In Sect. 4 we describe related works and conclusions follow.

35

2 Evolving Fuzzy Modeling In this section we describe our enhancements of ECM [4] with an example in 3D space. The fuzzy reasoning model is given in Sect. 2.3. 2.1 Kasabov and Song Evolving Clustering Method In the online clustering process, the given data set consists of input vectors (examples) X = {x1,…, xp} which are p points in q-dimensional space, x [ Rq. Examples come from a data stream one by one and the algorithm starts with an empty set of clusters. When a new cluster C k is created (k is an index related to the time instant), the current input vector is assigned to be a cluster center (Cc) and its cluster radius (Ru) is initially set to zero. Then this cluster is updated or new cluster is created depending on a distance threshold value (Dthr) that affects the number of clusters. The background of the ECM algorithm ‘step by step’ is described in [4]. Although the algorithm does not keep any information of passed examples, finding the optimal number of clusters, i.e., optimal Dthr value, needs an off-line optimization in a batch-mode. This is done by incrementing the value of Dthr in the range of 0–1 by a small step of 0.01 or 0.02 and computing Xie-Beni index [11]. The number of clusters corresponding to the minimum value of the Xie-Beni index indicates the optimal number of clusters. However, in MANETs with constrained resources this methodology is not relevant, since we can neither keep any information for passed examples nor perform any complex validity-index computations. 2.2 Overview of the Lightweight Evolving Fuzzy Clustering Method As it seen in Sect. 2.1, the maximum distance from any cluster center to the examples that belong to this cluster is not greater than the distance threshold value Dthr. Therefore, Dthr affects the number of clusters by creating new clusters or updating existing ones after changing their centers’ positions and increasing their radii. Estimating the number of clusters online means online adaptation of Dthr. The key idea is if the input example is close to a cluster and its reference center—then Dthr is set up to produce cluster updating instead of new cluster creation. We incorporate the fuzzy membership degrees and a distance to reference center for each cluster into the clustering process. Input vector is bound to each cluster by means of a membership degree (Md), which is a number between 0 and 1, and makes clustering more accurate in case of overlapping clusters. A novel multidimensional membership function

123

36

Int J Wireless Inf Networks (2010) 17:34–41

(Eq. 1) is proposed for obtaining the degree of membership between data and cluster centers. 2 Þ ðm1  xi  Ccj min  Mdij ¼ 1  ; 2 n  Þ ðm1 P xi  Ccj 

x 2 Rq

ð1Þ

j¼1

    where xi  Ccj  ¼ 1; when xi  Ccj  [ 1 m 2 ½1; 1Þ is the weighted exponent coefficient which determines how much clusters may overlap. We set m to be equal to 2. The idea of Eq. 1 is that Md for the current input vector to each cluster depends on its Md to all clusters. In other words, if an input is close to two or more clusters, its Md is lesser than if the input is close only to one cluster. In Fig. 1 D11 = D21, however X1 has bigger Md than X2 since X2 is close to Cc2 and D22 play significant role in decreasing Md of X2. It does not belong to Cc1 with high degree as X1 does. In addition, isolated points will be discarded, since their Md are low enough to be clustered in the existing clusters and this will cause creation of a new cluster, which will be eliminated later if the samples in it remain less than a predefined number, e.g., two. This is essential for the learning process, since online clustering methods have problems related to the impact of the noise and the quality of training data. We define a reference center (the arithmetic mean) Mcj for each cluster j, described by its center coordinates in q space Mcj . , cp X Mcj ¼ xi cp; ð2Þ i¼1

where x [ Rq and cp is the number of the input data in cluster j. When the current input vector xi comes, the distances between this new input and mean center for each cluster DMcij ¼ jjxi  Mcj jj; i ¼ 1; . . .; p; is calculated. We introduce two new thresholds: (1) Md Threshold (Mdthr) in the range [0.4–0.6]; and (2) Mean Threshold (Mthr) in the range [0.5–0.7]. When DMcij is less or equal to Mthr we can increase the radius of this cluster, otherwise we make new cluster. In other words, a cluster will not be updated

X1 D11 Cc 1

D21 C1

D12

X2 D22

Cc2



Step 0: Set up a small value for Dthr, i.e., 0.1. Normalize the input vector in the range [0–1] using Eq. 3: xi  xmin xi norm ¼ ; ð3Þ xmax  xmin

where the range of xi is [xmin 7 xmax] and x [ Rq Create the first cluster C01 by taking the position of the first example from the input stream as the first cluster center Cc01, then setting a value for its cluster radius Ru01 = 0. Add the value of xi to Mcj and set up the number for input vectors classified in this cluster to 1. • Step 1: If all examples of the data stream have been processed, the algorithm is finished. Else, the current input example, xi, is taken. After the normalization of the input vector in the range [0–1] the distances between this example and n all already created cluster centers Ccj, Dij = ||xi - Ccj||, j = 1,…,n, are calculated. • Step 2: If there is any distance value, Dij = ||xi - Ccj|| equal to, or less than, at least one of the radii, Ruj, it means that the current example belongs to a cluster Cm with the minimum distance   Dim ¼ jjxi  Ccm jj ¼ min jjxi  Ccj jj subject to the constraint Dij \ Ruj, j = 1,…,n Add the value of xi toMcj and increment the number of inputs in cluster j by one. In this case, neither a new cluster is created, nor any existing cluster is updated and the algorithm returns to Step 1. Else: • Step 3: Find cluster Cka (with center Ccka and cluster radius Ruka ) from all existing cluster centers through calculating the values Skij = Dij ? Rukj j = 1,…, n, and then choosing the cluster center Ccka with the minimum value Skia:   Skij ¼ Dij þ Rukj ¼ min Skij ; j ¼ 1; . . .; n •

C2

Fig. 1 Geometrical interpretation of the multidimensional membership function

123

any more if DMcij is bigger than acceptable tolerance to the mean center. We establish that for Mdthr = 0.5 and Mthr = 0.6 we obtain optimal (or at least feasible) cluster centers and radii according to the often used validity indexes in the literature [11–13]. The nature of the application data additionally can help in choosing the right thresholds. For instance, when data are not well partitioned in the input space, i.e., sparse data exist, the Mthr should be about 0.7. The increasing of Mdthr to 0.6 or decreasing Mthr to 0.5 results in more clusters. The LEFCM algorithm consists of the following steps:



Step 4: Calculate Md according to Eq. 1 and distances between the new input and mean center for each cluster—Mcj ; DMcij ¼ jjxi  Mcj jj; i ¼ 1; . . .; p: Step 5: If Md is greater than Mdthr and DMcij B Mthr then Dthr = Sim/2. This means that the input example

Int J Wireless Inf Networks (2010) 17:34–41

37

Z

Z2 Z3 4,47 cm

D1

Cc 1 Cc2

R2

Cc 0

R1 S/2

2,24 cm

Z1 X3

β1

X1

X2

X

α1 Y1 R1' Cc1' Cc2' R2' Y3 Y2

Y Fig. 2 A brief clustering process using LEFCM with input vectors X1 to X3 in a 3-D space: the vector X1 causes the LEFCM to create a new cluster C0 with center Cc0. Examples X2 and X3 update cluster center Cc0 first to Cc1 and then to Cc2





is very close to that cluster and its mean center and Dthr is set up to produce cluster updating instead of new cluster creation. Step 6: If Sia is greater than 2xDthr, the example does not belong to any existing clusters. A new cluster is created in the same way as described in Step 0 and the algorithm returns to Step 1. Step 7: If Skia is not greater than 2xDthr, the cluster Cka is updated by moving its center, Ccka , and increasing the value of its radius, Ruka . The updated new radius Ruk?1 a is set to be equal to Sia/2 and the new center is located at the point on the line connecting the xi and Ccka , and the distance from the new center Cck?1 to the point xi is a equal to Ruk?1 (the cases of Cc and Cc2 in Fig. 2). a 1 Add the value of xi to Mcj and increment the number of inputs in cluster j. The algorithm returns to Step 1. More math details for this step, as well as generalization in q-space, are given in the Appendix.

The tables in Sect. 3.2 demonstrate the results for number of clusters and their centers, as well as the corresponding validity indexes after varying Dthr in ECM and after varying Mdthr and Mthr in LEFCM.

system parameters from empirical data without human participation. A learning algorithm for MANETs needs to be designed to build a reasoning engine that can model system behavior by learning and adapting itself online in a distributed way only on packets from the data stream at a given time instant. The reasoning engine can not be trained towards labeled input/output pairs because of the MANETs dynamism. For instance, in modeling a MANETs anomalydetection system the normal profiles for system behavior are not known a priory. In modeling a system for mobility prediction the movement patterns of mobile nodes over time are not labeled a priory. This impacts us to take the advantage of unsupervised learning algorithms seeking to summarize and organize key features of the data in order to identify a model based on fuzzy rules and input linguistic variables. The system behavior is determined by selected candidate features as inputs and outputs:  F  ði1 ; i2 ; . . .; iz Þ ! o1 ; o2 ; . . .; opz The function (F) maps input parameters to output behavior of the model using fuzzy inference and membership functions. Certain dimensions of vectors are used as input (iz) and certain dimensions as outputs (op-z). Each determined cluster corresponds to one rule of the model. The parameter(s) that we control or predict is set in the consequence, while the rest parameters in the antecedent part of the fuzzy rules. Membership degrees to clusters are placed into antecedents of the fuzzy rules. The model consists of fuzzy rules FRj in the form: FRi : If xi belongs to the j-th cluster with cluster centre Ccj then oi ;

ð4Þ

where xi is the input vector and oi is the model output. Instead of generating an input membership function for each input dimension, a multidimensional membership degree to specific behavior (expressed by a cluster) is computed directly for the entire antecedence part by Eq. 1. The degree Md defines the degree of fulfillment of i-th fuzzy rule applied to the corresponding fuzzy output value. The crisp output value is calculated by aggregation of the fuzzy outputs of the rules affected by the current sample. If we are interested in only one value the maxima defuzzyfication rule is used.

2.3 Evolving Fuzzy Reasoning Model Based on LEFCM

3 Experimental Results

Dynamic conditions in MANETs make questionable the usage of analytical modeling. At the same time, data-driven Artificial Intelligence (AI) modeling approaches are capable of representing non-linear relationships, adapting and managing uncertainties by learning, predict and control

In this section we give a brief overview of one real data set and a data set from a realistic DA scenario. Validation of the proposed methodology is described and followed by a demonstration of the variation of the number of clusters and validation indexes.

123

38

Int J Wireless Inf Networks (2010) 17:34–41

3.1 Data Sets 3.1.1 Iris Data This data represents three categories of irises having four feature values [14]. We took this data set, since very few clustering techniques reported in the research community actually come up with three clusters for this data. Most found clusters to be two. The clustered Iris data set for three features (petal width, petal length and sepal width) and corresponding clusters and their centers are shown in Fig. 3. 3.1.2 DA Data

"

We utilize the context of the maneuver took place in May 2005 in Cologne, Germany [2]. Each mobile node extracts from its coordinates the Mean and trend in fluctuations in its movement patterns for a certain period of time. DA data represents six repetitive categories of node movement patterns having three feature values (the Mean and the trend in node movement for the past eight time units, as well as the trend in node movement for the next eight time units). Corresponding cluster centers are shown in Fig. 4. 3.2 Results and Comparison We proved the proposed solution via computing of the following cluster validity measures: Eq. 5—Xie-Beni [11]; Eq. 6—Kwon [12] and Eq. 7—PBM-index [13]. ,   p n X X   2 2 2   mXB ¼ Mdij ð xi  Ccj Þ p minðkCci  Cck kÞ j

i6¼k

i

ð5Þ p n X X

  Mdij2 ðxi  Ccj Þ2 i j , n X  2 1 þ ðCci  CcÞ minðkCci  Cck kÞ2 ; i6¼k n i¼1

mK ¼

where Cc ¼ 1p

p P

Fig. 4 Clustering of data from DA scenario

xj

ð6Þ

#2 Pp 1 i Mdi1 kxi  Cc1 k  PBM ¼  Pn Pp ðkCci  Cck kÞ    max i6¼k c j i Mdij xi  Ccj ð7Þ The performance of the fuzzy PBM index in determining the proper number of clusters for different data sets is compared with the mXB-index in [13]. Authors show that PBM-index outperforms mXB, for that reason we use it as a main criterion. The maximum value of PBM indicates optimal number of clusters. Table 1 illustrates the variation of validity indexes with the number of clusters and Dthr for offline clustering validation. We start by choosing a small value of Dthr 0.1 and increment it with a small amount. The number of clusters corresponding to the maximum value of the PBM index indicates the optimal number of clusters. The Table 2 illustrates the variations for online clustering validation. We choose a small value of Dthr 0.1 and vary the both thresholds in their ranges [0.4–0.6] and [0.5–0.7]. As it seen most of the clustering results in Table 2 are feasible. The high value (in bold) for PBM in Table 1 (rows 4 and 5) has equal results to these in rows 1 and 2 in Table 2. The results in rows 3 and 4 even demonstrate better input data partition. Similar are the results for DA data; however they are not presented because of the lack of space. The number of clusters in the input data space is 6 or 7. Again the input data partitioning is feasible if the thresholds are in the above mentioned ranges.

j¼1

3.3 Discussion

Fig. 3 Clustering of IRIS data

123

The feasibility of the proposed fuzzy model obtained online has been tested in NS2 network simulator [15] for enhancement of the multicast routing. Multicasting has attracted a lot of attention recently in DA applications, since it plays a significant role for supporting grouporiented applications. The performance of On-Demand Multicast Routing Protocol (ODMRP) is the best one with the advantages of considerable throughput, however this protocol suffers of heavy overhead. We enhance ODMRP by reducing the flooding of control messages after deploying of

Int J Wireless Inf Networks (2010) 17:34–41

39

Table 1 Variation of validity indexes with changing of Dthr threshold (offline approach for IRIS data) Dthr

Number of rules

Cluster center coordinates

mXB

mK

PBM

0.2

11



84.25

1141.25

0.002

0.4

4

1.62

60.51

4.91; 3.35; 1.26

0.04

5.04; 2.43; 3.42 6.14; 2.72; 4.64

4 Related Works

7.26; 2.81; 6.04 0.5

3

4.91; 3.35; 1.26

0.53

26.55

0.17

0.43

21.32

0.20

0.43

21.32

0.20

0.12

9.40

0.11

5.49; 2.61; 3.88 6.99; 2.75; 5.71 0.55

3

4.91; 3.35; 1.26 5.64; 2.64; 4.06 6.97; 2.74; 5.67

0.6

3

4.91; 3.35; 1.26 5.64; 2.64; 4.06 6.97; 2.74; 5.67

0.7

2

5.62; 3.01; 2.63 6.42; 2.97; 5.57

Bold values indicate all results from clustering that are not optimal but feasible Table 2 Variation of validity indexes with changing of Mdthr and Mthr thresholds (online approach for IRIS data) Mdthr

Mthr

Number of rules

0.4

0.5

3

Cluster center coordinates

mXB

mK

PBM

4.91; 3.35; 1.26

0.43

21.32

0.20

0.43

21.32

0.20

0.34

17.11

0.26

0.34

17.11

0.26

0.06

4.47

0.19

5.64; 2.64; 4.06 0.5

0.5

3

5.64; 2.64; 4.06 4.91; 3.35; 1.26 5.64; 2.64; 4.06 5.64; 2.64; 4.06

0.4

0.6

3

4.91; 3.35; 1.26 5.89; 2.69; 4.39 7.22; 2.99; 6.02

0.5

0.6

3

4.91; 3.35; 1.26 5.89; 2.69; 4.39 7.22; 2.99; 6.02

0.5

0.7

2

control overhead without end-to-end delay) demonstrates that the algorithm is not intrusive, not wasting network resources and enough lightweight. Further approximations for sqrt(), sin(), cos(), acos() and atg() and using bitwise operations might lighten the LEFCM additionally.

4.91; 3.35; 1.26 6.04; 2.78; 4.57

Bold values indicate all results from clustering that are not optimal but feasible

fuzzy clustering and reasoning on mobile nodes. Clustering algorithm on each node learns the trend in movement patterns over time in order to create fuzzy inference rules for prediction of stable links. A fuzzy-inference rule base was implemented to generate the route and forward-group time parameters of ODMRP. The improved performance (less

In this section, related works are presented. They are twofolds: the first group considers the evolving computational intelligence systems and one-pass online evolving clustering algorithms for unsupervised learning; the second group reviews the deployment of fuzzy systems in MANETs from the viewpoint of self-evolving software. Recent achievements in the development of evolving fuzzy and neuro-fuzzy intelligence systems [8] are suitable for the integration of new data and existing models that can be incrementally adapted to future incoming data. The aim of the clustering for data space partitioning is used as a technique to automatically identify the structure of fuzzy or neuro-fuzzy systems [4–7]. In [3] two working examples of ECIS, namely Evolving Connectionist System (ECoS) and Evolving Fuzzy System (EFS) are compared. They used similar online distance-based clustering methods [4] and [5], which have their merits and demerits for different modeling tasks and data. EFS, having only one algorithmic parameter is more robust, less sensitive to noise and outliers however more conservatively in producing new clusters/rules. ECoS using two algorithmic parameters for rules/neurons evolution is more flexible. We chose the clustering algorithm of ECoS [4] because the recursive calculation of the input data potential in [5] requires passed samples to be kept. In Sect. 2.1, we reviewed in details the ECM algorithm [4] and its disadvantage for direct application in MANETs: in order to find out the optimum cluster centers and thus the optimum fuzzy model, the distance threshold of clustering algorithm has to be varied between 0.1 and 1 off-line in batch mode. Ravi et al. [6] propose evolving fuzzy clustering method (EFCM)—a fuzzy variant of ECM. It computes the membership values of each of the input vector (sample) as it arrives from the input stream, signifying its degree of belonging to the existing clusters and makes it more accurate in case of overlapping clusters. The proposed study differs from [6] in lighter formulation of Md and its participation in the clustering process for online estimation of the number of clusters by adaptation of Dhtr. Unsupervised learning is closely related to the problem of density estimation in statistics. Many works exploit also a reference mean point into clustering process, e.g., ‘‘Minimum Distance to Mean’’ classification method [16]. However, these algorithms do not recognize differences in the variance of clusters, which determines

123

40

their relative ‘‘size’’ in feature space. For training sets, input vectors near the edge of a ‘‘larger’’ cluster may be closer to the center of a nearby ‘‘smaller’’ cluster than to their own cluster center, resulting in miss-classification of some unknown clusters. The recursive estimation of data density at a data point, resembling the probability distribution in ‘kernel density estimation’ in [17] overcomes the above problem. However the recursive calculation of the input data potential requires passed samples to be kept. The present here clustering algorithm gets over the miss-classification and unknown number of clusters by the combination of statistical and geometrical analyses involving radii in LEFCM without keeping past samples. Related works dealing with fuzzy reasoning have been done in MANETs [9]. However, most of the models need to be reprogrammed when the environment changes. The dynamism in MANETs topology requires self-regeneration of mobile software as it runs. Usually in the works reported online learning or structure and knowledge adaptation, intelligent agents run autonomously on each node, collect packets from the data stream and exchange information through light-weight messages. Unsupervised learning is carried out using Reinforcement Learning (RL) [18], Fuzzy Neural Networks [10] or Self-organizing Maps (SOM). The most often used RL in MANETs attempts to find a policy that maps states of the world to the actions the agent ought to take in those states. Agent receives how effective is the action it performs by feedback extracted from the environment, which delays the process of knowledge and data integration. That makes these models not suitable for continuously clustering of new data and evolving structures in MANETs. Most relevant to our work is the evolving fuzzy neural network for a Bluetooth routing protocol [10], however we find our solution for fuzzy structure identification more simple, faster and optimized to lack of resourcesin wireless Adhoc and sensor networks.

Int J Wireless Inf Networks (2010) 17:34–41

fuzzy structure identification is lightweight and thus suitable to be implemented on mobile devices to configure mobile services and applications to adapt to MANETs topology changes continuously. The methodology might be used to predict the incoming connectivity failures, density of nodes and traffic in a local scope in order to find the ‘‘reliable’’ nodes that can perform the network services in a best way. Acknowledgments The research has been supported by DFG grants N 436BUL112/08. Also thanks goes to Prof. Peter Martini, Nils Aschenbruck, Elmar Gerhards-Padilla from Inst. of Computer Science IV, Bonn University for their strong support and provided data from a realistic disaster area scenario.

Appendix Math Details (Sect. 2.2 – Step7 in LEFCM Description) 1.

2.

X1 is the first three-dimensional input vector with coordinates x1 y1 z1. It becomes the first cluster center Cc1. X2-new three-dimensional input vector with coordinates x2 y2 z2 D—Euclidean distances between current example and already created cluster centers

D¼ 3.

dXxY-projection of D in XY space

dXY

4.

qffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi ðx2  x1 Þ2 þ ðy2  y1 Þ2 þ ðz2  z1 Þ2

qffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi ¼ ðx2  x1 Þ2 þ ðy2  y1 Þ2 if x1 [ x2 ! x2 ¼ x1 ; x1 ¼ x2 if y1 [ y2 ! y2 ¼ y1 ; y1 ¼ y2 if z1 [ z2 ! z2 ¼ z1 ; z1 ¼ z2

5.

b—angle between D and X 9 Y—projection (see Fig. 2)

5 Conclusion tgb ¼ The presented methodology can be applied for online distributed data analysis or modeling tasks in wireless Ad-hoc or sensor networks when software that evolves by itself in an unsupervised way is required. The presented evolving fuzzy modeling addresses the MANETs call for learning of incoming spatial and temporal context online. It is based on distributed data-driven evolving clustering method that addresses the issue of fuzziness and online estimates the optimal number of clusters and their centers. The proposed

123

6. 7. 8. 9.

ðz2  z1 Þ ðz2  z1 Þ b ¼ atg dXY dXY

ðz2 z1 Þ 1Þ sin b ¼ ðz2 z D D ¼ sin b z2 ðdXY SÞ z1 ¼ D x1 Þ a ¼ a cos ðxd2XY SXY —the projection of S in XY space SXY ¼ S cos b

x1 ¼ x2  SXY cos a y1 ¼ y2  SXY sin a

Int J Wireless Inf Networks (2010) 17:34–41

41

Generalization in q-dimensional space

10. C. Huang, A Bluetooth routing protocol using evolving fuzzy neural networks, International Journal of Wireless Information Networks, Vol. 11, No. 3, pp. 1572–8129, 2004. 11. X. Xie and G. Beni, A validity measure for fuzzy clustering, IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 13, No. 8, pp. 841–847, 1991. 12. S. Kwon, Cluster validity index for fuzzy clustering, Electronics Letters, Vol. 34, No. 22, pp. 2176–2177, 1998. 13. M. Pakhira, S. Bandyopadhyay and U. Maulik, Validity index for crisp and fuzzy clusters, Pattern Recognition, Vol. 37, No. 3, pp. 487–501, 2004. 14. Iris Data Set in UCI Machine Learning Repository. http://archive. ics.uci.edu/ml/datasets/Iris. 15. The VINT project—network simulator (NS-2). http://www.isi.edu/ nsnam/ns. 16. T. Lillesand and R. Kiefer, Minimum Distance to Means Classifier, Digital Image Processing, WileyNew York, 1994. pp. 590– 591. 17. R. Ramezani, P. Angelov and X. Zhou, A fast approach to novelty detection in video streams using recursive density estimation. In 4th Inernational IEEE Conference of Intelligent Systems, pp. 14-2–14-7, 2008. 18. J. Martyna, Fuzzy reinforcement learning for routing in wireless sensor networks. In B. Reusch, editor. Computational Intelligence. Theory and Applications, Springer-VerlagBerlin, Heidelberg, New York, 2006. pp. 637–645.

The Euclidean distance between q-dimensional input vectors P = (p1, p2, …,pq) and Q = (q1, q2, …,qq) in Euclidean q-space, is defined as: ffi qffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi sffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi q X 2 2 2 D ¼ ðp1  q1 Þ þ    þ ðpq  qq Þ ¼ ð pi  qi Þ i¼1

By decomposing D and S in all projections in q-space, the angles bn and api pj are obtained, as well as dpi pj and Spi pj . Thus, the new radius and the center coordinates of updated sphere in the q-dimensional space are calculated.

References 1. L. Zadeh, Fuzzy sets, Information and Control, Vol. 8, No. 3, pp. 338–353, 1965. 2. N. Aschenbruck, E. Gerhards-Padilla, M. Gerharz, M. Frank and P. Martini, Modelling mobility in disaster area scenarios. In 10th ACM International Workshop on Modeling Analysis and Simulation of Wireless and Mobile Systems, pp. 4–12, 2007. 3. P. Angelov and N Kasabov, Evolving computational intelligence systems. In I International Workshop on Genetic Fuzzy Systems, 2005. 4. N. Kasabov and Q. Song, DENFIS: dynamic, evolving neuralfuzzy inference systems and its application for time-series prediction, IEEE Transactions on Fuzzy Systems, Vol. 10, No. 2, pp. 144–154, 2002. 5. P. Angelov, Evolving Takagi-Sugeno fuzzy systems from streaming data, eTS?. In Evolving Intelligent Systems: Methodology and Applications. Wiley, New York, 2010. 6. V. Ravi, E. Srinivas and N. Kasabov, On-line evolving fuzzy clustering. In International Conference on Computational Intelligence and Multimedia Applications, pp. 347–351, 2007. 7. J. de Oliveira and W. Pedrycz, Advances in Fuzzy Clustering and Applications. Wiley, Chichester, 2007. 8. P. Angelov, D. Filev and N. Kasabov, editors., Evolving Intelligent Systems: Methodology and Applications, WileyNew York, 2010. 9. E. Natsheh, A survey on fuzzy reasoning applications for routing protocols in wireless Ad Hoc networks, International Journal of Business Data Communications and Networking, Vol. 4, No. 2, pp. 22–37, 2008.

Author Biography Anna Lekova was born in Sofia, Bulgaria. She received her MSc in Computer Science from the Technical University—Sofia in 1988 and her PhD in CAD/CAE/CAM from the Technical University—Sofia in 1995. Currently, she is Head of the Hybrid Systems and Management Department at the Institute of Control and System Research, Bulgarian Academy of Sciences. Her current research interests are in fuzzylogic for mobile control and computing, design of intelligent services and application for mobile ad hoc networks.

123

Suggest Documents