Joint Tracking and Classification of Airbourne Objects using Particle ...

1 downloads 0 Views 462KB Size Report
Malvern Technology. Centre,. Malvern,. Worcestershire, UK [email protected]. Abstract - This paper describes the integration of a particle filter and a ...
Joint Tracking and Classification of Airbourne Objects using Particle Filters and the Continuous Transferable Belief Model Gavin Powell & David Marshall The Geometric Computing & Computer Vision Group, School of Computer Science, Cardiff University, Cardiff, UK {G.R.Powell, Dave.Marshall}@cs.cf.ac.uk

Philippe Smets IRIDIA, Universite libre de Bruxelles, 50 Av Roosevelt, CP 194-6, 1050 Bruxelles, Belgium

Abstract - This paper describes the integration of a particle filter and a continuous version o f the transferable belief model. The output from the particle filter is used as input to t h e transferable belief model. The transferable belief model's continuous nature allows for the prior knowledge over the classification space to be incorporated within the s y s t e m . Classification of objects is demonstrated within the paper and compared to the more classical Bayesian classification routine. T h i s is the first time that such an approach has been taken to jointly classify and track targets. We show that there is a great deal o f flexibility built into the continuous transferable belief model and in our comparison with a Bayesian classifier, w e show that our novel approach offers a more robust classification output that is l e s s influenced by noise. Keywords: Tracking, particle filter, transferable belief model, classification.

Branko Ristic Intelligence, Surveillance, and Reconnaissance DSTO Australia, [email protected] ence.gov.au

Simon Maskell QinetiQ Malvern Technology Centre, Malvern, Worcestershire, UK [email protected]

cTBM are limited to land vehicle positioning [2] and model based classification [3]. We will also identify some deficiencies of a recursive application of the cTBM, such as unwanted convergence and empty set dominance, and show how these deficiencies can be overcome. Our approach to jointly tracking and classifying using particle filters [4] differs from more classical approaches [5]. Within the cTBM framework each of the particles of the particle filter are able to contribute to the classification stage. Each particle outputs a classification distribution. These distributions are not fused using a Bayesian approach, which can be thought of as taking the mean classification distribution from the particles, as we regard this as a 'throwing away' of valuable data. Instead, the individual classification of each of the particles are combined using the cTBM to produce a fused classification output. This is updated recursively in time as shown in Figure 1.

Object observed by a sensor

Classification used to update the particle filter

1 Introduction Recent advances with the transferable belief model (TBM) have led to the inclusion of belief functions that operate within the continuous domain [1]. This is a substantial improvement over previous discrete versions of the TBM and made such approaches usable in many more applications. We present in this paper one such application where we combine it with a particle filter for joint tracking with classification of moving objects. The continuous transferable belief model (cTBM) is a powerful tool that can utilise all of the features of the TBM, such as the transfer of beliefs and combinations of belief functions, but also allows for probability density functions to be output and used to articulate prior information. Previous examples of applications of the

Particle Filter receives noisy measurement

Classification output from each particle

All classifications from 0..n fused for overall classification

All particles' classifications fused together in cTBM for time n

Figure 1. Data flow through system

2 Paper Outline



bel(A) =

∀A ⊆ Ω, A ≠ ∅

m(B)

(1)

∅≠B⊆A

In Section 3 we present the basic elements, or the building blocks, of the TBM. In Section 4 we show that the model has been extended for use in the continuous domain through the cTBM. Section 4 also includes discussion of how probability density functions can be used as prior information within the cTBM. Section 5 describes how the output of a particle filter can be used as input to a cTBM. Section 6 presents the results, highlights deficiencies of previous applications of the cTBM and demonstrates that the suggested strategies do indeed address these deficiencies. Our conclusions are drawn in Section 7.

3 The basic elements transferable belief model

of

the

A brief summary of the TBM is now given. We refer readers to other introductions [8] for full details on the TBM. set

of

all

possible

aircraft

classes

is

Ω = {A − 10, B − 2, F − 16} . A basic belief assignment Ω

(bba) is given by m : 2 → [0,1] with

The GBT is a generalisation of Bayes theorem where all conditional probabilities are replaced by belief functions. And the a priori belief function can be vacuous if necessary, stating that we are completely ignorant a priori. If we are completely ignorant over some space, Θ , and so have no idea, which θ ∈ Θ is the true case. But we do have some alternative knowledge such that, for each θ ∈ Θ we know what our beliefs are on another space, X. i

The TBM began life in the 1970's and is an extension of work completed by A.P.Dempster [6] and Glenn Shafer [7], which later became known as Dempster Shafer Theory (DST). DST itself is a generalisation of Bayesian theory and is based on two ideas, firstly that you create beliefs from subjective probabilities and secondly that you can fuse such information using Dempster's rule of combination. The TBM works on two levels 1) The credal level where beliefs are entertained and quantified by belief functions 2) The pignistic level where beliefs can be used to make decisions and are quantified by probability functions. The credal level derives its name from the way that beliefs can be transferred when new information is received.

The

The Generalised Bayesian Theorem (GBT) [9] allows for us to calculate conditional beliefs, much in the same way that Bayes theorem works with probabilities. We of course can use a vacuous a priori, m(Ω) = 1 , if we have no prior knowledge.

∑ m( A) = 1 ,

i

So the belief function over the space X given that θ is i

the case is written as bel (. | θ ) . The plausibility of x ∈ X X

i

being the true case given that θ has already occurred is i

written as pl (x | θ ) . X

i

belΘ (θ | x) =

∏ (bel

X

(x | θ i ) + m(∅ | θ i ))

θ i ∈θ



∏ (bel

X

(x | θ i ) + m(∅ | θ i ))

(2)

θ i ∈Θ

and plΘ (θ | x) = 1 −

∏ (1 − pl

X

(x | θ i ))

(3)

θ i ∈θ

If some belief is known a priori it can simply be combined through conjunctive combination [8].

where

A⊆ Ω

m(A) is the basic belief mass (bbm) given to A. Every A ⊆ Ω such that m(A)>0 is a focal representation. Every subset has some support that the real state is within that subset we will provide a mass (bbm) for, and it is these subsets that are focal sets. This is one of the key points that the TBM can allocate masses to subsets of Ω , rather than only the mutually exclusive hypotheses, the elements, as in probability theory. This allows for ignorance to be accounted for within the model. The greater the number of elements within the focal subset, the greater the ignorance that is being displayed. A focal set with one element is called a singleton. We quantify our belief that the actual world belongs to the subset A by bel(A). This is given by the sum of all bbm's that support A given by

4 The continuous transferable belief model Thus far we have given some of the major elements of the TBM applied to finite and discrete spaces. Lots of applications involve continuous spaces. The difficulties of considering a fine discretisation can be seen by considering the rotation of an object, with look angle discretised into 360 segments, each of which is one degree wide. This results in 2360 subsets in the power set. Computing power and memory considerations dictate that 210-220 is an upper limit of the size of this power set for real time operation. It is therefore necessary to consider clever methods to exploit the continuous nature of the domain.

4.1 Moving away from the discrete We consider intervals in the range [0,1] ⊂  , and with a finite amount of focal sets (sets with a bbm > 0). We will use a triangle space shown below in Figure 2 to plot and visualize these intervals.

[a,b], could be straddling [a,b] and outside [a,b]. It shows belief (mass) that is vague with regard to the interval [a,b]. A singleton value is just represented by [a,b] where a=b, whose point K would lie on the diagonal of the triangle.

1

from 0

a

1 1

2

44

2 33

66

0.7 K

55

to

b

b 1

1

a 0.2 0

Figure 2. Point K = (a,b) inside the triangle

Figure 3. Triangle space representation of Table 1

This triangle defines an interval [a, b] ⊆ [0,1] . A point K in the triangle represents an interval, [a,b] within [0,1] as shown in Figure 2 [2]. Such a point has a mass associated with it. The belief of an interval [a,b] is defined to be the sum of the mass of all other intervals [x,y] where x ≥ a AND y ≤ b . The plausibility is defined to be the sum of the mass of all intervals where a ≤ x ≤ b OR a ≤ y ≤ b . The commonality is the sum of all intervals where x ≤ a AND y ≥ b . The relevant masses that are used to calculate the bel, commonality and pl for the interval [0.2,0.7] are also given in Table 1 [2].

We can calculate the probability density function over this triangle space using Equation 4. A m (A) Betf (s) = ∑ (4) ∗ A A:s ∈A ⊂[ 0 ,1 ] a − a∗ 1 − m (∅) ∗

where a − a



represents the interval length. If we relax

the fact that the number of focal sets has to be finite, we now have a mass density over our triangle and instead of summing we integrate to find the bel, pl etc. The pignistic density now becomes x=a

Betf (a) = lim ε→0

Table 1. a bba with six focal sets and their corresponding mass, interval limits and inclusion in the belief for the interval [0.2, 0.7] i 1 2 3 4 5 6 total

mA 0.07 0.18 0.25 0.15 0.05 0.30 1.0

ai 0.3 0.1 0.1 0.4 0.4 0.8

bi 0.4 0.9 0.8 0.9 0.5 0.6

belA X

y =1

∫ ∫

x=0 y=a+ε

f

T[ 0 ,1 ]

(x, y)

y−x

dx, dy

(5)

Extension of this approach to the whole of the real axis is described in [1].

4.2 The least committed basic belief density For the continuous case a bba will become a basic belief density (bbd) [1].

X 0.12

The commonality denotes the belief that is ‘free to flow’ anywhere. It contains all intervals that could be within

There exists multiple belief functions that can project to the same probability density function. When defining a belief that projects to a given probability density function, there are therefore multiple candidate beliefs. We choose the least committed belief. We define the least committed belief to be the belief that maximises the

commonality function. This commonality articulates the mass that is free to flow. The associated mass could support the proposition being considered, partially support it or not support it at all. So by maximising the commonality, we are hedging our bets as much as is possible; we are creating the underlying belief function that is ‘least committed’. The least committed basic belief density is in fact a line on the triangle space which starts at ( µ , µ ) (the mean of the pignistic density function) is denoted as ϕ (u) , where u is the distance along the line from ( µ , µ ) . For a unimodal symmetrical density this is a straight line. This is shown in Figure 4 [2].

we base ours on is taken from a paper by Smetts [2]. If we want to classify a target on its speed then we can construct some pdf’s of target speed conditional of target class, Betf(x|ci ). Currently we use Gaussian distributions for simplicity, but any distribution can be used. Using the least committed bbd theory we can construct the relevant underlying conditional bbd’s for each of these pdf’s which will give us m(x|ci ), pl(x|ci ), bel(x|ci ) etc. It is in fact the likelihoods that are required l(ci |x) = pl(x|ci ). The bbd is given by 2y

ϕ (y) =

The line of focal intervals

2





y

2

2

e

(8)

and the plausibility by pl(y) =

2y



e



y

2

2

+ erfc(y

where y = (x − µ ) / σ and erfc(s) =

An interval

2

π

2)

(9)



∫e

−t

2

dt

s

Using the GBT mentioned previously and described in Smets' paper [1,2,7] we can get m(A|x), which can be used, for making decisions on the class when given the speed of the target. Probability given to the interval

Normal pignistic density

m(A | x) =

∏ pl(x | c )∏ [1 − pl(x | c )] i

ci ∈A

(10)

i

ci ∈A

Figure 4. The focal sets of the least committed basic belief density induced by a normal pignistic density, using the same triangle space as Figure 2. µ , µ is denoted by a donut.

5.1 Integration of the continuous transferable belief model and a particle filter

As can be seen in Figure 4, there is a relationship between the probability density function and the focal sets of the least committed bbd. The relationship between Betf(s) and ϕ (u) is

We assume that every particle in the particle filter outputs a value of a variable that is being tracked, upon which a classification can be made within the cTBM. These classification outputs will be weighted according to the particle weights. These classification outputs will be combined to create an overall classification. The resulting classification is combined with the classification output from the previous time step to get a classification decision based on all the data.



Betf (s) =

ϕ (u)

∫ u − u du

(6)

u= s

where Betf (u) = Betf (u ) and through differentiation of Equation 6 we can obtain

The particle filter at time t = k will have a state of x = [ posx, posy, velx, vely] which is approximated by k

T

N samples or particles which are given by S j = [ x j , w j ] k

ϕ (s) = −(s − s )

dBetf (s)

(6)

ds

5 Joint tracking and classification using a particle filter and the cTBM We show a scenario where we are tracking and identifying aircraft based on their air speeds. We use the integrated particle filter and cTBM which uses prior information in the form of PDF's. The integration of the cTBM with a particle filter is novel. The classification technique that

k

k

k

where j = 1… N and the particle weight is w j . With each of the particles we calculate a conditional belief function, using equation 10, for that particular particle. We use the weight associated with each particle as a measure of how reliable that particle is. We weight the belief function through 'discounting' [9]; discounting is a process that allows for an uncertainty over a belief to be taken into account in the cTBM framework. The discounting factor k is given as (1 − α ) = w j . To discount our belief function we use:

α

m (A | x) = (1 − α ) ⋅ m(A) α

[

]

m (A | x) = (1 − α ) ⋅ m(A) +α

We combine all of the bba's from each of the particles by recursively using a closed world normalised combination. We use Dempster's combination rule given by

where K=



B ∩C

m1 (B)m2 (C) 1− K



m1 (B)m2 (C)

(12)



A :ci ∈A

A=Ω

The bba given by application of Equation 10 will give rise to values being assigned to the empty set ∅ . In an open world example values given to the empty set indicate that the true object is not one that we have covered with the prior pdf's. Since we use the particles' states to provide input to the cTBM, this occurs on a frequent basis with the outlying particles. When using the cTBM in a recursive manner these regular empty set values give rise to an unrecoverable convergence on the empty set. For this reason after discounting we remove the mass assigned to the empty set and redistribute it to Ω.

m1⊕ 2 (A) =

BetP{ci | x} =

∀A ⊆ Ω, A ≠ Ω (11)

1

m(A | x)

A [1 − m(∅ | x)]

(14)

The pignistic transform will provide us with class probabilities and is a proper probability function.

6 Results We have chosen to highlight the features of our approach through classifying a moving aircraft. We show the tracking and classification results of three aircraft, an A10 tankbuster, a B-2 stealth bomber and an F-16 fighter jet. We also graphically demonstrate the outcome of convergence and empty set domination problems. A track is created and then PDF models chosen so as to highlight some of the features of our approach. The observed track will produce positional coordinates for the object at each time frame, from these we can deduce a velocity. The changes in position are presented to the particle filter for it to track with. The velocity output of the particle filter is used for classification. One example of the tracks is given in Figure 6.

(13)

B ∩C = ∅

We are using a normalised version of the combination rule for the same reasons as was previously mentioned with regard to empty set values. When this combination rule is used recursively the mass associated with the empty set will quickly dominate. To prevent this we simply put an upper threshold on the mass associated to the singleton sets of 0.99. If at any point they exceed this we remove that additional mass and place it in Ω . This is a simple way to prevent us getting masses of 1.0 on the singleton sets. This means that the cTBM can now recover or change its mind about its classification. Once we have combined the bba's from all of the particles we are presented with a fused belief function for that particular time k. To gain a fused belief function for all time steps 1…k we must use Equation 12 again to fuse these together, see Figure 5.

Figure 6. A-10 motion Prior PDF's 0.4 0.35 0.3

P

At each time step k begin

0.25 B-2 A-10

0.2

F-16 0.15 0.1 0.05

bba_time_0..k = combine bba_time_k WITH bba_time_0..k-1 end

Figure 5. Pseudo code for the fusion of bba's To obtain our classifications we use the pignistic transform on the fused conditional masses.

5

5 9. 5 10 .5 11 .5 12 .5 13 .5 14 .5 15 .5 16 .5 17 .5 18 .5 19 .5 20 .5 21 .5 22 .5 23 .5 24 .5

5

8.

5

7.

6.

5

5

5.

5

4.

5

3.

2.

1.

5

0

0.

For each particle j=1..N Begin bba_time_k = combine bba part J WITH bba_time_k end

Velocity

Figure 7. PDF's for track shown in Figure 6

6.1 Scenario 1 The observed velocity for the track shown in Figure 6 is shown in Figure 8 along with the estimated velocity by the particle filter.

Observed and estimated object velocities

Prior PDF's

20

0.4

18

0.35

16

0.3 0.25

12 Estimated

10

Observed

P

Velocity

14 B-2 A-10

0.2

F-16

8

0.15

6

0.1

4 0.05

2

5

5 9. 5 10 .5 11 .5 12 .5 13 .5 14 .5 15 .5 16 .5 17 .5 18 .5 19 .5 20 .5 21 .5 22 .5 23 .5 24 .5

5

8.

5

7.

6.

5

5

5

Time

5.

5

86 103 120 137 154 171 188 205 222 239 256 273 290 307 324

3.

69

2.

52

1.

35

5

18

0.

1

4.

0

0

Velocity

Figure 8. Observed and estimated velocities of track shown in Figure 6

Figure 11. Prior PDF's for target model's velocities Observed and estimated target velocities

Classification of object

30

1.2 25 1

BetP

B-2 A-10

0.6

Velocity

20 0.8

Estimated

15

Observed

10

F-16 0.4

5 0.2 0 1

0 1

17

33

49

65

81

27

53 79 105 131 157 183 209 235 261 287 313 339 365 391 417 443 469 495 Time

97 113 129 145 161 177 193 209 225 241 257 273 289 305 321 Time

It should be noted that the particle filter was initiated with samples from a broad prior; hence the poor initial estimates of the object state. This appears to be a very simple classification task, that the cTBM is shown to succeed with in Figure 9, but it should be noted that within the first 20 frames there is confusion to the aircraft type. This is solely due to the particle filter settling down from its poor initialisation.

6.2 Scenario 2

Figure 12. Observed and estimated velocities for aircraft shown in Figure 10 Figure 13 shows the classification output from the cTBM when we do not prevent the singleton sets from converging to 1. TBM Classification 1.2

1

0.8

BetP

Figure 9. Classification of aircraft shown in Figure 6 using prior PDF's shown in Figure 7

B-2 A-10

0.6

F-16 0.4

0.2

An object motion of an A-10 is shown in Figure 10 0 1

26

51

76 101 126 151 176 201 226 251 276 301 326 351 376 401 426 451 476 501 Time

Figure 13. Classification of object shown in Figure 10 using prior PDF's shown in Figure 11

Figure 10. B-2 motion We used the prior PDF's as shown in Figure 11 and have the observed and estimated velocities shown in Figure 12

By frame 20 we can see that aircraft type A-10 has been classified and that particular singleton set converged to 1. Note that the system is unable to change its mind in any way. It can be argued that if the system changes its mind then either: our models are incorrect; an open world should be assumed. However, the use of the particle filter can result in errors which necessitate the need for a recovery mechanism. A closed world where mass is prevented from converging in singleton sets is a more robust approach. The simplest method of performing this is to set an upper limit close to 1 for the bbm's of the underlying belief function, we will call this convergence protection. Later in the time series, the velocity suggests that the target is in fact an F-16, but due to the previous convergence the cTBM is unable to recover in any way.

6.3 Scenario 3

TBM Classification 1.2

1

0.8 BetP

A more ambiguous example is now presented and the TBM results compared to a more classical Bayesian classification, which classifies on the estimated target velocity from the particle filter. The aircraft's motion is shown in Figure 14.

B-2 A-10

0.6

F-16 0.4

0.2

0 1

39

77 115 153 191 229 267 305 343 381 419 457 495 533 571 609 647 685 723 761 Time

Figure 17. Classification of object shown in Figure 14 using prior PDF's shown in Figure 11 and with convergence protection Figure 16 and Figure 17 show how the cTBM is affected when one of the singleton's mass converges to 1. The output appears to latch and to be unable to recover from erroneous initial conclusion. What if the initial classification is incorrect, then the true classification will never be reached how ever obvious it may become. Prior PDF's

Figure 14. F-16 object motion

0.4 0.35 0.3

Estimated and observed target velocities P

0.25

20

F-16 0.15

18 16

0.1

14

0.05

5

5

8.

5

7.

6.

5

5

5.

5

4.

3.

5 9. 5 10 .5 11 .5 12 .5 13 .5 14 .5 15 .5 16 .5 17 .5 18 .5 19 .5 20 .5 21 .5 22 .5 23 .5 24 .5

8

2.

0

Observed

1.

Estimated

5

10

5

12

0.

velocity

B-2 A-10

0.2

Velocity

6

Figure 18. Prior PDF's for target model's velocities

4 2 0 1

41

81 121 161 201 241 281 321 361 401 441 481 521 561 601 641 681 721 761 Time

Figure 15. Estimated and observed velocities for target shown in Figure 14 TBM Classification

We now try to classify the same object motion given in Figure 14 but using altered PDF models as shown in Figure 18 and using both the cTBM approach described in this paper and a simple Bayesian classifier using the same PDF's and the mean estimate of velocity, derived from the particles within the particle filter.

1.2

TBM Classification

1

1.2

BetP

0.8 B-2 A-10

0.6

1

F-16

0.8 BetP

0.4

0.2

B-2 0.6

A-10 F-16

0.4

0 1

39

77 115 153 191 229 267 305 343 381 419 457 495 533 571 609 647 685 723 761 Time

Figure 16. Classification of object shown in Figure 14 using prior PDF's shown in Figure 11 and with no convergence protection

0.2

0 1

39 77 115 153 191 229 267 305 343 381 419 457 495 533 571 609 647 685 723 761 Time

Figure 19. Classification of object shown in Figure 14 using prior PDF's shown in Figure 18 and with convergence protection

Bayesian classification 1.2

1

0.8

P

B-2 0.6

A-10 F-16

0.4

0.2

0 1

39 77 115 153 191 229 267 305 343 381 419 457 495 533 571 609 647 685 723 761

will be looked at in the future. We found three avenues for future research. Firstly, the lack of intelligent memory in the cTBM. No knowledge of a target's behaviour is stored. It is possible for a target to behave unlike a particular class yet still be classified as a member of this class Secondly, the empty set has a very important role to play in the cTBM in articulating conflict and the presence of a model that is not a member of the model set. Thirdly, the recursive updating of beliefs in an open world remains a challenge.

Time

Acknowledgements Figure 20. Classification of object shown in Figure 14 using prior PDF's shown in Figure 18 and using Bayesian classification Comparison between Figure 19 and Figure 20 highlights how a frame wise classification technique can be very susceptible to noisy inputs and are over sensitive to outlying measurements. Both techniques give a similar classification output until the time is approximately 600. From this point we can see a little more how the two techniques differ. The cTBM is concerned with how close the measurement is to the mean of the PDF for each class, as well as its probability, which is different from the Bayesian approach, which just looks at the probability for that speed.

6.4 Discussion If we used an open world example with this then we would have a massive mass given to the empty set due to the presence of conflicting information. The target is one of our modelled targets, but the cTBM is not sure which it is. Intuitively, it should pick F-16 as this is the only one whose model completely covers the velocity measurements. The problem is that such intuitive performance is not currently output by the cTBM. Within the cTBM there will also be masses assigned to non singleton sets, which can be analysed to give us Type A or Type B classifications. Even though we are normalising by the empty set its values prior to normalisation can be used to show conflict within measurements, which also denotes the possibility of the target not being a member of the target set.

7 Conclusions We have shown how a particle filter and the continuous transferable belief model have novelly being integrated within our system. We have noted the weak points of the cTBM in such a heavily recursive application and proposed methods to address these. When compared to a simple Bayesian classifier we show to have a more robust method for classifying. By using the cTBM we have truly flexible approach to classification. Its ability to transfer beliefs when the world we are working within changes is a very useful feature for object classification, and one that

Thanks go to the Ministry of Defence, Data Information Fusion defence Technology Centre for funding project 4.10 Statistical Fusion of Battlefield Data for which this paper is a part. Also a special thanks to the late Philippe Smets for his assistance, and his family for their hospitality.

References [1] P. Smets, Belief functions on real numbers, International Journal on Approximate Reasoning, Vol 40:3, pp181-223, 2005 [2] F.Caron, P.Smets, E.Duflos and P.Vanheeghe, Multisensor data fusion in the frame of the TBM on reals application to land vehicle positioning, Information Fusion, Philadelphia, USA, 2005 [3] P.Smets and B.Ristic, Belief function theory on the continuous space with an application to model based classification, Information Processing and Management of Uncertainty in Knowledge-Based Systems, Italy, pp 11191126, July 2004 [4] S.Arulampalam, S.Maskell, N.Gordon and T.Clapp, A tutorial on particle filters for on-line non-linear/nongaussian bayesian tracking, IEEE Transactions on Signal Processing, Vol 50, pp1683-1689, February 2002 [5] N.Gordon, S.Maskell and T.Kirubarajan, Efficient particle filters for joint tracking and classification, In Proceedings of SPIE: Signal and Data Processing of Small Targets, pp439-449, 2002 [6] A.P.Dempster, A generalisation of Bayesian inference, Journal of the Royal Statistical Society, Series B30. pp205-247. 1968 [7] G.Shafer, A mathematical theory of evidence, Princeton University Press, Princeton, NJ 1976. [8] P.Smets and R.Kennes, The transferable belief model, Artificial Intelligence, 66, pp191-234, 1994 [9] P.Smets, Belief functions: the disjunctive rule of combination and the generalised Bayesian theorem, International Journal of Approximate Reasoning, 9, pp135, 1993.