J. Appl. Prob. 35, 633–641 (1998) Printed in Israel Applied Probability Trust 1998
EXPLICIT SUFFICIENT INVARIANTS FOR AN INTERACTING PARTICLE SYSTEM YOSHIAKI ITOH,∗ Institute of Statistical Mathematics, Tokyo COLIN MALLOWS,∗∗ AT&T Laboratories, Florham Park
LARRY SHEPP,∗∗∗ Rutgers University
Abstract We introduce a new class of interacting particle systems on a graph G. Suppose initially there are Ni (0) particles at each vertex i of G, and that the particles interact to form a Markov chain: at each instant two particles are chosen at random, and if these are at adjacent vertices of G, one particle jumps to the other particle’s vertex, each with probability 1/2. The process N enters a death state after a finite time when all the particles are in some independent subset of the vertices of G, i.e. a set of vertices with no edges between any two of them. The problem is to find the distribution of the death state, ηi = Ni (∞), as a function of Ni (0). We are able to obtain, for some special graphs, the limiting distribution of Ni if the total number of particles N → ∞ in such a way that the fraction, Ni (0)/S = ξi , at each vertex is held fixed as N → ∞. In particular we can obtain the limit law for the graph S2 , the two-leaf star which has three vertices and two edges. Keywords: Interacting particle systems; stochastic invariants; graph processes AMS 1991 Subject Classification: Primary 60K35, 60J70, 58F07, 05C80
1. A new class of multiparticle systems With an eye to modelling applications to genetics, to contagion, to voting, etc., and gaining a better understanding of multiparticle systems by searching for the few rare problems of this type allowing explicit solution, we formulate a new class of processes as follows. In its discrete version, we associate with any graph G a stochastic process N(t ) = Ni (t ), t > 0, i ∈ G, where Ni (t ) is the integer number of particles at vertex i of G at time t ≥ 0. Say we start from given integers N i0 = Ni (0), i ∈ G. The total number of particles in the system remains N = Ni (t ) = Ni0 for all t > 0. At each instant of time, two of the N particles are chosen at random, and if these two happen to be at adjacent vertices, then one of the two particles jumps to the other one’s vertex, each with probability 1/2. This process might be called the ‘political positions process’ where the vertices of G represent various political positions and an advocate for one position may attract people with neighbouring views. This Received 29 May 1996; revision received 16 April 1997. ∗ Postal address: The Institute of Statistical Mathematics, 4-6-7 Minami-Azabu Minato-ku, Tokyo 106-8589, Japan. Email address:
[email protected] ∗∗ Postal address: AT&T Laboratories, 180 Park Avenue, Florham Park, NJ 07932, USA. ∗∗∗ Postal address: Department of Statistics, Rutgers University, Hill Center, Piscataway, NJ 08903, USA.
633
634
YOSHIAKI ITOH ET AL.
has some similarity to the voter model in interacting particle systems (Liggett (1985), Bramson and Griffeath (1989)). For the complete graph, the model is that of Moran (1958) for the Fisher–Wright random sampling effect in population genetics. In the more general case the model might be applied to study speciation in biology as well as political positionings. For example consider a genetic system for m alleles Ai , i = 1, 2, . . . , m, in which zygotes Ai A j are fertile for j = i − 1, i, i + 1 and infertile for the other j . This problem was studied numerically by Nei et al. (1982), considering the Fisher–Wright random sampling effect with some selection structure. Our present model has a random sampling effect depending on the structure of a graph, which could be a natural simplified model of the genetic problem. The graph R2k , which is a regular polygon with 2k vertices and all edges present except those joining opposite vertices, is a special case of our genetic model and will be studied in Section 3. Eventually the number of particles at some vertex becomes zero, and thereafter that vertex remains empty and the process lives on the subgraph of G with this vertex removed. The process continues until all vertices are removed except those of an independent subset I of G. (A non-empty subset I ⊂ G is called independent if there are no edges between points of I .) As soon as all the particles lie in some I , the process stops and no further interaction occurs. The problem is to find the probability distribution of the death state as a function of the initial state Ni (0). Let τ be the first time to reach some independent set I (τ ), and denote by Ni (τ ) the number of particles in i at time τ . We would like to determine P(N (τ ) | N (0)) and P(I (τ ) | N (0)). It is easy to see that Ni (t ) is a martingale for each i so that E Ni (τ ) = E Ni (0), i.e. E Ni (τ ) = E Ni (0) = Ni (0).
(1.1)
However, determining P(I (τ ) | N (0)) appears to be a very difficult problem even for the simple graph S2 (Figure 1), with three vertices and two edges. For any complete graph, Kr , it is trivial. Indeed for the complete graph the only independent sets are singletons, and we have P(I (τ ) = {i}) = Ni (0)/N . For other graphs G, where there are other than singleton independent sets, the problem is not so easy; indeed in the discrete case we cannot find the solution for the first non-trivial such graph, the two-leaf star, which we will call S2. Denote the vertices of S2 by 0, 1, 2, with 0 at the centre joined to both 1 and 2. It is easy to see that N0 (t ) is a martingale so that as in (1.1), P{N0 (τ ) = N } = N0 (0)/N .
(1.2)
Of course (1.2) holds for any vertex 0 in any graph G provided 0 is connected to all other points. But this argument does not yield the distribution of N1 (τ ) for S2 , because there are many death states in which N1 (τ ) has a non-zero value and E N1 (τ ) yields only the mean value. It is (apparently) very difficult to determine the distribution of N1 (τ ) even in this simple case, except when the total number of particles, N = i∈S2 Ni (t ), is very small. Although the discrete version of the process is thus not amenable to explicit calculation, the situation is different for the continuous version obtained by letting N → ∞ in such a way for each i ∈ G, that Ni (0)/N is held fixed. The following Ito process is then obtained. Let G be any graph, and let i∈G ξi = 1, ξi ≥ 0, i ∈ G, be given. We will define X i (t ), i ∈ G, t ≥ 0, with X i (0) = ξi , as the solution to the stochastic Ito equation for t ≥ 0, dX i = X i X j dBi j , i ∈G (1.3) j∈
Ni
Explicit limits for diffusions on graphs
635
N
where i is the set of neighbours of i in G, and Bi j are independent standard Wiener processes for distinct pairs {i, j } and with the skew-symmetry property if the order is reversed, B j i (t ) = −Bi j (t ),
t ≥ 0.
(1.4)
Existence and uniqueness of a process X i (t ), i ∈ G, t ≥ 0, satisfying (1.3) follows for the complete graph case from the works by Ethier (1976) and Sato (1976). The existence and uniqueness of the diffusion for an arbitrary graph is a consequence of a result given by Ethier and Kurtz (1986). Their Proposition 1.3.5 applies, and the existence and uniqueness follow. By (1.6) given later and Ito’s formula, the diffusion in question has generator A f (x) =
r 1 ai j (x)Di D j f (x) 2
(1.5)
i, j =1
with aii (x) = x i j ∈Ni x j , ai j (x) = −x i x j if i = j and j ∈ i , and ai j (x) = 0 otherwise. It is straightforward to show that (ai j (x)) is non-negative definite, hence A is dissipative. Because these coefficients are quadratic polynomials in the components of x, A maps the space of polynomials of degree at most n into itself. We have verified the conditions of the proposition, and consequently A is closable and its closure generates a Feller semigroup on C(K ), K being the simplex corresponding to the vertices of the graph. This suffices for existence and uniqueness of solutions of the martingale problem for A, or equivalently, existence and uniqueness of weak solutions of the stochastic Ito equation (1.3). Thus, it is clear from (1.3) and (1.4) that dX i = 0, so that X i (t ) = 1 for all t ≥ 0, and that there exists a first time τ ≥ 0, for which {i : X i (τ ) > 0} = I (τ ) is an independent subset of G and P(τ < ∞) = 1, i.e. the situation is the same for X as for N . Indeed if the total number of particles N = Ni (0) in the discrete process tends to infinity in such a way that Ni (0)/N is held fixed for each i ∈ G, then the limiting process of N (t )/N is X given by (1.3), see e.g. Ethier (1976) and Sato (1976). Nonlinear integrable systems are a central theme in Newtonian mechanics since Liouville showed that a system with n degrees of freedom is integrable if n independent invariants are known. The Toda lattice is known to have n independent invariants and to be integrable in Liouville’s sense (Flaschka (1974), Henon (1974), Bogoyavlensky (1988)). The KdV equation was shown to have infinitely many invariants by Gardener, Green, Kruskal, Lax, Miura, and Zabusky; see Arnold (1989) p. 543 and Lax (1968). In the stochastic models considered here and in Itoh (1979, 1993), the notion of ‘invariant’ is replaced by ‘martingale’; although the stochastic models are quite different, the analogy is quite strong. Having a sufficient set of martingale ‘invariants’ is what enables us to obtain the limiting distribution of the Markov chain. It appears to be always true that for any graph G there are enough martingales, polynomial functions of X , to determine the distribution of I (τ ) and X (τ ) uniquely. Indeed for every n ≥ 1 there seem to be one or more homogeneous polynomials of degree n in X i , i ∈ G, which are martingales. At least this appears to be the case for the examples we will study here, though we have no general proof. On the other hand it seems not to be easy in general to find these polynomials. Of course ( i∈G X i )n is trivially a martingale. It is clear from (1.3) that for each i ∈ G, X i (t ) is a martingale, since there is no dt term on the right in (1.1). Using the multiplication table for Ito calculus, McKean (1969) p. 44, gives
N
636
YOSHIAKI ITOH ET AL.
(dX i )2 = X i
j∈
(dX i )(dX j ) =
Ni
X j dt
−X i X j dt 0
j∈ j ∈
(1.6a)
Ni Ni .
(1.6b)
It now follows from (1.6) that if n ≥ 2 and Y (t ) is a homogeneous polynomial of degree n in X i (t ), i ∈ G, i.e.a sum of terms of the form i∈G X ia(i) , where a(i) are non-negative integer exponents with a(i) = n, then dY (t ) = Q dB + R dt , where R is again a homogeneous polynomial of the same degree n, and Q is the Brownian term. The above conjecture is that there are enough homogeneous polynomials Y , for which R = 0 so that Y is a martingale, so that the resulting equalities EY (τ ) = Y (0) determine the distribution of I (τ ) and X (τ ). We will study one graph where this conjecture holds and can be made to produce the limiting death state law. Despite the paucity of examples, we would be surprised if the conjecture does not hold in general, although finding the right coefficients will surely not be easy for general graphs.
F IGURE 1: The two-leaved star, S2 ; the in-line graph, L 4 ; the star graph, Sr .
The following is a set of homogeneous polynomial martingales for the star graph, Sr , with r + 1 nodes, where there are r leaves each connected to a central root, 0 (Figure 1). Let α1 + · · · + αr = 0 and n ≥ r. For any real numbers αi , i = 1, 2, . . . , r, define Y = Ynα by n n −r (α1 X 1(t ))i1 · · · (αr X r (t ))ir , (1.7) Y (t ) = i i−1 i1 ≥1,... ,ir ≥1 i1 +···+ir =n
where i = (i1 , . . . , ir ), ni = n!/ i1 ! . . . ir !, and the sum is over all integer partitions of n, so that Ynα is a homogeneous polynomial of degree n. To show this is a martingale, note that since (1.6b) gives dX i dX j = 0 for i = 0, j = 0, r n n − r i j (i j − 1) i j i j −1
dY (t ) = Q dB + α j X j X0 (αk X k )ik dt. (1.8) i i−1 2 i
j =1
k = j
In the j th sum replace i j by i j + 1 and rearrange the multinomial coefficients, to obtain that each term in the sum over j is identical except the j th term has a factor α . Thus α j factors j out and since α j = 0, there is no dt term in dY and so Y is a martingale. The reader may wonder how we ever guessed (1.7); the only insight we can give is that in the case r = 2 we noticed that X 1 X 2 is a martingale for S2, and this seemed curious and worthy of extension.
Explicit limits for diffusions on graphs
637
The simplest case of this is r = 2, and in this case we can use the resulting family of martingales to determine the limiting distribution explicitly. It is no loss to take α1 = −1, α2 = 1, and in this case there is one martingale for each n ≥ 2, given by n−1 n n −2 Yn (t ) = (−1)i X 1i (t )X 2n−i (t ). (1.9) i i −1 i=1
In the next section we will use the martingale property: EYn (τ ) = EYn (0)
(1.10)
to obtain the laws of I (τ ) and X (τ ) as a function of ξ = X (0). It is clear that the corresponding equations (1.10) for the Sr graph for all α in (1.8) would also determine all the moments of X (τ ), but it does not seem easy to carry this out explicitly. Remark. It is perhaps remarkable that an infinite family of homogeneous martingales can be found in the continuous case but not in the discrete case. Putting Ni in place of X i on the right of (1.9) gives a martingale in the discrete case for n = 2, 3, 4, 5. For n ≥ 6 there are non-homogeneous martingales, for example for S2 with n = 9, we find the martingale N18 N2 − 28N17 N22 + 196N16 N23 − 490N15 N24 + 490N14 N25 − 196N13 N26 + 28N12 N27 − N1 N28 + 420N15 N22 − 1890N14 N23 + 1890N13 N24 − 420N12 N25 − 1722N13 N22 + 1722N12 N23
(1.11)
and we have found a formula for the terms of degree n − 2, but the general form escapes us. Of course the limit law of N (τ ) is not the same of that of X (τ ), since the former is purely discrete. The glib advantage of using X instead of N is that the second derivative operator is simpler than the second difference. 2. The limit for S2 We now determine the law of X (τ ) for the graph S2. From (1.10), for n ≥ 2 n−1 n−1 n n − 2 n n − 2 n−i i i (−1) E X 1(τ )X 2 (τ ) = (−1)i X 1i (0)X 2n−i (0). i i −1 i i −1 i=1
(2.1)
i=1
Since i ≥ 1 in (2.1), we have X 1i (τ )X 2n−i (τ ) = X 1i (τ )(1 − X 1(τ ))n−i
(2.2)
because if X 1(τ ) = 0, which happens in the death state (X 1 , X 0, X 2) = (0, 1, 0), then (2.2) clearly holds (since i ≥ 1). In any other death state, X 1 (τ ) + X 2(τ ) = 1, since none of the mass is at 0, i.e. X 0 (τ ) = 0 and X 1(τ ) + X 0(τ ) + X 2(τ ) = 1. Thus for n ≥ 2 1 n−1 n−1 n n − 2 n n −2 (−1)i x i (1 − x)n−i µ(dx) = (−1)i ξ1i ξ2n−i (2.3) i i −1 i i −1 0 i=1
i=1
where ξi = X i (0) and µ is the distribution of X 1(τ ). Note that if ξ1 +ξ2 = 1, X (0) is already a death state. We want to obtain µ(dx) = P{X 1 (τ ) ∈ dx}, and note that µ(dx) = µ(dx; ξ1 , ξ2) and µ(dx) = δ(ξ1 , 0, ξ2) if ξ1 + ξ2 = 1. Also we saw in (1.2) that P{X 1 (τ ) = X 2 (τ ) = 0} = P{X 0 (τ ) = 1} = ξ0 = 1 − ξ1 − ξ2
(2.4)
638
YOSHIAKI ITOH ET AL.
and since X 1(t ) is a martingale
1
E(X 1(τ )) = ξ1 =
x dµ(x).
(2.5)
0
1 It is clear that for j ≥ 2 the moments 0 x j µ(dx) can be obtained successively from (2.3), and that these moments determine µ since the moment problem on [0, 1] is determinate. How to actually do it? With Yn as in (1.9), define for any u the process Z u (t ) =
un n≥2
n
Yn (t ),
t ≥ 0.
(2.6)
The following identity is valid for |v| < 1/4, 0 ≤ x ≤ 1, n−1 vn n n −2 1−v (−1)i x i (1 − x)n−i = xv + n i i −1 2 n≥2
1−
i=1
4xv 1+ (1 − v)2
(2.7)
and may be seen by expanding (1 − x)n−i and interchanging sums. Multiplying in (2.3) by u n /n and summing over n ≥ 2, we find using (2.7) with v = u, that for |u| < 1/4,
1 1−u 4xu xu + 1− 1+ µ(dx). (2.8) E Z u (τ ) = 2 (1 − u)2 0 If we use (2.7) now with v = u(ξ1 + ξ2) and x = ξ1 /(ξ1 + ξ2), we find
1 − u(ξ1 + ξ2 ) 4uξ1 Z u (0) = ξ1u + 1− 1+ 2 (1 − u(ξ1 + ξ2))2
.
(2.9)
Since for |u| < 1/4, martingale, Z u (0) = E Z u (τ ), and so after some Z u is also a bounded rearranging using µ(dx) = 1, xµ(dx) = ξ1 , we arrive at the following equation for µ(dx) = P(X 1 (τ ) ∈ dx), valid for |u| < 1/4. 1 (1 − u)2 + 4ux µ(dx) = u(ξ1 + ξ2 − 1) + 1 + 2u(ξ1 − ξ2 ) + u 2 (ξ1 + ξ2)2 . (2.10) 0
√ We can turn (2.10) into Abel’s equation for µ. Divide the integrand by 2 u and define w and θ in terms of u by u = eiθ ,
w=−
1 − cos θ (1 − u)2 = . 4u 2
(2.11)
Then (2.10) is the same as 0
1√
(ξ1 + ξ2 − 1) √ 1 x − w µ(dx) = u+ 2 2
1 + 2(ξ1 − ξ2) + u(ξ1 + ξ2 )2 . u
(2.12)
Now (2.11) and (2.12) hold for large negative w and by analytic continuation (2.12) holds for w < 0, i.e. 0 < u < 1. We would like to continue to w > 0. The left side of (2.12) is
Explicit limits for diffusions on graphs
639
analytic in the upper half plane of w which corresponds to the region {|u| < 1, Im(u) > 0}. Now if we let w tend to the real axis, 0 < w < 1, we get by continuity that (2.12) √ √ continues to hold for u, w, and θ, linked as in (2.11) with 0 ≤ θ ≤ π. Since x − w = −i w − x for 0 ≤ x < w, if we take the imaginary part in (2.12) for 0 < w < 1, we find w √ √ 1 − ξ1 − ξ2 √ w − xµ(dx) = w − 12 Im A + iB, 0≤w≤1 (2.13) 2 0 with A = A(w) and B = B(w), given by A = (1 − 2w)(1 + (ξ1 + ξ2 )2 ) + 2(ξ1 − ξ2 ) B = 2 w(1 − w)((ξ1 + ξ2)2 − 1) since sin 12 θ = Since
w 0
√
(2.15)
√ √ w, cos θ = 1 − 2w, sin θ = 2 w(1 − w).
√
we find
(2.14)
A + iB =
w − xµ(dx) =
√
1/2
B + i √ (A + A2 + B 2 )−1/2 2
(2.16)
1 − ξ1 − ξ2 √ 1 B w − √ (A + A2 + B 2 )−1/2 . 2 2 2
(2.17)
A+
A2 + B 2 2
We show that µ has point mass p0 at x = 0, p1 at x = 1, and that for 0 < x < 1, µ(dx) = f (x)dx, with f (x) absolutely continuous. We can identify p0 and p1 as follows. For 0 < w < 1 the left side of (2.17) is w √ √ w − xµ(dx). (2.18) p0 w + 0+
√ So p0 must be the limit, as w → 0, of the right side of (2.17) divided by w. Thus we obtain immediately 1 − ξ1 − ξ2 1 + ξ1 + ξ2 p0 = P{X 1 (τ ) = 0} = . (2.19) 1+ 2 ((1 + ξ1 + ξ2 )2 − 4ξ2)1/2 Now we have observed in (2.4) that P{X 0 (τ ) = 1} = P{X 1 (τ ) = X 2 (τ ) = 0} = ξ0 = 1 − ξ1 − ξ2
(2.20)
so that P{X 2 (τ ) = 1} = P{X 1 (τ ) = 0} − P{X 0 (τ ) = 1} 1 + ξ1 + ξ2 1 − ξ1 − ξ2 −1 + . = 2 ((1 + ξ1 + ξ2)2 − 4ξ2)1/2 By symmetry we have (interchanging 1 and 2) the point mass of µ at x = 1, 1 + ξ1 + ξ2 1 − ξ1 − ξ2 p1 = P{X 1 (τ ) = 1} = −1 + . 2 ((1 + ξ1 + ξ2 )2 − 4ξ1)1/2
(2.21)
(2.22)
640
YOSHIAKI ITOH ET AL.
√ Finally, subtracting p0 w from (2.17) we have w √ w − x µ(dx) = g(w) 0+ √ w(1 − w) = {1 − (ξ1 + ξ2)2 } √ 2 (2 A + B 2 + 2A)1/2 √ w − , 0 < w < 1, 2((1 + ξ1 + ξ2)2 − 4ξ2 )1/2
(2.23)
and so by the Abel inversion formula (Whittaker and Watson (1927)), µ(dx) = f (x)dx with x d2 d2 2 x 1 f (x) = 2 (x − t ) f (t ) dt = 2 √ g(w) dw. (2.24) dx 0 dx π 0 x −w It does not seem possible to perform the integration in (2.24) in closed form. For later use we record (2.21), defining ϕ(ξ1 , ξ2) 1 − ξ1 − ξ2 1 + ξ1 + ξ2 P{X 1 (τ ) = 1} , ϕ(ξ1 , ξ2) = −1 + 2 ((1 + ξ1 + ξ2 )2 − 4ξ1)1/2
(2.25)
valid for 0 ≤ ξ1 , 0 ≤ ξ2, ξ1 + ξ2 ≤ 1. The reader may well wonder how did we ever think of (2.6) as the way to solve the moment problem and the only insight we can give is that we have solved many other problems. 3. Some corollaries for other graphs The method of embedded graphs and embedded Markov processes allow us to draw corollaries for some other graphs from the results of Section 2 for the simple graph S2. For example suppose G has a pair of vertices, say 1 and 2, not connected to each other but each connected to all the other vertices (e.g. a complete graph with some edges removed except those connecting 1 and 2 to other vertices). Then if the continuous graph process (1.3) starts in state ξi , i ∈ G, then P{X 1 (τ ) ∈ dx} = µ(dx, ξ1 , ξ2) and P(I (τ ) = {1}) = ϕ(ξ1 , ξ2), as given in (2.25). The proof is clear; if weidentify all i ∈ G, i = 1, 2, then we have an embedded Markov process on S2 with X 0 = i =1,2 X i . A second (similar but more specific) example is the graph R2k , a regular polygon with 2k vertices and all edges present except those joining opposite vertices. The independent sets consist of singletons and doubleton pairs of opposite edges. We can find the distribution of I (τ ) and X (τ ) from those obtained for S2 in Section 2. Indeed by the last paragraph, if 1 and 2 are taken to be any pair of opposite vertices, then P{I (τ ) = {1}}, and P{I (τ ) = {1, 2}} are immediately obtained as ϕ(ξ1 , ξ2) and 1 − ξ1 − ξ2 − ϕ(ξ1 , ξ2) − ϕ(ξ2 , ξ1) respectively. Recall that Sr is the star graph with leaves {1, . . . , r} and hub 0; every subset of {1, . . . , r} is independent. Although we have not solved the moment problem completely, we can obtain the distribution of the limiting support set I (τ ), i.e. for any subset J of {1, 2, . . . , r} we can give PSr {I (τ ) = J }. We identify with the mark 1 all points in J and identify with mark 2 all points in {1, 2, . . . , r} − J . The induced process remains Markovian but is now simply an S2 graph process. It follows immediately from (2.25) that (3.1) PSr {I (τ ) ⊂ J } = PS2 (I (τ ) = {1}) = ϕ ξi , ξi i∈J
i∈{1,... ,r}− J
Explicit limits for diffusions on graphs
641
with ϕ again as in (2.25). It is now easy to obtain P{I (τ ) = J } by the inclusion–exclusion formula: P{I = J } = P{I ⊂ J } − P{I ⊂ J1} + P{I ⊂ J2} . . . (3.2) J1
J2
where the sum J P(I ⊂ J ) is over all subsets J ⊂ J for which J − J has exactly elements and the value in (3.1) is summed with J replaced by J. Although these remarks extend the results of Section 2 to many special graphs, it seems out of reach to determine explicitly the distribution of I (τ ), much less that of X (τ ), for an arbitrary graph G. Probably the next most complicated graph to try would be the in-line graph, L 4, on four vertices and three edges (Figure 1). L 4 has seven independent subgraphs, four singletons and {1, 3}, {2, 4}, {1, 4}. What are the respective probabilities of the limiting support set I (τ ) for the graph L 4 ? There doesn’t seem to be any way to reduce this graph to S2. Somewhat more symmetrical would be to put L 4 on a circle by including an edge between 4 and 1, so now there would be six independent sets. But this graph is R4 which does reduce to S2 as we have just seen. Acknowledgement We are grateful to the referees for correcting some errors, and also for the observation that existence and uniqueness of our stochastic differential equations (which we had left open) follows easily from results in the book of Ethier and Kurtz (1986). References A RNOLD , V. I. (1989). Mathematical Models of Classical Mechanics. Springer, New York. B OGOYAVLENSKY, O. I. (1988). Integrable discretization of the KdV equation. Phys. Lett. A 134, 34–38. B RAMSON , M. AND G RIFFEATH , D. (1989). Flux and fixation in cyclic particle systems. Ann. Prob. 17, 26–45. E THIER , S. N. (1976). A class of degenerate diffusion processes occurring in population genetics. Comm. Pure. Appl. Math. 29, 483–493. E THIER , S. N. K URTZ T. G. (1986). Markov Processes. Wiley, New York. F LASCHKA , H. (1974). The Toda lattice I. Phys. Rev. B 9, 1924–1925. H ENON , M. (1974). Integrals of the Toda lattice. Phys. Rev. B 9, 1921–1923. I TOH , Y. (1979). Random collision models in oriented graphs. J. Appl. Prob. 16, 36–44. I TOH , Y. (1993). Stochastic model of an integrable nonlinear system. J. Phys. Soc. Japan 62, 1826–1828. L AX , P. D. (1968). Integrals of nonlinear equations of evolution and solitary waves. Comm. Pure Appl. Math. 21, 467–490. L IGGETT, T. M. (1985). Interacting Particle Systems. Springer, New York. M C K EAN , H. P. (1969). Stochastic Integrals. Academic Press, New York. M ORAN , P. A. P. (1958). Random processes in genetics. Proc. Camb. Phil. Soc. 54, 60–71. N EI , M., M ARUYAMA , T. AND W U , C. (1982). Models of evolution of reproductive isolation. Genetics 103, 557– 579. S ATO , K. (1976). Diffusion processes and a class of Markov chains related to population genetics. Osaka J. Math. 13, 631–659. W HITTAKER , E. T. AND WATSON , G. N. (1927). A Course of Modern Analysis, 4th edn. Cambridge University Press, Cambridge, Section 11.8.