A lower bound on the number of linear extensions of ...

3 downloads 0 Views 175KB Size Report
Jun 12, 2013 - from Simonyi's survey, [9]. Definition 6. Let G(n, p) be a random graph, with vertex set X and Y being the set of all independent sets of the graph.
A lower bound on the number of linear extensions of random graph orders Vasileios Iliopoulos

June 12, 2013

1 / 17

Abstract

In this presentation, I will discuss the sorting of partial orders arising from random graphs and present a new bound on the number on comparisons needed for the complete sorting.

2 / 17

Motivation Here, we assume that a partial ordering is given, and we establish a lower bound on the number of further comparisons for the complete sorting, using an entropy approach.

3 / 17

Motivation Here, we assume that a partial ordering is given, and we establish a lower bound on the number of further comparisons for the complete sorting, using an entropy approach. Fredman [5] showed that the number of comparisons c(P ) needed in worst case for the sorting of a partially ordered set (poset) P , consisting of n elements is  c(P ) = log2 e(P ) + 2n, where e(P ) is the number of linear extensions of the poset. A linear extension of a poset, is a total order compatible with the given partial order. In the case of a random graph order, we shall see that the information lower bound is not tight, as it has the same order of magnitude with n. 4 / 17

Partially ordered sets

Definition 1 A partially ordered set (poset) is a finite set S equipped with a the relation ‘≤’ which has the following properties (Here, x, y and z are elements of S). 1

x ≤ x, ∀x ∈ S. (That is, ≤ is reflexive)

2

If x ≤ y and y ≤ x then x = y (≤ is antisymmetric)

3

If x ≤ y and y ≤ z, then x ≤ z. (≤ is transitive)

Definition 2 If we have a partially ordered set, we say two elements x and y are comparable if x ≤ y or y ≤ x. Otherwise they are incomparable.

5 / 17

Information theoretic lower bound Given n distinct numbers to sort by a series of pairwise comparisons, with all the n! orderings equiprobable (e.g. no information is given beforehand), the minimum number of comparisons needed is dlog2 (n!)e.

6 / 17

Information theoretic lower bound Given n distinct numbers to sort by a series of pairwise comparisons, with all the n! orderings equiprobable (e.g. no information is given beforehand), the minimum number of comparisons needed is dlog2 (n!)e. Proof. The outcome of the comparison between ai and aj , will yield that either ai > aj , or ai < aj . In A linear extensions, we will have that ai > aj , and in B linear extensions, ai < aj . It holds n! candidates. that A + B = n! so there are at least max(A, B) ≥ 2 Similarly, of those number of linear extensions, choosing again two n! elements, we have to consider at least 2 total orders and after r 2 n! comparisons, there will be at least r candidates. The fraction is 2 equal to unity, when r = dlog2 (n!)e. 7 / 17

Random graphs

Definition 3 The Erd˝os–R´enyi random graph G(n, p) has labelled vertex set {1, 2, . . . , n} and for each pair of vertices, the probability of an edge arising between them is p, independently of all the other pairs of vertices.

8 / 17

Random graphs

Definition 3 The Erd˝os–R´enyi random graph G(n, p) has labelled vertex set {1, 2, . . . , n} and for each pair of vertices, the probability of an edge arising between them is p, independently of all the other pairs of vertices. The random graph can be seen as a random partial order. When an edge is present, to the nodes i ∼ j, then i < j. In the graph, j is above i and the transitivity gives the partial order P (n, p). The question that arises is how many linear extensions this random order has?

9 / 17

Bounds of the number of linear extensions Alon et al. [2] derived the following Theorem Theorem 4 Let 0 < p < 1 be fixed and consider e(P ), where the partial order is from P (n, p). Then we have that there are µ(p) > 0 and σ 2 (p) > 0 such that   ln e(P ) − µ(p) · n D √ −→ N(0, 1). σ(p) · n The bounds given for µ(p) are ∞ X j=1

ln(j)p(1 − p)

j−1

  1 ≤ µ(p) ≤ ln p

(1)

10 / 17

Average height of a random graph order Generalising Albert–Frieze argument [1], the overestimate h(p) and underestimate f (p) height increment of G(n, p) are j−1 ! ∞ X Y p(1 − p)i  (1 − p)j 1 − (1 − p)i+2 j=1 i=1 f (p) = 1 − ! j−1 ∞ X Y p(1 − p)i 1 − (1 − p)i+2 j=1

h(p) =

i=1

1 , ∞ X j(j−1) 2 (1 − p) j=1

Simple bounds on the increments are p ≤ f (p) < h(p) ≤

2 . θ3 (0, 1 − p) + 1 11 / 17

Entropy A central notion to the analysis is entropy of the distribution of a random variable X, which is a measure of uncertainty. Definition 5 The Shannon’s entropy H of a discrete random variable X taking the values x1 , x2 , . . . , xn with probabilities p(x1 ), p(x2 ), . . . , p(xn ) is defined by [8], H(X) = −

n X

 p(xi ) logb p(xi ) .

i=1

K¨ orner [6] extended this notion to random graphs This definition is from Simonyi’s survey, [9] Definition 6 Let G(n, p) be a random graph, with vertex set X and Y being the set of all independent sets of the graph. The graph entropy H(G) is H(G) = min I(X ∧ Y ), where the minimum is over pairs of random variables (X, Y ). 12 / 17

The mutual information is I(X ∧ Y ) = H(X) − H(X|Y ). Using ˜ Lemma 5 of Cardinal et al. [3] which states that |C| ≥ 2−H(P ) n, we deduce that ˜  h(p) ≥ E 2−H(P ) , where H(P˜ ) denotes the entropy of the incomparability graph. By Theorem 2 of that paper, we also have that nH(P˜ ) ≤ 2 log2 (e(P )) . Therefore  h(p) ≥ E e(P )−2/n .

13 / 17

Taking on both sides logarithms and applying Jensen’s inequality, we get   log2 h(p) ≥ log2 E e(P )−2/n   ≥ E log2 (e(P )−2/n )  2 = − E log2 (e(P ) n 2 =− µ(p). ln(2) Thus, µ(p) is bounded below by µ(p) ≥ −

log2 h(p) ln(2) . 2

14 / 17

References I

Albert, M. and Frieze A. (1989) Random graph orders. Order 6 (1): 19–30. Alon, N., Bollob´as, B., Brightwell, G. and Janson, S. (1994) Linear Extensions of a Random Partial Order. Ann. Appl. Probab. 4 (1): 108–123. Cardinal, J., Fiorini, S., Joret, G., Jungers, R. and Munro, I. (2010) Sorting under partial information (without the Ellipsoid algorithm). STOC’ 10 Proceedings of the 42nd ACM symposium on Theory of computing, 359–368.

15 / 17

References II

McEliece, R. (2002) The theory of Information and Coding: A Mathematical Framework for Communication. University Cambridge Press, 2nd Edition. Fredman, M. (1976) How good is the information theory bound in sorting? Theor. Comput. Sci. 1 (4): 355–361. K¨orner, J. (1973) Coding of an Information source having ambiguous alphabet and the entropy of graphs. Proceedings of the 6th Prague Conference on Information Theory, 411–425.

16 / 17

References III

Penner, R. C. (1999) Discrete Mathematics: Proof Techniques and Mathematical Structures. World Scientific Publishing. Shannon, C. E. (1948) A Mathematical Theory of Communication. Bell Syst. Tech. J. 27 (3): 379–423. Simonyi, G. (1995) Graph Entropy: A Survey. In Combinatorial Optimization (Ed. W. Cook, L. Lov´asz, and P. Seymour). DIMACS Series in Discrete Mathematics and Theoretical Computer Science. Amer. Math. Soc., 399–441.

17 / 17

Suggest Documents