Recent Developments in Quantitative Graph Theory: Information Inequalities for Networks Matthias Dehmer*, Lavanya Sivakumar Institute for Bioinformatics and Translational Research, UMIT, Hall in Tyrol, Austria
Abstract In this article, we tackle a challenging problem in quantitative graph theory. We establish relations between graph entropy measures representing the structural information content of networks. In particular, we prove formal relations between quantitative network measures based on Shannon’s entropy to study the relatedness of those measures. In order to establish such information inequalities for graphs, we focus on graph entropy measures based on information functionals. To prove such relations, we use known graph classes whose instances have been proven useful in various scientific areas. Our results extend the foregoing work on information inequalities for graphs. Citation: Dehmer M, Sivakumar L (2012) Recent Developments in Quantitative Graph Theory: Information Inequalities for Networks. PLoS ONE 7(2): e31395. doi:10.1371/journal.pone.0031395 Editor: Frank Emmert-Streib, Queen’s University Belfast, United Kingdom Received December 2, 2011; Accepted January 6, 2012; Published February 15, 2012 Copyright: ß 2012 Dehmer, Sivakumar. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. Funding: Matthias Dehmer and Lavanya Sivakumar thank the Austrian Science Fund (Project No. PN22029-N13) for supporting this work. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript. Competing Interests: The authors have declared that no competing interests exist. * E-mail:
[email protected]
present on either side of the inequality. It is important to emphasize that relatively little work has been done to investigate relations between network measures. A classical contribution in this area is due to Bonchev et al. [25]. Here, the relatedness between information-theoretic network measures has been investigated to detect branching in chemical networks. Further, implicit information inequalities have been studied for hierarchical graphs which turned out to be useful in network biology [26]. We first present closed form expressions of graph entropies using the graph classes, stars and path graphs. Further, we infer novel information inequalities for the measures based on the jsphere functional. The section ‘‘Implicit Information Inequalities’’ presents our main results on novel implicit inequalities for networks. We conclude the paper with a summary and some open problems. Before discussing our results, we will first present the information-theoretic measures that we want to investigate in this paper.
Introduction Complexity is an intricate and versatile concept that is associated with the design and configuration of any system [1,2]. For example, complexity can be measured and characterized by quantitative measures often called indices [3–5]. When studying the concept of complexity, information theory has been playing a pioneering and leading role. Prominent examples are the theory of communication and applied physics where the famous Shannon entropy [6] has extensively been used. To study issues of complexity in natural sciences and, in particular, the influence and use of information theory, see [7]. In this paper, we deal with an important aspect when studying the complexity of network-based systems. In particular, we establish relations between information-theoretic complexity measures [3,8–11]. Recall that such entropic measures have been used to quantify the information content of the underlying networks [8,12]. Generally, this relates to exploring the complexity of a graph by taking its structural features into account. Note that numerous measures have been developed to study the structural complexity of graphs [5,8,13–22]. Further, the use and ability of the measures has been demonstrated by solving interdisciplinary problems. As a result, such studies have led to a vast number of contributions dealing with the analysis of complex systems by means of information-theoretic measures, see, e.g., [8,13–22]. Figure 1 shows a classification scheme of quantitative network measures exemplarily. The main contribution of this paper is to study relations between entropy measures. We will tackle this problem by means of inequalities involving network information measures. In particular, we study so-called implicit information inequalities which have been introduced by Dehmer et al. [23,24] for studying graph entropies using information functionals. Generally, an implicit information inequality involves information measures which are PLoS ONE | www.plosone.org
Methods In this section, we briefly state the concrete definitions of the information-theoretic complexity measures that are used for characterizing complex network structures [3,6,9,27]. Here we state measures based on two major classifications namely partitionbased and partition-independent measures and deal mainly with the latter. Given a simple, undirected graph G~(V ,E), let d(u,v) denote the distance between two vertices u and v, and let r(G)~maxfd(u,v) : u,v [ V g. Let Sj (u; G) denote the j-sphere of a vertex u defined as Sj (u; G)~fx [ V : d(u,x)~jg. Throughout this article, a graph G represents a simple undirected graph. Definition 1 Let G~(V ,E) be a graph on n vertices and let X be a graph invariant of G. Let a be an equivalence relation that partitions X into k subsets X1 ,X2 , . . . Xk , with cardinality jXi j for 1ƒiƒk. The total 1
February 2012 | Volume 7 | Issue 2 | e31395
Information Inequalities for Networks
Figure 1. A classification of quantitative network measures. doi:10.1371/journal.pone.0031395.g001
structural information content of G is given by It (G)~jX j log2 jX j{
k X
jXi j log2 jXi j:
In order to define concrete graph entropies, we reproduce the definitions of some information functionals based on metrical properties of graphs [3,9,27]. Definition 5 Parameterized exponential information functional using jspheres: Pr(G) c jS (v ;G)j fP (vi )~a j~1 j j i , ð5Þ
ð1Þ
i~1
Definition 2 Let G~(V ,E) be a graph on n vertices and let pi ~jXi j=jX j, for 1ƒiƒk be the probability value for each partition. The mean information content of G is Im (G)~{
k X
pi log2 pi ~{
i~1
k X jXi j i~1
jX j
log2
jXi j : jX j
where aw0 and ck w0 for 1ƒkƒr(G). Definition 6 Parameterized linear information functional using jspheres: n X cj jSj (vi ; G)j, ð6Þ f ’P (vi )~
ð2Þ
j~1
In the context of theory of communication, the above equation is called as Shannon equation of information [28]. Definition 3 Let G~(V ,E) be a graph on n vertices. The quantity f (vi ) p(vi )~ Pn , j~1 f (vj )
where ck w0 for 1ƒkƒr(G). Remark 2 Observe that, when either a~1 or the ck are all equal, the functional fP and fP’ becomes a constant function and, hence, the probability 1 on all the vertices are equal. That is pf (v)~ , for v [ V . Thus, the value of n the entropy attains its maximum value, If (G)~ log2 (n). Thus, in all our proofs, we only consider the non-trivial case, namely a=1 and/or at least for two coefficients holds cj =ck . Next, we will define the local information graph to use local centrality measures from [9]. Let LG (v,j) be the subgraph induced by the shortest path starting from the vertex v to all the vertices at distance j in G. Then, LG (v; j) is called the local information graph regarding v with respect to j, see [9]. A local centrality measure that can be applied to determine the structural information content of a network [9] is then defined as follows. Definition 7 The closeness centrality of the local information graph is defined by 1 b(v; LG (v,j))~ P : ð7Þ d(v,x)
ð3Þ
is a probability value of vi [ V . f : V ?Rz is an arbitrary information functional that maps a set of vertices to the non-negative real numbers. Remark 1 Observe that, p(:) defines a probability distribution over the set of vertices as it satisfies 0ƒp(vi )v1, for every vertex vi , 1ƒiƒn and n P p(vi )~1. i~1
Using the resulting probability distribution associated with G leads to families of network information measures [3,9]. Definition 4 The graph entropy of G given representing its structural information content: If (G)~{
n X i~1
log2
p(vi ) log2 p(vi )~{ ! f (vi ) Pn : j~1 f (vj )
PLoS ONE | www.plosone.org
n X
f (v ) Pn i j~1 f (vj ) i~1
x [ LG (v,j)
ð4Þ
Remark 3 Note that centrality is an important concept that has been introduced for analyzing social networks [29,30]. Many centrality measures 2
February 2012 | Volume 7 | Issue 2 | e31395
Information Inequalities for Networks
have been contributed [30], and in particular, various definitions for closeness centrality [30–32]. We remark that the above definition has been firstly defined by Sabidussi [31] for arbitrary graphs. However, we use the measure as a local invariant defined on the subgraphs induced by the local information graph [9]. Similar to the j-sphere functionals, we define further functionals based on the local centrality measure as follows. Definition 8 Parameterized exponential information functional using local centrality measure: Pn c b(v ;L (v ,j)) fC (vi )~a j~1 j i G i ,
x~
f ’C (vi )~
ð11Þ
c1 , 2c1 zc2 (n{2)
ð12Þ
if f ~fP . x~ if f ~f ’P .
ð8Þ
x~
where aw0, ck w0 for 1ƒkƒr(G). Definition 9 Parameterized linear information functional using local centrality measure: n X
1 , 1z(n{1)a(c2 {c1 )(n{2)
1
,
ð13Þ
!,
ð14Þ
c1 2
ð9Þ
c1 (1z(n{1) )zc2
j~1
(n{1)2 2n{3
if f ~f ’C .
where ck w0, for 1ƒkƒr(G). Note that the coefficients ck can be chosen arbitrarily. However, the functionals become more meaningful when we choose the coefficients to emphasize certain structural characteristics of the underlying graphs. Also, this remark implies that the notion of graph entropy is not unique because each measure takes different structural features into account. Further, this can be understood by the fact that a vast number of entropy measures have been developed so far. Importantly, we point out that the measures we explore in this paper are notably different to the notion of graph entropy introduced by Ko¨rner [21]. The graph entropy due to Ko¨rner [21] is rooted in information theory and based on the known stable set problem. To study more related work, survey papers on graph entropy measures have been authored by Dehmer et al. [3] and Simonyi [33].
Proof:
N
Pr(S(n)) c jS (v;S(n))j Consider f (v)~fP (v)~a j~1 j j , where aw0 and ck w0 for 1ƒkƒr(S(n)).
We get, ( f (v)~
ac1 (n{1) ,
if v is the central vertex,
ac1 zc2 (n{2) ,
otherwise:
ð15Þ
Therefore, X
f (v)~ac1 (n{1) 1z(n{1)a(c2 {c1 )(n{2) :
ð16Þ
v [ V (S(n))
Results and Discussion Hence,
Closed Form Expressions and Explicit Information Inequalities
8 1 > > > < 1z(n{1)a(c2 {c1 )(n{2) , pf (v)~ > a(c2 {c1 )(n{2) > > , : 1z(n{1)a(c2 {c1 )(n{2)
When calculating the structural information content of graphs, it is evident that the determination of closed form expressions using arbitrary networks is critical. In this section, we consider simple graphs namely trees with smallest and largest diameter and compute the measures defined in the previous section. By using arbitrary connected graphs, we also derive explicit information inequalities using the measures based on information functionals (stated in the previous section). Stars. Star graphs, S(n), have been of considerable interest because they represent trees with smallest possible diameter (r(S(n))~2) among all trees on n vertices. Now, we present closed form expressions for the graph entropy by using star graphs. For this, we apply the information-theoretic measures based on information functionals defined in the preliminaries section. Theorem 4 Let S(n) be a star on n vertices. Let f [ ffP ,f ’P ,fC ,f ’C g be the information functionals as defined before. The information measure is given by 1{x If (S(n))~{x log2 x{(1{x) log2 , ð10Þ n{1
if v is the central vertex, ð17Þ otherwise:
By substituting the value of pf (v) in If (S(n)) and simplifying, we get If (S(n))~{x log2 x{(1{x) log2 x~
N
1{x , n{1
1 : 1z(n{1)a(c2 {c1 )(n{2)
Consider f (v)~fP’ (v)~ 1ƒkƒr(S(n)).
r(S(n)) P
cj jSj (v; S(n))j, where ck w0 for
j~1
We get, f (v)~
where x is the probability of the central vertex of S(n): PLoS ONE | www.plosone.org
if f ~fC . x~
cj b(vi ; LG (vi ,j)),
n{2 1 1z(n{1)ac1 n{1 zc2 2n{3
3
c1 (n{1),
if v is the central vertex,
c1 zc2 (n{2),
otherwise:
ð18Þ
February 2012 | Volume 7 | Issue 2 | e31395
Information Inequalities for Networks
Therefore,
Therefore, X
f (v)~(n{1)½2c1 zc2 (n{2):
X
ð19Þ
v [ V (S(n))
Hence, c1 , 2c1 zc2 (n{2) c1 zc2 (n{2) , (n{1)½2c1 zc2 (n{2)
8 c1 > > , > > < c1 (1z(n{1)2 )zc2 ( (n{1)2 ) 2n{3 pf (v)~ 1 > c1 zc2 ( 2n{3 ) > > , > : 2 1z(n{1) n{1 c1 ( n{1 )zc2 ( 2n{3 )
if v is the central vertex, ð20Þ otherwise:
By substituting the value of pf (v) in If (S(n)) and simplifying, we get If (S(n))~{x log2 x{(1{x) log2
P
1 d(v,x)
c1
where x~
ð21Þ
,
denotes the closeness centrality measure. Then, we yield
f (v)~
1
ac1 (n{1) ,
ð22Þ
otherwise:
Therefore, 1
1
f (v)~ac1 (n{1) z(n{1)ac1 zc2 (2n{3) :
Hence, if v is the central vertex, ð24Þ otherwise:
By substituting the value of pf (v) in If (S(n)) and simplifying, we obtain 1{x If (S(n))~{x log2 x{(1{x) log2 : n{1 where x~
Consider f (v)~f ’C (v)~
n znð2r{3Þ{2rðr{1Þ nðn{1Þð2n{1Þ 2nðn{1Þð2n{1Þ : log2 3n2 z3nð2r{3Þ{6rðr{1Þ
ð31Þ
Proof: Let Pn be a path graph trivially labeled by v1 , v2 , . . . ,vn (from left to right). P Given f (v)~fP’ (v)~ n{1 j~1 cj jSj (v; Pn )j with cj ~n{j for 1ƒjƒn{1. n By computing f , when v [ fvr ,vnz1{r g, for 1ƒrƒq r, we infer 2
cj b(v; LS(n) (v,j)), where ck w0 for
1ƒkƒr(S(n)). b is defined via Equation (18). We get, 8 1 > ), if v is the central vertex, < c1 ( n{1 f (v)~ > : c zc ( 1 ), otherwise: 1 2 2n{3
ð29Þ
qn=2r X 2 r~1
j~1
PLoS ONE | www.plosone.org
.
2 nz2 n (nz2)(n{1) log2 z log2 : ð30Þ nz2 2 nz2 n
IfP’ ðPn Þ~3
1 . c1 (n{2)zc2 ( 1 ) 2n{3 1z(n{1)a n{1 n P
ð28Þ
Paths. Let Pn be the path graph on n vertices. Path graphs are the only trees with maximum diameter among all the trees on n vertices, i.e., r(Pn )~n{1. We remark that to compute a closed form expression even for path graphs, is not always simple. To illustrate this, we present the concrete information measure IfP’ (Pn ) by choosing particular values for its coefficients. Lemma 5 Let Pn be a path graph and consider the functional f ~fP’ defined by Equation (6). We set c1 : ~r(Pn )~n{1, c2 : ~ r(Pn ){1, . . . ,cr : ~1. We yield
ð23Þ
v [ V(S(n))
8 1 , > > c (n{2)zc ( 1 ) > < 1z(n{1)a 1 n{1 2 2n{3 n{2 1 pf (v)~ ac1 (n{1)zc2 (2n{3) > > > , : n{2 1 1z(n{1)ac1 (n{1)zc2 (2n{3)
(n{1)2 ) 2n{3
1{x , n{1
If we plug in those values in Equations (10) and (11), we easily derive IfP’ (S(n))~
X
otherwise:
c1 : ~r(S(n))~2 and c2 : ~r(S(n)){1~1:
if v is the central vertex,
1 ) : ac1 zc2 (2n{3 ,
ð27Þ
By choosing particular values for the parameters involved, we get concrete measures using the above stated functionals. For example, consider the functional f ~fP’ and set
x [ LS(n) (v,j)
8
> < pf (v)~ > > :
N
f (v)~c1
v [ V (S(n))
! 1z(n{1)2 n{1 : zc2 2n{3 n{1
ð25Þ
f (v)~
r{1 X j~1
4
2cj z
n{r X
cj ,
ð32Þ
j~r
February 2012 | Volume 7 | Issue 2 | e31395
Information Inequalities for Networks
~2
r{1 X
(n{j)z
n{r X
Similarly,
ð33Þ
(n{j),
j~r
j~1
f (v)§a(n{1)cmin :
1 ~ ½n2 zn(2r{3){2r(r{1): 2
ð44Þ
Therefore, from the Equations (43) and (44), we get
ð34Þ
na(n{1)cmin ƒ
Therefore,
X
f (v)ƒna(n{1)cmax :
ð45Þ
a(n{1)cmin a(n{1)cmax ƒpf (v)ƒ : (n{1)c : (n{1)c : max min na na
ð46Þ
v[V n X
f (vi )~2
i~1
n{1 X
1 (n{j)cj ~ n(n{1)(2n{1), 3 j~1
ð35Þ
Hence,
and, hence, 2
pf (v)~
3 n zn(2r{3){2r(r{1) , 2 n(n{1)(2n{1)
ð36Þ
Let X ~(n{1)½cmax {cmin . Then, the last inequality can be rewritten as,
n where v [ fvr ,vnz1{r g, for 1ƒrƒq r. By substituting these 2 quantities into If (Pn ) yields the desired result. Note that when using the same measure with arbitrary coefficients, its computation is intricate. In this regard, we present explicit bounds or information inequalities for any connected graph if the measure is based on the information functional using j-spheres. That is, either f ~fP or f ~fP’ . General connected graphs. Theorem 6 Given any connected graph G~(V ,E) on n vertices and let f ~fP given by Equation (5). Then, we infer the following bounds: X a log2 (n:aX ), If (G)ƒ {X a log2 (n:a{X ), 8 > > > > aX log2 ðn:aX Þ, > > > > < If (G)§ a{X log ðn:a{X Þ, 2 > > > > > > > > 0, :
if aw1, : if av1:
1 aX ƒpf (v)ƒ : X : na n
ð47Þ
Upper bound for If (G): Since X w0 and aw1, we have
1 v1. Hence, we have : n aX 1 1 { log2 : X §0 and 0v{ log2 pf (v)ƒ{ log2 : X . Thus we na na get,
ð37Þ
{pf (v) log2 pf (v)ƒ{
1 1 X if ƒaƒ1, n
aX 1 log2 : X : na n
ð48Þ
By adding over all the vertices of V , we obtain
ð38Þ
1 If (G)ƒ{aX log2 : X ~aX log2 (n:aX ): na
X ~ðcmax {cmin Þðn{1Þ,
ð38Þ
Lower bound for If (G): We have to distinguish two cases, either aX vn or aX §n. aX w0. Case 1.1: 1vavn1=X . We yield { log2 pf (v)§{ log2 n Therefore,
with
cmax ~maxfcj : 1ƒjƒr(G)g,
ð40Þ
and
cmin ~minfcj : 1ƒjƒr(G)g:
ð41Þ
where
1
if 1ƒaƒnX , 1 1 1 X or a§nX : if 0vaƒ n
1 aX {pf (v) log2 pf (v)§{ : X log2 : n na
ð50Þ
By adding over all the vertices of V , we get
Pr(G) c jS (v;G)j , where aw0 and Proof: Consider f (v)~fP (v)~a j~1 j j ck w0 for 1ƒkƒr(G). Let cmax ~maxfcj : 1ƒjƒr(G)g and cmin ~minfcj : 1ƒjƒr(G)g. Recall (see Remark (2)) that, when either a~1 or when all the coefficients (ck ) are equal, the information functional becomes constant and, hence, the value of If (G) equals log2 n. In the following, we will discuss the cases aw1 and av1, and we also assume that not all ck are equal. Case 1: aw1: We first construct the bounds for pf (v) as shown below: r(G) P
ð49Þ
If (G)§{
1 aX ~a{X log2 (n:a{X ): log2 X n a
ð51Þ
Case 1.2: a§n1=X . aX aX §0 and log2 pf (v)v0ƒ log2 . In this case, we obtain log2 n n Therefore, by using these bounds in Equation (4), we infer If (G)w0. Case 2: av1: Consider Equation (42). We get the following bounds for f (v):
cj jSj (v;G)j
f (v)~a j~1 ƒa(n{1)cmax : PLoS ONE | www.plosone.org
ð42Þ
a(n{1)cmax ƒf (v)ƒa(n{1)cmin :
ð52Þ
Therefore,
ð43Þ 5
February 2012 | Volume 7 | Issue 2 | e31395
Information Inequalities for Networks
na(n{1)cmax ƒ
X
f (v)ƒna(n{1)cmin :
ð53Þ
with cmax ~maxfcj : 1ƒjƒr(G)g,
ð62Þ
and cmin ~minfcj : 1ƒjƒr(G)g:
ð63Þ
v[V
Hence, a(n{1)cmin
a(n{1)cmax
ƒpf (v)ƒ : (n{1)cmax : na n:a(n{1)cmin
ð54Þ Proof: Consider f (v)~f ’P (v)~
cj jSj (v; G)j, where ck w0
j~1
Set X ~(n{1)½cmax {cmin . Then, the last inequality can be rewritten as, aX 1 ƒpf (v)ƒ : X : n na
r(G) P
for 1ƒkƒr(G). Let cmax ~maxfcj : 1ƒjƒr(G)g and cmin ~ minfcj : 1ƒjƒr(G)g. We have,
ð55Þ f (v)~
r(G) X
cj jSj (v; G)jƒ(n{1)cmax :
ð64Þ
j~1
Upper bound for If (G): aX ƒ1. Hence, we have Since X w0 and av1, we have n aX aX { log2 §0 and 0v{ log2 pf (v)ƒ{ log2 . Thus, we obtain, n n
Similarly, f (v)§(n{1)cmin :
ð65Þ
X
1 a : {pf (v) log2 pf (v)ƒ{ : X log2 na n
Therefore, from the Equations (64) and (65), we get
ð56Þ
n(n{1)cmin ƒ
By adding over all the vertices of V , we get If (G)ƒ{
1 aX ~a{X log2 (n:a{X ): log 2 aX n
X
f (v)ƒn(n{1)cmax :
ð66Þ
v[V
Hence,
ð57Þ
cmin cmax ƒpf (v)ƒ : : n:cmax n cmin
Lower bound for If (G): 1 1 Again, we consider two cases, either aX ƒ or aX w . n n 1 Case 2.1: 0vaƒ( )1=X . n 1 In this case, we have log2 : X §0 and log2 pf (v)v0ƒ n a 1 log2 : X . Therefore, by substituting these bounds in the Equation na (4), we obtain If (G)w0. 1 Case 2.2: ( )1=X vav1. n 1 We have { log2 pf (v)§{ log2 X w0. Therefore, na
ð67Þ
Upper bound for If (G): cmin cmin ƒ1, we have { log2 : §0 and 0v Since : n cmax n cmax cmin { log2 pf (v)ƒ{ log2 : . Hence, n cmax cmax cmin {pf (v) log2 pf (v)ƒ{ : log2 : : n cmin n cmax
ð68Þ
By adding over all the vertices of V , we obtain
X
{pf (v) log2 pf (v)§{
a 1 log2 : X : n na
ð58Þ
If (G)ƒ{
By adding over all the vertices of V , we get 1 If (G)§{aX log2 : X ~aX log2 (n:aX ): na
: cmax n cmax , log2 cmin cmin
8 > 0, > < : If (G)§ cmin n cmin > > , log2 : cmax cmax PLoS ONE | www.plosone.org
cmax , if nƒ cmin cmax if nw , cmin
ð69Þ
Lower bound for If (G): Let us distinguish two cases: Case 1: cmax §n:cmin . cmax cmax We have log2 : §0 and log2 pf (v)v0ƒ log2 : . n cmin n cmin Therefore, by applying these bounds to Equation (4), we obtain If (G)w0. Case 2: cmax vn:cmin . cmax w0. ThereIn this case, we have { log2 pf (v)§{ log2 : n cmin fore,
ð59Þ
Hence, the theorem follows. In the next theorem, we obtain explicit bounds when using the information functional given by Equation (6). Theorem 7 Given any connected graph G~(V ,E) on n vertices and let f ~f ’P be given as in Equation (6). We yield If (G)ƒ
cmax cmin cmax n:cmax log2 : ~ log2 : cmin n cmax cmin cmin
ð60Þ
cmin cmax {pf (v) log2 pf (v)§{ : log2 : : n cmax n cmin ð61Þ
ð70Þ
By adding over all the vertices of V , we obtain the lower bound for If (G) given by 6
February 2012 | Volume 7 | Issue 2 | e31395
Information Inequalities for Networks
If (G)§{
cmin cmax cmin n:cmin log2 : ~ log2 : cmax n cmin cmax cmax
Using Equation (77) and based on the fact that pf (v)ƒ1, we get
ð71Þ
Hence, the theorem follows.
Information inequalities describe relations between information measures for graphs. An implicit information inequality is a special type of an information inequality where the entropy of the graph is estimated by a quantity that contains another graph entropy expression. In this section, we will present some implicit information inequalities for entropy measures based on information functionals. In this direction, a first attempt has been done by Dehmer et al. [23,24,26]. Note that Dehmer et al. [23,26] started from certain conditions on the probabilities when two different information functionals f and f are given. In contrast, we start from certain assumptions which the functionals themselves should satisfy and, finally, derive novel implicit inequalities. Now, given any graph G~(V ,E),jV j~n. Let If1 (G) and If2 (G) be two mean information measures of G defined using the information functionals f1 and f2 respectively. Let us further define another functional f (v)~c1 f1 (v)zc2 f2 (v), v [ V . In the following, we will study the relation between the information measure If (G) and the measures If1 (G) and If2 (G). Theorem 8 Suppose f1 (v)ƒf2 (v), for all v [ V , then the information measure If (G) can be bounded by If1 (G) and If2 (G) as follows: (c1 zc2 )A1 c1 A1 c2 (c1 zc2 )A2 (If1 (G){ log2 ){ , A A c1 Aln(2)
ð81Þ Since the last term in the above inequality is positive, we get
(c1 zc2 )A2 {pf (v) log2 pf (v)ƒ A
ð82Þ c2 A2 : {pf2 (v) log2 pf2 (v){pf2 (v) log2 A
ð72Þ
pf1 (v) log2 pf1 (v),
By adding up the above inequalities over all the vertices of V , we get the desired upper bound. From Equation (77), we also get a lower bound for pf (v), given by
pf (v)§
ð74Þ
f1 (v) f1 (v) ~ , f (v) A1 v[V 1
(c1 zc2 )A1 :pf1 (v) {pf (v) log2 pf (v)§{ A : c1 A1 pf1 (v)zc2 A2 :pf2 (v) log2 : A
where pf1 (v)~ P
X
pf2 (v) log2 pf2 (v),
(c1 zc2 )A1 :pf1 (v) ,since A1 :pf1 (v)ƒA2 :pf2 (v): A
ð75Þ
v[V
f2 (v) f2 (v) ~ . A2 f (v) 2 v[V Now consider the probabilities,
where pf2 (v)~ P
f (v) c1 f1 (v)zc2 f2 (v) ~ , f (v) A v[V
pf (v)~ P
~
c1 A1 :pf1 (v)zc2 A2 :pf2 (v) , A
(c1 zc2 )A2 :pf2 (v) ,since A1 :pf1 (v)ƒA2 :pf2 (v): ƒ A PLoS ONE | www.plosone.org
ð83Þ
Now proceeding as before with the above inequality for pf (v), we obtain
v[V
If2 (G)~{
ð80Þ
(c1 zc2 )A2 {pf (v) log2 pf (v)ƒ A
c2 A2 {pf2 (v) log2 pf2 (v){pf2 (v) log2 A ! (c1 zc2 )A2 pf2 (v) c1 A1 pf1 (v) log2 1z : { A c2 A2 pf2 (v)
(c1 zc2 )A2 c2 A2 (If2 (G){ log2 ), ð73Þ A A P P f1 (v), and A2 ~ f2 (v). where A~c1 A1 zc2 A2 , A1 ~ v[V vP [V Proof: P Given f (v)~c1 f1 (v)zc P 2 f2 (v). Let A1 ~ v [ V f1 (v) and A2 ~ v [ V f2 (v). Therefore v [ V f (v)~c1 A1 zc2 A2 ~ : A. The information measures of G with respect to f1 and f2 are given by If1 (G)~{
(c1 zc2 )A2 :pf2 (v) {pf (v) log2 pf (v)ƒ{ A : c1 A1 pf1 (v)zc2 A2 :pf2 (v) , log2 A and
If (G)ƒ
X
ð79Þ
Thus,
Implicit Information Inequalities
If (G)§
c1 A1 :pf1 (v)zc2 A2 :pf2 (v) §0: A
{ log2 pf (v)~{ log2
{pf (v) log2 pf (v)§ ð76Þ
ð77Þ
ð84Þ
(c1 zc2 )A1 A
c1 A1 {pf1 (v) log2 pf1 (v){pf1 (v) log2 ð85Þ A ! (c1 zc2 )A1 pf1 (v) c2 A2 pf2 (v) log2 1z : { A c1 A1 pf1 (v)
By using the concavity property of the logarithm, that is, x 1 x log2 (1z )ƒ ( ), we yield y ln(2) y
ð78Þ
7
February 2012 | Volume 7 | Issue 2 | e31395
Information Inequalities for Networks
!# c1 A1 :pf1 (v) c2 A2 :pf2 (v) 1z A c1 A1 :pf1 (v) " !# ð93Þ c2 A2 :pf2 (v) c2 A2 :pf2 (v) c1 A1 :pf1 (v) log2 1z , z A A c2 A2 :pf2 (v)
(c1 zc2 )A1 {pf (v) log2 pf (v)§ A
c1 A1 ð86Þ {pf1 (v) log2 pf1 (v){pf1 (v) log2 A {
c1 A1 :pf1 (v) ~ log2 A
(c1 zc2 ) : c2 A2 pf2 (v) : A c1 ln(2)
( !) c1 A1 :pf1 (v) c1 A1 :pf1 (v) c2 A2 :pf2 (v) log2 z log2 1z ~ A A c1 A1 :pf1 (v) ( !) c2 A2 :pf2 (v) c2 A2 :pf2 (v) c1 A1 :pf1 (v) log2 z log2 1z , z A A c2 A2 :pf2 (v)
By adding the above inequality over all the vertices of V , we get the desired lower bound. This proves the theorem. Corollary 9 The information measure If (G), for f ~f1 zf2 , is bounded by If1 (G) and If2 (G) as follows: If (G)§
2A1 A1 2A2 log2 e { If1 (G){ log2 , (A1 zA2 ) A1 zA2 A1 zA2
2A2 A2 If (G)ƒ : If2 (G){ log2 A1 zA2 A1 zA2
ð94Þ
ð87Þ ~ ð88Þ
Proof: Set c1 ~c2 in Theorem (8), then the corollary follows. Corollary 10 Given two information functionals, f1 , f2 such that f1 (v)ƒf2 (v), Vv [ V . Then If1 (G)ƒ
A2 A1 A2 A2 If (G)z log2 { log2 A1 2 A1 zA2 A1 A1 zA2
z
A2 log2 e : A1
ð95Þ
Since each of the last two terms in Equation (95) is positive, we get a lower bound for pf (v) log2 pf (v), given by
c1 A1 c1 A1 c2 A2 If1 (G){ log2 z If (G)§ A A A
c2 A2 { log2 e: If2 (G){ log2 A
c1 A1 c1 A1 pf1 (v) log2 pf1 (v)zpf1 (v) log2 pf (v) log2 pf (v)§ A A c2 A2 c2 A2 pf2 (v) log2 pf2 (v)zpf2 (v) log2 : z A A ð96Þ Applying the last inequality to Equation (4), we get the upper bound as given in Equation (91). By further applying the inequality x 1 x ( ) to Equation (95), we get an upper bound log2 (1z )ƒ y ln(2) y for pf (v) log2 pf (v), given by
ð90Þ
and
c1 A1 c1 A1 c2 A2 If (G)ƒ If1 (G){ log2 z A A A
c2 A2 , If2 (G){ log2 A P
f1 (v) and A2 ~
v[V
P
pf (v) ð91Þ
Therefore,
c1 A1 :pf1 (v)zc2 A2 :pf2 (v) A c1 A1 :pf1 (v)zc2 A2 :pf2 (v) , log2 A
c1 A1 c1 A1 pf1 (v) log2 pf1 (v)zpf1 (v) log2 A A c2 A2 c2 A2 pf2 (v) log2 pf2 (v)zpf2 (v) log2 z A A c A p (v)zc A p (v) 1 : 1 1 f1 2 2 f2 ð98Þ : z ln(2) A
pf (v) log2 pf (v) ƒ
PLoS ONE | www.plosone.org
c1 A1 c1 A1 pf1 (v) log2 pf1 (v)zpf1 (v) log2 log2 pf (v)ƒ A A c2 A2 c2 A2 pf2 (v) log2 pf2 (v)zpf2 (v) log2 z ð97Þ A A ! ! c1 A1 pf1 (v) c2 A2 pf2 (v) c2 A2 pf2 (v) c1 A1 :pf1 (v) z : z A ln(2) c1 A1 pf1 (v) A ln(2) c2 A2 :pf2 (v)
f2 (v).
v[V
Proof: Starting from the quantities for pf (v) based on Equation (77), we obtain
pf (v) log2 pf (v)~
c1 A1 c1 A1 pf1 (v) log2 pf1 (v)zpf1 (v) log2 A A c2 A2 c2 A2 pf2 (v) log2 pf2 (v)zpf2 (v) log2 z A A ! c1 A1 pf1 (v) c2 A2 :pf2 (v) log2 1z z A c1 A1 :pf1 (v) ! c2 A2 pf2 (v) c1 A1 :pf1 (v) log2 1z : z A c2 A2 :pf2 (v)
ð89Þ
Proof: Follows from Corollary (9). The next theorem gives another bound for If in terms of both If1 and If2 by using the concavity property of the logarithmic function. Theorem 11 Let f1 (v) and f2 (v) be two arbitrary functionals defined on a graph G. If f (v)~c1 f1 (v)zc2 f2 (v) for all v [ V , we infer
where A~c1 A1 zc2 A2 , A1 ~
"
ð92Þ
8
February 2012 | Volume 7 | Issue 2 | e31395
Information Inequalities for Networks
Finally, we now apply this inequality to Equation (4) and get the lower bound as given in Equation (90). The next theorem is a straightforward extension of the previous statement. Here, an information functional is expressed as a linear combination of k arbitrary information functionals. Theorem 12 Let k§2 and f1 (v),f2 (v), . . . ,fk (v) be arbitrary functionals defined on a graph G. If1 (G),If2 (G), . . . ,Ifk (G) are the corresponding information contents. If f (v)~c1 f1 (v)zc2 f2 (v)z z ck fk (v) for all v [ V , we infer
If (G)§
k X ci Ai ci Ai Ifi (G){ log2 {(k{1) log2 e, A A i~1
X
f ðvÞ~
v[V
X v [ V1
where A~
Pk
k X ci Ai ci Ai Ifi (G){ log2 , A A i~1
i~1 ci Ai ,
P
Aj ~
ð99Þ
ð100Þ
fj (v) for 1ƒjƒk.
If (G)
v [ V2
ð102Þ
v [ V1
fG1 (v) , and A1
If (G)~ X
pG2 (v) log2 pG2 (v),
where pG2 (v)~ (
ð108Þ
ð110Þ
fGi (v) for 1ƒiƒk.
Join of Graphs. Let G1 ~(V1 ,E1 ) and G2 ~(V2 ,E2 ) be two arbitrary connected graphs on n1 and n2 vertices, respectively. The join of the graphs G1 zG2 is defined as the graph G~(V ,E) with vertex set V ~V1 |V2 and the edge set E~E1 |E2 | f(x,y) : x [ V1 ,y [ V2 g. Let f ~fP be the information functional (given by Equation (5)) based on the j-sphere functional (exponential) defined on these graphs and denoted by fG1 , fG2 . Let If (G1 ) and If (G2 ) be the information measures on G1 and G2 respectively. Theorem 15 Let G~(V ,E)~G1 zG2 be the join of the graphs G1 ~(V1 ,E1 ) and G2 ~(V2 ,E2 ) with n1 zn2 vertices. The information measure If (G) can then be expressed in terms of If (G1 ) and If (G2 ) as follows:
ð104Þ
Hence, PLoS ONE | www.plosone.org
P v [ V1
fG2 (v) : For v [ V , A2
fG1 (v), if v [ V1 , fG2 (v), if v [ V2 :
k X Ai Ai If (Gi ){ log2 , A A i~1
where A~A1 zA2 z zAk with Ai ~
ð103Þ
v [ V2
f (v)~
A1 :pG1 ðvÞ P A1 :pG1 ðvÞ log2 ( ) A A v [ V1 A2 :pG2 ðvÞ P A2 :pG2 ðvÞ log2 ( ), { A A v [ V2
~{
Upon simplification, we get the desired result. Also, we immediately obtain a generalization of the previous theorem by taking k-disjoint graphs into account. Theorem 14 Let G1 ~(V1 ,E1 ), G2 ~(V2 ,E2 ), . . . ,Gk ~(Vk ,Ek ) be k arbitrary connected graphs on n1 ,n2 , . . . ,nk vertices, respectively. Let f be an information functional defined on these graphs denoted by fG1 ,fG2 , . . . ,fGk . Let G~(V ,E)~G1 |G2 | |Gk be the disjoint union of the graphs G1 ,G2 , . . . ,Gk for k§2. The information measure If (G) can be expressed in terms of If (G1 ),If (G2 ), . . ., If (Gk ) as follows:
of G1 and G2 are given as follows:
If (G2 )~{
ð107Þ if v [ V2 :
2
Proof: Let f be the given information functional. Let P P A1 ~ fG1 (v) and A2 ~ fG2 (v). The information measures
where pG1 (v)~
if v [ V1 ,
A1 X A1 pG1 ðvÞ log2 pG1 ðvÞzpG1 ðvÞ log2 ( Þ A v[V A 1 ð109Þ A2 X A2 pG2 ðvÞ log2 pG2 (v)zpG2 ðvÞ log2 ( Þ : { A v[V A
v [ V2
pG1 (v) log2 pG1 (v),
ð106Þ if v [ V2 :
If (G)~{
A1 A1 A2 A2 If (G1 ){ log2 z If (G2 ){ log2 , ð101Þ A A A A P P where A~A1 zA2 with A1 ~ fG1 (v) and A2 ~ fG2 (v).
X
if v [ V1 ,
and
If (G)~
If (G1 )~{
ð105Þ
Using these quantities to determine If ðG Þ, we obtain
Union of Graphs. In this section, we determine the entropy of the union of two graphs. Let G1 ~(V1 ,E1 ) and G2 ~(V2 ,E2 ) be two arbitrary connected graphs on n1 and n2 vertices, respectively. Let f be an information functional defined on these graphs denoted by fG1 , fG2 and let If (G1 ) and If (G2 ) be the information measures on G1 and G2 respectively. Theorem 13 Let G~(V ,E)~G1 |G2 be the disjoint union of the graphs G1 and G2 . Let f be an arbitrary information functional. The information measure If (G) can be expressed in terms of If (G1 ) and If (G2 ) as follows:
v [ V1
f ðvÞ~A1 zA2 ~ : A:
v [ V2
8 A1 > < :pG1 ðvÞ, ~ A > : A2 :p ðvÞ, G2 A
v[V
v [ V1
X
8 fG ðvÞ > > < 1 , f ðvÞ A ~ pG (v)~ P > f ðvÞ > v [ V f ðvÞ : G2 , A
and
If (G)ƒ
f ðvÞz
9
February 2012 | Volume 7 | Issue 2 | e31395
Information Inequalities for Networks
A1 ac1 n2 If (G1 ){ log2 A c1 n1 A2 a A2 ac1 n2 If (G2 ){ log2 , z A A
If (G)~
A1 ac1 n2 A
Using those entities to determine If (G), we infer ð111Þ
If (G)
Pr(H) c S (v;H) where fH (v)~a j~1 j j j w0 and A~ Pfor H [ fG1 ,G2 ,Gg, cP c1 n2 c1 n1 zA2 a with A1 ~ v [ V fG1 (v) and A2 ~ v [ V fG2 (v). A1 a 1 2 Proof: Let G~G1 zG2 be the join of two connected graphs
ac1 n2 A1 :pG1 (v) P ac1 n2 A1 :pG1 (v) log2 ( ) A A v [ V1 ð119Þ ac1 n1 A2 :pG2 (v) P ac1 n1 A2 :pG2 (v) { log2 ( ), A A v [ V2
~{
and
GP 1 and G2 . Here, r(G)~maxfr(G1 ),r(G2 )g. Let fH (v)~ r(H)cj Sj (v;H) a j~1 be the information functional defined by using P fG1 (v) the j-sphere functional on H [ fG,G1 ,G2 g. Let A1 ~ v [ V1 P and A2 ~ fG2 (v). The information measures of G1 and G2 are
If (G) ~{
c n ac1 n2 A1 X a 1 2 A1 pG1 (v) log2 pG1 (v)zpG1 (v) log2 A v[V A 1 ac1 n1 A2 X ac1 n1 A2 { pG2 (v) log2 pG2 (v)zpG2 (v) log2 : A v[V A 2
v [ V2
ð120Þ
given as follows: X
If (G1 )~{
pG1 (v) log2 pG1 (v),
Upon simplification, we get the desired result. If we consider the linear j-sphere functional f ’P (see Equation (6)), to infer an exact expression for the join of two graphs as in Theorem (15) is an intricate problem. By Theorem (16) and Theorem (17), we will now present different bounds in terms of If ’P (G1 ) and If ’P (G2 ). Theorem 16 Let G~(V ,E)~G1 zG2 be the join of the graphs G1 ~(V1 ,E1 ) and G2 ~(V2 ,E2 ) on n1 zn2 vertices. Then, we yield
ð112Þ
v [ V1
where pG1 (v)~ X
If (G2 )~{
fG1 (v) , and A1
pG2 (v) log2 pG2 (v),
ð113Þ
If ðGÞ§
v [ V2
where pG2 (v)~
( ~
2c1 n1 n2 , { A lnð2Þ
fG2 (v) : For v [ V , A2
8 Pr(G ) > < ac1 n2 z j~11 cj Sj (v;G1 ) , f (v)~ Pr(G ) > : c1 n1 z j~12 cj Sj (v;G2 ) , a
if v [ V1 ,
A1 A1 A2 A2 If ðG1 Þ{ log2 If ðG2 Þ{ log2 z A A A A
P where fH (v)~ r(H) for H [ fG1 ,G2 ,Gg, cj w0 and A~ j~1 cj Sj (v; H) P P 2c1 n1 n2zA1zA2 with A1 ~ v [ V fG1 (v) and A2 ~ v [ V fG2 (v). 1 2 P P fG1 (v) and A2 ~ fG2 (v). The informaProof: Let A1 ~ v [ V1
v[V
f (v)~
X v [ V1
f (v)z
v [ V2
tion measures of G1 and G2 are given as follows:
ð114Þ
if v [ V2 :
If (G1 )~{
X
pG1 (v) log2 pG1 (v),
ð122Þ
v [ V1
ac1 n2 fG1 (v), ac1 n1 fG2 (v),
if v [ V1 , if v [ V2 :
ð115Þ where pG1 (v)~
Hence, X
ð121Þ
X
If (G2 )~{
X
fG1 (v) , and A1
pG2 (v) log2 pG2 (v),
ð123Þ
v [ V2
f (v)~A1 ac1 n2 zA2 ac1 n1 ~ : A, ð116Þ
v [ V2
8 c n a 1 2 fG1 (v) > > < , fG (v) A ~ pG (v)~ P c1 n1 > a fG2 (v) > v [ V f (v) : , A 8 c1 n2 a A1 : > < pG1 (v), A ~ c1 n1 A2 : > :a pG2 (v), A
PLoS ONE | www.plosone.org
where pG2 (v)~
fG (v) 2 A2 :
For v [ V ,
8 r(G1 ) P > > > n z cj Sj (v; G1 ), if v [ V1 , c 1 2 > < j~1 f (v)~ r(G2 ) > P > > > cj Sj (v; G2 ), if v [ V2 : : c1 n1 z
if v [ V1 , ð117Þ if v [ V2 :
ð124Þ
j~1
if v [ V1 ,
(
ð118Þ
~
if v [ V2 :
10
c1 n2 zfG1 ðvÞ, c1 n1 zfG2 ðvÞ,
if v [ V1 , if v [ V2 :
ð125Þ
February 2012 | Volume 7 | Issue 2 | e31395
Information Inequalities for Networks
{pG ðvÞ log2 pG ðvÞ§ 8 A1 A1 c1 n2 > > p { , ð v Þ log p ð v Þzp ð v Þ log { > G1 G1 2 G1 2 > > A A A lnð2Þ > > > > < if v [ V1 , ð133Þ > A2 A2 c1 n1 > > >{ pG2 ðvÞ log2 pG2 ðvÞzpG2 ðvÞ log2 { , > > A A A lnð2Þ > > > : if v [ V2 :
Hence, X
fG ðvÞ~
v[V
X v [ V1
fG ðvÞz
X
fG ðvÞ~2c1 n1 n2 zA1 zA2 ~ : A:
v [ V2
ð126Þ 8 c1 n2 zfG1 ðvÞ > > < , fG ðvÞ A ~ pG ðvÞ~ P > c n zf ðvÞ > v [ V fG ðvÞ : 1 1 G2 , A 8 : A1 pG1 ðvÞ c1 n2 > > < z , A A ~ > A :p ðvÞ > : 2 G2 z c1 n1 , A A
if v [ V1 , ð127Þ
By adding up the above inequality system (across all the vertices of V ) and by simplifying, we get the desired lower bound. Further, an alternate set of bounds can be achieved as follows. Theorem 17 Let G~ðV ,E Þ~G1 zG2 be the join of the graphs G1 ~ðV1 ,E1 Þ and G2 ~ðV2 ,E2 Þ on n1 zn2 vertices. Then, we infer
if v [ V2 :
if v [ V1 , ð128Þ
A1 A1 A2 A2 If ðG1 Þ{ log2 z If ðG2 Þ{ log2 If ðGÞ ƒ A A A A ð134Þ 2 c1 n1 n2 c n1 n2 log2 1 2 , { A A
if v [ V2 :
c1 n2 c1 n1 and are positive, we get a lower bound for pG ðvÞ A A given as Since
8 pG ðvÞ:A1 > > < 1 , A pG ðvÞ§ : > p ðvÞ A2 > : G2 , A
and A1 A1 A2 A2 If ðG1 Þ{ log2 z If ðG2 Þ{ log2 If ðGÞ § A A A A ð135Þ c1 n1 n2 c21 n1 n2 log2 { { log ð e Þ, 2 A A2 P where f (v)~ r(H) Þ for H [ fG1 ,G2 ,Gg, cj w0 and A~ j~1 cj Sj ðv; H P P 2c1 n1 n2 zA1 zA2 with A1 ~ v [ V fG1 (v) and A2 ~ v [ V fG2 (v). 1 2 Proof: Starting from Theorem (16), consider the value of pG (v) given by Equation (128). By using the quantities for pG (v) to calculate If (G), we get
if v [ V1 , ð129Þ if v [ V2 :
To infer a lower bound for the information measure If ðG Þ, we start from the Equations (128), (129) and obtain {pG ðvÞ log2 pG ðvÞ 8 pG1 ðvÞ:A1 pG1 ðvÞA1 zc1 n2 > > log , { > 2 < A A § > pG ðvÞ:A2 pG2 ðvÞA2 zc1 n1 > > 2 :{ log2 , A A
if v [ V1 ,
ð130Þ
If (G)
if v [ V2 :
8 " !# > A1 pG1 ðvÞ A1 pG1 ðvÞ c1 n2 > > > log2 1z , { > < A A A1 pG1 ðvÞ " !# ~ > > > { A2 pG2 ðvÞ log A2 pG2 ðvÞ 1z c1 n1 > , > 2 : A A A2 pG2 ðvÞ
P
ð136Þ if v [ V1 , and
ð131Þ
A1 X A1 pG1 ðvÞ log2 pG1 ðvÞzpG1 ðvÞ log2 A v[V A 1 ! A1 X c1 n2 { pG1 ðvÞ log2 1z : A v[V A1 pG1 ðvÞ
if v [ V2 :
If ðGÞ~{
8 A1 A1 > > p ð v Þ log p ð v Þzp ð v Þ log { > G G G 2 2 1 1 1 > A A > > > ! > > > A p ð v Þ c n 1 G1 > 1 2 > > log2 1z , if v [ V1 , { > < A A1 pG1 ðvÞ ~ ð132Þ > A2 A2 > > ð v Þ log p ð v Þzp ð v Þ log p { > G G G 2 2 2 2 2 > A A > > > ! > > > A2 pG2 ðvÞ c1 n1 > > > log2 1z , if v [ V2 : { : A A2 pG2 ðvÞ
1
pG ðvÞA1 c1 n2 X c1 n2 z log2 1z 1 log2 { A v[V A c1 n2 1 ð137Þ A2 X A2 { pG2 ðvÞ log2 pG2 ðvÞz log2 A v[V A 2 ! A2 X c1 n1 { pG2 ðvÞ log2 1z : A v[V A2 pG2 ðvÞ
x 1 x and performBy using the inequality log2 1z ƒ y lnð2Þ y ing simplification steps, we get,
PLoS ONE | www.plosone.org
: : A1 pG1 (v)zc1 n2 A1 pG1 (v)zc1 n2 log2 A A v [ V1 : : A2 pG2 (v)zc1 n1 P A2 pG2 (v)zc1 n1 { log2 , A A v [ V2
~{
2
{
pG ðvÞA2 c1 n1 X c1 n1 z log2 1z 2 : log2 A v[V A c1 n1 2
11
February 2012 | Volume 7 | Issue 2 | e31395
Information Inequalities for Networks
By simplifying and performing summation, we get
A1 A1 A2 A2 If ðG1 Þ{ log2 z If ðG2 Þ{ log2 A A A A
If (G)~
Summary and Conclusion
c1 n1 n2 c2 n1 n2 log2 1 2 A A ! X A1 c1 n2 { pG1 ðvÞ log2 1z : A v[V A1 pG1 ðvÞ
{
1
pG ðvÞA1 c1 n2 X { log2 1z 1 A v[V c1 n2 1
A2 X c1 n1 pG2 ðvÞ log2 1z : { A v[V A2 pG2 ðvÞ
ð138Þ !
2
pG ðvÞA2 c1 n1 X : log2 1z 2 { A v[V c1 n1 2
An upper bound for the measure If (G) can be derived as follows: If ðGÞƒ
A1 A1 A2 A2 If ðG1 Þ{ log2 z If ðG2 Þ{ log2 A A A A
c1 n1 n2 c2 n1 n2 log2 1 2 , { A A
ð139Þ
since each of the remaining terms in Equation (138) is positive. Finally, we infer the lower bound for If (G) as follows. By applying x 1 x to Equation (138), we get inequality log2 1z ƒ y lnð2Þ y If ðG Þ§
A1 A1 A2 A2 If ðG1 Þ{ log2 z If ðG2 Þ{ log2 A A A A
c1 n1 n2 c2 n1 n2 A1 X c1 n2 log2 1 2 { pG1 ðvÞ { A A A v[V lnð2Þ:A1 :pG1 ðvÞ
!
1
c1 n2 X pG1 ðvÞA1 A2 X c1 n1 { { pG2 ðvÞ : A v [ V lnð2Þ c1 n2 A v[V lnð2Þ:A2 :pG2 ðvÞ 1
!
N N
2
c1 n1 X pG2 ðvÞA2 : { A v [ V lnð2Þ:c1 n1 2
ð140Þ
Upon simplification, we get A1 A1 A2 A2 If ðG1 Þ{ log2 z If ðG2 Þ{ log2 If ðGÞ§ A A A A {
c1 n1 n2 c2 n1 n2 1 : log2 1 2 { lnð2Þ A A
In this article, we have investigated a challenging problem in quantitative graph theory namely to establish relations between graph entropy measures. Among the existing graph entropy measures, we have considered those entropies which are based on information functionals. It turned out that these measures have widely been applicable and useful when measuring the complexity of networks [3]. In general, to find relations between quantitative network measures is a daunting problem. The results could be used in various branches of science including mathematics, statistics, information theory, biology, chemistry and social sciences. Further, the determination of analytical relations between measures is of great practical importance when dealing with large scale networks. Also, relations involving quantitative network measures could be fruitful when determining the information content of large complex networks. Note that our proof technique follows the one proposed in [23]. It is based on three main steps: Firstly, we compute the information functionals and in turn, we calculate the probability values for every vertex of the graph in question. Secondly, we start with certain conditions for the computed functionals and arrive at a system of inequalities. Thirdly, by adding up the corresponding inequality system, we obtain the desired implicit information inequality. Using this approach, we have inferred novel bounds by assuming certain information functionals. It is evident that further bounds could be inferred by taking novel information functionals into account. Further, we explored relations between the involved information measures for general connected graphs and for special classes of graphs such as stars, path graphs, union and join of graphs. At this juncture, it is also relevant to compare the results proved in this paper with those proved in [23]. While we derived the implicit information inequalities by assuming certain properties for the functionals, the implicit information inequalities derived in [23] are based on certain conditions for the calculated vertex probabilities. Interestingly, note that by using Theorem (11) and Theorem (17), the range of the corresponding bounds is very small. We inferred that the difference between the upper and lower bound equals log2 e&1:442695. As noted earlier, relations between entropy-based measures for graphs have not been extensively explored so far. Apart from the results we have gained in this paper, we therefore state a few open problems as future work:
N N N
ð141Þ
Putting Inequality (139) and Inequality (141) together finishes the proof of the theorem.
To find relations between If (G) and If (H), when H is an induced subgraph of G and f is an arbitrary information functional. To find relations between If (G) and fIf (T1 ),If (T2 ), . . . , If (Tn )g, where Ti , 1ƒiƒn are so-called generalized trees, see [34]. Note that it is always possible to decompose an arbitrary, undirected graph into a set of generalized trees [34]. To find relations between measures based on information functionals and the other classical graph measures. To derive information inequalities for graph entropy measures using random graphs. To derive statements to judge the quality of information inequalities.
Author Contributions Wrote the paper: MD LS. Performed the mathematical analysis: MD LS.
References 2. Wang J, Provan G (2009) Characterizing the structural complexity of real-world complex networks. In: Zhou J, ed. Complex Sciences, Springer, Berlin/ Heidelberg, Germany, volume 4 of Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering. pp 1178–1189.
1. Basak SC (1999) Information-theoretic indices of neighborhood complexity and their applications. In: Devillers J, Balaban AT, eds. Topological Indices and Related Descriptors in QSAR and QSPAR, Gordon and Breach Science Publishers, Amsterdam, The Netherlands. pp 563–595.
PLoS ONE | www.plosone.org
12
February 2012 | Volume 7 | Issue 2 | e31395
Information Inequalities for Networks
19. Bonchev D, Rouvray DH (2005) Complexity in chemistry, biology, and ecology. Mathematical and Computational Chemistry. New York: Springer. pp xx+344. doi:10.1007/b136300. URL http://dx.doi.org/10.1007/b136300. 20. Claussen JC (2007) Offdiagonal complexity: A computationally quick complexity measure for graphs and networks. Physica A: Statistical Mechanics and its Applications 375: 365–373. 21. Ko¨rner J (1973) Coding of an information source having ambiguous alphabet and the entropy of graphs. Trans 6th Prague Conference on Information Theory. pp 411–425. 22. Butts C (2001) The complexity of social networks: Theoretical and empirical findings. Social Networks 23: 31–71. 23. Dehmer M, Mowshowitz A (2010) Inequalities for entropy-based measures of network information content. Applied Mathematics and Computation 215: 4263–4271. 24. Dehmer M, Mowshowitz A, Emmert-Streib F (2011) Connections between classical and parametric network entropies. PLoS ONE 6: e15733. 25. Bonchev D, Trinajstic N (1977) Information theory, distance matrix, and molecular branching. The Journal of Chemical Physics 67: 4517–4533. 26. Dehmer M, Borgert S, Emmert-Streib F (2008) Entropy bounds for hierarchical molecular networks. PLoS ONE 3: e3079. 27. Skorobogatov VA, Dobrynin AA (1988) Metrical analysis of graphs. MATCH Commun Math Comp Chem 23: 105–155. 28. Shannon C, Weaver W (1997) The Mathematical Theory of Communication. University of Illinois Press, Urbana, IL, USA. 29. Freeman LC (1977) A set of measures of centrality based on betweenness. Sociometry 40: 35–41. 30. Freeman LC (1978) Centrality in social networks conceptual clarification. Social Networks 1: 215–239. 31. Sabidussi G (1966) The centrality index of a graph. Psychometrika 31: 581–603. 32. Emmert-Streib F, Dehmer M (2011) Networks for systems biology: Conceptual connection of data and function. IET Systems Biology 5: 185–207. 33. Simonyi G (1995) Graph entropy: A survey. In: Cook W, Lova´sz L, Seymour P, eds. Combinatorial Optimization, DIMACS Series in Discrete Mathematics and Theoretical Computer Science, volume 20. pp 399–441. 34. Emmert-Streib F, Dehmer M, Kilian J (2006) Classification of large graphs by a local tree decomposition. In:, , et al HRA, editor (2006) Proceedings of DMIN’05, International Conference on Data Mining, Las Vegas, USA. pp 477–482.
3. Dehmer M, Mowshowitz A (2011) A history of graph entropy measures. Information Sciences 181: 57–78. 4. Li M, Vita´nyi P (1997) An Introduction to Kolmogorov Complexity and Its Applications Springer. 5. Mowshowitz A (1968) Entropy and the complexity of graphs: I. an index of the relative complexity of a graph. Bulletin of Mathematical Biophysics 30: 175–204. 6. Shannon CE (1948) A mathematical theory of communication. Bell System Technical Journal 27: 379–423 and 623–656. 7. Bonchev D, Rouvray DH (2003) Complexity in chemistry: Introduction and Fundamentals. Mathematical and Computational Chemistry 7. New York: CRC Press. 8. Bonchev D (1983) Information Theoretic Indices for Characterization of Chemical Structures. Research Studies Press, Chichester. 9. Dehmer M (2008) Information processing in complex networks: graph entropy and information functionals. Appl Math Comput 201: 82–94. 10. Emmert-Streib F, Dehmer M (2007) Information theoretic measures of UHG graphs with low computational complexity. Appl Math Comput 190: 1783–1794. 11. Mehler A, Weiß P, Lu¨cking A (2010) A network model of interpersonal alignment. Entropy 12: 1440–1483. 12. Bonchev D (2003) Complexity in Chemistry. Introduction and Fundamentals. Taylor and Francis. Boca Raton, FL, USA. 13. Anand K, Bianconi G (2009) Entropy measures for networks: Toward an information theory of complex topologies. Phys Rev E 80: 045102. 14. Costa LdF, Rodrigues FA, Travieso G, Boas PRV (2007) Characterization of complex networks: A survey of measurements. Advances in Physics 56: 167–242. 15. Kim J, Wilhelm T (2008) What is a complex graph? Physica A: Statistical Mechanics and its Applications 387: 2637–2652. 16. Balaban AT, Balaban TS (1991) New vertex invariants and topological indices of chemical graphs based on information on distances. Journal of Mathematical Chemistry 8: 383–397. 17. Bertz SH (1983) A mathematical model of complexity. In: King R, ed. Chemical applications of topology and graph theory, Elsevier, Amsterdam. pp 206–221. 18. Basak SC, Magnuson VR, Niemi GJ, Regal RR (1988) Determining structural similarity of chemicals using graph-theoretic indices. Discrete Applied Mathematics 19: 17–44.
PLoS ONE | www.plosone.org
13
February 2012 | Volume 7 | Issue 2 | e31395