ISIT 2008, Toronto, Canada, July 6 - 11, 2008
A New Achievable Rate For Relay Networks Based on Parallel Relaying Leila Ghabeli, Mohammad R. Aref Information Systems and Security Lab Department of Electrical Engineering Sharif University of Technology, Tehran, Iran Email:
[email protected], aref @sharif.edu
1 Abstract— In this paper, we propose a new concept of relaying named parallel relaying to obtain a new achievable rate for relay networks. The proposed encoding scheme is based on partial decoding, where the relay decodes only part of the transmitted message. The superiority of our proposed method to the previous ones is that, it can be applied to the relay networks with general structure, not necessarily with feed-forward structure. The proposed achievable rate is also used to establish the capacity of a class of semi-deterministic relay network which is a generalization of Aref network.
I. I NTRODUCTION The discrete-memoryless relay network denoted by (X0 × X1 × ... × XN , p(y0 , y1 , ..., yN |x0 , x1 , ..., xN ), Y0 × Y1 × ... × YN ) consists of a sender X0 ∈ X0 , a receiver Y0 ∈ Y0 , relay senders X1 ∈ X1 ,...,XN ∈ XN and relay receivers Y1 ∈ Y1 ,...,YN ∈ YN and a family of conditional probability mass functions p(y0 , y1 , ..., yN |x0 , x1 , ..., xN ) on Y0 ×Y1 ×...×YN one for each (x0 , x1 , ..., xN ) ∈ X0 × X1 × ... × XN . A (2nR , n) code for the channel consists of: i) a set of messages 1, 2, ..., 2nR , ii) an encoding function that maps each message w into a codeword xn (w) of length n, iii) relay encoding functions x1i = fi (y11 , y12 , ..., y1,i−1 ), for 1 ≤ i ≤ n, and iv) a decoding function that maps each received sequence y n into an estimate w(y ˆ n ). A rate R is achievable if there exists (n) nR ˆ = W ) → 0, a sequence of (2 , n) codes with Pe = P (W as n → ∞. Channel capacity C is defined as the supremum over the set of achievable rates. The relay channel, first introduced by Van der Meulen in [1], describes a single-user communication channel where a relay helps a sender-receiver pair in their communication. In [2], Cover and El Gamal proved a converse result for the relay channel, the so-called max-flow min-cut upper bound. Additionally, they established two coding approaches and three achievability results for the discrete-memoryless relay channel. They also presented the capacity of degraded, reversely degraded relay channel and the relay channel with full feedback. In [3], partial decoding scheme or generalized block Markov encoding was defined as a special case of the proposed coding scheme by Cover and El Gamal [2, Theorem 7]. In this encoding scheme, the relay does not completely 1 This work was supported by Iranian National Science Foundation (INSF) under contract No. 84,5193-2006.
978-1-4244-2571-6/08/$25.00 ©2008 IEEE
decode the transmitted message by the sender. Instead the relay only decodes part of the message transmitted by the sender. Partial decoding scheme was used to establish the capacity of two classes of relay channels called, semi-deterministic relay channel [3]-[4] and orthogonal relay channel [5]. In [6] and [7], partial decoding scheme was extended to multirelay network and the capacity of multilevel semi-deterministic and orthogonal relay network were established. In [8], some cooperative strategies for relay networks are discussed and reviewed, additionally, the authors generalize the compressand-forward strategy to relay networks and also give an achievable rate when the relays use either decode-and-forward or compress-and-forward. They also add partial decoding to the later method when there are two relays. In their scheme, the first relay uses decode-and-forward, and the second relay uses compress-and-forward. Second relay further partially decodes the signal from first relay before compressing its observation. They made the second relay output statistically independent of the first relay and the transmitter outputs. In [9], mixed strategy consisting of partial decoding scheme and compressand-forward is developed for multiple relay networks. In this work we propose a new achievable rate for relay networks based on partial decoding method. In contrast with the proposed achievable rate in [6]-[7] and [9], the proposed rate in this paper can be applied to the relay networks with general structure for the relay arrangements not necessarily with feed-forward structure. The proof involves current techniques such as random binning [10], regular encoding/ sliding window decoding [11] and regular encoding/ backward decoding [12]. Additionally, we introduce a special class of relay network, named semi-deterministic relay network with no interference at the relays, that is a generalization of Aref network, deterministic relay network with no interference, and establish its capacity by using the proposed achievable rate. II. PARALLEL R ELAYING S CHEME In [3] generalized block Markov encoding or partial decoding is defined as a special case of Theorem 7 in [2]. In this encoding scheme, the relay does not completely decode the transmitted message by the sender. Instead the relay only decodes part of the message transmitted by the sender. A block Markov encoding timeframe is again used in this scheme such
1328
ISIT 2008, Toronto, Canada, July 6 - 11, 2008
that the relay decodes part of the message transmitted in the previous block and cooperates with the sender to transmit the decoded part of the message to the sink in current block. In this section, we apply the concept of partial decoding scheme to the relay networks with two relays and present parallel relaying scheme in the following theorem. Theorem 1: For any relay networks (X0 × X1 × X2 , p(y0 , y1 , y2 |x0 , x1 , x2 ), Y0 × Y1 × Y2 ) the capacity C is lower bounded by C≥
sup
p(v,u12 ,u21 ,u101 ,u102 ,u201 ,u202 ,x0 ,x1 ,x2 )
Y1 : X1
V
U12
2 U101U 01
X0
Y0 U 21
2 U102 U 02
Y2 : X 2
V
Fig. 1. Relay networks with the individual parts of the messages at the sender and the relays shown in it.
min{
I(X0 X1 X2 ; Y0 ), 1 2 U01 U21 ; Y1 |X1 U12 V ) I(U01 1 2 2 U01 U02 U12 U21 V ), +I(X0 X2 ; Y0 |X1 U01 1 2 I(U02 U02 U12 ; Y2 |X2 U21 V ) 1 2 2 U01 U02 U12 U21 V ), +I(X0 X1 ; Y0 |X2 U02 2 2 I(U02 U21 ; Y1 |X1 U12 V ) + I(U01 U12 ; Y2 |X2 U21 V ) 2 2 U02 U12 U21 V ), +I(X0 X1 X2 ; Y0 |U01 1 2 1 2 I(U01 U01 ; Y1 |X1 U12 U21 V ) + I(U02 U02 ; Y2 |X2 U12 U21 V ) 1 2 1 2 U01 ; U02 U02 |X1 X2 U12 U21 V ) −I(U01 1 1 2 2 U02 U01 U02 U12 U21 V )} +I(X0 ; Y0 |X1 X2 U01 (1) where the supremum is over all joint probability mass function p(x0 , x1 , x2 , u101 , u201 , u102 , u202 , u12 , u21 , v) on the product set
We show that for any joint probability mass function p(x0 , x1 , x2 , u101 , u201 , u102 , u202 , u12 , u21 , v) there exists at least (n) a sequence of codebooks Cn , such that Pe → 0 as n → ∞ if R ≤ C, where 1 2 1 2 + R01 + R02 + R02 R = R00 + R01
Random Coding: For any joint probability mass function p(x0 , x1 , x2 , u101 , u201 , u102 , u202 , u12 , u21 , v) = p(x0 |x1 , x2 , u101 , u201 , u102 , u202 , u12 , u21 , v) 1 p(u01 |u201 , x1 , u12 , u21 , v)p(u102 |u202 , x2 , u12 , u21 , v) p(u201 |u12 , u21 , v)p(u202 |u12 , u21 , v) p(x1 |u12 , v)p(x2 |u21 , v)p(u12 |v)p(u21 |v)p(v)
1 2 1 2 V × U12 × U21 × U01 × U01 × U02 × U02 × X0 × X1 × X2 .
such that
(5)
(6)
on the product set
1 2 1 2 (V U12 U21 U01 U01 U02 U02 ) → (X0 X1 X2 ) → Y0
(2)
1 2 1 2 V × U12 × U21 × U01 × U01 × U02 × U02 × X0 × X1 × X2 .
1 U01 → (X1 , X0 ) → Y0
(3)
1 U02 → (X2 , X0 ) → Y0
(4)
a) Generate 2n(R01 +R02 ) i.i.d. v n sequences each with n probability p(v n ) = p(vi ). Index them as v n (mv ), where
Proof: For encoding, we make use of regular block Markov superposition encoding and for decoding we make use of backward decoding [12]. The message is divided into three independent parts. The first and second parts are decoded by the first and second relays respectively and the receiver can only make estimates of them, while the third part is directly decoded by the receiver. The sender and the relays cooperate in the next transmission block to remove the receiver’s uncertainly about the first and second parts of the message. The messages at the relays are also divided into two parts, the first part of the message is decoded by the other relay, while the second part is directly decoded by the receiver. Fig. 1 shows the individual parts of the messages at the sender and the relays in the proposed coding scheme. Consider a block Markov encoding scheme with B blocks of transmission, each of n symbols. A sequence of (B − 2) messages wi , i = 1, 2, ..., B − 2, each selected independently and uniformly over W is to be sent in nB transmissions. (Note that as B → ∞, for fixed n, the rate R(B −2)/B is arbitrarily close to R). The same codebook is used in each block of transmission. Each relay tries to send the information about the messages to the sink as much as possible through the direct link between the relay and the sink and if there is some more information, via the other relay.
2
2
2
2
i=1
mv ∈ [1, 2n(R01 +R02 ) ]. 2 b) For each v n (mv ), generate 2nR01 i.i.d. un12 sequences n p(u12,i |vi ). Index them each with probability p(un12 |v n ) = i=1
2
as un12 (m12 |mv ), where m12 ∈ [1, 2nR01 ]. 2 c) For each v n (mv ), generate 2nR02 i.i.d. un21 sequences n p(u21,i |vi ). Index them each with probability p(un21 |v n ) = i=1
2
as un21 (m21 |mv ), where m21 ∈ [1, 2nR02 ]. 1 d) For each v n (mv ), and un12 (m12 |mv ), generate 2nR01 n n n n i.i.d. x1 sequences each with probability p(x1 |u12 , v ) = n p(x1,i |u12,i , vi ). Index them as xn1 (m10 |m12 , mv ), where i=1
1
m10 ∈ [1, 2nR01 ]. 1 e) For each v n (mv ) and un21 (m21 |mv ), generate 2nR02 i.i.d. xn2 sequences each with probability p(xn2 |un21 , v n ) = n p(x2,i |u21,i , vi ). Index them as xn2 (m20 |m21 , mv ), where i=1
1
m20 ∈ [1, 2nR02 ]. f) For each v n (mv ), un12 (m12 |mv ) and un21 (m21 |mv ), 2 generate 2nR01 i.i.d u2n each with probability 01 n 2n n n n 2 p(u01,i |u12,i , u21,i , vi ). Index p(u01 |u12 , u21 , v ) = i=1
2
2 2 nR01 them as u2n ]. 01 (m01 |m12 , m21 , mv ), where m01 ∈ [1, 2
1329
ISIT 2008, Toronto, Canada, July 6 - 11, 2008
g) For each v n (mv ), un12 (m12 |mv ) and un21 (m21 |mv ), 2 each with probability generate 2nR02 i.i.d u2n 02 n 2n n n n 2 p(u02,i |u12,i , u21,i , vi ). Index p(u02 |u12 , u21 , v ) = i=1
2
2 2 nR02 ]. them as u2n 02 (m02 |m12 , m21 , mv ), where m02 ∈ [1, 2 h) For each v n (mv ), un12 (m12 |mv ), un21 (m21 |mv ), 2 and u2n xn1 (m10 |m12 , mv ) 01 (m01 |m12 , m21 , mv ), 1 nR01 2n generate 2 i.i.d u01 sequences each 2n n n n n |u , x , u , u , v ) = with probability p(u1n 01 01 1 12 21 n 1 2 p(u01,i |u01,i , x1,i , u12,i , u21,i , vi ). Index them as i=1
1
1 2 1 nR01 u1n ]. 01 (m01 |m01 , m10 , m21 , m12 , mv ), where m01 ∈ [1, 2 n n n i) For each v (mv ), u12 (m12 |mv ), u21 (m21 |mv ), 2 xn2 (m20 |m21 , mv ) and u2n 02 (m02 |m12 , m21 , mv ), generate 1 2nR02 i.i.d u1n sequences each with probability 02 2 n n n n |u , x , u , u , v ) p(u1n 02 02 2 12 21 n p(u102,i |u202,i , x2,i , u12,i , u21,i , vi ). Index them as = i=1
1
1 2 1 nR02 u1n ]. 02 (m02 |m02 , m20 , m21 , m12 , mv ), where m02 ∈ [1, 2 n n u12 (m12 |mv ), j) For each v (mv ), xn1 (m10 |m12 , mv ), xn2 (m20 |m21 , mv ), un21 (m21 |mv ), 2 2n 2 (m |m , m , m ), u (m u2n 12 21 v 01 01 02 02 |m12 , m21 , mv ), 1 2 (m |m , m , m , m , m ), u1n 10 21 12 v 01 01 01 1 2 generate 2nR00 i.i.d u1n 02 (m02 |m02 , m20 , m21 , m12 , mv ), n x0 sequences each with probability n i=1
1n 2n 2n n n n n n p(xn0 |u1n 01 , u02 , u01 , u02 , x1 , x2 , u12 , u21 , v ) =
p(x0,i |u101,i , u102,i , u201,i , u202,i , x1,i , x2,i , u12,i , u21,i , vi )
Index them as xn0 (m00,i |m101,i , m201,i , m102,i , m202,i , m10,i , m20,i , m12,i , m21,i , mv,i ), m00 ∈ [1, 2nR00 ]. The index m10 represents the index m101 of the previous block. The index m20 represents the index m102 of the previous block. The index m12 represents the index m201 of the previous block. The index m21 represents the index m202 of the previous block. mv represents the pair indices (m12 , m21 ) of the previous block. Repeating the above process a)-j) independently, we generate another random codebooks C1 similar to C0 . We will use these 2 codebooks in a sequential way as follows: In block b = 1, ..., B, the codebook Cb mode 2 is used. Hence in any 2 consecutive blocks, codewords from different blocks are independent. This is a property we will use in the analysis of the probability of error. The codebook and bin assignment are revealed to all parties. Encoding: Encoding is performed in the following Markov 2 ) be fashion: Let wi = (m00,i , m101,i , m201,i , m102,i , m02,i the new message to be transmitted in block i. Assume that the first and second relays have estimates of ˆˆ 2 ˆ ˆ ˆ ˆ ˆˆ 1 ˆ 21,i−1 ) and (m ˆ 102,i−1 , m ˆ 202,i−1 , m ˆ 12,i−1 ) (m 01,i−1 , m 01,i−1 , m 1 2 of the previous indices (m01,i−1 , m01,i−1 , m21,i−1 ) and (m102,i−1 , m202,i−1 , m12,i−1 ) respectively. Since both relays know (m12,i−1 , m21,i−1 ), they try to send it to the sink by the cooperation variable vi . In this fashion, the first and second relays transmissions in block i are xn1 (m10,i |m12,i , mv,i ) and
ˆˆ 2 xn2 (m20,i |m21,i , mv,i ) respectively, where m12,i = m 01,i−1 , 2 1 ˆ ˆ ˆˆ 1 m21,i = m ˆ 02,i−1 , m10,i = m ˆ 01,i−1 and m20,i = m 02,i−1 . The sender at block i knowing (m101,i , m201,i ), (m102,i , m202,i ) (the parts of the messages transmitted to the relays in block i), (m101,i−1 , m201,i−1 ), (m102,i−1 , m202,i−1 ) and hence m10,i , m20,i , m12,i , m21,i , transmits xn0 (m00,i |m101,i , m201,i , m102,i , m202,i , m10,i , m20,i , m12,i , m21,i , 1 2 mv,i ) implicity including u1n 01 (m01 |m01 , m10 , m21 , m12 , mv ), 1 2 2n 2 (m |m , m , m , m , m ), u u1n 20 21 12 v 02 02 02 01 (m01 |m12 , m21 , mv ), 2 (m |m , m , m ). u2n 12 21 v 02 02 Decoding: Assume that at the end of block (i − 1), the first relay knows (m101,1 , m101,2 , ..., m101,i−1 ), 2 2 2 (m01,1 , m01,2 , ..., m01,i−1 ), (m10,1 , m10,2 , ..., m10,i ), (m21,1 , m21,2 , ..., m21,i−1 )and (m12,1 , m12,2 , ..., m12,i ), The second relay knows (mv,1 , mv,2 , ..., mv,i ). (m202,1 , m202,2 , ..., m202,i−1 ), (m102,1 , m102,2 , ..., m102,i−1 ), (m21,1 , m21,2 , ..., m21,i ), (m20,1 , m20,2 , ..., m20,i ), (m12,1 , m12,2 , ..., m12,i−1 ) and (mv,1 , mv,2 , ..., mv,i ). At the end of block i ˆ 1) (At the first relay) The first relay declares m ˆ 21,i = m21,i ˆ or equivalently m ˆ 202,i−1 = m202,i−1 was sent by looking for unique index m21 such that ˆ ˆ ˆ ˆ ˆ 21,i |m ˆ vi ), un12 (m ˆ 12,i |m ˆ vi ), un21 (m ∈ An ; ˆ ˆ ˆ ˆ xn1 (m ˆ 10,i |m ˆ vi , m ˆ 12,i ), v n (m ˆ vi ), y1n (i) and
⎞ ˆ ˆ ˆ ˆ ˆ 202,i |m ˆ 12,i , m ˆ 21,i , m ˆ v,i ), u2n 02 (m ⎜ ⎟ ˆ ˆ ˆ ˆ ˆ 21,i |m ˆ vi ), un12 (m ˆ 12,i |m ˆ vi ), ⎠ ∈ An ; ⎝ un21,i (m ˆ ˆ ˆ ˆ xn1 (m ˆ 10,i |m ˆ vi , m ˆ 12,i ), v n (m ˆ vi ), y1n (i) ⎛
this can be done with arbitrary small probability of error if n is sufficiently large and 2 2 < I(U02 U21 ; Y1 |X1 U12 V ) R02
(7)
The proof can be done by regular encoding/sliding window decoding technique and [9, lemma 1], based on the fact that 2 U21 ; X1 U12 |V ) = 0. according to (6), I(U02 ˆ 2) (At the first relay) The first relay declares m ˆ 101,i = m101,i 2 2 ˆ and m ˆ 01,i = m01,i were sent by looking for unique indices m101 and m201 such that ⎛ ⎞ ˆ ˆ ˆ ˆ ˆ ˆ u1n ˆ 101,i |m ˆ 201,i , m ˆ 10,i , m ˆ 12,i , m ˆ 21,i , m ˆ v,i ), 01 (m ⎜ ⎟ ˆ ˆ ˆ ˆ u2n ˆ 201,i |m ˆ 12,i , m ˆ 21,i , m ˆ v,i ), ⎜ ⎟ 01 (m ⎜ ⎟ ∈ An ; ˆ ˆ ˆ ˆ ˆ ⎝ xn1 (m ˆ 10,i |m ˆ 12,i , m ˆ vi ), un21 (m ˆ 21,i |m ˆ vi ), ⎠ ˆ ˆ ˆ un12 (m ˆ 12,i |m ˆ vi ), v n (m ˆ vi ), y1n (i) this can be done with arbitrary small probability of error if n is sufficiently large and
1330
1 2 1 2 R01 + R01 < I(U01 U01 ; Y1 |X1 U12 U21 V )
(8)
ISIT 2008, Toronto, Canada, July 6 - 11, 2008
ˆ ˆ The proof is based on joint decoding of (m ˆ 101 , m ˆ 201 ), in this way the probability of error is expressed as Pen = 1 2 1 2 n (m101 ,m201 )∈[1,2nR01 ]×[1,2nR01 ] (u01 ,u01 ,x1 ,u12 ,u21 ,v,y1 )∈A
1
p(u101 |u201 , x1 , u12 , u21 , v)p(u201 |u12 , u21 , v) p(y1 |x1 , u12 , u21 , v)p(x1 , u12 , u21 , v)
2
1
2
= 2n(R01 +R01 ) An 2−n(H(U01 |U01 X1 U12 U21 V )−) 2 2−n(H(U01 |U12 U21 V )−) 2−n(H(Y1 |X1 U12 U21 V )−) 2−n(H(X1 U12 U21 V )−) 1 2 1 2 n(R01 +R01 ) n(H(Y1 U01 U01 X1 U12 U21 V )+) =2 2 1 2 2 2−n(H(U01 |U01 X1 U12 U21 V )−) 2−n(H(U01 |U12 U21 V )−) 2−n(H(Y1 |X1 U12 U21 V )−) 2−n(H(X1 U12 U21 V )−)
(a)
1
2
1
2
2 2 2 2 + R02 < I(U01 U02 U12 U21 V ; Y0 ). R01
where (a) follows from the fact that according to (6), 2 ; X1 |U12 U21 V ) = 0. Since > 0 is arbitrary, (8) imply I(U01 that Pe → 0 as n → ∞. 3) (At the second relay) Due to the symmetry in the definition of codewords by the same argument as the previous step, the following rate region is obtained, 2 R01 2 + R02
<