Three-Receiver Broadcast Channels with Side Information Saeed Hajizadeh (Undergraduate Student) Department of Electrical Engineering Ferdowsi University of Mashhad Mashhad, Iran
[email protected] AbstractโThree-receiver broadcast channel (BC) is of interest due to its information theoretical differences with two receiver one. In this paper, we derive achievable rate regions for two classes of 3-receiver BC with side information available at the transmitter, Multilevel BC and 3-receiver less noisy BC, by using superposition coding, Gelโfand-Pinsker binning scheme and Nair-El Gamal indirect decoding. Our rate region for multilevel BC subsumes the Steinberg rate region for 2-receiver degraded BC with side information as its special case. We also find the capacity region of 3-receiver less noisy BC when side information is available both at the transmitter and at the receivers. Keywords: 3-receiver broadcast channel, less noisy, Multilevel broadcast channel
I.
INTRODUCTION
The k-receiver, ๐๐ โฅ 3, broadcast channel (BC) was first studied by Borade et al. in [1] where they simply surmised that straightforward extension of Kรถrner-Martonโs capacity region for two-receiver BCs with degraded message sets [2] to k-receiver multilevel broadcast networks is optimal. Nair-El Gamal [3] showed that the capacity region of a special class of 3-receiver BCs with two degraded message sets when one of the receivers is a degraded version of the other, is a superset of [1], thus proving that direct extension of [2] is not in general optimal. Nair and Wang later in [4] established the capacity region of the 3-receiver less noisy BC. Channels with Side information (SI), were first studied by Shannon [5], where he found the capacity region of the Single-Input-Single-Output channel when SI is causally available at the encoder. Gelfโand and Pinsker [6] found the capacity region of a single-user channel when SI is non-causally available at the transmitter while the receiver is kept ignorant of it. Cover and Chiang [7] extended the results of [6] to the case where SI is available at both the encoder and the decoder. Multiple user channels with side information were studied in [8] where inner and outer bounds for degraded BC with non-causal SI and capacity region of degraded BC with causal SI were found. Moreover, in [9] inner and outer bounds were given to general two-user BCs with SI available at the transmitter and other special cases both for BCs and MACs were also found. In this paper, we find the achievable rate region of Multilevel BC and 3-receiver less noisy BC both with SI non-
Ghosheh Abed Hodtani Department of Electrical Engineering Ferdowsi University of Mashhad Mashhad, Iran
[email protected]
causally available at the encoder. Our achievable rate regions reduce to that of [3] and [4] when there is no side information. We also find the capacity region of the latter when side information is also available at the receivers. The rest of the paper is organized as follows. In section II, basic definitions and notations are presented. In sections III and IV, new achievable rate regions are given for the Multilevel BC and 3receiver less noisy BC, respectively. In section V, conclusion is given. II.
DEFINITIONS
Random variables and their realizations are denoted by uppercase and lowercase letters, respectively, e.g. x is a realization of X. Let ๐ณ๐ณ, ๐ด๐ด1 , ๐ด๐ด2 , ๐ด๐ด3 , ๐๐๐๐๐๐ ๐ฎ๐ฎ be finite sets showing alphabets of random variables. The n-sequence of a random variable is given by ๐๐ ๐๐ where the superscript is omitted when the choice of n is clear, thus we only use boldface letters for the random variable itself, i.e. ๐๐ = ๐ฅ๐ฅ ๐๐ . is the Throughout, we assume that ๐๐๐๐๐๐ sequence (๐๐๐๐ , ๐๐๐๐+1 , โฆ , ๐๐๐๐ ). Definition 1: A channel ๐๐ โ ๐๐ is said to be a degraded version of the channel ๐๐ โ ๐๐ with SI if ๐๐ โ ๐๐ โ ๐๐ be a Markov chain conditioned on every ๐ ๐ โ ๐ฎ๐ฎ for all ๐๐(๐ข๐ข, ๐ฅ๐ฅ|๐ ๐ ). Multilevel BC with side information, denoted by ๏ฟฝ๐ณ๐ณ, ๐ฎ๐ฎ, ๐ด๐ด1 , ๐ด๐ด2 , ๐ด๐ด3 , ๐๐(๐ฆ๐ฆ1 , ๐ฆ๐ฆ3 |๐ฅ๐ฅ, ๐ ๐ ), ๐๐(๐ฆ๐ฆ2 |๐ฆ๐ฆ1 )๏ฟฝ , is a 3-receiver BC with 2-degraded message sets with input alphabet ๐ณ๐ณ and output alphabets ๐ด๐ด1 , ๐ด๐ด2 , and ๐ด๐ด3 . The side information is the random variable S distributed over the set ๐ฎ๐ฎ according to ๐๐(๐ ๐ ). The transition probability function ๐๐(๐ฆ๐ฆ1 , ๐ฆ๐ฆ3 |๐ฅ๐ฅ, ๐ ๐ ) describes the relationship between channel input X, side information S, and channel outputs ๐๐1 and ๐๐3 while the probability function ๐๐(๐ฆ๐ฆ2 |๐ฆ๐ฆ1 ) shows the virtual channel modeling the output ๐๐2 as the degraded version of ๐๐1 . Independent message sets ๐๐0 โ โณ0 and ๐๐1 โ โณ1 are to be reliably sent, m0 being the common message for all the receivers and m1 the private message only for Y1 . Channel model is depicted in Fig. 1.
Definition 2: A (๐๐, 2๐๐๐
๐
0 , 2๐๐๐
๐
1 , ๐๐) two-degraded message set code for the Multilevel BC with side information ๏ฟฝ๐๐(๐ฆ๐ฆ1 , ๐ฆ๐ฆ3 |๐ฅ๐ฅ, ๐ ๐ ), ๐๐(๐ฆ๐ฆ2 |๐ฆ๐ฆ1 )๏ฟฝ consists of an encoder map
The messages ๐๐1 โ โณ1 , ๐๐2 โ โณ2 , ๐๐3 โ โณ3 are to be reliably sent to receivers ๐๐1 , ๐๐2 , ๐๐๐๐๐๐ ๐๐3 , respectively. The code and rate tuple definitions are as follows (๐๐, 2๐๐๐
๐
1 , 2๐๐๐
๐
2 , 2๐๐๐
๐
3 , ๐๐) 1
(๐
๐
1 , ๐
๐
2 , ๐
๐
3 ) = (๐๐๐๐๐๐๐๐1 , ๐๐๐๐๐๐๐๐2 , ๐๐๐๐๐๐๐๐3 ) ๐๐
Figure 1. Multilevel broadcast channel with side information.
๐๐ โถ {1,2, โฆ , ๐๐0 } ร {1,2, โฆ , ๐๐1 } ร ๐ฎ๐ฎ ๐๐ โถ ๐ณ๐ณ ๐๐
and a tuple of decoding maps
๐๐๐ฆ๐ฆ1 โถ ๐ด๐ด1๐๐ โถ {1,2, โฆ , ๐๐0 } ร {1,2, โฆ , ๐๐1 } ๐๐๐ฆ๐ฆ2 โถ ๐ด๐ด2๐๐ โถ {1,2, โฆ , ๐๐0 } ๐๐๐ฆ๐ฆ3 โถ ๐ด๐ด3๐๐ โถ {1,2, โฆ , ๐๐0 } (๐๐)
Such that ๐๐๐๐ ๐๐0
โค ๐๐, i.e.
๐๐1
1 ๏ฟฝ ๏ฟฝ ๏ฟฝ ๐๐(๐๐)๐๐{๐๐๐ฆ๐ฆ1 (๐๐1 ) โ (๐๐0 , ๐๐1 ) ๐๐๐๐ ๐๐0 ๐๐1 ๐๐ ๐๐ ๐๐ 0=1 ๐๐ 1 =1 ๐ ๐ โ๐ฎ๐ฎ
๐๐๐ฆ๐ฆ2 (๐๐2 ) โ ๐๐0 ๐๐๐๐ ๐๐๐ฆ๐ฆ3 (๐๐3 ) โ ๐๐0 |๐๐, ๐๐(๐๐0 , ๐๐1 , ๐๐)} โค ๐๐
The rate pair of the code is defined as (๐
๐
0 , ๐
๐
1 ) =
1 (log ๐๐0 , ๐๐๐๐๐๐๐๐1 ) ๐๐
A rate pair (๐
๐
0 , ๐
๐
1 ) is said to be ๐๐-achievable if for any ๐๐ > 0 there is an integer ๐๐0 such that for all ๐๐ โฅ ๐๐0 we have a code for (๐๐, 2๐๐(๐
๐
0 โ๐๐ ) , 2๐๐(๐
๐
1 โ๐๐ ) , ๐๐) ๏ฟฝ๐๐(๐ฆ๐ฆ1 , ๐ฆ๐ฆ3 |๐ฅ๐ฅ, ๐ ๐ ), ๐๐(๐ฆ๐ฆ2 |๐ฆ๐ฆ1 )๏ฟฝ. The union of the closure of all ๐๐achievable rate pairs is called the capacity region ๐๐๐๐๐๐๐๐ . Definition 3: A channel ๐๐ โ ๐๐ is said to be less noisy than the channel ๐๐ โ ๐๐ in the presence of side information if ๐ผ๐ผ(๐๐; ๐๐|๐๐ = ๐ ๐ ) โฅ ๐ผ๐ผ(๐๐; ๐๐|๐๐ = ๐ ๐ ) โ๐๐(๐ข๐ข, ๐ฅ๐ฅ, ๐ฆ๐ฆ, ๐ง๐ง|๐ ๐ ) = ๐๐(๐ข๐ข|๐ ๐ )๐๐(๐ฅ๐ฅ|๐ข๐ข, ๐ ๐ )๐๐(๐ฆ๐ฆ, ๐ง๐ง|๐ฅ๐ฅ, ๐ ๐ ) ๐๐๐๐๐๐ โ๐ ๐ โ ๐ฎ๐ฎ.
The 3-receiver less noisy BC with side information is depicted in Fig. 2, where ๐๐1 is less noisy than ๐๐2 and ๐๐2 is less noisy than ๐๐3 , i.e. according to [4], ๐๐1 โฝ ๐๐2 โฝ ๐๐3 .
Figure.2. Three-receiver less noisy broadcast channel with side information.
Achievable rate tuples and the achievable rate region and the capacity region ๐๐๐ฟ๐ฟ are defined in just the same way as Multilevel BC. III.
MULTILEVEL BROADCAST CHANNEL WITH SIDE INFORMATION
Define ๐ซ๐ซ as the collection of all random variables (๐๐, ๐๐, ๐๐, ๐๐, ๐๐1 , ๐๐2 , ๐๐3 ) with finite alphabets such that ๐๐(๐ข๐ข, ๐ฃ๐ฃ, ๐ ๐ , ๐ฅ๐ฅ, ๐ฆ๐ฆ1 , ๐ฆ๐ฆ2 , ๐ฆ๐ฆ3 ) = ๐๐(๐ ๐ )๐๐(๐ข๐ข|๐ ๐ )๐๐(๐ฃ๐ฃ|๐ข๐ข, ๐ ๐ )๐๐(๐ฅ๐ฅ|๐ฃ๐ฃ, ๐ ๐ )๐๐(๐ฆ๐ฆ1 , ๐ฆ๐ฆ3 |๐ฅ๐ฅ, ๐ ๐ )๐๐(๐ฆ๐ฆ2 |๐ฆ๐ฆ1 )
(1)
By (1), the following Markov chains hold:
(2)
(๐๐, ๐๐) โ (๐๐, ๐๐) โ (๐๐1 , ๐๐3 )
(3)
(๐๐, ๐๐, ๐๐3 ) โ ๐๐1 โ ๐๐2
Theorem 1: A pair of nonnegative numbers (๐
๐
0 , ๐
๐
1 ) is achievable for Multilevel BC with side information noncausally available at the transmitter provided that ๐
๐
0 โค min{๐ผ๐ผ(๐๐; ๐๐2 ) โ ๐ผ๐ผ(๐๐; ๐๐), ๐ผ๐ผ(๐๐; ๐๐3 ) โ ๐ผ๐ผ(๐๐๐๐; ๐๐)} ๐
๐
1 โค ๐ผ๐ผ(๐๐; ๐๐1 |๐๐) โ ๐ผ๐ผ(๐๐; ๐๐|๐๐) โ ๐ผ๐ผ(๐๐; ๐๐|๐๐) (4) ๐
๐
0 + ๐
๐
1 โค ๐ผ๐ผ(๐๐; ๐๐3 ) + ๐ผ๐ผ(๐๐; ๐๐1 |๐๐) โ ๐ผ๐ผ(๐๐; ๐๐|๐๐) โ ๐ผ๐ผ(๐๐๐๐; ๐๐) for some (๐๐, ๐๐, ๐๐, ๐๐, ๐๐1 , ๐๐2 , ๐๐3 ) โ ๐ซ๐ซ.
Corollary 1.1: By setting ๐๐ โก โ
in (4), our achievable rate region in Theorem 1 is reduced to the capacity region of Multilevel BC given in [3].
Corollary 1.2: By setting ๐๐3 = ๐๐1 and ๐๐ = ๐๐ in (4), our achievable rate region reduces to that of [8] for the two-user degraded BC with side information. Proof: Fix n and a joint distribution on ๐ซ๐ซ. Note that side information is distributed i.i.d according to ๐๐
๐๐(๐๐) = ๏ฟฝ ๐๐(๐ ๐ ๐๐ ) ๐๐=1
Split the โณ1 message into two independent submessage sets โณ11 , ๐๐๐๐๐๐ โณ12 so that ๐
๐
1 = ๐
๐
11 + ๐
๐
12 .
Codebook Generation: First randomly and independently โฒ โฒ generate 2๐๐๏ฟฝ๐
๐
0 +๐
๐
0 ๏ฟฝ sequences ๐๐(๐๐0โฒ , ๐๐0 ), ๐๐0โฒ โ ๏ฟฝ1,2, โฆ , 2๐๐๐
๐
0 ๏ฟฝ, ๐๐0 โ {1,2, โฆ , 2๐๐๐
๐
0 }, each one i.i.d according to โ๐๐๐๐=1 ๐๐(๐ข๐ข๐๐ ) and then randomly throw them into 2๐๐๐
๐
0 bins. It is clear that โฒ we have 2๐๐๐
๐
0 sequences in each bin. Now for each ๐๐(๐๐0โฒ , ๐๐0 ), randomly and independently โฒ โฒ โฒ , ๐๐11 ), ๐๐11 โ generate 2๐๐(๐
๐
11 +๐
๐
11 ) sequences ๐๐(๐๐0โฒ , ๐๐0 , ๐๐11
๏ฟฝ1, โฆ , 2๐๐๐
๐
11 ๏ฟฝ, ๐๐11 โ {1, โฆ , 2๐๐๐
๐
11 } each one i.i.d according to โ๐๐๐๐=1 ๐๐๐๐|๐๐ ๏ฟฝ๐ฃ๐ฃ๐๐ ๏ฟฝ๐ข๐ข๐๐ (๐๐0โฒ , ๐๐0 )๏ฟฝ, and randomly throw them into 2๐๐๐
๐
11 bins. โฒ Now for each sequence ๐๐(๐๐0โฒ , ๐๐0 , ๐๐11 , ๐๐11 ), randomly and โฒ independently generate 2๐๐(๐
๐
12 +๐
๐
12 ) โฒ โฒ , ๐๐11 , ๐๐12 , ๐๐12 ) each one i.i.d sequences ๐๐(๐๐0โฒ , ๐๐0 , ๐๐11 ๐๐ according to โ๐๐=1 ๐๐๐๐|๐๐,๐๐ (๐ฅ๐ฅ๐๐ |๐ฃ๐ฃ๐๐ , ๐ข๐ข๐๐ ) = โ๐๐๐๐=1 ๐๐๐๐|๐๐ (๐ฅ๐ฅ๐๐ |๐ฃ๐ฃ๐๐ ). Then randomly throw them into 2๐๐๐
๐
12 bins. Then provide the transmitter and all the receivers with bins and their codewords. Encoding: We are given the side information ๐๐ and the message pair (๐๐0 , ๐๐1 ). Indeed, our messages are bin indices. We find ๐๐11 , ๐๐๐๐๐๐ ๐๐12 . Now in the bin ๐๐0 of ๐๐ sequences (๐๐) look for a ๐๐0โฒ such that (๐๐(๐๐0โฒ , ๐๐0 ), ๐๐) โ ๐ด๐ด๐๐ , i.e. the sequence ๐๐ that is jointly typical with the ๐๐ given where definitions of typical sequences are given in [12]. Then in the โฒ such that bin ๐๐11 of ๐๐ sequences look for some ๐๐11
โฒ , 1), ๐ธ๐ธ11 = {( ๐๐(๐๐0โฒ , 1), ๐๐(๐๐0โฒ , 1, ๐๐11 (๐๐) โฒ โฒ โฒ ๐๐(๐๐0 , 1, ๐๐11 , 1, ๐๐12 , 1), ๐๐1 ) โ ๐ด๐ดโ }
โฒ Now in the bin ๐๐12 of ๐๐ sequences look for some ๐๐12 such that
โฒ โค ๐ผ๐ผ(๐๐; ๐๐1 |๐๐) โ 6๐๐ ๐
๐
12 + ๐
๐
12 โฒ โฒ ๐
๐
11 + ๐
๐
11 + ๐
๐
12 + ๐
๐
12 โค ๐ผ๐ผ(๐๐; ๐๐1 |๐๐) โ 6๐๐ โฒ โฒ โฒ ๐
๐
0 + ๐
๐
0 + ๐
๐
11 + ๐
๐
11 + ๐
๐
12 + ๐
๐
12 โค ๐ผ๐ผ(๐๐; ๐๐1 ) โ 5๐๐
โฒ
โฒ (๐๐(๐๐0โฒ , ๐๐0 ), ๐๐(๐๐0โฒ , ๐๐0 , ๐๐11 , ๐๐11 ), ๐๐) โ
(๐๐) ๐จ๐จ๐๐
โฒ , ๐๐11 ), (๐๐(๐๐0โฒ , ๐๐0 ), ๐๐(๐๐0โฒ , ๐๐0 , ๐๐11 (๐๐) โฒ โฒ โฒ , ๐๐11 , ๐๐12 , ๐๐12 ), ๐๐) โ ๐จ๐จ๐๐ ๐๐(๐๐0 , ๐๐0 , ๐๐11
We send the found ๐๐ sequence. Before bumping into decoding, assume that the correct indices are found through โฒ โฒ = ๐๐11 the encoding procedure, i.e. ๐๐0โฒ = ๐๐0โฒ , ๐๐11 โฒ โฒ and ๐๐12 = ๐๐12 .
Decoding: Since the messages are uniformly distributed over their respective ranges, we can assume, without loss of generality, that the tuple (๐๐0 , ๐๐11 , ๐๐12 ) = (1,1,1) is sent. The second receiver ๐๐2 receives ๐๐2 thus having the following error events (๐๐)
๐ธ๐ธ21 = {(๐๐(๐๐0โฒ , 1), ๐๐2 ) โ ๐ด๐ด๐๐ } (๐๐) ๐ธ๐ธ22 = {(๐๐(๐๐0โฒ , ๐๐0 ), ๐๐2 ) โ ๐ด๐ด๐๐ ๐๐๐๐๐๐ ๐ ๐ ๐ ๐ ๐ ๐ ๐ ๐ ๐๐0 โ 1 โฒ โฒ ๐๐๐๐๐๐ ๐๐0 โ ๐๐0 } (๐๐)
๐ธ๐ธ23 = {(๐๐(๐๐0โฒ , ๐๐0 ), ๐๐1 ) โ ๐ด๐ด๐๐ ๐๐๐๐๐๐ ๐ ๐ ๐ ๐ ๐ ๐ ๐ ๐ ๐๐0 โ 1}
leads us to a redundant inequality.
Now by the weak law of large numbers (WLLN) [14], ๐๐(๐ธ๐ธ21 ) โค ๐๐, โ๐๐ > 0 as ๐๐ โ โ. For the second error event we have โฒ
๐๐(๐ธ๐ธ22 ) = ๏ฟฝ ๏ฟฝ ๐๐(๐๐)๐๐(๐๐2 ) โค 2๐๐๏ฟฝ๐
๐
0 +๐
๐
0 ๏ฟฝ 2๐๐(๐ป๐ป(๐๐,๐๐2 )+๐๐) 2
โฒ
(๐๐ )
2
๐
๐
0 + ๐
๐
0โฒ โค ๐ผ๐ผ(๐๐; ๐๐2 ) โ 3๐๐
โฒ ๐ธ๐ธ14 = {( ๐๐(๐๐0โฒ , ๐๐0 ), ๐๐(๐๐0โฒ , ๐๐0 , ๐๐11 , ๐๐11 ), (๐๐) โฒ โฒ โฒ ๐๐(๐๐0 , ๐๐0 , ๐๐11 , ๐๐11 , ๐๐12 , ๐๐12 ), ๐๐1 ) โ ๐ด๐ดโ ๐๐๐๐๐๐ ๐ ๐ ๐ ๐ ๐ ๐ ๐ ๐ ๐๐0 โ 1 ๐๐๐๐๐๐ ๐๐0โฒ โ ๐๐0โฒ ๐๐๐๐๐๐ ๐ ๐ ๐ ๐ ๐ ๐ ๐ ๐ โฒ โฒ ๐๐1๐๐ โ 1 ๐๐๐๐๐๐ ๐๐1๐๐ โ ๐๐1๐๐ , ๐๐ = 1,2}
The first receiverโs probability of error can be arbitrarily made small provided that (6) (7) (8)
The third receiver ๐๐3 receives ๐๐3 and needs to decode only the common message indirectly by decoding the message ๐๐11 . The error events are (๐๐)
โฒ , 1), ๐๐3 ) โ ๐ด๐ดโ } ๐ธ๐ธ31 = {( ๐๐(๐๐0โฒ , 1), ๐๐(๐๐0โฒ , 1, ๐๐11 (๐๐) โฒ โฒ โฒ ๐ธ๐ธ32 = {( ๐๐(๐๐0 , 1), ๐๐(๐๐0 , 1, ๐๐11 , ๐๐11 ), ๐๐3 ) โ ๐ด๐ดโ ๐๐๐๐๐๐ โฒ โฒ โฒ ๐ ๐ ๐ ๐ ๐ ๐ ๐ ๐ ๐๐11 โ 1 ๐๐๐๐๐๐ ๐๐11 โ ๐๐11 } (๐๐) โฒ โฒ โฒ ๐ธ๐ธ33 = {( ๐๐(๐๐0 , ๐๐0 ), ๐๐(๐๐0 , ๐๐0 , ๐๐11 , ๐๐11 ), ๐๐3 ) โ ๐ด๐ดโ ๐๐๐๐๐๐ โฒ โฒ ๐ ๐ ๐ ๐ ๐ ๐ ๐ ๐ ๐๐0 โ 1, ๐๐11 โ 1, ๐๐0โฒ โ ๐๐0โฒ , ๐๐๐๐๐๐ ๐๐11 โ ๐๐11 }
Again by using WLLN and AEP, we see that the third receiverโs error probabilities can be arbitrarily made small as ๐๐ โ โ provided that (9)
Using Gelโfand-Pinsker coding we see that the encoders can โฒ โฒ choose the proper ๐๐0โฒ , ๐๐11 , ๐๐๐๐๐๐ ๐๐12 indices with vanishing probability of error provided that for every ๐๐ > 0 and sufficiently large n ๐
๐
0โฒ โฅ ๐ผ๐ผ(๐๐; ๐๐) + 2๐๐ โฒ ๐
๐
11 โฅ ๐ผ๐ผ(๐๐; ๐๐|๐๐) + 2๐๐ โฒ ๐
๐
12 โฅ ๐ผ๐ผ(๐๐; ๐๐|๐๐) + 2๐๐
(10) (11) (12)
๐ผ๐ผ(๐๐; ๐๐|๐๐) + ๐ผ๐ผ(๐๐; ๐๐) = ๐ผ๐ผ(๐๐๐๐; ๐๐)
(13)
Now combining (5) - (9) and (10) - (12) and noting that
โฒ
= 2โ๐๐๏ฟฝ๐ผ๐ผ(๐๐;๐๐2 )โ3๐๐โ๐
๐
0 โ๐
๐
0 ๏ฟฝ
We see that โ๐๐ > 0, ๐๐(๐ธ๐ธ22 ) โค ๐๐ as ๐๐ โ โ provided that
โฒ ๐ธ๐ธ13 = {( ๐๐(๐๐0โฒ , 1), ๐๐(๐๐0โฒ , 1, ๐๐11 , ๐๐11 ), (๐๐) โฒ โฒ โฒ ๐๐(๐๐0 , 1, ๐๐11 , ๐๐11 , ๐๐12 , ๐๐12 ), ๐๐1 ) โ ๐ด๐ดโ โฒ โฒ ๐๐๐๐๐๐ ๐ ๐ ๐ ๐ ๐ ๐ ๐๐ ๐๐1๐๐ โ 1 ๐๐๐๐๐๐ ๐๐1๐๐ โ ๐๐1๐๐ , ๐๐ = 1,2}
โฒ ๐
๐
0 + ๐
๐
0โฒ + ๐
๐
11 + ๐
๐
11 โค ๐ผ๐ผ(๐๐; ๐๐3 ) โ 3๐๐
Remark 1: The following error event
๐๐ 0 ,๐๐ 0 ๐ด๐ด ๐๐ โ๐๐(๐ป๐ป(๐๐)โ๐๐) โ๐๐(๐ป๐ป(๐๐2 )โ๐๐)
โฒ ๐ธ๐ธ12 = {( ๐๐(๐๐0โฒ , 1), ๐๐(๐๐0โฒ , 1, ๐๐11 , 1), (๐๐) โฒ โฒ โฒ ๐๐(๐๐0 , 1, ๐๐11 , 1, ๐๐12 , ๐๐12 ), ๐๐1 ) โ ๐ด๐ดโ โฒ โฒ ๐๐๐๐๐๐ ๐ ๐ ๐ ๐ ๐ ๐ ๐ ๐ ๐๐12 โ 1 ๐๐๐๐๐๐ ๐๐12 โ ๐๐12 }
(5)
The first receiver ๐๐1 receives ๐๐1 and needs to decode both ๐๐0 and ๐๐1 . Therefore, the error events are
and using Fourier-Motzkin procedure afterwards to eliminate ๐
๐
11 ๐๐๐๐๐๐ ๐
๐
12 , we obtain (4) as an achievable rate region for Multilevel BC with side information. โ
IV.
THREE-RECEIVER LESS NOISY BROADCAST CHANNEL WITH SIDE INFORMATION
Define ๐ซ๐ซ โ as the collection of all random variables (๐๐, ๐๐, ๐๐, ๐๐, ๐๐1 , ๐๐2 , ๐๐3 ) with finite alphabets such that ๐๐(๐ข๐ข, ๐ฃ๐ฃ, ๐ ๐ , ๐ฅ๐ฅ, ๐ฆ๐ฆ1 , ๐ฆ๐ฆ2 , ๐ฆ๐ฆ3 ) = ๐๐(๐ ๐ )๐๐(๐ข๐ข|๐ ๐ )๐๐(๐ฃ๐ฃ|๐ข๐ข, ๐ ๐ )๐๐(๐ฅ๐ฅ|๐ฃ๐ฃ, ๐ ๐ )๐๐(๐ฆ๐ฆ1 , ๐ฆ๐ฆ2 , ๐ฆ๐ฆ3 |๐ฅ๐ฅ, ๐ ๐ )
(14)
Theorem 2: A rate triple(๐
๐
1 , ๐
๐
2 , ๐
๐
3 ) is achievable for 3receiver less noisy BC with side information non-causally available at the transmitter provided that
(15)
๐
๐
1 โค ๐ผ๐ผ(๐๐; ๐๐1 |๐๐๐๐) ๐
๐
2 โค ๐ผ๐ผ(๐๐; ๐๐2 |๐๐๐๐) ๐
๐
3 โค ๐ผ๐ผ(๐๐; ๐๐3 |๐๐)
(22)
Achievability: The direct part of the proof is achieved if you set ๐๐๏ฟฝ๐๐ = (๐๐๐๐ , ๐๐), ๐๐ = 1,2,3 in (15).
Converse: The converse part uses an extension of lemma 1 in [4].
for some joint distribution on ๐ซ๐ซ โ .
Corollary 2.1: By setting ๐๐ โก โ
in the above rate region, it reduces to the capacity region of 3-receiver less noisy BC given in [4]. Proof: The proof uses Coverโs superposition [15] and Gelโfand-Pinsker random binning coding [6] procedures along with Nairโs indirect decoding and is similar to the last proof provided and thus only an outline is provided. Fix n and a distribution on ๐ซ๐ซ โ . Again note that side information is distributed i.i.d according to ๐๐
๐๐(๐๐) = ๏ฟฝ ๐๐(๐ ๐ ๐๐ ) ๐๐=1
Randomly and independently generate โฒ ) sequences ๐๐(๐๐3 , ๐๐3 , each distributed i.i.d 2 according to โ๐๐๐๐=1 ๐๐(๐ข๐ข๐๐ ) and randomly throw them into 2๐๐๐
๐
3 bins. For each ๐๐(๐๐3โฒ , ๐๐3 ), randomly and independently generate ๐๐(๐
๐
2โฒ +๐
๐
2 ) sequences ๐๐(๐๐3โฒ , ๐๐3 , ๐๐2โฒ , ๐๐2 ) each distributed i.i.d 2 according to โ๐๐๐๐=1 ๐๐๐๐|๐๐ (๐ฃ๐ฃ๐๐ |๐ข๐ข๐๐ ) and randomly throw them into 2๐๐๐
๐
2 bins. Now for each generated ๐๐(๐๐3โฒ , ๐๐3 , ๐๐2โฒ , ๐๐2 ), randomly and independently generate โฒ โฒ ๐๐(๐
๐
1โฒ +๐
๐
1 ) โฒ sequences ๐๐(๐๐3 , ๐๐3 , ๐๐2 , ๐๐2 , ๐๐1 , ๐๐1 ) , each one 2 distributed i.i.d according to โ๐๐๐๐=1 ๐๐๐๐|๐๐ (๐ฅ๐ฅ๐๐ |๐ฃ๐ฃ๐๐ ) and randomly throw them into 2๐๐๐
๐
1 bins. Encoding is succeeded with small probability of error provided that ๐
๐
3โฒ โฅ ๐ผ๐ผ(๐๐; ๐๐) ๐
๐
2โฒ โฅ ๐ผ๐ผ(๐๐; ๐๐|๐๐) ๐
๐
1โฒ โฅ ๐ผ๐ผ(๐๐; ๐๐|๐๐)
(16) (17) (18)
๐
๐
3 + ๐
๐
3โฒ ๐
๐
2 + ๐
๐
2โฒ ๐
๐
1 + ๐
๐
1โฒ
(19) (20) (21)
and decoding is succeeded if โค ๐ผ๐ผ(๐๐; ๐๐3 ) โค ๐ผ๐ผ(๐๐; ๐๐2 |๐๐) โค ๐ผ๐ผ(๐๐; ๐๐1 |๐๐)
Theorem 3: The capacity region of the 3-receiver less noisy BC with side information, non-causally available at the transmitter and the receivers is the set of all rate triples(๐
๐
1 , ๐
๐
2 , ๐
๐
3 ) such that
Proof:
๐
๐
1 โค ๐ผ๐ผ(๐๐; ๐๐1 |๐๐) โ ๐ผ๐ผ(๐๐; ๐๐|๐๐) ๐
๐
2 โค ๐ผ๐ผ(๐๐; ๐๐2 |๐๐) โ ๐ผ๐ผ(๐๐; ๐๐|๐๐) ๐
๐
3 โค ๐ผ๐ผ(๐๐; ๐๐3 ) โ ๐ผ๐ผ(๐๐; ๐๐)
๐๐(๐
๐
3โฒ +๐
๐
3 )
Now combining (16), (17) and (18) with (19), (20) and (21) gives us (15). โ
Lemma 1: [4] Let the channel ๐๐ โ ๐๐ be less noisy than the channel ๐๐ โ ๐๐. Consider (๐๐, ๐๐ ๐๐ ) to be any random vector such that (๐๐, ๐๐ ๐๐ ) โ ๐๐ ๐๐ โ (๐๐ ๐๐ , ๐๐ ๐๐ )
forms a Markov chain. Then
1.
๐ผ๐ผ๏ฟฝ๐๐ ๐๐โ1 ; ๐๐๐๐ ๏ฟฝ๐๐, ๐๐ ๐๐ ๏ฟฝ โฅ ๐ผ๐ผ๏ฟฝ๐๐ ๐๐โ1 ; ๐๐๐๐ ๏ฟฝ๐๐, ๐๐ ๐๐ ๏ฟฝ
2. ๐ผ๐ผ๏ฟฝ๐๐ ๐๐โ1 ; ๐๐๐๐ ๏ฟฝ๐๐, ๐๐ ๐๐ ๏ฟฝ โฅ ๐ผ๐ผ๏ฟฝ๐๐ ๐๐โ1 ; ๐๐๐๐ ๏ฟฝ๐๐, ๐๐ ๐๐ ๏ฟฝ
Proof: First of all note that since the channel is memoryless we have ๐๐ (๐๐1 , ๐๐2 , ๐๐3 , ๐๐1๐๐โ1 , ๐๐2๐๐โ1 , ๐๐3๐๐โ1 , ๐๐ ๐๐โ1 , ๐๐๐๐+1 ) โ (๐๐๐๐ , ๐๐๐๐ ) โ (๐๐1๐๐ , ๐๐2๐๐ , ๐๐3๐๐ )
Just like [4], for any 1 โค ๐๐ โค ๐๐ โ 1 ๐ผ๐ผ(๐๐ ๐๐โ1 , ๐๐๐๐๐๐โ1 ; ๐๐๐๐ ๏ฟฝ๐๐, ๐๐ ๐๐ )
๐๐โ1 ; ๐๐๐๐ ๏ฟฝ๐๐, ๐๐ ๐๐ ๏ฟฝ = ๐ผ๐ผ๏ฟฝ๐๐ ๐๐ โ1 , ๐๐๐๐+1
๐๐โ1 ๏ฟฝ +๐ผ๐ผ๏ฟฝ๐๐๐๐ ; ๐๐๐๐ ๏ฟฝ๐๐, ๐๐ ๐๐ , ๐๐ ๐๐ โ1 , ๐๐๐๐+1
๐๐ โฅ ๐ผ๐ผ๏ฟฝ๐๐ ๐๐ โ1 , ๐๐๐๐๐๐โ1 +1 ; ๐๐๐๐ ๏ฟฝ๐๐, ๐๐ ๏ฟฝ
๐๐โ1 ๏ฟฝ +๐ผ๐ผ๏ฟฝ๐๐๐๐ ; ๐๐๐๐ ๏ฟฝ๐๐, ๐๐ ๐๐ , ๐๐ ๐๐โ1 , ๐๐๐๐+1
๐๐โ1 ; ๐๐๐๐ ๏ฟฝ๐๐, ๐๐ ๐๐ ๏ฟฝ = ๐ผ๐ผ๏ฟฝ๐๐ ๐๐ , ๐๐๐๐+1
where the inequality follows from the memorylessness of the channel and the fact that ๐๐ is less noisy than ๐๐, i.e. ๐๐โ1 ๐๐โ1 ๏ฟฝ โฅ ๐ผ๐ผ๏ฟฝ๐๐๐๐ ; ๐๐๐๐ ๏ฟฝ๐๐, ๐๐ ๐๐ , ๐๐ ๐๐โ1 , ๐๐๐๐+1 ๏ฟฝ. ๐ผ๐ผ๏ฟฝ๐๐๐๐ ; ๐๐๐๐ ๏ฟฝ๐๐, ๐๐ ๐๐ , ๐๐ ๐๐โ1 , ๐๐๐๐+1
Proof of the second part follows the same as the first part โ with negligible variations. Now we stick to the proof of the converse
๐๐๐
๐
3 = ๐ป๐ป(๐๐3 ) = ๐ป๐ป(๐๐3 |๐๐ ๐๐ ) = ๐ป๐ป(๐๐3 |๐๐ ๐๐ , ๐๐3๐๐ )
+๐ผ๐ผ(๐๐3 ; ๐๐3๐๐ |๐๐ ๐๐ ) โค ๐ป๐ป(๐๐3 |๐๐3๐๐ ) + ๐ผ๐ผ(๐๐3 ; ๐๐3๐๐ |๐๐ ๐๐ ) ๐๐
โค ๐๐๐๐3๐๐ + ๏ฟฝ ๐ผ๐ผ(๐๐3 ; ๐๐3๐๐ |๐๐ ๐๐ , ๐๐3๐๐โ1 ) โค ๐๐๐๐3๐๐ ๐๐=1
๐๐
+ ๏ฟฝ ๐ผ๐ผ(๐๐3 ; ๐๐3๐๐ |๐๐ ๐๐=1 ๐๐
๐๐โ1
๐๐ , ๐๐๐๐ , ๐๐๐๐+1 , ๐๐3๐๐โ1 )
๐๐
+ ๏ฟฝ ๐ผ๐ผ(๐๐๐๐ ; ๐๐1๐๐ |๐๐๐๐ , ๐๐๐๐ ),
โค ๐๐๐๐3๐๐
๐๐=1
where (a) follows from the memorylessness of the channel and (b) follows from Lemma 1.
๐๐ , ๐๐3๐๐โ1 ; ๐๐3๐๐ ๏ฟฝ๐๐๐๐ ๏ฟฝ โค ๐๐๐๐3๐๐ + ๏ฟฝ ๐ผ๐ผ๏ฟฝ๐๐3 , ๐๐ ๐๐โ1 , ๐๐๐๐+1 ๐๐=1 ๐๐
๐๐
๐๐=1
๐๐=1
๐๐ , ๐๐2๐๐โ1 ; ๐๐3๐๐ ๏ฟฝ๐๐๐๐ ๏ฟฝ = ๐๐๐๐3๐๐ + ๏ฟฝ ๐ผ๐ผ(๐๐๐๐ ; ๐๐3๐๐ |๐๐๐๐ ) + ๏ฟฝ ๐ผ๐ผ๏ฟฝ๐๐3 , ๐๐ ๐๐โ1 , ๐๐๐๐+1 ๐๐ , ๐๐2๐๐โ1 ๏ฟฝ and the last inequality where ๐๐๐๐ โ ๏ฟฝ๐๐3 , ๐๐ ๐๐โ1 , ๐๐๐๐+1 follows from Lemma 1.
๐๐๐
๐
2 = ๐ป๐ป(๐๐2 ) = ๐ป๐ป(๐๐2 |๐๐3 , ๐๐
๐๐ )
= ๐ป๐ป(๐๐2 |๐๐3 , ๐๐
๐๐
, ๐๐2๐๐ )
+๐ผ๐ผ(๐๐2 ; ๐๐2๐๐ |๐๐3 , ๐๐ ๐๐ ) โค ๐ป๐ป(๐๐2 |๐๐2๐๐ ) + ๐ผ๐ผ(๐๐2 ; ๐๐2๐๐ |๐๐3 , ๐๐ ๐๐ ) ๐๐
๐๐ โค ๐๐๐๐2๐๐ + ๏ฟฝ ๐ผ๐ผ๏ฟฝ๐๐2 ; ๐๐2๐๐ ๏ฟฝ๐๐3 , ๐๐ ๐๐โ1 , ๐๐๐๐ , ๐๐๐๐+1 , ๐๐2๐๐โ1 ๏ฟฝ = ๐๐๐๐2๐๐ ๐๐
๐๐=1
๐๐ ๐๐ , ๐๐2๐๐โ1 ; ๐๐2๐๐ ๏ฟฝ๐๐3 , ๐๐ ๐๐โ1 , ๐๐๐๐+1 , ๐๐2๐๐โ1 , ๐๐๐๐ ๏ฟฝ + ๏ฟฝ ๐ผ๐ผ๏ฟฝ๐๐2 , ๐๐3 , ๐๐ ๐๐โ1 , ๐๐๐๐+1 ๐๐=1
๐๐
๐๐ โค ๐๐๐๐1๐๐ + ๏ฟฝ ๐ผ๐ผ๏ฟฝ๐๐1 ; ๐๐1๐๐ ๏ฟฝ๐๐2 , ๐๐3 , ๐๐ ๐๐โ1 , ๐๐๐๐ , ๐๐๐๐+1 , ๐๐1๐๐โ1 ๏ฟฝ ๐๐=1
+ ๏ฟฝ ๐ผ๐ผ๏ฟฝ๐๐๐๐ ; ๐๐1๐๐ ๏ฟฝ๐๐2 , ๐๐3 , ๐๐
๐๐โ1
๐๐ , ๐๐๐๐ , ๐๐๐๐+1 , ๐๐1๐๐โ1 ๏ฟฝ
๐๐ ) + ๏ฟฝ ๐ผ๐ผ(๐๐๐๐ ; ๐๐1๐๐ ๏ฟฝ๐๐2 , ๐๐3 , ๐๐ ๐๐โ1 , ๐๐๐๐ , ๐๐๐๐+1 ๐๐=1 ๐๐
๐๐ ๏ฟฝ โ ๏ฟฝ ๐ผ๐ผ๏ฟฝ๐๐1๐๐โ1 ; ๐๐1๐๐ ๏ฟฝ๐๐2 , ๐๐3 , ๐๐ ๐๐โ1 , ๐๐๐๐ , ๐๐๐๐+1 ๐๐=1 ๐๐
[1] [2]
[4]
+๐ผ๐ผ(๐๐1 ; ๐๐1๐๐ |๐๐2 , ๐๐3 , ๐๐ ๐๐ ) โค ๐ป๐ป(๐๐1 |๐๐1๐๐ ) + ๐ผ๐ผ(๐๐1 ; ๐๐1๐๐ |๐๐2 , ๐๐3 , ๐๐ ๐๐ )
๐๐
REFERENCES
๐๐=1
๐๐๐
๐
1 = ๐ป๐ป(๐๐1 |๐๐ ๐๐ , ๐๐2 , ๐๐3 ) = ๐ป๐ป(๐๐1 |๐๐2 , ๐๐3 , ๐๐ ๐๐ , ๐๐1๐๐ )
๐๐ ) + ๏ฟฝ ๐ผ๐ผ(๐๐๐๐ ; ๐๐1๐๐ ๏ฟฝ๐๐2 , ๐๐3 , ๐๐ ๐๐โ1 , ๐๐๐๐ , ๐๐๐๐+1
(๐๐) ๐๐๐๐ โค 1๐๐
= ๐๐๐๐1๐๐ +
[5] [6] [7] [8]
[9] [10]
(๐๐) ๐๐๐๐ โค 1๐๐
๐๐=1 ๐๐
[11] [12] [13]
๐๐ ๏ฟฝ = ๐๐๐๐1๐๐ โ ๏ฟฝ ๐ผ๐ผ๏ฟฝ๐๐2๐๐โ1 ; ๐๐1๐๐ ๏ฟฝ๐๐2 , ๐๐3 , ๐๐ ๐๐โ1 , ๐๐๐๐ , ๐๐๐๐+1
[14]
๐๐ , ๐๐2๐๐โ1 , ๐๐๐๐ ๏ฟฝ = ๐๐๐๐1๐๐ + ๏ฟฝ ๐ผ๐ผ๏ฟฝ๐๐๐๐ ; ๐๐1๐๐ ๏ฟฝ๐๐2 , ๐๐3 , ๐๐ ๐๐โ1 , ๐๐๐๐+1
[16]
๐๐=1 ๐๐
๐๐=1
CONCLUSION
We established two achievable rate regions for two special classes of 3-receiver BCs with side information. We also found the capacity region of 3-receiver less noisy BC when side information is available both at the transmitter and at the receivers.
[3]
๐๐ , ๐๐2๐๐โ1 ๏ฟฝ. It is clear that for the where ๐๐๐๐ โ ๏ฟฝ๐๐2 , ๐๐3 , ๐๐ ๐๐โ1 , ๐๐๐๐+1 given choice of ๐๐๐๐ and ๐๐๐๐ , we have the Markov chain (2) satisfied for the channel is assumed to be memoryless.
๐๐=1
V.
๐๐
= ๐๐๐๐2๐๐ + ๏ฟฝ ๐ผ๐ผ(๐๐๐๐ ; ๐๐2๐๐ |๐๐๐๐ , ๐๐๐๐ ),
๐๐
Now using the standard time sharing scheme, we can easily conclude that any achievable rate triple for the three-receiver less noisy broadcast channel with side information nancausally available at the transmitter and at the receivers, must satisfy (22) and the proof is complete. โ
[15]
S. Borade, L. Zheng, and M. Trott., โMultilevel Broadcast Networks,โ Int. Symp. on Inf. Theory, pp. 1151-1155, June 2007. J. Kรถrner and K. Marton, โGeneral broadcast channels with degraded message sets,โ IEEE Trans. On Inf. Theory IT-23, Jan. 1977, pp. 60-64. C. Nair and A. El Gamal, โThe capacity region of a class of 3-receiver broadcast channels with degraded message sets,โ IEEE Trans. On Inf. Theory, vol. 55, no. 10, pp. 4479-4493, Oct. 2009. C. Nair and Z. V. Wang, โThe Capacity Region of the Three-receiver Less Noisy Broadcast Channel,โ IEEE Trans. On Inf. Theory, Vol. 57, pp. 4058-4062, July 2011. C. E. Shannon, โChannels with Side Information at the Transmitter,โ IBM Journal of Research and Development, Vol. 2, pp. 289-293, Oct. 1958. S. I. Gelโfand and M. S. Pinsker, โCoding for Channel with random parameters,โ Probl. Contr. And Inform. Theory, Vol. 9, no. 1, pp. 19-31, 1980. T. M. Cover; M. Chiang, โDuality Between Channel Capacity and Rate Distortion with two-sided State Information,โ IEEE Trans. On Inf. Theory, Vol. 48, pp. 1629-1638, June 2002. Y. Steinberg, โCoding for the Degraded Broadcast Channel with Random Parameters, with Causal and Noncausal Side Information,โ IEEE Trans. On Inform. Theory, Vol. 51, no. 8, pp. 2867-2877, Aug. 2005. Reza Khosravi-Farsani; Farokh Marvasti, โCapacity Bounds for Multiuser Channels with Non-Causal Channel State Information at the Transmitters,โ Inf. Theory. Workshop (ITW), pp. 195-199, Oct. 2011. J. Kรถrner and K. Marton, โComparison of two noisy Channelsโ, Topics on Information Theory (ed. by I. Csiszar and P.Elias), Keszthely, Hungry, August 1975, 411-423. C. Nair, โCapacity Regions of two new classes of two-receiver broadcast channelsโ, IEEE Trans. On Inf. Theory, Vol. 56, pp. 4207-4214, Sept. 2010. T. M. Cover and J. A. Thomas, โElements of Information Theory,โ John Wiley & Sons, 1991. A. El Gamal and Y. H. Kim, โNetwork Information Theory,โ Cambridge University Press, 2011. S. M. Ross, โA First Course in Probability Theory,โ Prentice Hall, printed in the United States of America, 5th edition, 1997. T. M. Cover, โAn Achievable Rate Region for the Broadcast Channel,โ IEEE Trans. On Inf. Theory, Vol. IT-21, pp. 399-404, July 1975. Y. Stein berg and S. Shamai, โAchievable rates for the broadcast channel with states known at the transmitter,โ Int. Symp. on Inf. Theory, 2005, Adelaide, SA, pp. 2184-2188.