Decoding low-density parity-check codes with ... - Semantic Scholar

3 downloads 12264 Views 58KB Size Report
probability (APP) decoding [3]. However, cycle-free TG's do not support good codes [4] and the cycles in the TG of LDPC codes make sum–product decoding no ...
414

IEEE COMMUNICATIONS LETTERS, VOL. 5, NO. 10, OCTOBER 2001

Decoding Low-Density Parity-Check Codes With Probabilistic Scheduling Yongyi Mao, Student Member, IEEE, and Amir H. Banihashemi, Associate Member, IEEE

Abstract—In this letter, we present a new message-passing schedule for the decoding of low-density parity-check (LDPC) codes. This approach, designated “probabilistic schedule,” takes into account the structure of the Tanner graph (TG) of the code. We show by simulation that the new schedule offers a much better performance/complexity trade-off. This work also suggests that scheduling plays an important role in iterative decoding and that a schedule that matches the structure of the TG is desirable. Index Terms—Belief propagation algorithm, codes, codes on graphs, decoding, iterative decoding, iterative methods, LDPC codes, message-passing schedule, sum–product algorithm.

I. INTRODUCTION

T

HE capacity-achieving low-density parity-check (LDPC) codes [1], [2] are conventionally decoded by an iterative message-passing algorithm, called the sum–product or belief propagation algorithm, operating on the Tanner graph (TG) of the code. The passing of messages in this algorithm follows the so-called flooding schedule: In each iteration, all symbol nodes and subsequently all check nodes, pass new messages to their neighbors. For a cycle-free TG, applying this schedule along with the sum–product algorithm results in optimal a posteriori probability (APP) decoding [3]. However, cycle-free TG’s do not support good codes [4] and the cycles in the TG of LDPC codes make sum–product decoding no longer optimal. This loss of optimality can be more severe for LDPC codes at short block lengths (10 000 bits or shorter), since the TG’s often contain many small cycles and the sum–product algorithm is believed to perform further away from the optimal in this case. It has been noted that various schedules may be applied for iterative message-passing decoding (see, e.g., [3] and [5]). In the literature, however, to the best of our knowledge, only the application of the flooding schedule is reported for the decoding of LDPC codes. In this letter, we present a new scheduling scheme for the sum–product decoding of LDPC codes, called the probabilistic schedule. Fundamentally different from the flooding schedule, Manuscript received May 3, 2001. The associate editor coordinating the review of this letter and approving it for publication was Dr. M. Fossorier. This work was supported in part by the Natural Sciences and Engineering Research Council of Canada under Grant and in part by by an OGSST scholarship. Y. Mao was with the Department of Systems and Computer Engineering, Carleton University, Ottawa, ON K1S 5B6, Canada. He is now with the Department of Electrical and Computer Engineering, University of Toronto, Toronto, ON M5S 3G4, Canada (e-mail: [email protected]). A. H. Banihashemi is with Broadband Communication and Wireless Systems (BCWS) Center and the Department of Systems and Computer Engineering, Carleton University, Ottawa, ON K1S 5B6, Canada (e-mail: [email protected]). Publisher Item Identifier S 1089-7798(01)09051-2.

the probabilistic schedule is tailored to the TG of the code. Roughly speaking, the probabilistic schedule updates the outgoing messages of a symbol node with an average frequency proportional to the length of the shortest cycle passing through . We show by simulation that this new scheduling scheme can achieve significant performance improvement upon the flooding schedule, with a similar or even lower complexity. Moreover, our work suggests that for a given code and a given TG of the code, the message-passing schedule is essential to the performance of the sum–product decoder and significant gain in the performance/complexity trade-off may be obtained by carefully designing a schedule based on some graph property relevant to the sub-optimality of the decoder. II. PROBABILISTIC SCHEDULE In a previous work [6], we defined the girth of any symbol node on a TG as the length of the shortest cycle passing through . We showed in [6] that the distribution of girths is an important entity associated with the TG of a code; this result relates the performance of iterative belief propagation algorithms to the structure of the underlying graph. In this work, we use girths to devise a TG-dependent decoding schedule. To show how the girths of a TG can be exploited to improve scheduling, we start with a careful examination of the message-passing dynamics in the sum–product algorithm when the flooding schedule is used: and be the largest and the smallest girths in Let the TG, respectively. Assuming that node in the TG has girth , it takes iterations for the message sent from to propagate back to node itself. Before this number of iterations is reached, the message-passing operations at all symbol nodes and check nodes are performed optimally. The reason for this is that, assuming independent information bits and a memoryless channel, for each node, all the incoming messages and the initial message are independent of each other (indeed, it is precisely this independence that guarantees the sum–product algorithm to result in APP decoding for cycle-free graphs). However, , the incoming messages to node and any at iteration , along their shortest cycles are depenother node with girth dent on the initial messages at those nodes. At this time, when node passes messages, the optimality of the algorithm would be violated. Notice that at this iteration, the message-passing operations at other nodes with larger girths still preserve their optimality. However, the violation of optimality at node (and ) would affect the global optimality, other nodes with girth in the sense that there is no guarantee that the algorithm will converge to the APP solution.

1089–7798/01$10.00 © 2001 IEEE

MAO AND BANIHASHEMI: DECODING LDPC CODES

A partial remedy to this problem could be to synchronize the timing at which different nodes pass nonoptimal messages. To implement this, node (and other nodes with smallest girth) and instead, would not update their messages at iteration is reached. By the same token, wait until the iteration , would stop updating any other node , with . Therefore, at iteration their outgoing messages at iteration , only nodes with girth pass new information. is reached, all the nodes are activated When the iteration again and resume updating their outgoing messages. This strategy has two advantages. First, we maximize the number of iterations in which only “optimal” messages are passed through the graph. (The maximum value is .) Second, at iteration , when every node resumes updating its outgoing messages, the incoming messages to contain more “optimal” global information than they . In other words, information from farther did at iteration away in the graph is inputted to when restarts sending new messages. It is also worth noting that this strategy is similar to the notion of “equally double counting” in [7], which results in an optimal final assignment of bits for TG’s with single cycles (although the computed APP results are over-confident). It is clear that there are many different deterministic ways to implement this strategy, including the one we just described. A more natural approach however, is to implement the strategy randomly (probabilistically), hence the name “probabilistic schedule”: To each symbol node , we associate a probability . At each iteration (except for the first), first, updates its outgoing messages with each symbol node and stays idle with probability ; then all probability check nodes pass messages. For the first iteration, we follow the conventional flooding schedule, i.e., all symbol nodes pass messages then all check nodes pass messages. This is to input all initial messages to the graph from the beginning of the decoding. With this probabilistic implementation, on average, for every iterations, a message sent by a node propagates back to the node itself once along its smallest cycle. In the following section, we will see how the performance of the iterative sum–product algorithm can be significantly improved using the probabilistic schedule. III. SIMULATION RESULTS Simulations are performed on a LDPC code constructed by construction 2A [1]. This code was designed for the transport of ATM cells in an industrial project and its performance was previously reported in [6], [8]. In our simulations, BPSK-modulated codewords are transmitted through an AWGN channel and the received vectors are decoded simultaneously using two sum–product decoders, one using the flooding schedule and the other with the probabilistic schedule. The decoding is stopped when a codeword is reached or after 500 iterations. For both schedules, the bit error rate (BER) and the message error rate (MER) are plotted in Fig. 1 and the statistics for the number of decoding iterations are shown in Table I. The percentages of undetected and detected errors are also listed in

415



2

Fig. 1. BER ( ) and MER ( ) for the probabilistic schedule (solid line) and the flooding schedule (dashed line).

TABLE I STATISTICS OF THE NUMBER OF DECODING ITERATIONS

Table II. This data was obtained by decoding up to simulated codewords at each signal-to-noise ratio (SNR). It can be seen that the probabilistic schedule provides a significant improvement in performance over the conventional flooding schedule. As the SNR increases, the BER and MER for the probabilistic schedule decrease much more rapidly than those for the flooding schedule. At an SNR of 2.5 dB, compared with the flooding schedule, the error rates for the probabilistic schedule are reduced by more than an order of magnitude. It is also observed that at low SNR’s, most undetected errors resulting from the conventional flooding schedule are removed by this new scheduling scheme (note that both decoders process exactly the same received vectors). This is particularly useful when a feedback channel is available and an ARQ scheme is employed. We expect the performance improvement due to probabilistic schedule to be larger for codes with wider spread girth distributions. In fact, a related result in [9] shows that the belief propagation decoding of difference set cyclic (DSC) codes outperforms that of LDPC codes with similar parameters. Noting that the girth distributions of DSC codes are typically much less spread compared to those of LDPC codes, we expect the new schedule to provide more improvement for the latter than the former and possibly close the performance gap between the two. The average number of decoding iterations for the new scheme is slightly greater than that for the flooding schedule (by less than four iterations, for all SNR’s). However, with the new schedule, since not all nodes update messages in

416

IEEE COMMUNICATIONS LETTERS, VOL. 5, NO. 10, OCTOBER 2001

TABLE II BREAKDOWN OF ERROR EVENTS INTO DETECTED AND UNDETECTED

each iteration, the average total number of computations is estimated to be comparable or even smaller than what is required for the flooding schedule. In fact, in each iteration, messages the new schedule only updates messages in the conventional on average, as compared to flooding schedule, where is the total number of edges in the is the degree of symbol node . TG and Simulations are also performed on LDPC codes at other rates and block lengths and similar results are observed. IV. DISCUSSION AND CONCLUSION Taking into account the structure of the TG, we propose a new message-passing schedule, called the “probabilistic schedule,” for decoding LDPC codes. We show by simulation that the probabilistic schedule, at no cost in complexity, can improve substantially upon the conventional flooding schedule, both in decreasing the overall BER and MER and also in reducing the number of undetected errors. Our work also shows that scheduling plays an important role in the performance of iterative belief-propagation decoders. For a given TG, there appears to be a great potential for performance improvement, particularly at short block lengths, through a careful design of the message-passing schedule. The results of this work confirm that in order to push the performance of iterative belief-propagation algorithms to their limit, a properly devised message-passing schedule, which

matches the underlying graph, is a necessity. We hope that this work, as the first attempt in this direction, inspires more interest and research in this area. REFERENCES [1] D. J. C. MacKay and R. M. Neal, “Near Shannon limit performance of low density parity check codes,” Electron. Lett., vol. 33, no. 6, pp. 457–458, Mar. 1997. [2] T. J. Richardson, M. A. Shokrollahi, and R. L. Urbanke, “Design of provably good low density parity check codes,” IEEE Trans. Inform. Theory, vol. 47, pp. 619–637, Feb. 2001. [3] G. D. Forney Jr, “On iterative decoding and the two-way algorithm,” in Proc. Int. Symp. on Turbo Codes and Related Topics, Brest, France, Sept. 1997, pp. 12–25. [4] T. Etzion, A. Trachtenbeg, and A. Vardy, “Which codes have cycle-free Tanner graphs?,” IEEE Trans. Inform. Theory, vol. 45, pp. 2173–2181, Sept. 1999. [5] F. R. Kschischang and B. J. Frey, “Iterative decoding of compound codes by probability propagation in graphical models,” IEEE J. Select. Areas Commun., vol. 16, pp. 219–230, Feb. 1998. [6] Y. Mao and A. H. Banihashemi, “A heuristic search for good low-density parity-check codes at short block lengths,” in Proc. ICC 2001, Helsinki, Finland, June 2001, pp. 41–44. [7] Y. Weiss, “Correctness of local probability propagation in graphical models with loops,” in Neural Comput., 2000, vol. 12, pp. 1–41. [8] Y. Mao, A. Banihashemi, and M. Landolsi, “Comparison between lowdensity parity-check codes and turbo product codes for delay and complexity sensitive applications,” in Proc. 20th Biennial Symp. Comm., Kingston, Canada, May 2000, pp. 151–153. [9] R. Lucas, M. P. C. Fossorier, Y. Kou, and S. Lin, “Iterative decoding of one-step majority logic decodable codes based on belief propagation,” IEEE Trans. Commun., vol. 48, pp. 931–937, June 2000.

Suggest Documents