Turbo-Information Processing: Algorithms ... - IEEE Xplore

3 downloads 403 Views 243KB Size Report
Whether we call it belief ..... German Aerospace Research Center DLR and since ... was corecipient (with J. Louveaux and F. Deryck) of the Biennal Siemens.
SCANNING THE ISSUE

Turbo-Information Processing: Algorithms, Implementations & Applications BY CLAUDE BERROU, Member IEEE JOACHIM HAGENAUER, Fellow IEEE MARCO LUISE, Senior Member IEEE CHRISTIAN SCHLEGEL, area not defined LUC VANDENDORPE, Fellow IEEE

O

ver the last 15 years, information theory, the discipline that is concerned with the reliable transmission and processing of digital information, has been enriched by a novel concept with widespread ramifications. Whether we call it belief propagation, message passing, or, more This issue describes a commonly, the turbo principle, the variety of concepts and concept is simple, and probably as practical solutions for old as time itself: rather than trying to designing and decoding treat a complex problem in just one (complex) step, divide it up into turbo codes, and provides several small simple problems that a snapshot of research can be treated in a locally optimal way. If we do this correctly, by getting the local processors to communicate between each other, then the global solution can be close to optimal. The invention of turbo codes in the early 1990s and their first public presentation in 1993 by Berrou, (the late) Glavieux, and Thitimajshima, showed the pertinence of this approach for communications. By associating two small convolutional codes with low decoding complexity and applying an iterative decoding mechanism employing the exchange of probabilities between the two elementary decoders, error correction in close agreement with Shannon’s theory was achieved. This revolutionary result took the coding community by storm. A new code construction from well-known components was what was needed to usher in a revolution. The interleaving function, a well-known component in many communications system, was used to Bconcatenate[ the two simple encoders, and turned out to be an essential ingredient for designing a good turbo

Digital Object Identifier: 10.1109/JPROC.2007.896520

1146

Proceedings of the IEEE | Vol. 95, No. 6, June 2007

code: it played a key role regarding important code properties such as the minimum distance and the asymptotic performance behavior. The ideas enabling turbo coding and decoding were rapidly applied to signal processing functions other than error control coding. Once the notion of iterative decoding with probabilistic information exchange was established, researchers took a new look at signal demodulation, equalization, and synchronization, for example, using such Bsoft[ information exchange to enhance performance. However, it is in error control coding that the principles embraced in turbo coding came to their first and fullest fruition. The shock wave that turbo codes caused in the scientific community also led to a reexamination of the work of Gallager who, at the beginning of the 1960s, had invented a family of error control codes whose defining characteristic, the low density of their parity-check matrix, turned out to make them ideally suited to probabilistic message passing decoding. These codes, called low-density parity check (LDPC) codes, are currently the focus of very active research. 0018-9219/$25.00 Ó 2007 IEEE

Scanning the Issue

Decoding by exchanging messages is a nonlinear dynamic process with a large number of participating variables. This is the reason why a mathematical analysis is difficult and why only few theoretical performance prediction and assessment tools exist, as Calderbank aptly put it in his article BThe art of signaling: Fifty years of coding theory[ (IEEE Trans. Inf. Theory, vol. 44, no. 6, pp. 2561Y2595, Oct. 1998): BIt is interesting to observe that the search for theoretical understanding of turbo codes has transformed coding theorists into experimental scientists.[ Although understanding turbo decoding or the belief propagation process effectively requires a deep sense of physics, designing codes still remains mainly a question of mathematics and calls on algebra, algebraic geometry, graph theory, and statistics. This special issue of the Proceedings of the IEEE reflects on the variety of concepts used and practical solutions found in designing good turbo codes and the decoding process for them. It is, at a specific moment, a doubtlessly incomplete snapshot of a vast field of research that has greatly evolved in the last few years but whose progress continues unabatedly. The first paper, by Costello and Forney, takes the reader on an exciting journey through the 60-year-old history of error control coding. From simple Hamming codes to the ultimately sophisticated turbo and LDPC codes, the road to channel capacity was not a straight line, meandering between algebraic and convolutional coding exploring both binary (hard) as well as probabilistic (soft) decoding. We wish the reader a pleasant journey with Costello and Forney, who have both long been important contributors to the discipline. In the first part of their paper, Hanzo et al. present the basic principles of turbo coding and decoding, paying particular attention to the maximum a posteriori (MAP) algorithm component, along with its simplified log-MAP and max-logMAP versions. The importance of various coding/decoding parameters

is highlighted and explained. The second part of the paper widens the applications of the turbo principle to different transceiver functions, such as modulation, equalization, spacetime coding, and multiuser detection. As in any processing device, large throughput and low complexity in iterative decoders are fundamentally conflicting requirements. Boutillon et al. detail the solutions that are currently adopted in turbo decoders to obtain favorable compromises. After a general introduction of the iterative decoder structure and its components, the authors develop various forms of parallelism that can speed up the iterative process, along with a complexity evaluation of these techniques. Other aspects of decoding latency, such as stopping criteria, are also discussed in this paper. The paper by Gracie and Hamon addresses the myriad of issues that arise in the actual application of turbo coding to practical systems. They discuss specific codes that have been chosen for various standards, as well as examples of integrated realizations of these codes. Detailed design results for high-minimum distance codes as well as practical implications such as computational effort and memory requirements are examined in detail. They present examples of commercially available implementations and the platforms they run on, as well as a comprehensive discussion of standards which employ turbo coding, from satellite communications to terrestrial wireless services. The practicing engineer will find this paper a valuable resource of information. The paper by Vucetic et al. on recent advances in turbo code design and applications exposes the interleaver as the ‘‘heart’’ of the code, which is presented as the code-defining component. An in-depth explanation of the design targets for interleavers is followed by practical design methods and examples. The paper then continues its exploration of the structure of turbo codes by showing their close relationship to LDPC codes. Recasting turbo codes as LDPC codes and,

conversely, viewing LDPC codes in a turbo-code format stirs in the reader the realization that these apparently different code families are two sides of the same coin. The paper thus reinforces the distributed-processing concept of modern error control coding from the interleaver point of view. Message passing algorithms, which were introduced as stated above for the decoding of LDPC codes, can be used to solve a variety of parameter estimation and signal processing problems. The paper by Loeliger et al. shows in fact how many signal processing problems that can be formulated in terms of a linear statespace model can be treated through the use of suited recursive messagepassing. In particular, they show that recursive least squares, linear minimum-mean-squared-error estimation, and Kalman filtering algorithms can be re-derived and implemented following this approach (as far as Gaussian message passing is considered), and that message passing encourages to Bmix and match[ different, apparently unrelated algorithmic techniques as for example steepest descent and expectation-maximization. The paper by Herzet et al. highlights the important ancillary question of synchronization. With its extreme low-power performance, synchronization in turbo-coded systems is no longer a simple task. This paper presents a compelling exposition of novel code-aided synchronization techniques which interplay naturally with the iterative decoding algorithm of the codes themselves. The paper presents a thorough introduction and leads the reader into a fascinating discussion of expectation-maximization, gradient method, and sum-product algorithmbased synchronization strategies. These are examined for their suitability for turbo codes using various illustrative examples. A comprehensive bibliography completes this paper. Anastasopoulos et al. further elaborate on the techniques for iterative signal detection when unknown parameters are embedded into the received signal in addition to the

Vol. 95, No. 6, June 2007 | Proceedings of the IEEE

1147

Scanning the Issue

digital data. An example of such scenario is the case of the detection of a turbo code on a time-varying (fading) wireless communications channel. The authors show how this problem can be cast into one of signal detection on a channel with memory, and how it can be solved with the aid of suited generalizations of the message-passing algorithm. One of the main results they provide is the proof that such iterative algorithms bear in some case polynomial complexity only. The Bturbo principle[Vnamely, iterative decoding involving two or more component decodersVcan be applied to any combination of data pro-

cessors as long as sufficient statistical independence can be guaranteed among the messages communicated. This is typically done by the interleaver. The two encoders could be a source and a channel encoder, respectively. This idea is taken up in the paper by Jaspar et al., who describe turbo joint source channel coding and give an overview of existing and new work employing lossless compression of discrete-valued sources. The methods used to describe the setup are factor graphs and message passing. As stated above, the understanding of the turbo mechanism is still a very hot topic. A question which particu-

larly puzzles engineers and modem designers is, BWhat is actually the objective function targeted by a turbo receiver or for what criterion is the turbo receiver actually optimum?[ The paper by Regalia and Walsh provides, for the particular case of turbo decoding, an excellent overview and comparison between two optimality formulations: one originating from statistical physics and based on Bethe Free Energy optimization, and the other, on constrained maximum likelihood estimation. The authors offer a very nice tutorial presentation of the two approaches and show the duality link between them. h

ABOUT THE GUEST EDITORS Claude Berrou (Member, IEEE) was born in Penmarc’h, France, in 1951. In 1978, he joined the ´rieure des Te ´le ´communicaEcole Nationale Supe tions (ENST) de Bretagne, Brest Cedex 3, France, where he is currently a Professor in the Electronics Department. In the early 1980s, he started up the training and research activities in VLSI technology and design, to meet the growing demand from industry for microelectronics engineers. Some years later, he took an active interest in the field of algorithm/silicon interaction for digital communications. In collaboration with Prof. Alain Glavieux, he introduced the concept of probabilistic feedback into error correcting decoders and developed a new family of quasi-optimal error correction codes, which he nicknamed turbo codes. He also pioneered the extension of the turbo principle to joint detection and decoding processing. He is author or coauthor of eight registered patents and about 60 publications in the field of digital communications and electronics. His current research topics, besides algorithm/silicon interaction, are electronics and digital communications at large, error correction codes, turbo codes and iterative processing, soft-in/soft-out (probabilistic) decoders, etc. Prof. Berrou has received several distinctions, among which are the ´daille Ampe `re, the 1998 IEEE (Information Theory) Golden 1997 SEE Me Jubilee Award for Technological Innovation, the 2003 IEEE Richard ´le ´com de l’Acade ´mie W. Hamming medal, the 2003 Grand Prix France Te des sciences and the 2005 Marconi Prize. Joachim Hagenauer (Fellow, IEEE) received degrees from the Technical University of Darmstadt, Germany. He held a postdoctoral fellowship at the IBM T. J. Watson Research Center, Yorktown Heights, NY, working on error-correction coding for magnetic recording. Later he became a Director of the Institute for Communications Technology at the German Aerospace Research Center DLR and since 1993 he holds a chaired professorship at the Technical University of Munich, Germany. During 1986Y1987 he spent a

1148

Proceedings of the IEEE | Vol. 95, No. 6, June 2007

sabbatical year as an Otto Lilienthal Fellow at Bell Laboratories, Crawford Hill, NJ, working on joint source/channel coding and on trellis coded modulation for wireless systems. He served as an editor and guest editor for the IEEE and for the European Transactions on Telecommunications (ETT). His research interests concentrate on the turbo principle in communications and on the application of communication principles to genetics. Prof. Hagenauer is a Distinguished Lecturer of the IEEE. He served as President of the IEEE Information Theory Society. Among other awards he received in 1996 the E. H. Armstrong-Award of IEEE COMSOC, in 2003 the IEEE Alexander Graham Bell Medal, and an Honorary Doctorate from the University Erlangen-Nuremberg in 2006.

Marco Luise (Senior Member, IEEE) received the M.Eng. and Ph.D. degrees in electronic engineering from the University of Pisa, Pisa, Italy. He was a Research Fellow of the European Space Agency (ESA) at ESTEC Noordwijk, The Netherlands, and a Researcher of the Italian National Research Council (CNR), at the CSMDR Pisa. He is currently a Full Professor of Telecommunications at the University of Pisa. He has authored more than 150 publications on international journals and contributions to major international conferences, and holds a few international patents. His main research interests lie in the area of wireless communications, with particular emphasis on CDMA/ multicarrier signals and satellite communications and positioning. Prof. Luise cochaired four editions of the Tyrrhenian International Workshop on Digital Communications, was the General Chairman of the URSI Symposium ISSSE’98, and the General Chairman of EUSIPCO 2006 in Florence. He served as Editor for Synchronization of the IEEE TRANSACTIONS ON COMMUNICATIONS, and Editor for Communications Theory of the European Transactions on Telecommunications. He is the coEditor-in-Chief of the recently founded International Journal of Navigation and Observation, and acts as General Secretary of the Italian Association GTTI, Gruppo Telecomunicazioni Teoria dell’Informazione.

Scanning the Issue

Christian Schlegel received the Dipl.El.Ing. ETH degree from the Federal Institute of Technology, Zurich, Switzerland, in 1984 and the M.S. and Ph.D. degrees in electrical engineering from the University of Notre Dame, Notre Dame, IN, in 1986 and 1989, respectively. He held academic positions at the University of South Australia, the University of Texas, and the University of Utah, Salt Lake City. In 2001 he was appointed chaired Professor for Digital Communications at the University of Alberta, Canada, heading a large research laboratory funded by the Alberta Informatics Circle of Research Excellence (iCORE). His research focuses on the areas of error control coding and applications, multiple access communications, modulation, and detection, as well as analog and digital implementations of communications systems. He is the author of the research monographs Trellis Coding and Trellis and Turbo Coding by IEEE/Wiley, as well as Coordinated Multiple User Communications, coauthored with Prof. A. Grant, published by Springer. Dr. Schlegel received an 1997 Career Award, and a Canada Research Chair in 2001. He is associate editor for coding theory and techniques for the IEEE TRANSACTIONS ON COMMUNICATIONS, has served as the technical program co-chair of the IEEE Information Theory Workshop, ITW 2001, and the IEEE International Symposium on Information Theory, ISIT 2005, as well as the general chair of the IEEE Communications Theory Workshop, CTW 2005. He served on numerous other IEEE conference program committees, and is an IEEE Distinguished Lecturer.

Luc Vandendorpe (Fellow, IEEE) was born in Mouscron, Belgium, in 1962. He received the electrical e‘ngineering degree (summa cum laude) ´ and the Ph. D. degree from the Universite Catholique de Louvain (UCL) Louvain-la-Neuve, Belgium in 1985 and 1991, respectively. Since 1985, he has been with the Communications and Remote Sensing Laboratory of UCL. In 1992, he was a Visiting Scientist and Research Fellow at the Telecommunications and Traffic Control Systems Group of the Delft Technical University, The Netherlands. Presently he is Full Professor and head of the Electrical Engineering Department of UCL. He is mainly interested in digital communication systems: equalization, joint detection/synchronization for CDMA, OFDM (multicarrier), MIMO and turbo-based communications systems (UMTS, xDSL, WLAN, etc.), and joint source/channel (de)coding. Prof. Vandendorpe was corecipient of the Biennal Alcatel-Bell Award from the Belgian National Science Foundation (NSF) in 1990. In 2000 he was corecipient (with J. Louveaux and F. Deryck) of the Biennal Siemens Award from the Belgian NSF. He is or has been TPC member for IEEE VTC Fall 1999, IEEE Globecom 2003 Communications Theory Symposium, the 2003 Turbo Symposium, IEEE VTC Fall 2003, IEEE SPAWC 2005, and IEEE SPAWC 2006. He was co-technical chair (with P. Duhamel) for IEEE ICASSP 2006. He is an elected member of the Sensor Array and Multichannel Signal Processing committee of the Signal Processing Society and associate editor of the IEEE TRANSACTIONS ON SIGNAL PROCESSING.

Vol. 95, No. 6, June 2007 | Proceedings of the IEEE

1149