Skip to main content

Guessing Random Additive Noise Decoding (GRAND)

Muriel Medard
Cecil H. Green Professor in Electrical Engineering and Computer Science
Massachusetts Institute of Technology
Mercer Distinguished Lecture Series
https://rensselaer.webex.com/rensselaer/j.php?MTID=m8f5e68d06fb651f5f11fcd6044129f8b
Wed, November 18, 2020 at 4:00 PM

Claude Shannon's 1948 "A Mathematical Theory of Communication" provided the basis for the digital communication revolution. As part of that ground-breaking work, he identified the greatest rate (capacity) at which data can be communicated over a noisy channel. He also provided an algorithm for achieving it, based on random codes and a code-centric Maximum Likelihood (ML) decoding, where channel outputs are compared to all possible codewords to select the most likely candidate based on the observed output. Despite its mathematical elegance, his algorithm is impractical from a complexity perspective and much work in the intervening 70 years has focused on co-designing codes and decoders that enable reliable communication at high rates.

We introduce a new algorithm for a noise-centric, rather than code-centric, ML decoding. The algorithm is based on the principle that the receiver rank orders noise sequences from most likely to least likely and guesses noises accordingly. Subtracting noise from the received signal in that order, the first instance that results in an element of the code-book is the ML decoding. For common additive noise channels, we establish that the algorithm is capacity-achieving for uniformly selected code-books, providing an intuitive alternate approach to the channel coding theorem.  We illustrate the practical usefulness of our approach and the fact that it renders the decoding of random codes feasible. The complexity of the decoding is, for the sorts of channels generally used in commercial applications, quite low, unlike code-centric ML.

 

This work is joint with Ken Duffy (Maynooth University).

Muriel Medard

Muriel Médard is the Cecil H. Green Professor in the Electrical Engineering and Computer Science (EECS) Department at MIT and leads the Network Coding and Reliable Communications Group at the Research Laboratory for Electronics at MIT.  She has served as editor for many publications of the Institute of Electrical and Electronics Engineers (IEEE), of which she was elected Fellow, and she has served as Editor in Chief of the IEEE Journal on Selected Areas in Communications. She was President of the IEEE Information Theory Society in 2012, and served on its board of governors for eleven years. She has served as technical program committee co-chair of many of the major conferences in information theory, communications and networking. She received the 2019 Best Paper award for IEEE Transactions on Network Science and Engineering, 2009 IEEE Communication Society and Information Theory Society Joint Paper Award, the 2009 William R. Bennett Prize in the Field of Communications Networking, the 2002 IEEE Leon K. Kirchmayer Prize Paper Award, the 2018 ACM SIGCOMM Test of Time Paper Award and several conference paper awards. She was co-winner of the MIT 2004 Harold E. Egerton Faculty Achievement Award, received the 2013 EECS Graduate Student Association Mentor Award and served as undergraduate Faculty in Residence for seven years. In 2007 she was named a Gilbreth Lecturer by the U.S. National Academy of Engineering. She received the 2016 IEEE Vehicular Technology James Evans Avant Garde Award, the 2017 Aaron Wyner Distinguished Service Award from the IEEE Information Theory Society and the 2017 IEEE Communications Society Edwin Howard Armstrong Achievement Award. She is a member of the National Academy of Inventors. She was elected Member of the National Academy of Engineering for her contributions to the theory and practice of network coding in 2020. She has co-founded CodeOn, Steinwurf and Chocolate Cloud for technology transfer of network coding.