Transmitting data over noisy channels is an important and interesting problem with many real life applications, such as wireless communications, 5G, Internet of Things (IoT), etc. To address the problem, data is encoded with error correcting codes prior to transmission. The added redundancy allows the receiver to decode the original message. Error correcting codes are usually constructed in a specific manner to allow for efficient decoding. It is common practice to ignore the added noise, and treat it as nuisance. In this talk, we show how information of the noise can be squeezed to help the decoding process. We introduce Soft Guessing Random Additive Noise Decoding (SGRAND), a GRAND variant that decodes by guessing over the noise sequences, rather than trying to identify the transmitted code word directly. SGRAND is a Maximum Likelihood (ML) decoder in the soft detection setting, and as is the case with all GRAND variants, is a universal decoder. We then discuss how Multiple Access Channels (MAC) can be decoded by separating the noise from the interference, which results in a Maximum a-Posteriori (MAP) decoder. Each problem is solved with a dedicated algorithm: SGRAND variation to remove the effects of the noise, and ZigZag decoding on a putative noiseless channel. We eventually show how noise estimation can be used to improve communications over orthogonal channels that are subject to correlated noise, through a method we call Noise Recycling.
Amit Solomon is a Ph.D. student at the Department of Electrical Engineering and Computer Science at Massachusetts Institute of Technology (MIT). He works with Prof. Muriel Mèdard at the Network Coding and Reliable Communication group in the Research Laboratory of Electronics. His research interests include coding theory, information theory, communication, algorithms, machine learning among others. He received a B.Sc. (cum laude), and an M.Sc. in Electrical Engineering from the Technion-Israel Institute of Technology, in 2015 and 2018, respectively.