Chip eliminates the need for specific decoding hardware

Researchers at MIT, Boston University, and Maynooth University in Ireland created the first silicon chip able to decode any code with maximum accuracy, using a universal decoding algorithm called Guessing Random Additive Noise Decoding (GRAND). The algorithm increases efficiency for applications in augmented and virtual reality, gaming, 5G networks, and connected devices that rely on processing a high volume of data with minimal delay

GRAND works by guessing the noise that affected the message, and uses the noise pattern to deduce the original information. It generates a series of noise sequences in the order they are likely to occur, subtracts them from the received data, and checks to see if the resulting codeword is in a codebook.

The GRAND chip is designed to switch seamlessly between two codebooks. It contains two static random-access memory chips, one to crack codewords, while the other loads a new codebook and then switches to decoding with no downtime. The researchers found it could effectively decode any moderate redundancy code up to 128 bits in length, with only about a microsecond of latency.

The chip works with legacy codes and could also be used with codes that haven’t even been introduced yet. For 5G implementation, regulators and communications companies struggled to find consensus as to which codes should be used, ultimately choosing two types of traditional codes for 5G infrastructure in different situations. Using GRAND could eliminate the need for that rigid standardization in the future.

Original Release: MIT

Leave A Reply

Your email address will not be published.