Home > Research > Publications & Outputs > Guessing random additive noise decoding of netw...

Electronic data

  • Chatzigeorgiou_and_Savostyanov_TVT_2023_05114

    Rights statement: ©2024 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.

    Accepted author manuscript, 2.05 MB, PDF document

    Available under license: CC BY: Creative Commons Attribution 4.0 International License

Links

Text available via DOI:

View graph of relations

Guessing random additive noise decoding of network coded data transmitted over burst error channels

Research output: Contribution to Journal/MagazineJournal articlepeer-review

E-pub ahead of print

Standard

Guessing random additive noise decoding of network coded data transmitted over burst error channels. / Chatzigeorgiou, Ioannis; Savostyanov, Dmitry.
In: IEEE Transactions on Vehicular Technology, 01.04.2024.

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Harvard

APA

Vancouver

Chatzigeorgiou I, Savostyanov D. Guessing random additive noise decoding of network coded data transmitted over burst error channels. IEEE Transactions on Vehicular Technology. 2024 Apr 1. Epub 2024 Apr 1. doi: 10.1109/TVT.2024.3383546

Author

Bibtex

@article{c074919331924150a3509b1dec065c0b,
title = "Guessing random additive noise decoding of network coded data transmitted over burst error channels",
abstract = "We consider a transmitter that encodes data packets using network coding and broadcasts coded packets. A receiver employing network decoding recovers the data packets if a sufficient number of error-free coded packets are gathered. The receiver does not abandon its efforts to recover the data packets if network decoding is unsuccessful; instead, it employs syndrome decoding (SD) in an effort to repair erroneous received coded packets, and then reattempts network decoding. Most decoding techniques, including SD, assume that errors are independently and identically distributed within received coded packets. Motivated by the guessing random additive noise decoding (GRAND) framework, we propose transversal GRAND (T-GRAND): an algorithm that exploits statistical dependence in the occurrence of errors, complements network decoding and recovers all data packets with a higher probability than SD. T-GRAND examines error vectors in order of their likelihood of occurring and altering the transmitted packets. Calculation and sorting of the likelihood values of all error vectors is a simple but computationally expensive process. To reduce the complexity of T-GRAND, we take advantage of the properties of the likelihood function and develop an efficient method, which identifies the most likely error vectors without computing and ordering all likelihood values.",
keywords = "network coding, Random linear codes, burst noise, syndome decoding, guessing random additive noise decoding (GRAND), Gilbert-Elliott model, algorithm, Complexity analysis",
author = "Ioannis Chatzigeorgiou and Dmitry Savostyanov",
year = "2024",
month = apr,
day = "1",
doi = "10.1109/TVT.2024.3383546",
language = "English",
journal = "IEEE Transactions on Vehicular Technology",
issn = "0018-9545",
publisher = "Institute of Electrical and Electronics Engineers Inc.",

}

RIS

TY - JOUR

T1 - Guessing random additive noise decoding of network coded data transmitted over burst error channels

AU - Chatzigeorgiou, Ioannis

AU - Savostyanov, Dmitry

PY - 2024/4/1

Y1 - 2024/4/1

N2 - We consider a transmitter that encodes data packets using network coding and broadcasts coded packets. A receiver employing network decoding recovers the data packets if a sufficient number of error-free coded packets are gathered. The receiver does not abandon its efforts to recover the data packets if network decoding is unsuccessful; instead, it employs syndrome decoding (SD) in an effort to repair erroneous received coded packets, and then reattempts network decoding. Most decoding techniques, including SD, assume that errors are independently and identically distributed within received coded packets. Motivated by the guessing random additive noise decoding (GRAND) framework, we propose transversal GRAND (T-GRAND): an algorithm that exploits statistical dependence in the occurrence of errors, complements network decoding and recovers all data packets with a higher probability than SD. T-GRAND examines error vectors in order of their likelihood of occurring and altering the transmitted packets. Calculation and sorting of the likelihood values of all error vectors is a simple but computationally expensive process. To reduce the complexity of T-GRAND, we take advantage of the properties of the likelihood function and develop an efficient method, which identifies the most likely error vectors without computing and ordering all likelihood values.

AB - We consider a transmitter that encodes data packets using network coding and broadcasts coded packets. A receiver employing network decoding recovers the data packets if a sufficient number of error-free coded packets are gathered. The receiver does not abandon its efforts to recover the data packets if network decoding is unsuccessful; instead, it employs syndrome decoding (SD) in an effort to repair erroneous received coded packets, and then reattempts network decoding. Most decoding techniques, including SD, assume that errors are independently and identically distributed within received coded packets. Motivated by the guessing random additive noise decoding (GRAND) framework, we propose transversal GRAND (T-GRAND): an algorithm that exploits statistical dependence in the occurrence of errors, complements network decoding and recovers all data packets with a higher probability than SD. T-GRAND examines error vectors in order of their likelihood of occurring and altering the transmitted packets. Calculation and sorting of the likelihood values of all error vectors is a simple but computationally expensive process. To reduce the complexity of T-GRAND, we take advantage of the properties of the likelihood function and develop an efficient method, which identifies the most likely error vectors without computing and ordering all likelihood values.

KW - network coding

KW - Random linear codes

KW - burst noise

KW - syndome decoding

KW - guessing random additive noise decoding (GRAND)

KW - Gilbert-Elliott model

KW - algorithm

KW - Complexity analysis

U2 - 10.1109/TVT.2024.3383546

DO - 10.1109/TVT.2024.3383546

M3 - Journal article

JO - IEEE Transactions on Vehicular Technology

JF - IEEE Transactions on Vehicular Technology

SN - 0018-9545

ER -