We propose a data-driven approach to explicitly learn the progressive encoding of a continuous source, which is successively decoded with increasing levels of quality and with the aid of correlated side information. This setup refers to the successive refinement of the Wyner-Ziv coding problem. Assuming ideal Slepian-Wolf coding, our approach employs recurrent neural networks (RNNs) to learn layered encoders and decoders for the quadratic Gaussian case. The models are trained by minimizing a variational bound on the rate-distortion function of the successively refined Wyner-Ziv coding problem. We demonstrate that RNNs can explicitly retrieve layered binning solutions akin to scalable nested quantization. Moreover, the rate-distortion performance of the scheme is on par with the corresponding monolithic Wyner-Ziv coding approach and is close to the rate-distortion bound.
Joukovsky, B, De Weerdt, B & Deligiannis, N 2024, Learned layered coding for Successive Refinement in the Wyner-Ziv Problem. in 2024 IEEE International Conference on Acoustics, Speech and Signal Processing. ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings, IEEE, pp. 6020-6024, 2024 IEEE International Conference on Acoustics, Speech and Signal Processing, Seoul, Korea, Democratic People's Republic of, 14/04/24. https://doi.org/10.1109/ICASSP48485.2024.10446574
Joukovsky, B., De Weerdt, B., & Deligiannis, N. (2024). Learned layered coding for Successive Refinement in the Wyner-Ziv Problem. In 2024 IEEE International Conference on Acoustics, Speech and Signal Processing (pp. 6020-6024). (ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings). IEEE. https://doi.org/10.1109/ICASSP48485.2024.10446574
@inproceedings{b7b090c64be547c2ad3703950e1c2603,
title = "Learned layered coding for Successive Refinement in the Wyner-Ziv Problem",
abstract = "We propose a data-driven approach to explicitly learn the progressive encoding of a continuous source, which is successively decoded with increasing levels of quality and with the aid of correlated side information. This setup refers to the successive refinement of the Wyner-Ziv coding problem. Assuming ideal Slepian-Wolf coding, our approach employs recurrent neural networks (RNNs) to learn layered encoders and decoders for the quadratic Gaussian case. The models are trained by minimizing a variational bound on the rate-distortion function of the successively refined Wyner-Ziv coding problem. We demonstrate that RNNs can explicitly retrieve layered binning solutions akin to scalable nested quantization. Moreover, the rate-distortion performance of the scheme is on par with the corresponding monolithic Wyner-Ziv coding approach and is close to the rate-distortion bound. ",
keywords = "cs.LG, cs.IT, math.IT",
author = "Boris Joukovsky and {De Weerdt}, Brent and Nikos Deligiannis",
note = "5 pages, submitted to ICASSP 2024; 2024 IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP ; Conference date: 14-04-2024 Through 19-04-2024",
year = "2024",
month = mar,
day = "18",
doi = "10.1109/ICASSP48485.2024.10446574",
language = "English",
series = "ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings",
publisher = "IEEE",
pages = "6020--6024",
booktitle = "2024 IEEE International Conference on Acoustics, Speech and Signal Processing",
}