Designing recurrent neural networks by unfolding an L1-L1 minimization algorithm
Host Publication: IEEE International Conference on Image Processing
Authors: H. Le, H. Van Luong and N. Deligiannis
Publication Year: 2019
Number of Pages: 5
We propose a new deep recurrent neural network (RNN) architecture for sequential signal reconstruction. Our network is designed by unfolding the iterations of the proximal gradient method that solves the l1-l1 minimization problem. As such, our network leverages by design that signals have a sparse representation and that the difference between consecutive signal representations is also sparse. We evaluate the proposed model in the task of reconstructing video frames from compressive measurements and show that it outperforms several state-of-the-art RNN models.