Depth-based view synthesis can produce novel realistic images of a scene by view warping and image inpainting. This paper presents a depth-based view synthesis approach performing pixel-level image inpainting. The proposed approach provides great flexibility in pixel manipulation and prevents random effects in texture propagation. By analyzing the process generating image holes in view warping, we firstly classify such areas into simple holes and disocclusion areas. Based on depth information constraints and different strategies for random propagation, an approximate nearest-neighbor match based pixel-level inpainting is introduced to complete holes from the two classes. Experimental results demonstrate that the proposed view synthesis method can effectively produce smooth textures and reasonable structure propagation. The proposed depth-based pixel-level inpainting is well suitable to multi-view video and other higher dimensional view synthesis settings.
Lu, S-P, Hanca, J, Munteanu, A & Schelkens, P 2013, Depth-Based View Synthesis Using Pixel-level Image Inpainting. in 2013 18th International Conference on Digital Signal Processing. Proceedings of the International Conference on Digital Signal Processing, IEEE, pp. 240-245, 18th International Conference on Digital Signal Processing (DSP 2013), Santorini, Greece, 1/07/13.
Lu, S.-P., Hanca, J., Munteanu, A., & Schelkens, P. (2013). Depth-Based View Synthesis Using Pixel-level Image Inpainting. In 2013 18th International Conference on Digital Signal Processing (pp. 240-245). (Proceedings of the International Conference on Digital Signal Processing). IEEE.
@inproceedings{86a5b123885544418318ea20273989dc,
title = "Depth-Based View Synthesis Using Pixel-level Image Inpainting",
abstract = "Depth-based view synthesis can produce novel realistic images of a scene by view warping and image inpainting. This paper presents a depth-based view synthesis approach performing pixel-level image inpainting. The proposed approach provides great flexibility in pixel manipulation and prevents random effects in texture propagation. By analyzing the process generating image holes in view warping, we firstly classify such areas into simple holes and disocclusion areas. Based on depth information constraints and different strategies for random propagation, an approximate nearest-neighbor match based pixel-level inpainting is introduced to complete holes from the two classes. Experimental results demonstrate that the proposed view synthesis method can effectively produce smooth textures and reasonable structure propagation. The proposed depth-based pixel-level inpainting is well suitable to multi-view video and other higher dimensional view synthesis settings.",
keywords = "depth image based rendering, view synthesis, approximate nearest-neighbor match, pixel-level inpainting",
author = "Shao-Ping Lu and Jan Hanca and Adrian Munteanu and Peter Schelkens",
year = "2013",
month = jul,
day = "1",
language = "English",
isbn = "978-1-4673-5806-4",
series = "Proceedings of the International Conference on Digital Signal Processing",
publisher = "IEEE",
pages = "240--245",
booktitle = "2013 18th International Conference on Digital Signal Processing",
note = "18th International Conference on Digital Signal Processing (DSP 2013) ; Conference date: 01-07-2013 Through 03-07-2013",
url = "http://dsp2013.dspconferences.org/",
}