Copy this text
SUCRe: Leveraging Scene Structure for Underwater Color Restoration
Underwater images are altered by the physical characteristics of the medium through which light rays pass before reaching the optical sensor. Scattering and wavelengthdependent absorption significantly modify the captured colors depending on the distance of observed elements to the image plane. In this paper, we aim to recover an image of the scene as if the water had no effect on light propagation. We introduce SUCRe, a novel method that exploits the scene’s 3D structure for underwater color restoration. By following points in multiple images and tracking their intensities at different distances to the sensor, we constrain the optimization of the parameters in an underwater image formation model and retrieve unattenuated pixel intensities. We conduct extensive quantitative and qualitative analyses of our approach in a variety of scenarios ranging from natural light to deep-sea environments using three underwater datasets acquired from real-world scenarios and one synthetic dataset. We also compare the performance of the proposed approach with that of a wide range of existing state-of-the-art methods. The results demonstrate a consistent benefit of exploiting multiple views across a spectrum of objective metrics. Our code is publicly available at github.com/clementinboittiaux/sucre.
Keyword(s)
Underwater color restoration, Structure-from-Motion, Texturing
Full Text
File | Pages | Size | Access | |
---|---|---|---|---|
Publisher's official version | 10 | 4 Mo | ||
Preprint - 10.48550/arXiv.2212.09129 | 12 | 6 Mo |