Copy this text
Combined use of a frame and a linear pushbroom camera for deep-sea 3D hyperspectral mapping
Hyperspectral (HS) imaging produces an image of an object across a large range of the visible spectrum, and not just the primary colors (R, G, B) of conventional cameras. It can provide valuable information for object detection, analysis of materials and processes in environmental science in the deep-sea, especially for the study of benthic environments and pollution monitoring. In this paper, we address the problem of camera calibration towards 3D hyperspectral mapping where GPS is not available, and the platform navigational sensors are not accurate enough to allow direct georeferencing of linear sensors, as is the case with traditional aerial platform methods. Our approach presents a preliminary method for 3D hyperspectral mapping that uses only image processing techniques to reduce reliance on GPS or navigation sensors. The method is based on the use of standard RGB camera coupled with the hyperspectral pushbroom camera. The main contribution is the implementation and preliminary testing of a method to relate the two cameras using image information alone. The experiments presented in this paper analyze the estimation of relative orientation and time synchronization parameters for both cameras through experiments based on epipolar geometry and Monte-Carlo simulation. All methods are designed to work with real world data.
Keyword(s)
Geometry, Sea surface, Three-dimensional displays, Navigation, Nonlinear distortion, Cameras, Sensors
Full Text
File | Pages | Size | Access | |
---|---|---|---|---|
Author's final draft | 9 | 895 Ko | ||
Publisher's official version | 9 | 800 Ko |