Analysis of the accuracy of building 3D coordinates of the underwater robot workspace

B.A. Skorohod, P.V. Zhiyakov, A.V. Statsenko, S.I. Fateev

 Sevastopol State University, RF, Sevastopol, Universitetskaya St., 33

E-mail: boris.skorohod@mail.ru,  yany@mail.ru, lex00x1@mail.ru,  fateev-si@ya.ru

DOI: 10.33075/2220-5861-2020-3-163-170

UDC 004.9:004.41 

Abstract:

   Distortion of underwater images can impair both the accuracy and robustness of 3D scene reconstruction algorithms. The problems that arise are related to the lack of robustness of these methods to changes in the underwater environment and features of transmitting and receiving signals under water, including, in particular, uneven illumination of the underwater environment, rapid attenuation, scattering and refraction of light when passing through an inhomogeneous medium of air-water-glass, limiting the frequency spectrum of passing light, which leads to the absorption of low-frequency components to a greater extent than light of higher frequencies. All this seriously complicates the ability to extract information about the scene as a whole and objects of interest located in the underwater environment, limits the use of standard image processing algorithms and requires their significant improvement.

   This article offers a new approach to analyzing the accuracy of constructing 3D coordinates of the working space of an underwater robot. The approach is based on underwater camera calibration, assessment of camera image centers taking into account the waterproof shell. We use statistical analysis that allows us to evaluate the impact of all sources of disturbances (both hardware and software) based only on experimental data. In particular, it shows how to get the error distribution using the measured values of the calibration sample and obtained by triangulation under underwater conditions. This makes it possible to simultaneously evaluate the systematic error and the distribution characteristics of the random component of the error in restoring 3D coordinates of the workspace. An important feature of the proposed approach is the ability to assess the impact of all sources of disturbances in the aggregate, including the design of a waterproof shell, based only on experimental data obtained in the underwater environment. In addition, the same approach can also provide estimates of the position of camera image centers, allowing for the presence of a waterproof shell to improve the accuracy of image processing algorithms. The proposed approach was tested on real data.

Keywords: underwater robots, stereo vision, perspective camera model, 3D reconstruction of the work space of an underwater robot.

Full text in PDF(RUS)

LIST OF REFERENCES

  1. L.S. Dolin, I.M. Levin Theory of underwater visibility Fundamental and applied Hydrophysics, 2015, Vol. 8, №. 2. Р. 22–35.
  2. Schechner Y.; Karpel, N. Clear underwater vision. In Proceedings of the 2004 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR), Washington, DC, USA, 27 June–2 July 2004; P. 536–543.
  3. Elizabeth V. Cabrera, Luis E. Ortiz, Bruno M. F. da Silva 1 and Esteban W. G. Clua and Luiz M. G. Gonçalves A Versatile Method for Depth Data Error Estimation in RGB-D Sensors. Sensors (Basel). 2018 Sep; 18(9): 3122.
  4. Luis E. Ortiz, Elizabeth V. Cabrera, and Luiz M. Goncalves Depth Data Error Modeling of the ZED 3D Vision Sensor from Stereolabs. Electronic Letters on Computer Vision and Image Analysis 17(1):1–15, 2018.
  5. Skorokhod B.A., Statsenko A.V., Fateev S.I. Influence of preprocessing and algorithms for selecting key points in the problem of simultaneous 3d reconstruction of underwater objects and construction of the camera trajectory. Monitoring systems of environment. 2019, № 2(36). Р. 30–36.
  6. Hartley R, Zisserman A. Multiple View Geometry in Computer Vision. Cambridge University, Press, 2003. Computers.  655 p.

 

If you have found a spelling error, please, notify us by selecting that text and pressing Ctrl+Enter.

Translate »

Spelling error report

The following text will be sent to our editors: