Toward an artificial intelligence-assisted counting of sharks on baited video

Type Article
Date 2024-05
Language English
Author(s) Villon Sébastien1, Iovan Corina1, Mangeas Morgan1, Vigliola Laurent1
Affiliation(s) 1 : ENTROPIE, Institut de Recherche pour le Développement (IRD), UR, UNC, CNRS, IFREMER, Centre IRD de Nouméa, 98000 Noumea, New-Caledonia, France
Source Ecological Informatics (1574-9541) (Elsevier BV), 2024-05 , Vol. 80 , P. 102499 (9p.)
DOI 10.1016/j.ecoinf.2024.102499
Keyword(s) Deep learning, Neural network, Coral reef, Marine ecology, Shark conservation
Abstract

Given the global biodiversity crisis, there is an urgent need for new tools to monitor populations of endangered marine megafauna, like sharks. To this end, Baited Remote Underwater Video Stations (BRUVS) stand as the most effective tools for estimating shark abundance, measured using the MaxN metric. However, a bottleneck exists in manually computing MaxN from extensive BRUVS video data. Although artificial intelligence methods are capable of solving this problem, their effectiveness is tested using AI metrics such as the F-measure, rather than ecologically informative metrics employed by ecologists, such as MaxN. In this study, we present both an automated and a semi-automated deep learning approach designed to produce the MaxN abundance metric for three distinct reef shark species: the grey reef shark (Carcharhinus amblyrhynchos), the blacktip reef shark (C. melanopterus), and the whitetip reef shark (Triaenodon obesus). Our approach was applied to one-hour baited underwater videos recorded in New Caledonia (South Pacific). Our fully automated model achieved F-measures of 0.85, 0.43, and 0.72 for the respective three species. It also generated MaxN abundance values that showed a high correlation with manually derived data for C. amblyrhynchos (R = 0.88). For the two other species, correlations were significant but weak (R = 0.35–0.44). Our semi-automated method significantly enhanced F-measures to 0.97, 0.86, and 0.82, resulting in high-quality MaxN abundance estimations while drastically reducing the video processing time. To our knowledge, we are the first to estimate MaxN with a deep-learning approach. In our discussion, we explore the implications of this novel tool and underscore its potential to produce innovative metrics for estimating fish abundance in videos, thereby addressing current limitations and paving the way for comprehensive ecological assessments.

Licence CC-BY
Full Text
File Pages Size Access
Publisher's official version 9 3 MB Open access
Supplementary material 66 KB Open access
Top of the page