FN Archimer Export Format PT J TI Toward an artificial intelligence-assisted counting of sharks on baited video BT AF Villon, Sébastien Iovan, Corina Mangeas, Morgan Vigliola, Laurent AS 1:1;2:1;3:1;4:1; FF 1:;2:;3:;4:; C1 ENTROPIE, Institut de Recherche pour le Développement (IRD), UR, UNC, CNRS, IFREMER, Centre IRD de Nouméa, 98000 Noumea, New-Caledonia, France C2 IRD, FRANCE UM ENTROPIE IN WOS Cotutelle UMR IF 5.1 TC 0 UR https://archimer.ifremer.fr/doc/00875/98726/108231.pdf https://archimer.ifremer.fr/doc/00875/98726/108232.docx LA English DT Article DE ;Deep learning;Neural network;Coral reef;Marine ecology;Shark conservation AB Given the global biodiversity crisis, there is an urgent need for new tools to monitor populations of endangered marine megafauna, like sharks. To this end, Baited Remote Underwater Video Stations (BRUVS) stand as the most effective tools for estimating shark abundance, measured using the MaxN metric. However, a bottleneck exists in manually computing MaxN from extensive BRUVS video data. Although artificial intelligence methods are capable of solving this problem, their effectiveness is tested using AI metrics such as the F-measure, rather than ecologically informative metrics employed by ecologists, such as MaxN. In this study, we present both an automated and a semi-automated deep learning approach designed to produce the MaxN abundance metric for three distinct reef shark species: the grey reef shark (Carcharhinus amblyrhynchos), the blacktip reef shark (C. melanopterus), and the whitetip reef shark (Triaenodon obesus). Our approach was applied to one-hour baited underwater videos recorded in New Caledonia (South Pacific). Our fully automated model achieved F-measures of 0.85, 0.43, and 0.72 for the respective three species. It also generated MaxN abundance values that showed a high correlation with manually derived data for C. amblyrhynchos (R = 0.88). For the two other species, correlations were significant but weak (R = 0.35–0.44). Our semi-automated method significantly enhanced F-measures to 0.97, 0.86, and 0.82, resulting in high-quality MaxN abundance estimations while drastically reducing the video processing time. To our knowledge, we are the first to estimate MaxN with a deep-learning approach. In our discussion, we explore the implications of this novel tool and underscore its potential to produce innovative metrics for estimating fish abundance in videos, thereby addressing current limitations and paving the way for comprehensive ecological assessments. PY 2024 PD MAY SO Ecological Informatics SN 1574-9541 PU Elsevier BV VL 80 UT 001174108700001 DI 10.1016/j.ecoinf.2024.102499 ID 98726 ER EF