Expert, Crowd, Students or Algorithm: who holds the key to deep-sea imagery ‘big data’ processing?

1.Recent technological development has increased our capacity to study the deep sea and the marine benthic realm, particularly with the development of multidisciplinary seafloor observatories. Since 2006, Ocean Networks Canada cabled observatories, have acquired nearly 65 TB and over 90,000 hours of video data from seafloor cameras and Remotely Operated Vehicles (ROVs). Manual processing of these data is time-consuming and highly labour-intensive, and cannot be comprehensively undertaken by individual researchers. These videos are a crucial source of information for assessing natural variability and ecosystem responses to increasing human activity in the deep sea. 2.We compared the performance of three groups of humans and one computer vision algorithm in counting individuals of the commercially important sablefish (or black cod) Anoplopoma fimbria, in recorded video from a cabled camera platform at 900 m depth in a submarine canyon in the Northeast Pacific. The first group of human observers were untrained volunteers recruited via a crowdsourcing platform and the second were experienced university students, who performed the task for their ichthyology class. Results were validated against counts obtained from a scientific expert. 3.All groups produced relatively accurate results in comparison to the expert and all succeeded in detecting patterns and periodicities in fish abundance data. Trained volunteers displayed the highest accuracy and the algorithm the lowest. 4.As seafloor observatories increase in number around the world, this study demonstrates the value of a hybrid combination of crowdsourcing and computer vision techniques as a tool to help process large volumes of imagery to support basic research and environmental monitoring. Reciprocally, by engaging large numbers of online participants in deep-sea research, this approach can contribute significantly to ocean literacy and informed citizen input to policy development.

Keyword(s)

computer vision algorithms, crowdsourcing, deep-sea imagery, Digital Fishers, fish counting, OceanNetworks Canada, seafloor observatories, underwater video

Full Text

FilePagesSizeAccess
Author's final draft
331 Mo
Video S1. Example of video from the project dataset recorded in Barkley Canyon (British-Columbia, Canada, using the Ocean Networks Canada Observatory.
-1 Mo
Fig. S1. Ocean Networks Canada's annotation system used by the students to count the number of Sablefish in the videos ( http://dmas.uvic.ca/SeaTube).
-2 Mo
Fig. S2. Tutorial provided to the Crowd participants through the web interface Digital Fishers ( http://dmas.uvic.ca/DigitalFishers).
-1 Mo
Fig. S3. Summary of automated analysis method to detect fish in the Barkley Canyon videos recorded by the Ocean Networks Canada observatory.
141 Mo
Publisher's official version IN PRESS
91 Mo
How to cite
Matabos Marjolaine, Hoeberechts Maia, Doya Carol, Aguzzi Jacopo, Nephin Jessica, Reimchen Thomas E., Leaver Steve, Marx Roswitha M., Albu Alexandra Branzan, Fier Ryan, Fernandez-Arcaya Ulla, Juniper S. Kim (2017). Expert, Crowd, Students or Algorithm: who holds the key to deep-sea imagery ‘big data’ processing?. Methods In Ecology And Evolution. 8 (8). 996-1004. https://doi.org/10.1111/2041-210X.12746, https://archimer.ifremer.fr/doc/00369/47978/

Copy this text