31
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Towards the fully automated monitoring of ecological communities

      review-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          High‐resolution monitoring is fundamental to understand ecosystems dynamics in an era of global change and biodiversity declines. While real‐time and automated monitoring of abiotic components has been possible for some time, monitoring biotic components—for example, individual behaviours and traits, and species abundance and distribution—is far more challenging. Recent technological advancements offer potential solutions to achieve this through: (i) increasingly affordable high‐throughput recording hardware, which can collect rich multidimensional data, and (ii) increasingly accessible artificial intelligence approaches, which can extract ecological knowledge from large datasets. However, automating the monitoring of facets of ecological communities via such technologies has primarily been achieved at low spatiotemporal resolutions within limited steps of the monitoring workflow. Here, we review existing technologies for data recording and processing that enable automated monitoring of ecological communities. We then present novel frameworks that combine such technologies, forming fully automated pipelines to detect, track, classify and count multiple species, and record behavioural and morphological traits, at resolutions which have previously been impossible to achieve. Based on these rapidly developing technologies, we illustrate a solution to one of the greatest challenges in ecology: the ability to rapidly generate high‐resolution, multidimensional and standardised data across complex ecologies.

          Abstract

          Monitoring living organisms with high‐resolution and multidimensional is a complex and labour‐intensive task, yet it is fundamental to understand and predict the dynamics of ecological communities in an era of global change and biodiversity declines. Here, we review existing technologies for automated data recording and processing, and we present novel frameworks that combine these technologies into automated monitoring pipelines that detect, track, classify and count multiple species, and even record behavioural and morphological traits at resolutions which have previously been impossible to achieve. We illustrate a solution to one of the greatest challenges in ecology and conservation: the ability to rapidly generate high resolution, multidimensional and critically, standardised data across complex ecologies.

          Related collections

          Most cited references297

          • Record: found
          • Abstract: not found
          • Article: not found

          ImageNet: A large-scale hierarchical image database

            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            DeepLabCut: markerless pose estimation of user-defined body parts with deep learning

            Quantifying behavior is crucial for many applications in neuroscience. Videography provides easy methods for the observation and recording of animal behavior in diverse settings, yet extracting particular aspects of a behavior for further analysis can be highly time consuming. In motor control studies, humans or other animals are often marked with reflective markers to assist with computer-based tracking, but markers are intrusive, and the number and location of the markers must be determined a priori. Here we present an efficient method for markerless pose estimation based on transfer learning with deep neural networks that achieves excellent results with minimal training data. We demonstrate the versatility of this framework by tracking various body parts in multiple species across a broad collection of behaviors. Remarkably, even when only a small number of frames are labeled (~200), the algorithm achieves excellent tracking performance on test frames that is comparable to human accuracy.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: found
              Is Open Access

              Using Deep Learning for Image-Based Plant Disease Detection

              Crop diseases are a major threat to food security, but their rapid identification remains difficult in many parts of the world due to the lack of the necessary infrastructure. The combination of increasing global smartphone penetration and recent advances in computer vision made possible by deep learning has paved the way for smartphone-assisted disease diagnosis. Using a public dataset of 54,306 images of diseased and healthy plant leaves collected under controlled conditions, we train a deep convolutional neural network to identify 14 crop species and 26 diseases (or absence thereof). The trained model achieves an accuracy of 99.35% on a held-out test set, demonstrating the feasibility of this approach. Overall, the approach of training deep learning models on increasingly large and publicly available image datasets presents a clear path toward smartphone-assisted crop disease diagnosis on a massive global scale.
                Bookmark

                Author and article information

                Contributors
                marc.besson@obs-banyuls.fr
                Journal
                Ecol Lett
                Ecol Lett
                10.1111/(ISSN)1461-0248
                ELE
                Ecology Letters
                John Wiley and Sons Inc. (Hoboken )
                1461-023X
                1461-0248
                20 October 2022
                December 2022
                : 25
                : 12 ( doiID: 10.1111/ele.v25.12 )
                : 2753-2775
                Affiliations
                [ 1 ] School of Biological Sciences University of Bristol Bristol UK
                [ 2 ] Sorbonne Université CNRS UMR Biologie des Organismes Marins, BIOM Banyuls‐sur‐Mer France
                [ 3 ] Department of Ecoscience Aarhus University Aarhus Denmark
                [ 4 ] UK Centre for Ecology & Hydrology Bangor UK
                [ 5 ] Department of Electrical and Computer Engineering Aarhus University Aarhus Denmark
                [ 6 ] BrisEngBio, School of Chemistry University of Bristol Cantock's Close Bristol BS8 1TS UK
                [ 7 ] Arctic Research Centre Aarhus University Aarhus Denmark
                Author notes
                [*] [* ] Correspondence

                Marc Besson, Sorbonne Université CNRS UMR Biologie des Organismes Marins, BIOM, Banyuls‐sur‐Mer, France.

                Email: marc.besson@ 123456obs-banyuls.fr

                Author information
                https://orcid.org/0000-0003-3381-322X
                https://orcid.org/0000-0002-6787-6192
                https://orcid.org/0000-0001-6742-9504
                https://orcid.org/0000-0003-1702-786X
                https://orcid.org/0000-0001-5387-3284
                https://orcid.org/0000-0002-0751-6312
                https://orcid.org/0000-0002-4768-4767
                https://orcid.org/0000-0001-5677-5401
                Article
                ELE14123 ELE-00423-2022.R1
                10.1111/ele.14123
                9828790
                36264848
                9d684534-8a07-4655-933a-1c1061910050
                © 2022 The Authors. Ecology Letters published by John Wiley & Sons Ltd.

                This is an open access article under the terms of the http://creativecommons.org/licenses/by/4.0/ License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited.

                History
                : 09 August 2022
                : 22 April 2022
                : 06 September 2022
                Page count
                Figures: 7, Tables: 0, Pages: 23, Words: 18498
                Funding
                Funded by: Danmarks Frie Forskningsfond , doi 10.13039/501100004836;
                Award ID: 8021‐00423B
                Funded by: Engineering and Physical Sciences Research Council , doi 10.13039/501100000266;
                Award ID: BB/W013959/1
                Award ID: EP/N510129/1
                Funded by: Natural Environment Research Council , doi 10.13039/501100000270;
                Award ID: NE/T003502/1
                Award ID: NE/T006579/1
                Award ID: NE/S01537X/1
                Funded by: Royal Society , doi 10.13039/501100000288;
                Award ID: UF160357
                Funded by: Royal Society , doi 10.13039/501100000288;
                Award ID: RGS\R2\192033
                Categories
                Synthesis
                Synthesis
                Custom metadata
                2.0
                December 2022
                Converter:WILEY_ML3GV2_TO_JATSPMC version:6.2.3 mode:remove_FC converted:09.01.2023

                Ecology
                community ecology,computer vision,deep learning,high‐resolution monitoring,remote sensing

                Comments

                Comment on this article