23
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Perspectives in machine learning for wildlife conservation

      review-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Inexpensive and accessible sensors are accelerating data acquisition in animal ecology. These technologies hold great potential for large-scale ecological understanding, but are limited by current processing approaches which inefficiently distill data into relevant information. We argue that animal ecologists can capitalize on large datasets generated by modern sensors by combining machine learning approaches with domain knowledge. Incorporating machine learning into ecological workflows could improve inputs for ecological models and lead to integrated hybrid modeling tools. This approach will require close interdisciplinary collaboration to ensure the quality of novel approaches and train a new generation of data scientists in ecology and conservation.

          Abstract

          Animal ecologists are increasingly limited by constraints in data processing. Here, Tuia and colleagues discuss how collaboration between ecologists and data scientists can harness machine learning to capitalize on the data generated from technological advances and lead to novel modeling approaches.

          Related collections

          Most cited references99

          • Record: found
          • Abstract: found
          • Article: not found

          Deep learning.

          Deep learning allows computational models that are composed of multiple processing layers to learn representations of data with multiple levels of abstraction. These methods have dramatically improved the state-of-the-art in speech recognition, visual object recognition, object detection and many other domains such as drug discovery and genomics. Deep learning discovers intricate structure in large data sets by using the backpropagation algorithm to indicate how a machine should change its internal parameters that are used to compute the representation in each layer from the representation in the previous layer. Deep convolutional nets have brought about breakthroughs in processing images, video, speech and audio, whereas recurrent nets have shone light on sequential data such as text and speech.
            Bookmark
            • Record: found
            • Abstract: not found
            • Article: not found

            Random Forests

              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              DeepLabCut: markerless pose estimation of user-defined body parts with deep learning

              Quantifying behavior is crucial for many applications in neuroscience. Videography provides easy methods for the observation and recording of animal behavior in diverse settings, yet extracting particular aspects of a behavior for further analysis can be highly time consuming. In motor control studies, humans or other animals are often marked with reflective markers to assist with computer-based tracking, but markers are intrusive, and the number and location of the markers must be determined a priori. Here we present an efficient method for markerless pose estimation based on transfer learning with deep neural networks that achieves excellent results with minimal training data. We demonstrate the versatility of this framework by tracking various body parts in multiple species across a broad collection of behaviors. Remarkably, even when only a small number of frames are labeled (~200), the algorithm achieves excellent tracking performance on test frames that is comparable to human accuracy.
                Bookmark

                Author and article information

                Contributors
                devis.tuia@epfl.ch
                Journal
                Nat Commun
                Nat Commun
                Nature Communications
                Nature Publishing Group UK (London )
                2041-1723
                9 February 2022
                9 February 2022
                2022
                : 13
                : 792
                Affiliations
                [1 ]GRID grid.5333.6, ISNI 0000000121839049, School of Architecture, Civil and Environmental Engineering, , Ecole Polytechnique Fédérale de Lausanne (EPFL), ; Lausanne, Switzerland
                [2 ]GRID grid.20861.3d, ISNI 0000000107068890, Department of Computing and Mathematical Sciences, , California Institute of Technology (Caltech), ; Pasadena, CA USA
                [3 ]GRID grid.507516.0, ISNI 0000 0004 7661 536X, Max Planck Institute of Animal Behavior, ; Radolfzell, Germany
                [4 ]GRID grid.9811.1, ISNI 0000 0001 0658 7699, Centre for the Advanced Study of Collective Behaviour, , University of Konstanz, ; Konstanz, Germany
                [5 ]GRID grid.9811.1, ISNI 0000 0001 0658 7699, Department of Biology, , University of Konstanz, ; Konstanz, Germany
                [6 ]GRID grid.497276.9, ISNI 0000 0004 1779 6404, Institute for Applied Mathematics and Information Technologies, IMATI-CNR, ; Pavia, Italy
                [7 ]GRID grid.5949.1, ISNI 0000 0001 2172 9288, Computer Science Department, , University of Münster, ; Münster, Germany
                [8 ]GRID grid.5333.6, ISNI 0000000121839049, School of Life Sciences, , Ecole Polytechnique Fédérale de Lausanne (EPFL), ; Lausanne, Switzerland
                [9 ]GRID grid.4818.5, ISNI 0000 0001 0791 5666, Environmental Sciences Group, , Wageningen University, ; Wageningen, Netherlands
                [10 ]GRID grid.5337.2, ISNI 0000 0004 1936 7603, Computer Science Department, , University of Bristol, ; Bristol, UK
                [11 ]GRID grid.40803.3f, ISNI 0000 0001 2173 6074, Department of Forestry and Environmental Resources, , North Carolina State University, ; Raleigh, NC USA
                [12 ]GRID grid.421582.8, ISNI 0000 0001 2226 059X, North Carolina Museum of Natural Sciences, ; Raleigh, NC USA
                [13 ]GRID grid.5386.8, ISNI 000000041936877X, Cornell Lab of Ornithology, , Cornell University, ; Ithaca, NY USA
                [14 ]GRID grid.33647.35, ISNI 0000 0001 2160 9198, Department of Computer Science, , Rensselaer Polytechnic Institute, ; Troy, NY USA
                [15 ]GRID grid.261331.4, ISNI 0000 0001 2285 7943, Translational Data Analytics Institute, , The Ohio State University, ; Columbus, OH USA
                [16 ]GRID grid.261331.4, ISNI 0000 0001 2285 7943, Departments of Computer Science and Engineering; Electrical and Computer Engineering; Evolution, Ecology, and Organismal Biology, , The Ohio State University, ; Columbus, OH USA
                Author information
                http://orcid.org/0000-0003-0374-2459
                http://orcid.org/0000-0001-5291-788X
                http://orcid.org/0000-0003-1358-0828
                http://orcid.org/0000-0001-5691-4029
                http://orcid.org/0000-0002-3777-2202
                http://orcid.org/0000-0001-7368-4456
                http://orcid.org/0000-0001-8870-0797
                http://orcid.org/0000-0002-2947-6665
                http://orcid.org/0000-0002-9790-7025
                http://orcid.org/0000-0001-8556-4558
                http://orcid.org/0000-0001-7610-1412
                Article
                27980
                10.1038/s41467-022-27980-y
                8828720
                35140206
                0dc574b3-79d4-438e-b7e9-3889b9da7740
                © The Author(s) 2022

                Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/.

                History
                : 25 May 2021
                : 8 December 2021
                Funding
                Funded by: FundRef https://doi.org/10.13039/100000001, National Science Foundation (NSF);
                Award ID: 1745301
                Award ID: IIS 1514174
                Award ID: IOS 1250895
                Award ID: 1453555
                Award ID: 1550853
                Award ID: IIS 1514174
                Award ID: IOS 1250895
                Award ID: 1453555
                Award ID: 1550853
                Award Recipient :
                Funded by: FundRef https://doi.org/10.13039/501100001659, Deutsche Forschungsgemeinschaft (German Research Foundation);
                Award ID: EXC 2117-422037984
                Award ID: EXC 2117-422037984
                Award ID: EXC 2117-422037984
                Award ID: EXC 2117-422037984
                Award Recipient :
                Funded by: FundRef https://doi.org/10.13039/100009152, Fondation Bertarelli (Bertarelli Foundation);
                Funded by: FundRef https://doi.org/10.13039/100000008, David and Lucile Packard Foundation (David & Lucile Packard Foundation);
                Award ID: 2016-65130
                Award Recipient :
                Categories
                Perspective
                Custom metadata
                © The Author(s) 2022

                Uncategorized
                conservation biology,computer science
                Uncategorized
                conservation biology, computer science

                Comments

                Comment on this article