Facial reconstruction

Search LJMU Research Online

Browse Repository | Browse E-Theses

Removing Human Bottlenecks in Bird Classification Using Camera Trap Images and Deep Learning

Chalmers, C, Fergus, P, Wich, SA, Longmore, SN, Walsh, ND, Stephens, PA, Sutherland, C, Matthews, N, Mudde, J and Nuseibeh, A (2023) Removing Human Bottlenecks in Bird Classification Using Camera Trap Images and Deep Learning. Remote Sensing, 15 (10). p. 2638.

[img]
Preview
Text
remotesensing-15-02638.pdf - Published Version
Available under License Creative Commons Attribution.

Download (14MB) | Preview

Abstract

Birds are important indicators for monitoring both biodiversity and habitat health; they also play a crucial role in ecosystem management. Declines in bird populations can result in reduced ecosystem services, including seed dispersal, pollination and pest control. Accurate and long-term monitoring of birds to identify species of concern while measuring the success of conservation interventions is essential for ecologists. However, monitoring is time-consuming, costly and often difficult to manage over long durations and at meaningfully large spatial scales. Technology such as camera traps, acoustic monitors and drones provide methods for non-invasive monitoring. There are two main problems with using camera traps for monitoring: (a) cameras generate many images, making it difficult to process and analyse the data in a timely manner; and (b) the high proportion of false positives hinders the processing and analysis for reporting. In this paper, we outline an approach for overcoming these issues by utilising deep learning for real-time classification of bird species and automated removal of false positives in camera trap data. Images are classified in real-time using a Faster-RCNN architecture. Images are transmitted over 3/4G cameras and processed using Graphical Processing Units (GPUs) to provide conservationists with key detection metrics, thereby removing the requirement for manual observations. Our models achieved an average sensitivity of 88.79%, a specificity of 98.16% and accuracy of 96.71%. This demonstrates the effectiveness of using deep learning for automatic bird monitoring.

Item Type: Article
Uncontrolled Keywords: 0203 Classical Physics; 0406 Physical Geography and Environmental Geoscience; 0909 Geomatic Engineering
Subjects: Q Science > QA Mathematics > QA75 Electronic computers. Computer science
Q Science > QH Natural history > QH301 Biology
T Technology > T Technology (General)
Divisions: Computer Science & Mathematics
Publisher: MDPI
SWORD Depositor: A Symplectic
Date Deposited: 05 Jun 2023 14:34
Last Modified: 05 Jun 2023 14:34
DOI or ID number: 10.3390/rs15102638
URI: https://researchonline.ljmu.ac.uk/id/eprint/19663
View Item View Item