Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

Image-Based Insect Counting Embedded in E-traps that Learn Without Manual Image Annotation and Self-Dispose Captured Insects

Version 1 : Received: 7 March 2023 / Approved: 8 March 2023 / Online: 8 March 2023 (10:19:45 CET)

A peer-reviewed article of this Preprint also exists.

Saradopoulos, I.; Potamitis, I.; Konstantaras, A.I.; Eliopoulos, P.; Ntalampiras, S.; Rigakis, I. Image-Based Insect Counting Embedded in E-Traps That Learn without Manual Image Annotation and Self-Dispose Captured Insects. Information 2023, 14, 267. Saradopoulos, I.; Potamitis, I.; Konstantaras, A.I.; Eliopoulos, P.; Ntalampiras, S.; Rigakis, I. Image-Based Insect Counting Embedded in E-Traps That Learn without Manual Image Annotation and Self-Dispose Captured Insects. Information 2023, 14, 267.

Abstract

This study describes the development of an image-based insect trap diverging from the plug-in camera insect trap paradigm. In short, a) it does not require manual annotation of images to learn how to count targeted pests and, b) it self-disposes the captured insects, and therefore is suitable for long-term deployment. The device consists of an imaging sensor integrated with Raspberry Pi microcontroller units with embedded deep learning algorithms that count agricultural pests inside a pheromone-based funnel trap. The device also receives commands from the server which configures its operation while an embedded servomotor can automatically rotate the detached bottom of the bucket to dispose of hydrated insects as they begin to pile up. Therefore, it completely overcomes a major limitation of camera-based insect traps: the inevitable overlap and occlusion caused by the decay and layering of insects during long-term operation, thus extending the autonomous operational capability. We study cases that are underrepresented in literature such as counting in situations of congestion and significant debris using crowd counting algorithms encountered in human surveillance. Finally, we perform comparative analysis of the results from different deep-learning approaches (YOLO7/8, crowd counting, deep learning regression) and we open-source the code and a large database of Lepidopteran plant pests.

Keywords

edge computing; e-traps; insect monitoring

Subject

Engineering, Electrical and Electronic Engineering

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.