LeAF: Leveraging Deep Learning for Agricultural Pest Detection and Classification for Farmers
Aditya Sengupta
DOI: http://dx.doi.org/10.15439/2024F2492
Citation: Proceedings of the 19th Conference on Computer Science and Intelligence Systems (FedCSIS), M. Bolanowski, M. Ganzha, L. Maciaszek, M. Paprzycki, D. Ślęzak (eds). ACSIS, Vol. 39, pages 525–530 (2024)
Abstract. Farmers face many challenges while growing crops such as monitoring and maintaining plant health. Key indicators of poor plant health are plant anomalies such as pests, plant disease, and weeds, which can decrease crop yield. Over 40\% of global crop production is lost to plant anomalies, costing $220 billion annually. As the global population and demand for food increases, farmers will have to grow more food, making manual surveying for plant anomalies increasingly difficult. This forces farmers to excessively and indiscriminately apply more fertilizers and pesticides across their whole fields, often to both healthy and unhealthy plants, unnecessarily wasting acres worth of chemicals and increasing chemical contamination of food and environmental footprint of agriculture as the chemicals release greenhouse gases after their application and leak into ecosystems. Recent advances in deep learning with Convolutional Neural Networks (CNNs) allow using imaging data to solve this problem. LeAF aims to provide farmers with an end-to-end system to survey crops on the field and take targeted actions to maintain plant health. By focusing on agricultural pests, this paper demonstrates the following capabilities for the visual perception sub-system of LeAF: (1) use CNNs on field images to get plant-specific data with bounding box based detection and classification about plant anomalies at human-level accuracy and (2) combine detection and classification functionality into a single compact distilled model that can run on farmer accessible mobile phones or in embedded devices in agricultural tractors and robots with low latency and high throughput to enable real-time processing on video feeds. With lightweight and accurate plant anomaly detection and classification, LeAF addresses plant health management challenges faced by farmers, empowering them with actionable insights to enhance productivity while minimizing chemical usage and its environmental impact.
References
- H. Ritchie, P. Rosado, and M. Roser, “Environmental impacts of food production,” Our World in Data, 2022.
- J. G. D. Silva, “Feeding the world sustainably,” United Nations Chronicle, 2012.
- “Invasive pest spread another fallout from climate change, UN-backed study finds,” United Nations, 2021.
- N. Bhalla, “40% of global crop production is lost to pests. and it’s getting worse,” World Economic Forum, 2021.
- J. Garthwaite, “Why laughing gas is a growing climate problem,” Stanford News, 2020.
- S. LaMotte, “Reducing pesticides in food: Major food manufacturers earn an F grade,” CNN, 2023.
- A. Pariona, “Top pesticide using countries,” WorldAtlas, 2017.
- M. Tudi, H. D. Ruan, L. Wang, J. Lyu, R. Sadler, D. Connell, C. Chu, and D. T. Phung, “Agriculture development, pesticide application and its impact on the environment,” National Library of Medicine, 2021.
- “For Up to 800 Million Rural Poor, a Strong World Bank Commitment to Agriculture,” World Bank, 2019.
- W. A. Reinsch, T. Denamiel, and E. Kerstens, “Climate change and U.S. agricultural exports,” CSIS, 2023.
- X. Wu and et al., “IP102: A Large-Scale Benchmark Dataset for Insect Pest Recognition,” IEEE CVPR, 2019.
- A. Mall, S. Kabra, A. Lhila, and P. Ajmera, “AMaizeD: an end to end pipeline for automatic maize disease detection,” ICST, 2023.
- A. Subeesh and et al., “Deep Convolutional Neural Network Models for Weed Detection in Polyhouse Grown Bell Peppers,” Artificial Intelligence in Agriculture, 2022.
- “EarthSense TerraSentia,” https://www.earthsense.co/robotics.
- A. Sengupta, “LeAF: Leveraging Deep Learning for Plant Anomaly Detection and Classification for Farmers with Large Language Models for Natural Language Interaction & BRANCH Robot-Based Deployment,” IEEE CVPR Computer Vision for Science, June 2024.
- K. He, X. Zhang, S. Ren, and J. Sun, “Deep Residual Learning for Image Recognition,” IEEE CVPR, 2016.
- J. Redmon, S. Divvala, R. Girshick, and A. Farhadi, “You only look once: Unified, real-time object detection,” IEEE CVPR, 2016.
- “iNaturalist,” https://www.inaturalist.org/.
- S. Liu and et al., “Grounding DINO: Marrying DINO with Grounded Pre-Training for Open-Set Object Detection,” arXiv.org, 2024.
- A. Decker and et al., “2022 applied research results field crop disease and insect management,” UIUC Technical Report, 2022.
- “ResNet18 - Torchvision main documentation,” https://pytorch.org/vision/main/models/generated/torchvision.models.resnet18.html.
- “Ultralytics YOLO,” https://docs.ultralytics.com/.