Integrating Real-ESRGAN with CNN Models for UAV Image Based Plant Disease Detection
Sravya Malladi, Pranav Kulkarni
DOI: http://dx.doi.org/10.15439/2025F2775
Citation: Position Papers of the 20th Conference on Computer Science and Intelligence Systems, M. Bolanowski, M. Ganzha, L. Maciaszek, M. Paprzycki, D. Ślęzak (eds). ACSIS, Vol. 44, pages 69–73 (2025)
Abstract. The integration of deep learning models with UAV captured images for plant disease detection has been explored in many papers and has the potential to revolutionize commercial precision agriculture, by allowing for early and efficient detection and classification of crop disease stages. In order to address the limitations posed by low-resolution aerial imaging, this paper proposes the additional integration of an Enhanced Super Resolution Generative Adversarial Network (ESRGAN) with a Convolutional Neural Network model for field monitoring through UAV captured imagery. UAVs are a cost effective method of monitoring large swaths of agricultural land; however, it is difficult to capture images of a high enough quality and clarity to be adequately analyzed by a CNN. The images typically lack the necessary resolution for accurate classification, especially for diseases with smaller, less noticable symptoms. The Real-ESRGAN model is employed to generate a dataset of high-resolution images, from low-resolution inputs, allowing the disease detection CNN to more accurately and effectively identify and classify disease stages in Armillaria afflicted cherry trees. This solution offers a solution to the problem posed by traditional UAV based approaches that enhances classification accuracy even in suboptimal conditions. Through this integrated approach, the model was able to reach an increased validation accuracy, as well as significantly decreased loss values due to the ESRGAN enhanced imagery allowing for clearer detection of early stage Armillaria symptoms. This integrated system provides a practical scalable solution for commercial agriculture, allowing for more comprehensive and efficient crop disease monitoring. Future research can be explored to optimize the architecture of this model and expand its applicability to other crops and environmental conditions, allowing more efficient precision agriculture and paving the way for more sustainable farming practices.
References
- C. H. Bock, G. H. Poole, P. E. Parker, and T. R. Gottwald, “Plant disease severity estimated visually, by digital photography and image analysis, and by hyperspectral imaging,” Crit. Rev. Plant Sci., vol. 29, no. 2, pp. 59–107, 2010. [Online]. Available: https://doi.org/10.1080/07352681003617285
- J. G. A. Barbedo, “Factors influencing the use of deep learning for plant disease recognition,” Biosyst. Eng., vol. 172, pp. 84–91, 2018. [Online]. Available: https://doi.org/10.1016/j.biosystemseng.2018.05.013
- K. P. Ferentinos, “Deep learning models for plant disease detection and diagnosis,” Comput. Electron. Agric., vol. 145, pp. 311–318, 2018. [Online]. Available: https://doi.org/10.1016/j.compag.2018.01.009
- X. Wang et al., “ESRGAN: Enhanced super-resolution generative adversarial networks,” in Proc. Eur. Conf. Comput. Vis. Workshops, 2018, pp. 63–79. [Online]. Available: https://doi.org/10.1007/978-3-030-11021-55
- X. Wang, L. Xie, and C. Dong, “Real-ESRGAN: Training real-world blind super-resolution with pure synthetic data,” in Proc. IEEE/CVF Int. Conf. Comput. Vis. Workshops, 2021, pp. 1905–1914. [Online]. Available: https://doi.org/10.1109/ICCVW54120.2021.00217
- X. Zeng and Y. Ma, “GANs-based data augmentation for citrus disease severity detection using deep learning,” Agronomy, vol. 10, no. 12, p. 1939, 2020. [Online]. Available: https://www.mdpi.com/2073-4395/10/12/1939
- C. Chaschatzis et al., “Detection and characterization of stressed sweet cherry tissues using machine learning,” Remote Sens., vol. 12, no. 3, p. 531, 2020. [Online]. Available: https://www.mdpi.com/2072-4292/12/3/531
- S. Zhang and J. M. Kovacs, “The application of small unmanned aerial systems for precision agriculture: A review,” Precision Agric., vol. 13, pp. 693–712, 2012. [Online]. Available: https://link.springer.com/article/10.1007/s11119-012-9274-5
- H. Zhu et al., “Intelligent agriculture: Deep learning in UAV-based remote sensing imagery for crop diseases and pests detection,” Front. Plant Sci., vol. 15, 2025. [Online]. Available: https://doi.org/10.3389/fpls.2024.1435016
- A. D. Boursianis et al., “Internet of Things (IoT) and agricultural unmanned aerial vehicles (UAVs) in smart farming: A comprehensive review,” Internet Things, vol. 18, p. 100187, 2022. [Online]. Available: https://doi.org/10.1016/j.iot.2020.100187
- J. Agrawal and M. Y. Arafat, “Transforming farming: A review of AI-powered UAV technologies in precision agriculture,” Drones, vol. 8, no. 11, p. 664, 2025. [Online]. Available: https://doi.org/10.3390/drones8110664
- O. Bongomin et al., “UAV image acquisition and processing for high-throughput phenotyping in agricultural research and breeding programs,” Plant Phenome J., vol. 7, no. 1, e20096, 2025. [Online]. Available: https://doi.org/10.1002/ppj2.20096
- S. A. Wahabzada et al., “Millimeter-level plant disease detection from aerial photographs via deep learning and crowdsourced data,” Front. Plant Sci., vol. 9, p. 1453, 2018. [Online]. Available: https://www.frontiersin.org/articles/10.3389/fpls.2018.01453/full
- K. Sathya and M. Rajalakshmi, “RDA-CNN: Enhanced Super Resolution Method for Rice Plant Disease Classification,” Comput. Syst. Sci. Eng., vol. 42, no. 1, pp. 33–47, Jul. 2022. [Online]. Available: https://www.techscience.com/csse/v42n1/45755/html
- A. ul Haq and S. Kaur, “Super resolution image based plant disease detection and classification using deep learning techniques,” Propuls. Tech. J., vol. 45, no. 1, pp. 1020–1022, 2024. [Online]. Available: https://www.propulsiontechjournal.com/index.php/journal/article/view/4108
- L. Bi and G. Hu, “Improving image-based plant disease classification with generative adversarial network under limited training set,” Front. Plant Sci., vol. 11, 2020. [Online]. Available: https://doi.org/10.3389/fpls.2020.583438
- J. Wen, Y. Shi, X. Zhou, and Y. Xue, “Crop disease classification on inadequate low-resolution target images,” Sensors, vol. 20, no. 16, p. 4601, 2020. [Online]. Available: https://doi.org/10.3390/s20164601
- Ş. B. Çetin, “Real-ESRGAN: A deep learning approach for general image restoration and its application to aerial images,” Advanced Remote Sensing, vol. 3, no. 2, pp. 90–99, 2023. [Online]. Available: https://publish.mersin.edu.tr/index.php/arsej/article/view/1072
- F. Rezapoor Nikroo et al., “A comparative analysis of SRGAN models,” arXiv preprint, https://arxiv.org/abs/2307.09456, 2023. [Online]. Available: https://arxiv.org/abs/2307.09456
- J. Redmon, S. Divvala, R. Girshick, and A. Farhadi, “You only look once: Unified, real-time object detection,” in Proc. IEEE Conf. Comput. Vis. Pattern Recognit., 2016, pp. 779–788. [Online]. Available: https://doi.org/10.1109/CVPR.2016.91