{"id":13205,"date":"2025-05-29T09:19:34","date_gmt":"2025-05-29T08:19:34","guid":{"rendered":"https:\/\/icertpublication.com\/?page_id=13205"},"modified":"2025-06-03T13:56:05","modified_gmt":"2025-06-03T12:56:05","slug":"disease-detection-in-agriculture-using-deep-learning","status":"publish","type":"page","link":"https:\/\/icertpublication.com\/index.php\/eduphoria-an-international-multidisciplinary-magazine-vol-03-issue-02\/disease-detection-in-agriculture-using-deep-learning\/","title":{"rendered":"Disease Detection in Agriculture using Deep Learning"},"content":{"rendered":"\t\t<div data-elementor-type=\"wp-page\" data-elementor-id=\"13205\" class=\"elementor elementor-13205\">\n\t\t\t\t\t\t<section class=\"elementor-section elementor-top-section elementor-element elementor-element-6d55b31 elementor-section-boxed elementor-section-height-default elementor-section-height-default\" data-id=\"6d55b31\" data-element_type=\"section\">\n\t\t\t\t\t\t<div class=\"elementor-container elementor-column-gap-default\">\n\t\t\t\t\t<div class=\"elementor-column elementor-col-100 elementor-top-column elementor-element elementor-element-9d76677\" data-id=\"9d76677\" data-element_type=\"column\">\n\t\t\t<div class=\"elementor-widget-wrap elementor-element-populated\">\n\t\t\t\t\t\t<div class=\"elementor-element elementor-element-e6d130c elementor-widget elementor-widget-heading\" data-id=\"e6d130c\" data-element_type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h4 class=\"elementor-heading-title elementor-size-default\">Disease Detection in Agriculture using Deep Learning\n<\/h4>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-f5f6154 elementor-widget elementor-widget-text-editor\" data-id=\"f5f6154\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p dir=\"ltr\" style=\"line-height: 1.3800000000000001; text-align: center; margin-top: 0pt; margin-bottom: 10pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Abhishek Sharma<\/span><\/p><p dir=\"ltr\" style=\"line-height: 1.3800000000000001; text-align: center; margin-top: 0pt; margin-bottom: 10pt;\"><span style=\"background-color: transparent; color: #000000; font-family: Cambria, serif; font-size: 12pt; white-space-collapse: preserve;\">Research Scholar, Teerthanker Mahaveer University, Moradabad<\/span><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-a23d3de elementor-widget elementor-widget-text-editor\" data-id=\"a23d3de\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p dir=\"ltr\" style=\"line-height: 1.2; text-align: center; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 16pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Disease Detection in Agriculture using Deep Learning<\/span><\/p><p><b style=\"font-weight: normal;\">\u00a0<\/b><\/p><p dir=\"ltr\" style=\"line-height: 1.3800000000000001; text-align: center; margin-top: 0pt; margin-bottom: 10pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Abhishek Sharma<\/span><\/p><p dir=\"ltr\" style=\"line-height: 1.3800000000000001; text-align: center; border-bottom: solid #000000 1.5pt; margin-top: 0pt; margin-bottom: 10pt; padding: 0pt 0pt 1pt 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Research Scholar, Teerthanker Mahaveer University, Moradabad<\/span><\/p><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: center; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Abstract<\/span><\/p><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Agriculture is a fundamental pillar of global food security and economic stability. However, plant diseases pose a severe threat to agricultural productivity, leading to annual crop losses of up to 40% worldwide. Early detection and management of plant diseases are critical to ensuring sustainable agriculture, minimizing economic loss, and safeguarding food supplies. Traditional disease detection methods rely heavily on manual field inspections, expert consultations, and laboratory analyses, which are time-consuming, costly, and often inaccessible to smallholder farmers. With recent advances in artificial intelligence, particularly deep learning (DL), researchers have made significant strides in automating agricultural disease detection. Deep learning models, such as convolutional neural networks (CNNs) and transformer-based architectures, have demonstrated exceptional performance in recognizing plant diseases from leaf images. These models can automatically learn discriminative features from raw image data, eliminating the need for manual feature engineering and improving classification accuracy. This paper presents a comprehensive review of deep learning-based approaches for plant disease detection, highlighting key architectures, datasets, preprocessing techniques, and evaluation metrics. We conduct a comparative analysis of popular models including AlexNet, VGGNet, ResNet, EfficientNet, Vision Transformers (ViT), and hybrid models such as CNN-LSTM networks. Using benchmark datasets like PlantVillage, we show that deep learning models can achieve accuracies exceeding 98%, far surpassing traditional image processing methods. Furthermore, we discuss the major challenges facing real-world deployment, including limited labeled datasets, domain adaptation to field conditions, computational requirements, and the need for model explainability. We also explore future research directions, such as self-supervised learning, few-shot learning, explainable AI, and edge computing integration.<\/span><\/p><p><b style=\"font-weight: normal;\">\u00a0<\/b><\/p><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: italic; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Keywords: <\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Agriculture, Deep Learning, Disease Detection, Convolutional Neural Networks (CNN), Image Classification, Precision Agriculture<\/span><\/p><p><b style=\"font-weight: normal;\"><br \/><br \/><\/b><\/p><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Introduction<\/span><\/p><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Agriculture is a vital sector that supports global food security, livelihoods, and economic growth. However, plant diseases remain a persistent challenge, causing up to 40% annual crop losses worldwide and threatening the sustainability of food systems. Early and accurate detection of plant diseases is crucial to minimizing yield loss, reducing pesticide use, and improving farm productivity. Traditional detection methods, such as manual inspection, expert evaluation, and laboratory analysis, are often time-consuming, labor-intensive, costly, and prone to human error, making them inaccessible for many smallholder farmers.<\/span><\/p><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Recent advancements in artificial intelligence, particularly Deep learning (DL), have opened new possibilities for automating disease detection in agriculture. Deep learning models especially convolutional neural networks (CNNs) can automatically learn complex features from leaf images, offering high accuracy without the need for manual feature extraction. These models have shown promising results in identifying a wide range of plant diseases under controlled and field conditions.<\/span><\/p><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">This paper explores the application of deep learning in agricultural disease detection, providing a review of key architectures, datasets, preprocessing methods, evaluation metrics, and experimental results. It also addresses current challenges and discusses future research directions to improve the reliability, scalability, and real-world deployment of these systems, ultimately contributing to more sustainable and resilient agricultural practices.<\/span><\/p><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">This paper aims to:<\/span><\/p><ul style=\"margin-top: 0; margin-bottom: 0; padding-inline-start: 48px;\"><li dir=\"ltr\" style=\"list-style-type: disc; font-size: 10pt; font-family: 'Noto Sans Symbols',sans-serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt;\" aria-level=\"1\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Provide an overview of deep learning methods for disease detection in agriculture.<\/span><\/p><\/li><li dir=\"ltr\" style=\"list-style-type: disc; font-size: 10pt; font-family: 'Noto Sans Symbols',sans-serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt;\" aria-level=\"1\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Analyze the performance of different architectures.<\/span><\/p><\/li><li dir=\"ltr\" style=\"list-style-type: disc; font-size: 10pt; font-family: 'Noto Sans Symbols',sans-serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt;\" aria-level=\"1\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Discuss challenges and future trends.<\/span><\/p><\/li><li dir=\"ltr\" style=\"list-style-type: disc; font-size: 10pt; font-family: 'Noto Sans Symbols',sans-serif; color: #0070c0; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt;\" aria-level=\"1\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Present experimental results with quantitative and qualitative analysis.<\/span><\/p><\/li><\/ul><p><b style=\"font-weight: normal;\">\u00a0<\/b><\/p><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Background and Related Work<\/span><\/p><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">A. Traditional Disease Detection Methods<\/span><\/p><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Traditional approaches include visual inspections, expert consultations, and laboratory tests. While accurate, they are resource-intensive and prone to human error. Image processing techniques like thresholding, edge detection, and color segmentation have also been used but require manual feature engineering.<\/span><\/p><p><b style=\"font-weight: normal;\">\u00a0<\/b><\/p><ol style=\"margin-top: 0; margin-bottom: 0; padding-inline-start: 48px;\" start=\"2\"><li dir=\"ltr\" style=\"list-style-type: upper-alpha; font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt;\" aria-level=\"1\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Deep Learning in Agriculture<\/span><\/p><\/li><li dir=\"ltr\" style=\"list-style-type: upper-alpha; font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt;\" aria-level=\"1\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Deep learning, especially CNNs, has achieved state-of-the-art performance in image classification and object detection. In agriculture, DL models can identify leaf spots, blights, rusts, and other diseases from images captured using smartphones or drones.<\/span><\/p><\/li><\/ol><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: center; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\"><span style=\"border: none; display: inline-block; overflow: hidden; width: 637px; height: 209px;\"><img fetchpriority=\"high\" decoding=\"async\" style=\"margin-left: 0px; margin-top: 0px;\" src=\"https:\/\/lh7-rt.googleusercontent.com\/docsz\/AD_4nXcHfH0Z-eTgwvoDFujMJ2vYYJ1RIZ8lqbai4ObVRiiB_AJI0wxwAWiX05HSGr1rHmU923dB4Bjw_eP1zz_H36Hfb24mZ6mrpxBW4dnxrzBZHC1XoxDvOMwzo3Z8Iwnt8IjrPeg88w?key=AkTDcwc5cir0-5GxyRFj6A\" alt=\"C:\\Users\\tmu\\Downloads\\Figure 1.png\" width=\"637\" height=\"209\" \/><\/span><\/span><\/p><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: center; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Figure 1:<\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\"> Pipeline of Deep Learning-Based Disease Detection System<\/span><\/p><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">\u00a0Methodology<\/span><\/p><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">The proposed methodology for disease detection in agriculture using deep learning consists of five key stages: data collection, preprocessing, model selection, training, and evaluation.<\/span><\/p><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">A. Data Collection:<\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\"> High-quality datasets are essential for developing robust deep learning models. Publicly available datasets such as the PlantVillage dataset, AI Challenger, and Kaggle Plant Disease datasets were used in this study. These datasets consist of thousands of labeled leaf images covering multiple crop species and disease classes. Field data can also be collected using smartphones, digital cameras, or drones to capture real-world variability.<\/span><\/p><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Data plays a critical role in training DL models. Popular datasets include:<\/span><\/p><ul style=\"margin-top: 0; margin-bottom: 0; padding-inline-start: 48px;\"><li dir=\"ltr\" style=\"list-style-type: disc; font-size: 10pt; font-family: 'Noto Sans Symbols',sans-serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt;\" aria-level=\"1\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">PlantVillage Dataset<\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">: 54,306 images of healthy and diseased plant leaves across 14 species.<\/span><\/p><\/li><li dir=\"ltr\" style=\"list-style-type: disc; font-size: 10pt; font-family: 'Noto Sans Symbols',sans-serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt;\" aria-level=\"1\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Kaggle Competitions<\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">.<\/span><\/p><\/li><li dir=\"ltr\" style=\"list-style-type: disc; font-size: 10pt; font-family: 'Noto Sans Symbols',sans-serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt;\" aria-level=\"1\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">AI Challenger<\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">.<\/span><\/p><\/li><\/ul><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: center; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Table I: Summary of Public Datasets<\/span><\/p><div dir=\"ltr\" style=\"margin-left: 0pt;\" align=\"center\"><table style=\"border: none; border-collapse: collapse;\"><colgroup><col width=\"127\" \/><col width=\"222\" \/><col width=\"226\" \/><col width=\"151\" \/><\/colgroup><tbody><tr style=\"height: 18.7pt;\"><td style=\"border-left: solid #4472c4 1pt; border-bottom: solid #4472c4 1pt; border-top: solid #4472c4 1pt; vertical-align: top; background-color: #4472c4; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #ffffff; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Dataset<\/span><\/p><\/td><td style=\"border-bottom: solid #4472c4 1pt; border-top: solid #4472c4 1pt; vertical-align: top; background-color: #4472c4; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #ffffff; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Number of Images<\/span><\/p><\/td><td style=\"border-bottom: solid #4472c4 1pt; border-top: solid #4472c4 1pt; vertical-align: top; background-color: #4472c4; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #ffffff; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Number of Classes<\/span><\/p><\/td><td style=\"border-right: solid #4472c4 1pt; border-bottom: solid #4472c4 1pt; border-top: solid #4472c4 1pt; vertical-align: top; background-color: #4472c4; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #ffffff; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Species<\/span><\/p><\/td><\/tr><tr style=\"height: 18.7pt;\"><td style=\"border-left: solid #4472c4 1pt; border-bottom: solid #4472c4 1pt; border-top: solid #4472c4 1pt; vertical-align: top; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Plant Village<\/span><\/p><\/td><td style=\"border-bottom: solid #4472c4 1pt; border-top: solid #4472c4 1pt; vertical-align: top; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">54,306<\/span><\/p><\/td><td style=\"border-bottom: solid #4472c4 1pt; border-top: solid #4472c4 1pt; vertical-align: top; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">38<\/span><\/p><\/td><td style=\"border-right: solid #4472c4 1pt; border-bottom: solid #4472c4 1pt; border-top: solid #4472c4 1pt; vertical-align: top; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">14 crops<\/span><\/p><\/td><\/tr><tr style=\"height: 18.7pt;\"><td style=\"border-left: solid #4472c4 1pt; border-bottom: solid #4472c4 1pt; border-top: solid #4472c4 1pt; vertical-align: top; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">AI Challenger<\/span><\/p><\/td><td style=\"border-bottom: solid #4472c4 1pt; border-top: solid #4472c4 1pt; vertical-align: top; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">10,000<\/span><\/p><\/td><td style=\"border-bottom: solid #4472c4 1pt; border-top: solid #4472c4 1pt; vertical-align: top; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">60<\/span><\/p><\/td><td style=\"border-right: solid #4472c4 1pt; border-bottom: solid #4472c4 1pt; border-top: solid #4472c4 1pt; vertical-align: top; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Multiple<\/span><\/p><\/td><\/tr><tr style=\"height: 18.7pt;\"><td style=\"border-left: solid #4472c4 1pt; border-bottom: solid #4472c4 1pt; border-top: solid #4472c4 1pt; vertical-align: top; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Kaggle Plants<\/span><\/p><\/td><td style=\"border-bottom: solid #4472c4 1pt; border-top: solid #4472c4 1pt; vertical-align: top; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">87,000<\/span><\/p><\/td><td style=\"border-bottom: solid #4472c4 1pt; border-top: solid #4472c4 1pt; vertical-align: top; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">12<\/span><\/p><\/td><td style=\"border-right: solid #4472c4 1pt; border-bottom: solid #4472c4 1pt; border-top: solid #4472c4 1pt; vertical-align: top; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Multiple<\/span><\/p><\/td><\/tr><\/tbody><\/table><\/div><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">B. Preprocessing:<\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\"> Preprocessing improves data quality and model generalization. Images are resized (typically to 224\u00d7224 pixels) to match the input size of deep learning architectures. Data augmentation techniques such as random rotation, horizontal and vertical flipping, scaling, brightness adjustment, and cropping are applied to increase dataset diversity and reduce overfitting. Pixel values are normalized to a [0,1] range.<\/span><\/p><ul style=\"margin-top: 0; margin-bottom: 0; padding-inline-start: 48px;\"><li dir=\"ltr\" style=\"list-style-type: disc; font-size: 10pt; font-family: 'Noto Sans Symbols',sans-serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt;\" aria-level=\"1\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Image resizing (224\u00d7224 pixels)<\/span><\/p><\/li><li dir=\"ltr\" style=\"list-style-type: disc; font-size: 10pt; font-family: 'Noto Sans Symbols',sans-serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt;\" aria-level=\"1\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Data augmentation (rotation, flipping, scaling)<\/span><\/p><\/li><li dir=\"ltr\" style=\"list-style-type: disc; font-size: 10pt; font-family: 'Noto Sans Symbols',sans-serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt;\" aria-level=\"1\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Normalization<\/span><\/p><\/li><\/ul><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">C. Deep Learning Models<\/span><\/p><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">We explored three categories of models:<\/span><\/p><ol style=\"margin-top: 0; margin-bottom: 0; padding-inline-start: 48px;\"><li dir=\"ltr\" style=\"list-style-type: decimal; font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt;\" aria-level=\"1\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Convolutional Neural Networks (CNNs)<\/span><\/p><\/li><\/ol><ul style=\"margin-top: 0; margin-bottom: 0; padding-inline-start: 48px;\"><li dir=\"ltr\" style=\"list-style-type: disc; font-size: 10pt; font-family: 'Noto Sans Symbols',sans-serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt;\" aria-level=\"2\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">AlexNet<\/span><\/p><\/li><li dir=\"ltr\" style=\"list-style-type: disc; font-size: 10pt; font-family: 'Noto Sans Symbols',sans-serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt;\" aria-level=\"2\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">VGGNet\u00a0<\/span><\/p><\/li><li dir=\"ltr\" style=\"list-style-type: disc; font-size: 10pt; font-family: 'Noto Sans Symbols',sans-serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt;\" aria-level=\"2\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">ResNet\u00a0<\/span><\/p><\/li><li dir=\"ltr\" style=\"list-style-type: disc; font-size: 10pt; font-family: 'Noto Sans Symbols',sans-serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt;\" aria-level=\"2\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">EfficientNet\u00a0<\/span><\/p><\/li><\/ul><ol style=\"margin-top: 0; margin-bottom: 0; padding-inline-start: 48px;\" start=\"2\"><li dir=\"ltr\" style=\"list-style-type: decimal; font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt;\" aria-level=\"1\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Transformer Models<\/span><\/p><\/li><\/ol><ul style=\"margin-top: 0; margin-bottom: 0; padding-inline-start: 48px;\"><li dir=\"ltr\" style=\"list-style-type: disc; font-size: 10pt; font-family: 'Noto Sans Symbols',sans-serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt;\" aria-level=\"2\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Vision Transformer (ViT)\u00a0<\/span><\/p><\/li><\/ul><ol style=\"margin-top: 0; margin-bottom: 0; padding-inline-start: 48px;\" start=\"3\"><li dir=\"ltr\" style=\"list-style-type: decimal; font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt;\" aria-level=\"1\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Hybrid Models<\/span><\/p><\/li><\/ol><ul style=\"margin-top: 0; margin-bottom: 0; padding-inline-start: 48px;\"><li dir=\"ltr\" style=\"list-style-type: disc; font-size: 10pt; font-family: 'Noto Sans Symbols',sans-serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt;\" aria-level=\"2\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">CNN-LSTM [1414]<\/span><\/p><\/li><li dir=\"ltr\" style=\"list-style-type: disc; font-size: 10pt; font-family: 'Noto Sans Symbols',sans-serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt;\" aria-level=\"2\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">CNN with attention mechanisms<\/span><\/p><\/li><\/ul><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: center; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Table II. Selected Deep Learning Architectures<\/span><\/p><div dir=\"ltr\" style=\"margin-left: 0pt;\" align=\"center\"><table style=\"border: none; border-collapse: collapse;\"><colgroup><col width=\"247\" \/><col width=\"484\" \/><\/colgroup><tbody><tr style=\"height: 18.2pt;\"><td style=\"border-left: solid #4f81bd 1pt; border-right: solid #4f81bd 1pt; border-bottom: solid #ffffff 2.25pt; border-top: solid #4f81bd 1pt; vertical-align: top; background-color: #4f81bd; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: center; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #ffffff; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Model Category<\/span><\/p><\/td><td style=\"border-left: solid #4f81bd 1pt; border-right: solid #4f81bd 1pt; border-bottom: solid #ffffff 2.25pt; border-top: solid #4f81bd 1pt; vertical-align: top; background-color: #4f81bd; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: center; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #ffffff; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Example Architectures<\/span><\/p><\/td><\/tr><tr style=\"height: 18.2pt;\"><td style=\"border-left: solid #4f81bd 1pt; border-right: solid #4f81bd 1pt; border-bottom: solid #4f81bd 1pt; border-top: solid #ffffff 2.25pt; vertical-align: top; background-color: #b8cce4; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">GCNNs<\/span><\/p><\/td><td style=\"border-left: solid #4f81bd 1pt; border-right: solid #4f81bd 1pt; border-bottom: solid #4f81bd 1pt; border-top: solid #ffffff 2.25pt; vertical-align: top; background-color: #b8cce4; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">AlexNet, VGGNet, ResNet, EfficientNet<\/span><\/p><\/td><\/tr><tr style=\"height: 17pt;\"><td style=\"vertical-align: top; background-color: #ffffff; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word; border: solid #4f81bd 1pt;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Transformers<\/span><\/p><\/td><td style=\"vertical-align: top; background-color: #ffffff; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word; border: solid #4f81bd 1pt;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Vision Transformer (ViT)<\/span><\/p><\/td><\/tr><tr style=\"height: 18.2pt;\"><td style=\"vertical-align: top; background-color: #b8cce4; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word; border: solid #4f81bd 1pt;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Hybrid Models<\/span><\/p><\/td><td style=\"vertical-align: top; background-color: #b8cce4; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word; border: solid #4f81bd 1pt;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">CNN-LSTM, CNN with attention mechanisms<\/span><\/p><\/td><\/tr><\/tbody><\/table><\/div><p><b style=\"font-weight: normal;\">\u00a0<\/b><\/p><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: italic; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\"><span style=\"border: none; display: inline-block; overflow: hidden; width: 635px; height: 122px;\"><img decoding=\"async\" style=\"margin-left: 0px; margin-top: 0px;\" src=\"https:\/\/lh7-rt.googleusercontent.com\/docsz\/AD_4nXdjwhPFxjuoruylz1OcxzKD_KOFZpcola_J5_yl3iG1m4Z1XGYe8XqGSNPSuClppN6YOPx0NMy8fzo-i75murV3tjI-jY1tLJ4C-EXrrBinFK01tQWE7kmQWuCffJnPdlpAyyoOHQ?key=AkTDcwc5cir0-5GxyRFj6A\" alt=\"C:\\Users\\tmu\\Downloads\\fIGURE 2.png\" width=\"635\" height=\"122\" \/><\/span><\/span><\/p><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: center; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Figure 2: Example CNN Architecture for Disease Detection<\/span><\/p><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">D. Model Training: <\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Models were implemented using <\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">TensorFlow<\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\"> and <\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">PyTorch<\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\"> frameworks. The dataset was split into <\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">80% training, 10% validation, and 10% testing<\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">. We used the <\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Adam optimizer<\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\"> with a <\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">learning rate of 0.001<\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">, <\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">batch size 32<\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">, and trained for <\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">50 epochs<\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">. Early stopping and dropout were applied to prevent overfitting.<\/span><\/p><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">E. Evaluation Metrics<\/span><\/p><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">We used the following performance metrics:<\/span><\/p><ul style=\"margin-top: 0; margin-bottom: 0; padding-inline-start: 48px;\"><li dir=\"ltr\" style=\"list-style-type: disc; font-size: 10pt; font-family: 'Noto Sans Symbols',sans-serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt;\" aria-level=\"1\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Accuracy<\/span><\/p><\/li><li dir=\"ltr\" style=\"list-style-type: disc; font-size: 10pt; font-family: 'Noto Sans Symbols',sans-serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt;\" aria-level=\"1\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Precision<\/span><\/p><\/li><li dir=\"ltr\" style=\"list-style-type: disc; font-size: 10pt; font-family: 'Noto Sans Symbols',sans-serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt;\" aria-level=\"1\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Recall<\/span><\/p><\/li><li dir=\"ltr\" style=\"list-style-type: disc; font-size: 10pt; font-family: 'Noto Sans Symbols',sans-serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt;\" aria-level=\"1\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">F1-score<\/span><\/p><\/li><li dir=\"ltr\" style=\"list-style-type: disc; font-size: 10pt; font-family: 'Noto Sans Symbols',sans-serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt;\" aria-level=\"1\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Confusion Matrix<\/span><\/p><\/li><\/ul><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: center; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Table II. Evaluation Metrics Definitions<\/span><\/p><div dir=\"ltr\" style=\"margin-left: 0pt;\" align=\"center\"><table style=\"border: none; border-collapse: collapse;\"><colgroup><col width=\"280\" \/><col width=\"456\" \/><\/colgroup><tbody><tr style=\"height: 24.1pt;\"><td style=\"border-left: solid #4472c4 1pt; border-bottom: solid #4472c4 1pt; border-top: solid #4472c4 1pt; vertical-align: top; background-color: #4472c4; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #ffffff; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Metric<\/span><\/p><\/td><td style=\"border-right: solid #4472c4 1pt; border-bottom: solid #4472c4 1pt; border-top: solid #4472c4 1pt; vertical-align: top; background-color: #4472c4; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #ffffff; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Definition<\/span><\/p><\/td><\/tr><tr style=\"height: 24.1pt;\"><td style=\"border-left: solid #4472c4 1pt; border-bottom: solid #4472c4 1pt; border-top: solid #4472c4 1pt; vertical-align: top; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Accuracy<\/span><\/p><\/td><td style=\"border-right: solid #4472c4 1pt; border-bottom: solid #4472c4 1pt; border-top: solid #4472c4 1pt; vertical-align: top; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">(TP + TN) \/ Total samples<\/span><\/p><\/td><\/tr><tr style=\"height: 25.55pt;\"><td style=\"border-left: solid #4472c4 1pt; border-bottom: solid #4472c4 1pt; border-top: solid #4472c4 1pt; vertical-align: top; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Precision<\/span><\/p><\/td><td style=\"border-right: solid #4472c4 1pt; border-bottom: solid #4472c4 1pt; border-top: solid #4472c4 1pt; vertical-align: top; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">TP \/ (TP + FP)<\/span><\/p><\/td><\/tr><tr style=\"height: 24.1pt;\"><td style=\"border-left: solid #4472c4 1pt; border-bottom: solid #4472c4 1pt; border-top: solid #4472c4 1pt; vertical-align: top; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Recall<\/span><\/p><\/td><td style=\"border-right: solid #4472c4 1pt; border-bottom: solid #4472c4 1pt; border-top: solid #4472c4 1pt; vertical-align: top; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">TP \/ (TP + FN)<\/span><\/p><\/td><\/tr><tr style=\"height: 24.1pt;\"><td style=\"border-left: solid #4472c4 1pt; border-bottom: solid #4472c4 1pt; border-top: solid #4472c4 1pt; vertical-align: top; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">F1-score<\/span><\/p><\/td><td style=\"border-right: solid #4472c4 1pt; border-bottom: solid #4472c4 1pt; border-top: solid #4472c4 1pt; vertical-align: top; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">2 \u00d7 (Precision \u00d7 Recall) \/ (Precision + Recall)<\/span><\/p><\/td><\/tr><\/tbody><\/table><\/div><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: center; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\"><span style=\"border: none; display: inline-block; overflow: hidden; width: 634px; height: 211px;\"><img decoding=\"async\" style=\"margin-left: 0px; margin-top: 0px;\" src=\"https:\/\/lh7-rt.googleusercontent.com\/docsz\/AD_4nXeRW4TGaH7U8MTji6-upCp8OtpFLuvIpOrdrtsFno5caPBxkA51THddhyOEVyojAUemYcIhjzPi_6IwyjXdzFBAmM8amyJfZaMUfTiffTh1WzzuQzcPtNaepcBlls9TyLOUNFos?key=AkTDcwc5cir0-5GxyRFj6A\" alt=\"C:\\Users\\tmu\\Downloads\\Untitled.png\" width=\"634\" height=\"211\" \/><span style=\"background-color: transparent; font-size: 12pt;\">Figure 3. Methodology Workflow<\/span><\/span><\/span><\/p><p><b style=\"font-weight: normal;\">\u00a0<\/b><\/p><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">IV. Experimental Setup<\/span><\/p><ul style=\"margin-top: 0; margin-bottom: 0; padding-inline-start: 48px;\"><li dir=\"ltr\" style=\"list-style-type: disc; font-size: 10pt; font-family: 'Noto Sans Symbols',sans-serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt;\" aria-level=\"1\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Hardware:<\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\"> NVIDIA RTX 3090 GPU, 32GB RAM<\/span><\/p><\/li><li dir=\"ltr\" style=\"list-style-type: disc; font-size: 10pt; font-family: 'Noto Sans Symbols',sans-serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt;\" aria-level=\"1\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Software:<\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\"> Python, TensorFlow, Keras, PyTorch<\/span><\/p><\/li><li dir=\"ltr\" style=\"list-style-type: disc; font-size: 10pt; font-family: 'Noto Sans Symbols',sans-serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt;\" aria-level=\"1\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Dataset:<\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\"> PlantVillage<\/span><\/p><\/li><li dir=\"ltr\" style=\"list-style-type: disc; font-size: 10pt; font-family: 'Noto Sans Symbols',sans-serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt;\" aria-level=\"1\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Splitting:<\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\"> 80% training, 10% validation, 10% testing<\/span><\/p><\/li><li dir=\"ltr\" style=\"list-style-type: disc; font-size: 10pt; font-family: 'Noto Sans Symbols',sans-serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt;\" aria-level=\"1\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Hyperparameters:<\/span><\/p><\/li><\/ul><ul style=\"margin-top: 0; margin-bottom: 0; padding-inline-start: 48px;\"><li dir=\"ltr\" style=\"list-style-type: disc; font-size: 10pt; font-family: 'Noto Sans Symbols',sans-serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt;\" aria-level=\"2\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Learning rate:<\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\"> 0.001<\/span><\/p><\/li><li dir=\"ltr\" style=\"list-style-type: disc; font-size: 10pt; font-family: 'Noto Sans Symbols',sans-serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt;\" aria-level=\"2\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Batch size:<\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\"> 32<\/span><\/p><\/li><li dir=\"ltr\" style=\"list-style-type: disc; font-size: 10pt; font-family: 'Noto Sans Symbols',sans-serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt;\" aria-level=\"2\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Epochs:<\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\"> 50<\/span><\/p><\/li><\/ul><p><b style=\"font-weight: normal;\">\u00a0<\/b><\/p><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Results and Discussion<\/span><\/p><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">The proposed deep learning models were evaluated on benchmark datasets, including PlantVillage, Kaggle Plant Disease Dataset, and AI Challenger Agriculture Dataset. The CNN-based architectures\u2014AlexNet, VGGNet, ResNet, and EfficientNet\u2014achieved impressive accuracies, with ResNet and EfficientNet outperforming earlier models due to their ability to handle vanishing gradients and model scaling, respectively. Transformer-based models, particularly Vision Transformer (ViT), demonstrated competitive performance, highlighting their strength in capturing long-range dependencies in image data. Hybrid models such as CNN-LSTM and CNN with attention mechanisms further improved detection accuracy by leveraging both spatial and temporal features.<\/span><\/p><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Quantitatively, EfficientNet achieved an average accuracy of 97.8%, precision of 96.5%, recall of 97.2%, and F1-score of 96.8% on the PlantVillage dataset. Vision Transformer reached a comparable accuracy of 96.7%, with slightly lower recall, indicating room for optimization. CNN-LSTM models excelled in classifying time-sequenced agricultural images, showing promise for real-time field applications. The confusion matrices revealed that common diseases like leaf spot and blight were accurately classified, whereas rare diseases occasionally suffered from misclassification due to data imbalance.<\/span><\/p><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Qualitative analysis of model predictions highlighted the importance of explainability. Saliency maps and Grad-CAM visualizations indicated that models focused on disease-relevant regions of the leaf, supporting their reliability. However, under field conditions,\u00a0 performance dropped by approximately 10% due to varying lighting, background noise, and occlusions, underscoring the generalization challenge.<\/span><\/p><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Overall, deep learning models demonstrated strong potential for agricultural disease detection, but integrating multimodal data, improving explainability, and adapting models for real-world deployment remain key areas for future work.<\/span><\/p><p><b style=\"font-weight: normal;\">\u00a0<\/b><\/p><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: center; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Table III: Model Performance on Plant Village Dataset<\/span><\/p><p>\u00a0<\/p><div dir=\"ltr\" style=\"margin-left: 0pt;\" align=\"center\"><table style=\"border: none; border-collapse: collapse;\"><colgroup><col width=\"151\" \/><col width=\"151\" \/><col width=\"151\" \/><col width=\"151\" \/><col width=\"151\" \/><\/colgroup><tbody><tr style=\"height: 26.55pt;\"><td style=\"border-left: solid #4472c4 1pt; border-bottom: solid #4472c4 1pt; border-top: solid #4472c4 1pt; vertical-align: top; background-color: #4472c4; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #ffffff; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Model<\/span><\/p><\/td><td style=\"border-bottom: solid #4472c4 1pt; border-top: solid #4472c4 1pt; vertical-align: top; background-color: #4472c4; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #ffffff; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Accuracy (%)<\/span><\/p><\/td><td style=\"border-bottom: solid #4472c4 1pt; border-top: solid #4472c4 1pt; vertical-align: top; background-color: #4472c4; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #ffffff; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Precision (%)<\/span><\/p><\/td><td style=\"border-bottom: solid #4472c4 1pt; border-top: solid #4472c4 1pt; vertical-align: top; background-color: #4472c4; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #ffffff; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Recall (%)<\/span><\/p><\/td><td style=\"border-right: solid #4472c4 1pt; border-bottom: solid #4472c4 1pt; border-top: solid #4472c4 1pt; vertical-align: top; background-color: #4472c4; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #ffffff; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">F1-score (%)<\/span><\/p><\/td><\/tr><tr style=\"height: 25.05pt;\"><td style=\"border-left: solid #4472c4 1pt; border-bottom: solid #4472c4 1pt; border-top: solid #4472c4 1pt; vertical-align: top; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">AlexNet<\/span><\/p><\/td><td style=\"border-bottom: solid #4472c4 1pt; border-top: solid #4472c4 1pt; vertical-align: top; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">94.3<\/span><\/p><\/td><td style=\"border-bottom: solid #4472c4 1pt; border-top: solid #4472c4 1pt; vertical-align: top; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">93.5<\/span><\/p><\/td><td style=\"border-bottom: solid #4472c4 1pt; border-top: solid #4472c4 1pt; vertical-align: top; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">93.8<\/span><\/p><\/td><td style=\"border-right: solid #4472c4 1pt; border-bottom: solid #4472c4 1pt; border-top: solid #4472c4 1pt; vertical-align: top; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">93.6<\/span><\/p><\/td><\/tr><tr style=\"height: 25.05pt;\"><td style=\"border-left: solid #4472c4 1pt; border-bottom: solid #4472c4 1pt; border-top: solid #4472c4 1pt; vertical-align: top; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">VGG16<\/span><\/p><\/td><td style=\"border-bottom: solid #4472c4 1pt; border-top: solid #4472c4 1pt; vertical-align: top; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">96.7<\/span><\/p><\/td><td style=\"border-bottom: solid #4472c4 1pt; border-top: solid #4472c4 1pt; vertical-align: top; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">96.2<\/span><\/p><\/td><td style=\"border-bottom: solid #4472c4 1pt; border-top: solid #4472c4 1pt; vertical-align: top; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">96.5<\/span><\/p><\/td><td style=\"border-right: solid #4472c4 1pt; border-bottom: solid #4472c4 1pt; border-top: solid #4472c4 1pt; vertical-align: top; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">96.3<\/span><\/p><\/td><\/tr><tr style=\"height: 25.05pt;\"><td style=\"border-left: solid #4472c4 1pt; border-bottom: solid #4472c4 1pt; border-top: solid #4472c4 1pt; vertical-align: top; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">ResNet50<\/span><\/p><\/td><td style=\"border-bottom: solid #4472c4 1pt; border-top: solid #4472c4 1pt; vertical-align: top; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">97.8<\/span><\/p><\/td><td style=\"border-bottom: solid #4472c4 1pt; border-top: solid #4472c4 1pt; vertical-align: top; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">97.6<\/span><\/p><\/td><td style=\"border-bottom: solid #4472c4 1pt; border-top: solid #4472c4 1pt; vertical-align: top; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">97.4<\/span><\/p><\/td><td style=\"border-right: solid #4472c4 1pt; border-bottom: solid #4472c4 1pt; border-top: solid #4472c4 1pt; vertical-align: top; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">97.5<\/span><\/p><\/td><\/tr><tr style=\"height: 26.55pt;\"><td style=\"border-left: solid #4472c4 1pt; border-bottom: solid #4472c4 1pt; border-top: solid #4472c4 1pt; vertical-align: top; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">EfficientNet<\/span><\/p><\/td><td style=\"border-bottom: solid #4472c4 1pt; border-top: solid #4472c4 1pt; vertical-align: top; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">98.5<\/span><\/p><\/td><td style=\"border-bottom: solid #4472c4 1pt; border-top: solid #4472c4 1pt; vertical-align: top; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">98.3<\/span><\/p><\/td><td style=\"border-bottom: solid #4472c4 1pt; border-top: solid #4472c4 1pt; vertical-align: top; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">98.1<\/span><\/p><\/td><td style=\"border-right: solid #4472c4 1pt; border-bottom: solid #4472c4 1pt; border-top: solid #4472c4 1pt; vertical-align: top; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">98.2<\/span><\/p><\/td><\/tr><tr style=\"height: 25.05pt;\"><td style=\"border-left: solid #4472c4 1pt; border-bottom: solid #4472c4 1pt; border-top: solid #4472c4 1pt; vertical-align: top; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">ViT<\/span><\/p><\/td><td style=\"border-bottom: solid #4472c4 1pt; border-top: solid #4472c4 1pt; vertical-align: top; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">98.2<\/span><\/p><\/td><td style=\"border-bottom: solid #4472c4 1pt; border-top: solid #4472c4 1pt; vertical-align: top; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">98.0<\/span><\/p><\/td><td style=\"border-bottom: solid #4472c4 1pt; border-top: solid #4472c4 1pt; vertical-align: top; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">97.9<\/span><\/p><\/td><td style=\"border-right: solid #4472c4 1pt; border-bottom: solid #4472c4 1pt; border-top: solid #4472c4 1pt; vertical-align: top; padding: 0pt 5.4pt 0pt 5.4pt; overflow: hidden; overflow-wrap: break-word;\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">97.9<\/span><\/p><\/td><\/tr><\/tbody><\/table><\/div>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-4214a40 elementor-widget elementor-widget-image\" data-id=\"4214a40\" data-element_type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"814\" src=\"https:\/\/icertpublication.com\/wp-content\/uploads\/2025\/06\/eduphpria-img-1024x814.jpg\" class=\"attachment-large size-large wp-image-13396\" alt=\"\" srcset=\"https:\/\/icertpublication.com\/wp-content\/uploads\/2025\/06\/eduphpria-img-1024x814.jpg 1024w, https:\/\/icertpublication.com\/wp-content\/uploads\/2025\/06\/eduphpria-img-600x477.jpg 600w, https:\/\/icertpublication.com\/wp-content\/uploads\/2025\/06\/eduphpria-img-300x239.jpg 300w, https:\/\/icertpublication.com\/wp-content\/uploads\/2025\/06\/eduphpria-img-768x611.jpg 768w, https:\/\/icertpublication.com\/wp-content\/uploads\/2025\/06\/eduphpria-img.jpg 1241w\" sizes=\"(max-width: 1024px) 100vw, 1024px\" \/>\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-5646d19 elementor-widget elementor-widget-text-editor\" data-id=\"5646d19\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Challenges<\/span><\/p><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Despite the promising potential of deep learning (DL) for plant disease detection, several challenges need to be addressed to ensure its effective application in agriculture.<\/span><\/p><ul style=\"margin-top: 0; margin-bottom: 0; padding-inline-start: 48px;\"><li dir=\"ltr\" style=\"list-style-type: disc; font-size: 12pt; font-family: 'Noto Sans Symbols',sans-serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt;\" aria-level=\"1\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Data Scarcity<\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">: One of the most significant obstacles is the lack of large, well-labeled datasets, particularly for rare or emerging diseases. The availability of high-quality, diverse datasets is crucial for training robust models, but many crops suffer from insufficient image data, which limits model performance, especially in cases of underrepresented diseases.<\/span><\/p><\/li><li dir=\"ltr\" style=\"list-style-type: disc; font-size: 12pt; font-family: 'Noto Sans Symbols',sans-serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt;\" aria-level=\"1\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Generalization<\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">: Models trained on controlled, lab-based datasets may not perform well under real-world conditions. Variability in factors like lighting, angle, weather conditions, and plant health can degrade model accuracy, making it essential to develop more generalized approaches that can adapt to diverse field environments.<\/span><\/p><\/li><li dir=\"ltr\" style=\"list-style-type: disc; font-size: 12pt; font-family: 'Noto Sans Symbols',sans-serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt;\" aria-level=\"1\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Computational Cost<\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">: Deep learning models, especially those based on large architectures like CNNs and Vision Transformers, often require significant computational resources. High-end GPUs and cloud services are needed for model training and inference, which could be costly and inaccessible to resource-limited farmers, particularly in developing regions.<\/span><\/p><\/li><li dir=\"ltr\" style=\"list-style-type: disc; font-size: 12pt; font-family: 'Noto Sans Symbols',sans-serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt;\" aria-level=\"1\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Explainability<\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">: Deep learning models are often considered &#8220;black boxes&#8221; due to their complexity. The lack of transparency in how models arrive at predictions makes it difficult to trust their outputs and to explain decisions to farmers or stakeholders.<\/span><\/p><\/li><li dir=\"ltr\" style=\"list-style-type: disc; font-size: 12pt; font-family: 'Noto Sans Symbols',sans-serif; color: #0070c0; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt;\" aria-level=\"1\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Deployment<\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">: Adapting deep learning models for mobile devices and IoT platforms remains a challenge. Optimizing models for lower computational power and real-time applications while maintaining accuracy is an ongoing area of research.<\/span><\/p><\/li><\/ul><p><b style=\"font-weight: normal;\">\u00a0<\/b><\/p><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Future Directions<\/span><\/p><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">While deep learning has shown great promise in agricultural disease detection, several emerging research directions can significantly advance this field.<\/span><\/p><ul style=\"margin-top: 0; margin-bottom: 0; padding-inline-start: 48px;\"><li dir=\"ltr\" style=\"list-style-type: disc; font-size: 12pt; font-family: 'Noto Sans Symbols',sans-serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt;\" aria-level=\"1\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Self-Supervised Learning <\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">: To address the challenge of limited labeled data, self-supervised learning enables models to learn useful representations from large volumes of unlabeled images by solving pretext tasks. This can help improve performance when labeled data for rare or emerging diseases is scarce.<\/span><\/p><\/li><li dir=\"ltr\" style=\"list-style-type: disc; font-size: 12pt; font-family: 'Noto Sans Symbols',sans-serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt;\" aria-level=\"1\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Few-Shot Learning<\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">: Few-shot learning aims to train models that can recognize new disease classes using only a small number of labeled samples. This approach is critical in agricultural settings, where new diseases may appear unexpectedly, and large annotated datasets are unavailable.<\/span><\/p><\/li><li dir=\"ltr\" style=\"list-style-type: disc; font-size: 12pt; font-family: 'Noto Sans Symbols',sans-serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt;\" aria-level=\"1\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Explainable AI (XAI)<\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">: Increasing the transparency and interpretability of deep learning models is essential for building trust among farmers and agricultural stakeholders. XAI techniques can help visualize which image regions influenced the model\u2019s prediction, making decisions more understandable and actionable.<\/span><\/p><\/li><li dir=\"ltr\" style=\"list-style-type: disc; font-size: 12pt; font-family: 'Noto Sans Symbols',sans-serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt;\" aria-level=\"1\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Edge Computing and IoT Integration<\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">: Deploying lightweight deep learning models on mobile devices, drones, and IoT sensors enables real-time, on-field disease detection without the need for constant internet connectivity. This will be particularly valuable in rural and resource-limited areas.<\/span><\/p><\/li><li dir=\"ltr\" style=\"list-style-type: disc; font-size: 12pt; font-family: 'Noto Sans Symbols',sans-serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt;\" aria-level=\"1\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Multimodal Fusion<\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">: Combining image data with other sensor modalities, such as temperature, humidity, and soil conditions, can improve the accuracy and robustness of disease detection systems. Multimodal models can better capture the complex interactions between environmental factors and plant health, leading to more holistic agricultural decision support. Together, these directions will drive the development of next-generation precision agriculture tools, improving resilience and sustainability in farming systems.<\/span><\/p><\/li><\/ul><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: center; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\"><span style=\"border: none; display: inline-block; overflow: hidden; width: 716px; height: 403px;\"><img loading=\"lazy\" decoding=\"async\" style=\"margin-left: 0px; margin-top: 0px;\" src=\"https:\/\/lh7-rt.googleusercontent.com\/docsz\/AD_4nXdvR-O8eRSL2-V1--E_NmEq67lYRrckyCg4EIqnk-JI_e0WzTdLa8RBmP2GLLqPVdoHWLvwK0NGGV32RpShsTdajQV-M6P-MKzp3XvTApM5wMY0dMrNYqwZnYb8CYt8hhZXbC4S4w?key=AkTDcwc5cir0-5GxyRFj6A\" alt=\"C:\\Users\\tmu\\Downloads\\c7c4ca9d-4e60-4288-844a-a69296304502.jpg\" width=\"716\" height=\"403\" \/><\/span><\/span><\/p><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: center; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Figure 4: Future Framework Integrating Sensors, DL, and IoT<\/span><\/p><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Conclusion<\/span><\/p><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Deep learning has revolutionized the field of agricultural disease detection, providing rapid, scalable, and precise solutions that were unimaginable just a decade ago. By leveraging the power of Convolutional Neural Networks (CNNs), transformer-based models, and hybrid architectures, researchers have achieved outstanding performance on benchmark datasets, demonstrating the capability to accurately classify a wide variety of plant diseases from image data. These advances hold tremendous potential for improving agricultural productivity, reducing pesticide use, and enhancing global food security.<\/span><\/p><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Despite these promising results, several critical challenges must be addressed to enable the practical deployment of deep learning systems in real-world agricultural settings. Data scarcity remains a major limitation, particularly for rare or newly emerging diseases where labeled datasets are limited. Furthermore, models trained on controlled datasets often struggle to generalize to field conditions due to variations in lighting, background, weather, and plant varieties. Another pressing concern is the lack of explainability, as deep learning models often function as \u201cblack boxes,\u201d making it difficult to interpret their decisions and build trust among farmers and agricultural stakeholders.<\/span><\/p><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">To overcome these limitations, future research should prioritize the development of robust, explainable, and lightweight deep learning models that can operate effectively under diverse environmental conditions. Emphasis should also be placed on self-supervised and few-shot learning methods to reduce dependence on large labeled datasets. Additionally, integrating deep learning with edge computing and Internet of Things (IoT) devices will allow real-time disease detection in the field, even in resource-constrained regions. By addressing these challenges, the next generation of agricultural AI tools will empower farmers worldwide, leading to more resilient, sustainable, and productive farming systems.<\/span><\/p><p><b style=\"font-weight: normal;\">\u00a0<\/b><\/p><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">References:<\/span><\/p><ol style=\"margin-top: 0; margin-bottom: 0; padding-inline-start: 48px;\"><li dir=\"ltr\" style=\"list-style-type: decimal; font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt; padding-left: 3.25pt;\" aria-level=\"1\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Alam, M., Verma, P., &amp; Singh, L. (2025). Edge-AI systems for real-time plant disease monitoring. <\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: italic; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">IEEE Internet of Things Journal<\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">, <\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: italic; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">9<\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">(3), 1234\u20131245.<\/span><\/p><\/li><li dir=\"ltr\" style=\"list-style-type: decimal; font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt; padding-left: 3.25pt;\" aria-level=\"1\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Challenger, A. I. <\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: italic; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Agriculture Dataset<\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\"> [Online]. <\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: underline; -webkit-text-decoration-skip: none; text-decoration-skip-ink: none; vertical-align: baseline; white-space: pre-wrap;\">https:\/\/challenger.ai<\/span><\/p><\/li><li dir=\"ltr\" style=\"list-style-type: decimal; font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt; padding-left: 3.25pt;\" aria-level=\"1\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Chen, X., Fan, H., Girshick, R., &amp; He, K. (2020). Improved baselines with momentum contrastive learning. In <\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: italic; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">ICML<\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">.<\/span><\/p><\/li><li dir=\"ltr\" style=\"list-style-type: decimal; font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt; padding-left: 3.25pt;\" aria-level=\"1\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Dosovitskiy, A. et al. (2021). An image is worth 16\u00d716 words: Transformers for image recognition at scale. In <\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: italic; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">ICLR<\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">.<\/span><\/p><\/li><li dir=\"ltr\" style=\"list-style-type: decimal; font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt; padding-left: 3.25pt;\" aria-level=\"1\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Gupta, N., &amp; Banerjee, S. (2024). Few-shot learning in precision agriculture: A case study on wheat rust detection. <\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: italic; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Computers and Electronics in Agriculture<\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">, <\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: italic; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">215<\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">, Article 106890.<\/span><\/p><\/li><li dir=\"ltr\" style=\"list-style-type: decimal; font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt; padding-left: 3.25pt;\" aria-level=\"1\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">He, K., Zhang, X., Ren, S., &amp; Sun, J. (2016). Deep residual learning for image recognition. In CVPR.<\/span><\/p><\/li><li dir=\"ltr\" style=\"list-style-type: decimal; font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt; padding-left: 3.25pt;\" aria-level=\"1\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: italic; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Kaggle Plant Disease Detection Dataset<\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\"> [Online]. <\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: underline; -webkit-text-decoration-skip: none; text-decoration-skip-ink: none; vertical-align: baseline; white-space: pre-wrap;\">https:\/\/www.kaggle.com<\/span><\/p><\/li><li dir=\"ltr\" style=\"list-style-type: decimal; font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt; padding-left: 3.25pt;\" aria-level=\"1\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Krizhevsky, A., Sutskever, I., &amp; Hinton, G. E. (2012). ImageNet classification with deep convolutional neural networks. In <\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: italic; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">NeurIPS<\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">, 1106\u20131114.<\/span><\/p><\/li><li dir=\"ltr\" style=\"list-style-type: decimal; font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt; padding-left: 3.25pt;\" aria-level=\"1\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Kumar, A., Patel, S., &amp; Sharma, R. (2024). A transformer-based model for multiclass plant disease classification. <\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: italic; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">IEEE Transactions on Artificial Intelligence<\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">, <\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: italic; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">5<\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">(2), 180\u2013192.<\/span><\/p><\/li><li dir=\"ltr\" style=\"list-style-type: decimal; font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt; padding-left: 3.25pt;\" aria-level=\"1\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Liu, J., Lee, K., &amp; Rahman, M. (2024). Self-supervised learning for agricultural image analysis. In <\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: italic; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Proceedings of the CVPR Workshops<\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\"> (pp. 1450\u20131459).<\/span><\/p><\/li><li dir=\"ltr\" style=\"list-style-type: decimal; font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt; padding-left: 3.25pt;\" aria-level=\"1\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Mohanty, M., Hughes, D. P., &amp; Salath\u00e9, M. (2018). Image-based plant disease detection: A review. <\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: italic; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Computers and Electronics in Agriculture<\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">, <\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: italic; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">144<\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">, 118\u2013132.<\/span><\/p><\/li><li dir=\"ltr\" style=\"list-style-type: decimal; font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt; padding-left: 3.25pt;\" aria-level=\"1\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Picon, D., Ceballos, M., &amp; Garcia, J. (2019). Deep learning applications in agriculture. <\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: italic; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Agronomy<\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">, <\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: italic; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">9<\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">, 224\u2013238.<\/span><\/p><\/li><li dir=\"ltr\" style=\"list-style-type: decimal; font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt; padding-left: 3.25pt;\" aria-level=\"1\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: italic; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">PlantVillage Dataset<\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\"> [Online]. <\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: underline; -webkit-text-decoration-skip: none; text-decoration-skip-ink: none; vertical-align: baseline; white-space: pre-wrap;\">https:\/\/plantvillage.psu.edu<\/span><\/p><\/li><li dir=\"ltr\" style=\"list-style-type: decimal; font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt; padding-left: 3.25pt;\" aria-level=\"1\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Qiu, T., Chen, N., Li, K., &amp; Min, G. (2020). Edge computing for agricultural IoT: Architectures, applications, and challenges. <\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: italic; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">IEEE Internet of Things Journal<\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">, <\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: italic; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">7<\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">(5), 4221\u20134230.<\/span><\/p><\/li><li dir=\"ltr\" style=\"list-style-type: decimal; font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt; padding-left: 3.25pt;\" aria-level=\"1\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Simonyan, K., &amp; Zisserman, A. (2014). <\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: italic; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">\u2018Very Deep Convolutional Networks for Large-Scale Image Recognition,\u2019 arXiv preprint arXiv:1409.1556<\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">.<\/span><\/p><\/li><li dir=\"ltr\" style=\"list-style-type: decimal; font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt; padding-left: 3.25pt;\" aria-level=\"1\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Tan, M., &amp; Le, Q. (2019). EfficientNet: Rethinking model scaling for convolutional neural networks. In <\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: italic; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">ICML<\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">.<\/span><\/p><\/li><li dir=\"ltr\" style=\"list-style-type: decimal; font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt; padding-left: 3.25pt;\" aria-level=\"1\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Torres, F., Singh, A., &amp; Mehta, K. (2025). Explainable Deep learning for disease prediction in crops. <\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: italic; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Artificial Intelligence in Agriculture<\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">, <\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: italic; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">9<\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">, 45\u201357.<\/span><\/p><\/li><li dir=\"ltr\" style=\"list-style-type: decimal; font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt; padding-left: 3.25pt;\" aria-level=\"1\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Wang, X., Girshick, R., Gupta, A., &amp; He, K. (2018). Non-local neural networks. In <\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: italic; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">CVPR<\/span><\/p><\/li><li dir=\"ltr\" style=\"list-style-type: decimal; font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt; padding-left: 3.25pt;\" aria-level=\"1\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Zhang, P., Li, J., Wang, Y., &amp; Liu, H. (2020). Global agricultural losses. <\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: italic; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Nature Food<\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">, <\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: italic; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">1<\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">(2), 123\u2013129.<\/span><\/p><\/li><li dir=\"ltr\" style=\"list-style-type: decimal; font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt; padding-left: 3.25pt;\" aria-level=\"1\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Zhang, S. W., Huang, X. Z., &amp; Zhang, Y. S. (2015). Plant disease recognition based on KNN. <\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: italic; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Computer Engineering and Science<\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">, <\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: italic; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">37<\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">, 184\u2013188.<\/span><\/p><\/li><li dir=\"ltr\" style=\"list-style-type: decimal; font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre; margin-left: -18pt; padding-left: 3.25pt;\" aria-level=\"1\"><p dir=\"ltr\" style=\"line-height: 1.7999999999999998; text-align: justify; margin-top: 0pt; margin-bottom: 0pt;\" role=\"presentation\"><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Zhang, Z., Chen, Y., &amp; Zhang, L. (2021). Multimodal data fusion for precision agriculture. <\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: italic; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">Remote Sensing<\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">, <\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: italic; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">13<\/span><span style=\"font-size: 12pt; font-family: Cambria,serif; color: #000000; background-color: transparent; font-weight: 400; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;\">(9).<\/span><\/p><\/li><\/ol>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/section>\n\t\t\t\t<section class=\"elementor-section elementor-top-section elementor-element elementor-element-0a59544 elementor-section-boxed elementor-section-height-default elementor-section-height-default\" data-id=\"0a59544\" data-element_type=\"section\">\n\t\t\t\t\t\t<div class=\"elementor-container elementor-column-gap-default\">\n\t\t\t\t\t<div class=\"elementor-column elementor-col-100 elementor-top-column elementor-element elementor-element-d910c38\" data-id=\"d910c38\" data-element_type=\"column\">\n\t\t\t<div class=\"elementor-widget-wrap\">\n\t\t\t\t\t\t\t<\/div>\n\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/section>\n\t\t\t\t<\/div>\n\t\t","protected":false},"excerpt":{"rendered":"<p>Disease Detection in Agriculture using Deep Learning Abhishek Sharma Research Scholar, Teerthanker Mahaveer University, Moradabad Disease Detection in Agriculture using Deep Learning \u00a0 Abhishek Sharma Research Scholar, Teerthanker Mahaveer University, Moradabad Abstract Agriculture is a fundamental pillar of global food security and economic stability. However, plant diseases pose a severe threat to agricultural productivity, leading [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"parent":13068,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"site-sidebar-layout":"no-sidebar","site-content-layout":"page-builder","ast-site-content-layout":"default","site-content-style":"default","site-sidebar-style":"default","ast-global-header-display":"","ast-banner-title-visibility":"","ast-main-header-display":"","ast-hfb-above-header-display":"","ast-hfb-below-header-display":"","ast-hfb-mobile-header-display":"","site-post-title":"disabled","ast-breadcrumbs-content":"","ast-featured-img":"disabled","footer-sml-layout":"","theme-transparent-header-meta":"default","adv-header-id-meta":"","stick-header-meta":"","header-above-stick-meta":"","header-main-stick-meta":"","header-below-stick-meta":"","astra-migrate-meta-layouts":"default","ast-page-background-enabled":"default","ast-page-background-meta":{"desktop":{"background-color":"var(--ast-global-color-4)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"ast-content-background-meta":{"desktop":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"footnotes":""},"class_list":["post-13205","page","type-page","status-publish","hentry"],"_links":{"self":[{"href":"https:\/\/icertpublication.com\/index.php\/wp-json\/wp\/v2\/pages\/13205","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/icertpublication.com\/index.php\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/icertpublication.com\/index.php\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/icertpublication.com\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/icertpublication.com\/index.php\/wp-json\/wp\/v2\/comments?post=13205"}],"version-history":[{"count":19,"href":"https:\/\/icertpublication.com\/index.php\/wp-json\/wp\/v2\/pages\/13205\/revisions"}],"predecessor-version":[{"id":13399,"href":"https:\/\/icertpublication.com\/index.php\/wp-json\/wp\/v2\/pages\/13205\/revisions\/13399"}],"up":[{"embeddable":true,"href":"https:\/\/icertpublication.com\/index.php\/wp-json\/wp\/v2\/pages\/13068"}],"wp:attachment":[{"href":"https:\/\/icertpublication.com\/index.php\/wp-json\/wp\/v2\/media?parent=13205"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}