Convolutional Neural Network for Automatic Identification of Plant Diseases with Limited Data.
crop disease classification
few-shot learning
metric learning
transfer learning
Journal
Plants (Basel, Switzerland)
ISSN: 2223-7747
Titre abrégé: Plants (Basel)
Pays: Switzerland
ID NLM: 101596181
Informations de publication
Date de publication:
24 Dec 2020
24 Dec 2020
Historique:
received:
02
12
2020
revised:
19
12
2020
accepted:
21
12
2020
entrez:
30
12
2020
pubmed:
31
12
2020
medline:
31
12
2020
Statut:
epublish
Résumé
Automated identification of plant diseases is very important for crop protection. Most automated approaches aim to build classification models based on leaf or fruit images. These approaches usually require the collection and annotation of many images, which is difficult and costly process especially in the case of new or rare diseases. Therefore, in this study, we developed and evaluated several methods for identifying plant diseases with little data. Convolutional Neural Networks (CNNs) are used due to their superior ability to transfer learning. Three CNN architectures (ResNet18, ResNet34, and ResNet50) were used to build two baseline models, a Triplet network and a deep adversarial Metric Learning (DAML) approach. These approaches were trained from a large source domain dataset and then tuned to identify new diseases from few images, ranging from 5 to 50 images per disease. The proposed approaches were also evaluated in the case of identifying the disease and plant species together or only if the disease was identified, regardless of the affected plant. The evaluation results demonstrated that a baseline model trained with a large set of source field images can be adapted to classify new diseases from a small number of images. It can also take advantage of the availability of a larger number of images. In addition, by comparing it with metric learning methods, we found that baseline model has better transferability when the source domain images differ from the target domain images significantly or are captured in different conditions. It achieved an accuracy of 99% when the shift from source domain to target domain was small and 81% when that shift was large and outperformed all other competitive approaches.
Identifiants
pubmed: 33374398
pii: plants10010028
doi: 10.3390/plants10010028
pmc: PMC7823428
pii:
doi:
Types de publication
Journal Article
Langues
eng
Subventions
Organisme : Deputyship for Research & Innovation, Ministry of Education in Saudi Arabia
ID : IFT20139
Références
Front Plant Sci. 2018 Aug 29;9:1162
pubmed: 30210509
Front Plant Sci. 2020 Jul 15;11:1082
pubmed: 32760419
Plants (Basel). 2020 Oct 01;9(10):
pubmed: 33019765
IEEE Trans Neural Netw. 2002;13(2):415-25
pubmed: 18244442
Appl Plant Sci. 2020 Sep 28;8(9):e11390
pubmed: 33014634
Plants (Basel). 2019 Oct 31;8(11):
pubmed: 31683734
Sensors (Basel). 2017 Sep 04;17(9):
pubmed: 28869539
Front Plant Sci. 2017 Oct 27;8:1852
pubmed: 29163582