Date of Award
Spring 2021
Degree Type
Open Access Dissertation
Degree Name
Management of Information System and Technology, PhD
Advisor/Supervisor/Committee Chair
Brian Hilton
Dissertation or Thesis Committee Member
Lorne Olfman
Dissertation or Thesis Committee Member
Zachary Dodds
Dissertation or Thesis Committee Member
Zachary Dodds
Terms of Use & License Information
Rights Information
© 2021 Joseph K Mbugua
Keywords
Deep Learning, Early Detection and Identification of Plant Disease, Multispectral Aerial Imagery, Plant and Crop Diseases, Spatialtemporal Analysis, Transfer Learning
Abstract
Production of food crops is hampered by the proliferation of crop diseases which cause huge harvest losses. Current crop-health monitoring programs involve the deployment of scouts and experts to detect and identify crop diseases through visual observation. These monitoring schemes are expensive and too slow to offer timely remedial recommendations for preventing the spread of these crop-damaging diseases. There is thus a need for the development of cheaper and faster methods for identifying and monitoring crop diseases. Recent advances in deep learning have enabled the development of automatic and accurate image classification systems. These advances coupled with the widespread availability of multispectral aerial imagery provide a cost-effective method for developing crop-diseases classification tools. However, large datasets are required to train deep learning models, which may be costly and difficult to obtain. Fortunately, models trained on one task can be repurposed for different tasks (with limited data) using transfer learning technique. The purpose of this research was to develop and implement an end-to-end deep learning framework for early detection and continuous monitoring of crop diseases using transfer learning and high resolution, multispectral aerial imagery. In the first study, the technique was used to compare the performance of five pre-trained deep learning convolutional neural networks (VGG16, VGG19, ResNet50, Inception V3, and Xception) in classifying crop diseases for apples, grapes, and tomatoes. The results of the study show that the best performing crop-disease classification models were those trained on the VGG16 network, while those trained on the ResNet50 network had the worst performance. The other studies compared the performance of using transfer learning and different three-band color combinations to train single- and multiple-crop classification models. The results of these studies show that models combining red, near infrared, and blue bands performed better than models trained with the traditional visible spectral band combination of red, green, and blue. The worst performing models were those combining near infrared, green, and blue bands. This research recommends that further studies be undertaken to determine the best band combinations for training single- and multi-label classification models for both crops and plants and diseases that affect them.
DOI
10.5642/cguetd/218
ISBN
9798738627163
Recommended Citation
Mbugua, Joseph Kimani. (2021). Deep Learning for Early Detection, Identification, and Spatiotemporal Monitoring of Plant Diseases Using Multispectral Aerial Imagery. CGU Theses & Dissertations, 218. https://scholarship.claremont.edu/cgu_etd/218. doi: 10.5642/cguetd/218