Multispectral imaging systems combined with deep learning classification models can be cost-effective tools for the early detection of apple scab (Venturia inaequalis) disease in commercial orchards. Near-infrared (NIR) imagery can display apple scab symptoms earlier and at a greater severity than visible-spectrum (RGB) imagery. Early apple scab diagnosis based on NIR imagery may be automated using deep learning convolutional neural networks (CNNs). CNN models have previously been used to classify a range of apple diseases accurately but have primarily focused on identifying late-stage rather than early-stage detection. This study fine-tunes CNN models to classify apple scab symptoms as they progress from the early to late stages of infection using a novel multispectral (RGB-NIR) time series created especially for this purpose.
This novel multispectral dataset was used in conjunction with a large Apple Disease Identification (ADID) dataset created from publicly available, pre-existing disease datasets. This ADID dataset contained 29,000 images of infection symptoms across six disease classes. Two CNN models, the lightweight MobileNetV2 and heavyweight EfficientNetV2L, were fine-tuned and used to classify each disease class in a testing dataset, with performance assessed through metrics derived from confusion matrices. The models achieved scab-prediction accuracies of 97.13 % and 97.57 % for MobileNetV2 and EfficientNetV2L, respectively, on the secondary data but only achieved accuracies of 74.12 % and 78.91 % when applied to the multispectral dataset in isolation. These lower performance scores were attributed to a higher proportion of false-positive scab predictions in the multispectral dataset. Time series analyses revealed that both models could classify apple scab infections earlier than the manual classification techniques, leading to more false-positive assessments, and could accurately distinguish between healthy and infected samples up to 7 days post-inoculation in NIR imagery.