Recognition and counting of greenhouse pests are important for monitoring and forecasting pest population dynamics.This study used image processing techniques to recognize and count whiteflies and thrips on a sticky t...Recognition and counting of greenhouse pests are important for monitoring and forecasting pest population dynamics.This study used image processing techniques to recognize and count whiteflies and thrips on a sticky trap located in a greenhouse environment.The digital images of sticky traps were collected using an image-acquisition system under different greenhouse conditions.If a single color space is used,it is difficult to segment the small pests correctly because of the detrimental effects of non-uniform illumination in complex scenarios.Therefore,a method that first segments object pests in two color spaces using the Prewitt operator in I component of the hue-saturation-intensity(HSI)color space and the Canny operator in the B component of the Lab color space was proposed.Then,the segmented results for the two-color spaces were summed and achieved 91.57%segmentation accuracy.Next,because different features of pests contribute differently to the classification of pest species,the study extracted multiple features(e.g.,color and shape features)in different color spaces for each segmented pest region to improve the recognition performance.Twenty decision trees were used to form a strong ensemble learning classifier that used a majority voting mechanism and obtains 95.73%recognition accuracy.The proposed method is a feasible and effective way to process greenhouse pest images.The system accurately recognized and counted pests in sticky trap images captured under real greenhouse conditions.展开更多
Insect pest control is considered as a significant factor in the yield of commercial crops.Thus,to avoid economic losses,we need a valid method for insect pest recognition.In this paper,we proposed a feature fusion re...Insect pest control is considered as a significant factor in the yield of commercial crops.Thus,to avoid economic losses,we need a valid method for insect pest recognition.In this paper,we proposed a feature fusion residual block to perform the insect pest recognition task.Based on the original residual block,we fused the feature from a previous layer between two 11 convolution layers in a residual signal branch to improve the capacity of the block.Furthermore,we explored the contribution of each residual group to the model performance.We found that adding the residual blocks of earlier residual groups promotes the model performance significantly,which improves the capacity of generalization of the model.By stacking the feature fusion residual block,we constructed the Deep Feature Fusion Residual Network(DFF-ResNet).To prove the validity and adaptivity of our approach,we constructed it with two common residual networks(Pre-ResNet and Wide Residual Network(WRN))and validated these models on the Canadian Institute For Advanced Research(CIFAR)and Street View House Number(SVHN)benchmark datasets.The experimental results indicate that our models have a lower test error than those of baseline models.Then,we applied our models to recognize insect pests and obtained validity on the IP102 benchmark dataset.The experimental results show that our models outperform the original ResNet and other state-of-the-art methods.展开更多
基金This work was financially supported by the National Natural Science Foundation of China(Grant No.61601034)and the National Natural Science Foundation of China(Grant No.31871525)The authors acknowledge Kimberly Moravec,PhD,from Liwen Bianji,Edanz Editing China(www.liwenbianji.cn/ac),for editing the English text of a draft of this manuscript.
文摘Recognition and counting of greenhouse pests are important for monitoring and forecasting pest population dynamics.This study used image processing techniques to recognize and count whiteflies and thrips on a sticky trap located in a greenhouse environment.The digital images of sticky traps were collected using an image-acquisition system under different greenhouse conditions.If a single color space is used,it is difficult to segment the small pests correctly because of the detrimental effects of non-uniform illumination in complex scenarios.Therefore,a method that first segments object pests in two color spaces using the Prewitt operator in I component of the hue-saturation-intensity(HSI)color space and the Canny operator in the B component of the Lab color space was proposed.Then,the segmented results for the two-color spaces were summed and achieved 91.57%segmentation accuracy.Next,because different features of pests contribute differently to the classification of pest species,the study extracted multiple features(e.g.,color and shape features)in different color spaces for each segmented pest region to improve the recognition performance.Twenty decision trees were used to form a strong ensemble learning classifier that used a majority voting mechanism and obtains 95.73%recognition accuracy.The proposed method is a feasible and effective way to process greenhouse pest images.The system accurately recognized and counted pests in sticky trap images captured under real greenhouse conditions.
基金partially supported by the Research Clusters Program of Tokushima University and JSPS KAKENHI(No.19K20345)
文摘Insect pest control is considered as a significant factor in the yield of commercial crops.Thus,to avoid economic losses,we need a valid method for insect pest recognition.In this paper,we proposed a feature fusion residual block to perform the insect pest recognition task.Based on the original residual block,we fused the feature from a previous layer between two 11 convolution layers in a residual signal branch to improve the capacity of the block.Furthermore,we explored the contribution of each residual group to the model performance.We found that adding the residual blocks of earlier residual groups promotes the model performance significantly,which improves the capacity of generalization of the model.By stacking the feature fusion residual block,we constructed the Deep Feature Fusion Residual Network(DFF-ResNet).To prove the validity and adaptivity of our approach,we constructed it with two common residual networks(Pre-ResNet and Wide Residual Network(WRN))and validated these models on the Canadian Institute For Advanced Research(CIFAR)and Street View House Number(SVHN)benchmark datasets.The experimental results indicate that our models have a lower test error than those of baseline models.Then,we applied our models to recognize insect pests and obtained validity on the IP102 benchmark dataset.The experimental results show that our models outperform the original ResNet and other state-of-the-art methods.