The number of leaves at a given time is important to characterize plant growth and development.In this work,we developed a high-throughput method to count the number of leaves by detecting leaf tips in RGB images.The ...The number of leaves at a given time is important to characterize plant growth and development.In this work,we developed a high-throughput method to count the number of leaves by detecting leaf tips in RGB images.The digital plant phenotyping platform was used to simulate a large and diverse dataset of RGB images and corresponding leaf tip labels of wheat plants at seedling stages(150,000 images with over 2 million labels).The realism of the images was then improved using domain adaptation methods before training deep learning models.The results demonstrate the efficiency of the proposed method evaluated on a diverse test dataset,collecting measurements from 5 countries obtained under different environments,growth stages,and lighting conditions with different cameras(450 images with over 2,162 labels).Among the 6 combinations of deep learning models and domain adaptation techniques,the Faster-RCNN model with cycle-consistent generative adversarial network adaptation technique provided the best performance(R^(2)=0.94,root mean square error=8.7).Complementary studies show that it is essential to simulate images with sufficient realism(background,leaf texture,and lighting conditions)before applying domain adaptation techniques.Furthermore,the spatial resolution should be better than 0.6 mm per pixel to identify leaf tips.The method is claimed to be self-supervised since no manual labeling is required for model training.The self-supervised phenotyping approach developed here offers great potential for addressing a wide range of plant phenotyping problems.The trained networks are available at https://github.com/YinglunLi/Wheat-leaf-tip-detection.展开更多
基金supported by the National Key R&D Program of China(nos.2021YFD2000105 and 2022YFE0116200)Jiangsu Funding Program for Excellent Postdoctoral Talent(no.2022ZB349)+3 种基金Young Scientists Fund of the Natural Science Foundation of Jiangsu Province,China(no.BK20210411)Young Scientists Fund of the National Natural Science Foundation of China(no.42201437)Fundamental Research Funds for the Central Universities of Ministry of Education of China(no.KYCXJC2022005)Project of Seed Industry Revitalization in Jiangsu Province,China(JBGS[2021]007).
文摘The number of leaves at a given time is important to characterize plant growth and development.In this work,we developed a high-throughput method to count the number of leaves by detecting leaf tips in RGB images.The digital plant phenotyping platform was used to simulate a large and diverse dataset of RGB images and corresponding leaf tip labels of wheat plants at seedling stages(150,000 images with over 2 million labels).The realism of the images was then improved using domain adaptation methods before training deep learning models.The results demonstrate the efficiency of the proposed method evaluated on a diverse test dataset,collecting measurements from 5 countries obtained under different environments,growth stages,and lighting conditions with different cameras(450 images with over 2,162 labels).Among the 6 combinations of deep learning models and domain adaptation techniques,the Faster-RCNN model with cycle-consistent generative adversarial network adaptation technique provided the best performance(R^(2)=0.94,root mean square error=8.7).Complementary studies show that it is essential to simulate images with sufficient realism(background,leaf texture,and lighting conditions)before applying domain adaptation techniques.Furthermore,the spatial resolution should be better than 0.6 mm per pixel to identify leaf tips.The method is claimed to be self-supervised since no manual labeling is required for model training.The self-supervised phenotyping approach developed here offers great potential for addressing a wide range of plant phenotyping problems.The trained networks are available at https://github.com/YinglunLi/Wheat-leaf-tip-detection.