No CrossRef data available.
Knowledge Distillation and Student-Teacher Learning for Weed Detection in Turf
Published online by Cambridge University Press: 29 October 2024
Abstract
Machine vision-based herbicide applications relying on object detection or image classification deep convolutional neural network (DCNN) demand high memory and computational resources, resulting in lengthy inference times. To tackle these challenges, this study assessed the effectiveness of three teacher models, each trained on datasets of varying sizes, including D-20k (comprising 10,000 true positive and true negative images) and D-10k (comprising 5,000 true positive and true negative images). Additionally, knowledge distillation was performed on their corresponding student models across a range of temperature settings. After the process of student-teacher learning, the parameters of all student models were reduced. ResNet18 not only achieved higher accuracy (ACC≥0.989) but also maintained higher frames per second (FPS≥742.9) under its optimal temperature condition (T=1). Overall, the results suggest that employing knowledge distillation on the machine vision models enabled accurate and reliable weed detection in turf while reducing the need for extensive computational resources, thereby facilitating real-time weed detection and contributing to the development of smart, machine vision-based sprayers.
Keywords
- Type
- Research Article
- Information
- Copyright
- © Weed Science Society of America 2024
Footnotes
Danlan Zhai and Teng Liu contributed equally to this work.