期刊文献+

A Method Based on Knowledge Distillation for Fish School Stress State Recognition in Intensive Aquaculture 被引量:1

下载PDF
导出
摘要 Fish behavior analysis for recognizing stress is very important for fish welfare and production management in aquaculture.Recent advances have been made in fish behavior analysis based on deep learning.However,most existing methods with top performance rely on considerable memory and computational resources,which is impractical in the real-world scenario.In order to overcome the limitations of these methods,a new method based on knowledge distillation is proposed to identify the stress states of fish schools.The knowledge distillation architecture transfers additional inter-class information via a mixed relative loss function,and it forces a lightweight network(GhostNet)to mimic the soft probabilities output of a well-trained fish stress state recognition network(ResNeXt101).The fish school stress state recognition model’s accuracy is improved from 94.17%to 98.12%benefiting from the method.The proposed model has about 5.18 M parameters and requires 0.15 G FLOPs(floating-point operations)to process an image of size 224×224.Furthermore,fish behavior images are collected in a land-based factory,and a dataset is constructed and extended through flip,rotation,and color jitter augmentation techniques.The proposed method is also compared with other state-of-the-art methods.The experimental results show that the proposed model is more suitable for deployment on resource-constrained devices or real-time applications,and it is conducive for real-time monitoring of fish behavior.
出处 《Computer Modeling in Engineering & Sciences》 SCIE EI 2022年第6期1315-1335,共21页 工程与科学中的计算机建模(英文)
基金 supported by the National Science Foundation of China‘Analysis and feature recognition on feeding behavior of fish school in facility farming based on machine vision’(No.62076244) the National Key R&D Program of China‘Next generation precision aquaculture:R&D on intelligent measurement,control and equipment technologies’(China Grant No.2017YFE0122100).
  • 相关文献

同被引文献11

引证文献1

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部