Fish fry body length measurement with improved YOLOv8n-pose and biomechanics
Abstract
Accurate measurement of fish fry body length is crucial in biomechanical research and the development of intelligent aquaculture, as it directly affects the growth, locomotion, and ecological adaptability of fish. Traditional manual methods are time-consuming, labor-intensive, and may harm fish fry. Therefore, accurate, rapid, and non-destructive measurements of large quantities of fish fry are highly important in aquaculture. This study used 20–100 mm grass carp fry (Ctenopharyngodon idella) as test subjects. An image acquisition platform was developed to obtain RGB-D data from the top view of the fry. We proposed ROS-YOLO, which replaces the original C2f module of YOLOv8n-Pose with reparameterized convolution-based shuffle one-shot aggregation (RCS-OSA) and introduces a simple attention module (SimAM) into the main feature extraction layer, to detect key body length points of fish fry. Depth information for 3D keypoint coordinate transformation was obtained through the depth map. Additionally, biomechanical principles were incorporated to study the movement patterns, muscle activity, and hydrodynamic efficiency of fish fry. High-speed cameras and motion tracking software were used to analyze swimming kinematics and dynamics, while biomechanical modeling was employed to simulate the effects of water flow on growth and development. Finally, fish fry body lengths were calculated based on keypoint coordinates. In experiments, ROS-YOLO achieved an average keypoint detection accuracy of 99.2%, with 3.97 M parameters and 125 FPS. Compared to manual measurements, the overall average error in automatic measurement results was 2.87 mm (5.85%). Therefore, the proposed method meets real-time measurement requirements for fish fry body length and provides insights into the biomechanics of fish fry growth and movement.
References
1. Li Z, Zhao Y, Yang P. Research review on fish body length measurement based on machine vision (Chinese). Transactions of the Chinese Society for Agricultural Machinery. 2021; 52: 207-218.
2. Zhao S, Zhang S, Liu J, et al. Application of machine learning in intelligent fish aquaculture: A review. Aquaculture. 2021; 540: 736724. doi: 10.1016/j.aquaculture.2021.736724
3. Islamadina R, Pramita N, Arnia F, et al. Estimating fish weight based on visual captured. In: Proceedings of the 2018 International Conference on Information and Communications Technology (ICOIACT); 06-07 March 2018; Yogyakarta, Indonesia.
4. Tu X, Qian C, Liu S. Research on identification and counting method of Turbot fry based on ResNet34 model. Fishery Modernization. 2024; 51(1): 90-97.
5. Zhang S, Yang X, Wang Y, et al. Automatic Fish Population Counting by Machine Vision and a Hybrid Deep Neural Network Model. Animals. 2020; 10(2): 364. doi: 10.3390/ani10020364
6. Chen C, Du Y, Zhou C. Study on fish feeding behavior recognition technology based on support vector machine (Chinese). Jiangsu Academy of Agricultural Sciences. 2018; 46: 226-229. doi: 10.15889/j.issn.1002-1302.2018.07.057
7. Liu X, Zhang C. Study on fish tracking based on embedded image processing system (Chinese). Jiangsu Academy of Agricultural Sciences. 2018; 46: 203-207.
8. Zhang C, Chen M. Research status and outlook of fish feeding behavior based on computer vision (Chinese). Jiangsu Academy of Agricultural Sciences. 2020; 48: 31-36.
9. Zhang H, Zhang C, Wang R. Freshness recognition of small yellow croaker based on image processing and improved DenseNet network (Chinese). South China Fisheries Science. 2024; 20: 133-142.
10. Issac A, Dutta MK, Sarkar B. Computer vision based method for quality and freshness check for fish from segmented gills. Computers and Electronics in Agriculture. 2017; 139: 10-21. doi: 10.1016/j.compag.2017.05.006
11. Yang C, Xu J, Lu W. Computer vision-based body size measurement and weight estimation of large yellow croaker. Journal of Chinese Agricultural Mechanization. 2018; 39: 66-70.
12. Tseng CH, Hsieh CL, Kuo YF. Automatic measurement of the body length of harvested fish using convolutional neural networks. Biosystems Engineering. 2020; 189: 36-47. doi: 10.1016/j.biosystemseng.2019.11.002
13. Zhou J, Ji B, Ni W. Noncontact method for the accurate estimation of the full length of Takifugu rubripes based on 3D pose fitting. Transactions of the Chinese Society of Agricultural Engineering. 2023; 39: 154-161.
14. Li K, Teng G. Study on Body Size Measurement Method of Goat and Cattle under Different Background Based on Deep Learning. Electronics. 2022; 11(7): 993. doi: 10.3390/electronics11070993
15. Wang X, Wang W, Lu J, et al. HRST: An Improved HRNet for Detecting Joint Points of Pigs. Sensors. 2022; 22(19): 7215. doi: 10.3390/s22197215
16. Li M, Su L, Zhang Y. Automatic measurement of Mongolian horse body based on improved YOLOv8n-pose and 3D point cloud analysis. Smart Agric. 2024; 6: 91-102.
17. Li T, Xu S, Shi Y. Continuous casting slab model positioning and measurement based on binocular vision and Transformer. Journal of Central South University. 2024; 55: 1312-1322.
18. Huang Z, Xu A, Zhou S. Key point detection method for pig face fusing reparameterization and attention mechanisms. Transactions of the Chinese Society of Agricultural Engineering. 2023; 39: 141-149.
19. Kang M, Ting C, Ting FF. RCS-YOLO: A fast and high-accuracy object detector for brain tumor detection. In: Proceedings of the 26th International Conference; 8–12 October 2023; Vancouver, BC, Canada.
20. Tang Z, Hou X, Huang X, et al. Domain Adaptation for Bearing Fault Diagnosis Based on SimAM and Adaptive Weighting Strategy. Sensors. 2024; 24(13): 4251. doi: 10.3390/s24134251
21. Durve M, Orsini S, Tiribocchi A, et al. Benchmarking YOLOv5 and YOLOv7 models with DeepSORT for droplet tracking applications. The European Physical Journal E. 2023; 46(5). doi: 10.1140/epje/s10189-023-00290-x
Copyright (c) 2025 Author(s)

This work is licensed under a Creative Commons Attribution 4.0 International License.
Copyright on all articles published in this journal is retained by the author(s), while the author(s) grant the publisher as the original publisher to publish the article.
Articles published in this journal are licensed under a Creative Commons Attribution 4.0 International, which means they can be shared, adapted and distributed provided that the original published version is cited.