Badminton striking motion recognition based on dense trajectory algorithm
Abstract
With people paying more attention to health and quality of life, more and more people are keen to exercise through various sports, such as badminton, which is suitable for all ages. With the development of the new era, artificial intelligence and robots have gradually entered people’s lives, among which badminton robot is an intelligent robot that can compete with people in real time. The robot can not only capture and track badminton moving at high speed, but also has a high-speed motion control system to accurately complete the striking action. Therefore, the main research content of this paper is based on badminton robot vision system to capture, identify and analyze badminton players’ striking action. The main research work is as follows: A method of obtaining video segments of badminton striking action according to the flying direction and position of badminton is proposed, and a data set containing 8 kinds of common badminton striking action is made. Then, the dense trajectory algorithm is improved to recognize badminton striking action more effectively. Through the experimental study found that the use of dense trajectory algorithm for badminton players hit action recognition by only extracting the trajectory of feature points in a small range of players reduce the complexity of the algorithm but also enhance the robustness of the algorithm. Through the way of experiments to verify the effectiveness of the non-fixed length trajectory, but also improve the recognition rate of badminton striking action.
References
1. Yang T, Li D, Bai Y, et al. Multiple-Object-Tracking Algorithm Based on Dense Trajectory Voting in Aerial Videos. Remote Sensing, 2019, 11 (19): 2278.
2. Zhao H, Dang J, Wang S, et al. Dense Trajectory Action Recognition Algorithm Based on Improved SURF. IOP Conference Series Earth and Environmental Science, 2019, 252: 032179.
3. Nguyen T T, Nguyen T P, Bouchara F. Directional Dense-Trajectory-based Patterns for Dynamic Texture Recognition. IET Computer Vision, 2020, 14 (4).
4. Chandra R, Bhattacharya U, Bera A, et al. TraPHic: Trajectory Prediction in Dense and Heterogeneous Traffic Using Weighted Interactions. In 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Long Beach. 2019. 8475-8484.
5. Gu J, Sun C, Zhao H. DenseTNT: End-to-end Trajectory Prediction from Dense Goal Sets. IEEE, 2021, 15283-15292.
6. Fan L I, Xiong J, Lan X, et al. Hybrid Vehicle Trajectory Prediction Algorithm Based on Hough Transform. Chinese Journal of Electronics, 2021, 30(5): 918-930.
7. Wang H, Zhang W, Wu X, et al. A double-layered non-linear model predictive control based control algorithm for local trajectory planning for automated trucks under integrated road adhesion coefficient conditions. Frontiers of Information Technology & Electronic Engineering, 2020, 21: 1059-1073.
8. Zhuang Y, Zhang J, Xiao J, et al. 3-dimensional dynamic facial expression modeling method based on video streaming. China, the invention patents, CN200610053393.8.
9. Liang Y. Research on target handover under multi-camera environment. Hebei University of Technology, 2010, 20-30.
10. Wang L, Tan T N, Ning H. Fusion of static and dynamic body biometrics for gait recognition. IEEE Transactions on Circuits and Systems for Video Technology, 2004, 14 (2): 149-158.
11. Liu H, Wang Y, Tan T N. Multi-modal dat a fus ion for person authentic ion using importenn. Chinese Journal of Automation, 2004, 30 (1): 78-85.
12. Pan S. A Method of Key Posture Detection and Motion Recognition in Sports Based on Deep Learning. Mobile Information Systems, 2022.
13. Wang Y, Zhang P, Wang Y. Detection Method of Limb Movement in Competitive Sports Training Based on Deep Learning. J. Comput. Methods Sci. Eng., 2023, 23: 1667-1678.
14. Zhen J I, Tian Y. IoT Based Dance Movement Recognition Model Based on Deep Learning Framework. Scalable Computing: Practice & Experience, 2024, 25(2).
15. Suzuki K, Ito H, Yamada T, et al. Deep Predictive Learning: Motion Learning Concept Inspired by Cognitive Robotics. 2023.
16. Akber S M A, Kazmi S N, Mohsin M, et al. Deep Learning-Based Motion Style Transfer Tools, Techniques and Future Challenges. Sensors, 2023, 23(5).
17. Gu B, Sidhu S, Weinreb R N, et al. Review of Visualization Approaches in Deep Learning Models of Glaucoma. Asia-Pacific Journal of Ophthalmology, 2023, 12(4): 392-401.
18. Duan X. Abnormal Behavior Recognition for Human Motion Based on Improved Deep Reinforcement Learning. International Journal of Image & Graphics, 2024, 24(1).
19. Ting J, Ting H Y, Tan D, et al. Kinect-Based Badminton Motion Analysis Using Intelligent Adaptive Range of Movement Index. IOP Conference Series: Materials Science and Engineering, 2019, 491(1): 012017.
20. Zhao X, Gu Y. Single leg landing movement differences between male and female badminton players after overhead stroke in the backhand-side court. Human movement science, 2019, 142-148.
21. Zhang X, Duan H, Zhang M, et al. Wrist MEMS Sensor for Movements Recognition in Ball Games. In 2019 IEEE 9th Annual International Conference on CYBER Technology in Automation, Control, and Intelligent Systems (CYBER). Suzhou. 2019. 1663-1667.
22. Model M, Lei Y, Xi Y E, et al. Real Time Recognition of Badminton Action Based on Hidden. Computer & Digital Engineering, 2019.
23. Steels T, Herbruggen B, Fontaine J, et al. Badminton Activity Recognition Using Accelerometer Data. Sensors, 2020, 20 (17): 4685.
Copyright (c) 2025 Youfeng Yang, Yin Zhang, Chang Li
This work is licensed under a Creative Commons Attribution 4.0 International License.
Copyright on all articles published in this journal is retained by the author(s), while the author(s) grant the publisher as the original publisher to publish the article.
Articles published in this journal are licensed under a Creative Commons Attribution 4.0 International, which means they can be shared, adapted and distributed provided that the original published version is cited.