Characteristic extraction of Tai Chi movement data—Based on self-powered wearable sensors

  • Ruijie Zhang Graduate School of Jeonju University, Jeonju 55069, Korea; Department of Physical Education, Tangshan Normal University, Tangshan 063000, China
  • Chunlei Xue Department of Physical Education, Tangshan Normal University, Tangshan 063000, China
  • Zijie Sun Department of Physical Education, Tangshan Normal University, Tangshan 063000, China
  • Kim Junhee Graduate School of Jeonbuk National University, Jeonju 54896, Korea
  • Yunna Liu Sports & Health College, Shanghai Lixin University of Accounting and Finance, Shanghai 201620, China
Keywords: Taijiquan; sensor; CM-WOA; feature extraction recognition rate
Article ID: 848

Abstract

Although visual recognition has good recognition accuracy, it brings great hidden danger of privacy leakage. Although signal recognition has the advantages of device-free and privacy protection, it is sensitive to environmental noise and is not suitable for crowded environment, so sensor-based human behavior recognition is a more feasible choice. Therefore, this paper proposes a multi-level decision behavior recognition method based on self-powered wearable sensor fusion. In this paper, we propose a CM-WOA-based automatic dynamic sensor deployment optimization method for the feature extraction of Tai Chi action data. In behavior recognition based on wearable sensors, different deployment schemes of self-powered wearable sensors, will lead to different recognition accuracy, However, the traditional empirical deployment scheme cannot guarantee the best sensor layout. In order to further improve the recognition accuracy. In this paper, we propose a CM-WOA-based autodynamic sensor deployment optimization method for the feature extraction of Tai Chi action data, so as to find a balance between recognition accuracy and sensor deployment cost, and deploy as few sensors as possible on the premise of maximizing recognition accuracy. Finally, by comparing the scheme proposed in this paper with the other seven schemes, The feature extraction and recognition rate of Taijiquan movement data based on self-powered wearable sensor can reach 94%, which proves that the proposed multi-sensor deployment optimization method based on CM-WOA is effective in improving the overall recognition rate of the recognition model.

References

1. Kuo C T, Chen C Y, Chang Y T, et al. CIC signal processing embedded system a modulizable platform for multi-domain signal processing. In 2012 Annual International Conference of the IEEE Engineering in Medicine and Biology Society, San Diego, 2012, 2849-2852.

2. Chen Y, Shen C. Performance Analysis of Smartphone-Sensor Behavior for Human Activity Recognition. IEEE Access, 2017, 5: 3095-3110.

3. Imai T, Kazuki M. Extensible Activity Recognition System for Behavior Support. In 2017 31st International Conference on Advanced Information Networking and Applications Workshops (WAINA). Taipei, 2017, 606-611.

4. Li W, Zhang R. A Hybrid Forecasting Model for Wind Energy Based on the Complementary Ensemble Empirical Mode Decomposition and Whale Optimized Back Propagation Neural Network. In IEEE 4th Conference on Energy Internet and Energy System Integration (EI2), Wuhan, 2020, 1084-1089.

5. Dai C, Liu X, Lai J, et al. Human Behavior Deep Recognition Architecture for Smart City Applications in the 5G Environment. IEEE Network, 2019, 33 (5): 206-211.

6. Vrigkas M, Nikou C, Kakadiaris I. Identifying Human Behaviors Using Synchronized Audio-Visual Cues. IEEE Transactions on Affective Computing, 2017, 8 (1): 54-56.

7. Sepas-Moghaddam A, Etemad A. View-Invariant Gait Recognition with Attentive Recurrent Learning of Partial Representations. IEEE Transactions on Biometrics Behavior and Identity Science, 2020, 3 (1): 124-137.

8. Liu H L, Taniguchi T, Tanaka Y, et al. Visualization of Driving Behavior Based on Hidden Feature Extraction by Using Deep Learning. IEEE Transactions on Intelligent Transportation Systems, 2017, 18 (9): 2477-2489.

9. Wang Q, Jiao W, Wang P, et al. Digital Twin for Human-Robot Interactive Welding and Welder Behavior Analysis. IEEE/CAA Journal of Automatica Sinica, 2020, 8 (2): 334-343.

10. Zhang L, Liang R, Yin J, et al. Scene Categorization by Deeply Learning Gaze Behavior in a Semissupervised Context. IEEE Transactions on Cybernetics, 2021, 51 (8): 4265-4276.

11. Jiang Z, Crookes D, Green B D, et al. Context-Aware Mouse Behavior Recognition Using Hidden Markov Models. IEEE Transactions on Image Processing, 2019, 28 (3): 1133-1148.

12. Sun G, Shi C, Liu J, et al. Behavior Recognition and Maternal Ability Evaluation for Sows Based on Triaxial Acceleration and Video Sensors. IEEE Access, 2021, 9: 65346-65360.

13. Martin M, Roitberg A, Haurilet M, et al. Drive & Act: A Multi-Modal Dataset for Fine-Grained Driver Behavior Recognition in Autonomous Vehicles. In 2019 IEEE/CVF International Conference on Computer Vision (ICCV), Seoul, 2019, 2801-2810.

14. Hao F F, Liu J, Chen X D. A Review of Human Behavior Recognition Based on Deep Learning. In 2020 International Conference on Artistic Intelligence and Education (ICAIE), Tianjin, 2020, 19-23.

15. Tang Z, Zhu A, Wang Z, et al. Human Behavior Recognition Based on WiFi Channel State Information. In 2020 Chinese Automation Congress (CAC), Shanghai, 2020, 1157-1162.

16. Ayed M B, Elkosantini S, Alshaya S A, et al. Suspicious Behavior Recognition Based on Face Features. IEEE Access, 2019, 7: 149952-149958.

17. Lin Y Y. Characterization of Taijiquan movement posture based on MEMS ---- with human waist, head as an example. In Fujian Normal University, Fujian, 2021.

18. Wang H Y, Yang S L, Wu J H. Recognition method of taijiquan based on fusion information, terminal device and storage medium, CN202111301208.3, 2024.

19. Wang Y, Jing J. Segmentation and recognition of taijiquan trajectory based on multi-sensor data fusion, Control Science and Engineering, 2024.

20. Ye S, Liang Y, Xie Y, et al. A method and system for taijiquan movement correction based on generative adversarial network, CN202111201371.2, 2024.

21. Yin Y, Sun N, Ren G, et al. Kinect-based taijiquan movement determination and guidance system and its guidance method, CN201610374146.1, 2024.

22. Chi C, Ren L. Taiji fixed-step push hand movement recognition system, CN201520291274.0, 2024.

23. Xue Z, Zhang L, Cheng Z, et al. Kinect-based in situ taijiquan auxiliary training system. Journal of Hebei University of Science and Technology, 2017, 038 (002): 183-189.

24. Xu Z. Assisted teaching and evaluation method of taijiquan based on whole-body motion capture. In Zhengzhou University, Zhengzhou, 2024

25. Ren H C, Duan H F, Li Q M., et al. A wearable Taiji exercise gait evaluation and training system based on cloud platform, CN201910415945, 2024.

26. Shuaibu A N, Malik A S, Faye I. Adaptive feature learning CNN for behavior recognition in crowd scene. In 2017 IEEE International Conference on Signal and Image Processing Applications (ICSIPA), Kuching, 2017, 357-361.

27. Wen C, Yuan H, Gao Y, et al. The Abnormal Behavior Recognition Based on the Smart Mobile Sensors. In 2016 9th International Symposium on Computational Intelligence and Design (ISCID), Hangzhou, 2016, 390-393.

28. Zhan H, Liu Y, Cui Z, et al. Pedestrian Detection and Behavior Recognition Based on Vision. In 2019 IEEE Intelligent Transportation Systems Conference (ITSC), Auckland, 2019, 771-776.

29. Shi F, Chen Z, Cheng X. Behavior Modeling and Individual Recognition of Sonar Transmitter for Secure Communication in UASNs. IEEE Access, 2020, 8: 2447-2454.

30. Zhou Z, Duan G, Lei H, et al. Human behavior recognition method based on double-branch deep conversion neural network. In 2018 Chinese Control and Decision Conference (CCDC), Shenyang, 2018, 5520-5524.

31. Nan Y, Shen Y, Jin W, et al. Four-channel behavior recognition algorithm based on DRN. In 2018 13th IEEE Conference on Industrial Electronics and Applications (ICIEA), Wuhan, 2018, 1217-1221.

32. Wang C, Wang Z, Yu Y, et al. Rapid Recognition of Human Behavior Based on Micro-Doppler Feature. In International Conference on Control, Automation and Information Sciences (ICCAIS), Chengdu, 2019, 1-5.

33. Nassuna H, Eyobu O S, Kim J H, et al. Feature Selection Based on Variance Distribution of Power Spectral Density for Driving Behavior Recognition. In 2019 14th IEEE Conference on Industrial Electronics and Applications (ICIEA), Xi’an, 2019, 335-338.

34. Ma Y, Zhang Z, Chen S, et al. A Comparative Study of Aggressive Driving Behavior Recognition Algorithms Based on Vehicle Motion Data. IEEE Access, 2019,7: 8028-8038.

35. Zhou S, Xu L. Mouse Behavior Recognition Based on Conversion Neural Network. In 2018 IEEE 8th Annual International Conference on CYBER Technology in Automation, Control, and Intelligent Systems (CYBER), Tianjin, 2018, 635-639.

36. Sultana M, Polash P, Gavrilova M. Authority recognition of tweets: A comparison between social behavior and linguistic profiles. In IEEE International Conference on Systems, Man and Cybernetics (SMC), Banff, 2017, 471-476.

37. Bo L, Bouachir W, Gouiaa R, et al. Real-time recognition of suicidal behavior using an RGB-D camera. In 2017 Seventh International Conference on Image Processing Theory, Tools and Applications (IPTA), Montreal, 2017, 1-6.

38. Brattoli B, Büchler U, Wahl A S. LSTM Self-Supervision for Detailed Behavior Analysis. In 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). Honolulu, 2017, 3747-3756.

39. An J, Cheng Y, He X, et al. Multiuser Behavior Recognition Module Based on DC-DMN. IEEE Sensors Journal, 2022, 22 (3): 2802-2813.

Published
2024-12-09
How to Cite
Zhang, R., Xue, C., Sun, Z., Junhee, K., & Liu, Y. (2024). Characteristic extraction of Tai Chi movement data—Based on self-powered wearable sensors. Molecular & Cellular Biomechanics, 21(4), 848. https://doi.org/10.62617/mcb848
Section
Article