Pencak Silat Movement Classification Using CNN Based On Body Pose
DOI:
https://doi.org/10.12962/jaree.v7i2.369Abstract
Pencak silat, besides from being useful for self-protection, also has many other benefits, such as increasing physical strength, maintaining posture, and maintaining heart health. Due to the recent pandemic, practicing pencak silat is difficult to do together. Even when there is study material on pencak silat at school, it is difficult for the sports teacher to teach the movements directly. Pencak silat exercises that are practiced alone without a coach can cause injury if the movements are not correct. Therefore, this study builds a system to recognize pencak silat movements. The system was built using the bodypose-based CNN method. Bodypose estimation is used to detect human body keypoints, then these keypoints are used as a feature for input to CNN to recognize movement in each frame. This system uses CNN because it requires fewer parameters and less computing power so that it can be more easily applied for further studies. The accuracy obtained reaches 77% when tested on data that has never been used. This model can be used as a starting point for creating an easy-to-use system to help people practice pencak silat with more recognizable moves.References
Kim, J.-W.; Choi, J.-Y.; Ha, E.-J.; Choi, J.-H. Human Pose Estimation Using MediaPipe Pose and Optimization Method Based on a Humanoid Model. Appl. Sci. 2023, 13, 2700.
Bearman, A., Dong, C., “Human Pose Estimation and Activity Classification Using Convolutional Neural Networks” http://cs231n.stanford.edu/reports/2015/pdfs/cdong-paper.pdf
A. Gupta, K. Gupta, K. Gupta and K. Gupta, "Human Activity Recognition Using Pose Estimation and Machine Learning Algorithm," 2021 International Semantic Intelligence Conference (ISIC), New Delhi, India
Zaman, Lukman, Sampeno, dan Hariadi. (2019). Analisis Kinerja LSTM dan GRU sebagai Model Generatif untuk Tari Remo. JNTETI, Vol. 8, No. 2, Mei 2019
Asshidiqy, R.A., Setiawan, Sasongko. (2022). Penerapan Metode Posenet untuk Deteksi Ketepatan Pose Yoga. JoYSC Vol. 4 No. 1, ISSN 2714-7150 E-ISSN 2714-8912
Upadhyay, A.; Basha, N.K.; Ananthakrishnan, B. Deep Learning-Based Yoga Posture Recognition Using the Y_PN-MSSD Model for Yoga Practitioners. Healthcare 2023, 11, 609.
Kishore DM, Bindu S, Manjunath NK. Estimation of yoga poses using machine learning techniques. Int J Yoga 2022;15:137-43.
Tanugraha, F.D., Pratikno, Musayyanah, dan Kusumawati. (2022). Pengenalan Gerakan Olahraga Berbasis (Long Short-Term Memory) menggunakan Mediapipe. JAIIT (Journal of Advances in Information and Industrial Technology) Vol. 4, No. 1 Mei 2022, ISSN 2723-4371, E-ISSN 2723-5912
Taruna, I.P.J., Fredlina, dan Sudiatmika. (2022). Pengenalan Gerakan Sikap Dasar Pencak Silat Bakti Negara Berbasis Aplikasi Mobile menggunakan Neural Network. ISSN:2477-0043 ISSN ONLINE:2460-7908.
T. L. Munea, Y. Z. Jembre, H. T. Weldegebriel, L. Chen, C. Huang and C. Yang, "The Progress of Human Pose Estimation: A Survey and Taxonomy of Models Applied in 2D Human Pose Estimation," in IEEE Access, vol. 8, pp. 133330-133348, 2020, doi: 10.1109/ACCESS.2020.3010248.
Camillo Lugaresi, Jiuqiang Tang, Hadon Nash, Chris McClanahan, Esha Uboweja, Michael Hays, Fan Zhang, Chuo-Ling Chang, Ming Guang Yong, Juhyun Lee, Wan-Teh Chang, Wei Hua, Manfred Georg, Matthias Grundmann: MediaPipe: A Framework for Building Perception Pipelines. CoRR abs/1906.08172 (2019)
Mediapipe Pose. Available online: https://developers.google.com/mediapipe/solutions/vision/pose_landmarker/ (accessed on 21 June 2023).
D. Dai, "An Introduction of CNN: Models and Training on Neural Network Models," 2021 International Conference on Big Data, Artificial Intelligence and Risk Management (ICBAR), Shanghai, China, 2021, pp. 135-138, doi: 10.1109/ICBAR55169.2021.00037.
S. Tripathi and R. Kumar, "Image Classification using small Convolutional Neural Network," 2019 9th International Conference on Cloud Computing, Data Science & Engineering (Confluence), Noida, India, 2019, pp. 483-487, doi: 10.1109/CONFLUENCE.2019.8776982.
A. Singh, S. Agarwal, P. Nagrath, A. Saxena and N. Thakur, "Human Pose Estimation Using Convolutional Neural Networks," 2019 Amity International Conference on Artificial Intelligence (AICAI), Dubai, United Arab Emirates, 2019, pp. 946-952, doi: 10.1109/AICAI.2019.8701267.
A. S. Dileep, N. S. S., S. S., F. K. and S. S., "Suspicious Human Activity Recognition using 2D Pose Estimation and Convolutional Neural Network," 2022 International Conference on Wireless Communications Signal Processing and Networking (WiSPNET), Chennai, India, 2022, pp. 19-23, doi: 10.1109/WiSPNET54241.2022.9767152.
Alzubaidi, L., Zhang, J., Humaidi, A.J. et al. Review of deep learning: concepts, CNN architectures, challenges, applications, future directions. J Big Data 8, 53 (2021). https://doi.org/10.1186/s40537-021-00444-8
Aruna, V. & Deepthi, Aruna & Leelavathi, R.. (2022). Human Activity Recognition Using Single Frame CNN. 10.1007/978-981-19-4831-2_17.
Downloads
Published
Issue
Section
License
Copyright
Submission of a manuscript implies that the submitted work has not been published before (except as part of a thesis or report, or abstract); that it is not under consideration for publication elsewhere; that its publication has been approved by all co-authors. If and when the manuscript is accepted for publication, the author(s) still hold the copyright and retain publishing rights without restrictions. Authors or others are allowed to multiply article as long as not for commercial purposes. For the new invention, authors are suggested to manage its patent before published. The license type is CC-BY-NC 4.0.
Disclaimer
No responsibility is assumed by publisher and co-publishers, nor by the editors for any injury and/or damage to persons or property as a result of any actual or alleged libelous statements, infringement of intellectual property or privacy rights, or products liability, whether resulting from negligence or otherwise, or from any use or operation of any ideas, instructions, procedures, products or methods contained in the material therein.


