Document Type

Conference Paper



Publication Details

Sunney, J., Jilani, M. & Pathak, P. (2023). A Real-time Machine Learning Framework for Smart Home-based Yoga Teaching System. The 7th International Conference on Machine Vision and Information Technology (CMVIT 2023), Xiamen, China | Mar. 24-26, 2023



Practicing yoga poses in a home-based environment has increased due to Covid19. Yoga poses without a trainer can be challenging, and incorrect yoga poses can cause muscle damage. Smart home-based yoga teaching systems may aid in performing accurate yoga poses. However, the challenge with such systems is the computational time required to detect yoga poses. This research proposes a real-time machine learning framework for teaching accurate yoga poses. It combines a pose estimation model, a pose classification model, and a real-time feedback mechanism. The dataset consists of five popular yoga poses namely the downdog pose, the tree pose, the goddess pose, the plank pose, and the warrior pose. The BlazePose model was used for yoga pose estimation which transforms the image data into 3D landmark points. The output of the pose estimation model was then passed to the pose classification model for yoga pose detection. Four machine learning classifiers namely, Random Forest, Support Vector Machine, XGBoost, Decision Tree, and two neural network classifiers LSTM and CNN were evaluated based on accuracy, latency and size. Results demonstrate that XGBoost outperforms other models with an accuracy of 95.14 percentage, latency of 8 ms, and size of 513 KB. The output of the XGBoost Classifier was then used to correct yoga poses by displaying real-time feedback to the user. This novel framework has the potential to be integrated into mobile applications which can be used by people for the unsupervised practice of yoga at home.



This research received no external funding

Creative Commons License

Creative Commons Attribution-Share Alike 4.0 International License
This work is licensed under a Creative Commons Attribution-Share Alike 4.0 International License.