Visualization of Interest Level using OpenPose with Class Videos
DOI:
https://doi.org/10.58190/imiens.2023.47Keywords:
interest level, visualization, OpenPose, classroom video, face directionAbstract
The approach from the teacher's side to improve the classroom is to establish a learning discipline and an improvement method. However, there are various ways to do this, and it is an important part of teachers' work. In this paper, we propose a system for estimating the level of interest from class videos as a means of knowing the learning status of individual students. Using OpenPose, the system detects a person in a class video and extracts feature data of his/her joints. And we measure the concentration level based on the information of posture, facial orientation, and other movements. In addition, assuming that a person's face is facing the direction of the target when he/she is concentrating, we measure the concentration level based on the information of posture, facial direction, and other movements.
Downloads
References
Yukimasa NISHIMURA, Yoshito TOBE,“Suggestion of simple wireless sensors measure the concentration of teaching,” The 16th National Convention of IEICE, 2011.
Tatsuma Muramatsu, Akihiko Sugiura,“Effect Verification of Concentration Measurement System that Uses Face Informations,” the 74th National Convention of IPSJ, 2012.
Z.Cao. Hidalgo, T. Simon, S.Wei, “Y.Sheikh:Realtime Multi-Person 2D Pose Estimation using Part Affinity Fields,” arXiv preprint ar Xiv:1812.0808
Zho Cao, Tomas Simon, Shih-En Wei, and Yaser Sheikh, “Realtime multi-person 2d pose estimation using part affinity fields, In CVPR, 2017.
Yunkai Zhang, Yinghong Tian, Pingyi Wu, Dongfan Chen, “Application of Skeleton Data and Long Short-Term Memory in Action Recognition of Children with Autism Spectrum Disorder,” Sensors 2021, 21, 411.
Google, “Google Colaboratory,” https://colab.google, 28 September 2023.
Downloads
Published
Issue
Section
License
Copyright (c) 2023 Intelligent Methods In Engineering Sciences
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.