Visualization of Interest Level using OpenPose with Class Videos

  • Dongshik Kang University of The Ryukyus Faculty of Engineering, Japan https://orcid.org/0009-0006-5465-0780
  • Yoshiaki Sasazawa University of The Ryukyus Faculty of Education, Japan
  • Minoru Kobayashi Bunkyo University Faculty Of Education, Japan

Keywords

interest level, visualization, OpenPose, classroom video, face direction

Abstract

The approach from the teacher's side to improve the classroom is to establish a learning discipline and an improvement method. However, there are various ways to do this, and it is an important part of teachers' work. In this paper, we propose a system for estimating the level of interest from class videos as a means of knowing the learning status of individual students. Using OpenPose, the system detects a person in a class video and extracts feature data of his/her joints. And we measure the concentration level based on the information of posture, facial orientation, and other movements. In addition, assuming that a person's face is facing the direction of the target when he/she is concentrating, we measure the concentration level based on the information of posture, facial direction, and other movements.

Downloads

Download data is not yet available.
VIEW THE ENTIRE ARTICLE

References

  • Yukimasa NISHIMURA, Yoshito TOBE,“Suggestion of simple wireless sensors measure the concentration of teaching,” The 16th National Convention of IEICE, 2011.
  • Tatsuma Muramatsu, Akihiko Sugiura,“Effect Verification of Concentration Measurement System that Uses Face Informations,” the 74th National Convention of IPSJ, 2012.
  • Z.Cao. Hidalgo, T. Simon, S.Wei, “Y.Sheikh:Realtime Multi-Person 2D Pose Estimation using Part Affinity Fields,” arXiv preprint ar Xiv:1812.0808
  • Zho Cao, Tomas Simon, Shih-En Wei, and Yaser Sheikh, “Realtime multi-person 2d pose estimation using part affinity fields, In CVPR, 2017.
  • Yunkai Zhang, Yinghong Tian, Pingyi Wu, Dongfan Chen, “Application of Skeleton Data and Long Short-Term Memory in Action Recognition of Children with Autism Spectrum Disorder,” Sensors 2021, 21, 411.
  • Google, “Google Colaboratory,” https://colab.google, 28 September 2023.
  • Published: 2023-09-30

    Issue: Vol. 2 No. 3 (2023) (view)

    Section: Research Articles

    How to cite:
    [1]
    D. Kang, Y. Sasazawa, and M. Kobayashi, “Visualization of Interest Level using OpenPose with Class Videos”, Intell Methods Eng Sci, vol. 2, no. 3, pp. 84–89, Sep. 2023.

    All papers should be submitted electronically. All submitted manuscripts must be original work that is not under submission at another journal or under consideration for publication in another form, such as a monograph or chapter of a book. Authors of submitted papers are obligated not to submit their paper for publication elsewhere until an editorial decision is rendered on their submission. Further, authors of accepted papers are prohibited from publishing the results in other publications that appear before the paper is published in the Journal unless they receive approval for doing so from the Editor-In-Chief.

    IMIENS open access articles are licensed under a Creative Commons Attribution-ShareAlike 4.0 International License. This license lets the audience to give appropriate credit, provide a link to the license, and indicate if changes were made and if they remix, transform, or build upon the material, they must distribute contributions under the same license as the original.