Please use this identifier to cite or link to this item:
|Title:||Automatic detection of students affective states in classroom environment using hybrid convolutional neural networks|
|Authors:||T., S, A.|
Ram Mohana Reddy, Guddeti
|Citation:||Education and Information Technologies, 2019, Vol., , pp.-|
|Abstract:||Predicting the students emotional and behavioral engagements using computer vision techniques is a challenging task. Though there are several state-of-the-art techniques for analyzing a student s affective states in an e-learning environment (single person s engagement detection in a single image frame), a very few works are available for analyzing the students affective states in a classroom environment (multiple people in a single image frame). Hence, in this paper, we propose a novel hybrid convolutional neural network (CNN) architecture for analyzing the students affective states in a classroom environment. This proposed architecture consists of two models, the first model (CNN-1) is designed to analyze the affective states of a single student in a single image frame and the second model (CNN-2) uses multiple students in a single image frame. Thus, our proposed hybrid architecture predicts the overall affective state of the entire class. The proposed architecture uses the students facial expressions, hand gestures and body postures for analyzing their affective states. Further, due to unavailability of standard datasets for the students affective state analysis, we created, annotated and tested on our dataset of over 8000 single face in a single image frame and 12000 multiple faces in a single image frame with three different affective states, namely: engaged, boredom and neutral. The experimental results demonstrate an accuracy of 86% and 70% for posed and spontaneous affective states of classroom data, respectively. 2019, Springer Science+Business Media, LLC, part of Springer Nature.|
|Appears in Collections:||1. Journal Articles|
Files in This Item:
There are no files associated with this item.
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.