Faculty Publications
Permanent URI for this communityhttps://idr.nitk.ac.in/handle/123456789/18736
Publications by NITK Faculty
Browse
4 results
Search Results
Item Unobtrusive Behavioral Analysis of Students in Classroom Environment Using Non-Verbal Cues(Institute of Electrical and Electronics Engineers Inc., 2019) Ashwin, T.S.; Guddeti, G.R.Pervasive intelligent learning environments can be made more personalized by adapting the teaching strategies according to the students' emotional and behavioral engagements. The students' engagement analysis helps to foster those emotions and behavioral patterns that are beneficial to learning, thus improving the effectiveness of the teaching-learning process. Unobtrusive student engagement analysis is performed using the students' non-verbal cues such as facial expressions, hand gestures, and body postures. Though there exist several techniques for classifying the engagement of a single student present in a single image frame, there are limited works on the students' engagement analysis in a classroom environment. In this paper, we propose a convolutional neural network architecture for unobtrusive students' engagement analysis using non-verbal cues. The proposed architecture is trained and tested on faces, hand gestures and body postures in the wild of more than 350 students present in a classroom environment, with each test image containing multiple students in a single image frame. The data annotation is performed using the gold standard study, and the annotators reliably agree with Cohen's ? = 0.43. We obtained 71% accuracy for the students' engagement level classification. Further, a pre-test/post-test analysis was performed, and it was observed that there is a positive correlation between the students' engagement and their test performance. © 2013 IEEE.Item Affective database for e-learning and classroom environments using Indian students’ faces, hand gestures and body postures(Elsevier B.V., 2020) Ashwin, T.S.; Guddeti, R.M.R.Automatic recognition of the students’ affective states is a challenging task. These affective states are recognized using their facial expressions, hand gestures, and body postures. An intelligent tutoring system and smart classroom environment can be made more personalized using students’ affective state analysis, and it is performed using machine or deep learning techniques. Effective recognition of affective states is mainly dependent on the quality of the database used. But, there exist very few standard databases for the students’ affective state recognition and its analysis that works for both e-learning and classroom environments. In this paper, we propose a new affective database for both the e-learning and classroom environments using the students’ facial expressions, hand gestures, and body postures. The database consists of both posed (acted) and spontaneous (natural) expressions with single and multi-person in a single image frame with more than 4000 manually annotated image frames with object localization. The classification was done manually using the gold standard study for both Ekman's basic emotions and learning-centered emotions, including neutral. The annotators reliably agree when discriminating against the recognized affective states with Cohen's ? = 0.48. The created database is more robust as it considers various image variants such as occlusion, background clutter, pose, illumination, cultural & regional background, intra-class variations, cropped images, multipoint view, and deformations. Further, we analyzed the classification accuracy of our database using a few state-of-the-art machine and deep learning techniques. Experimental results demonstrate that the convolutional neural network based architecture achieved an accuracy of 83% and 76% for detection and classification, respectively. © 2020 Elsevier B.V.Item Impact of inquiry interventions on students in e-learning and classroom environments using affective computing framework(Springer Science and Business Media B.V. editorial@springerplus.com, 2020) Ashwin, T.S.; Guddeti, R.M.R.Effective teaching strategies improve the students’ learning rate within academic learning time. Inquiry-based instruction is one of the effective teaching strategies used in the classrooms. But these teaching strategies are not adapted in other learning environments like intelligent tutoring systems, including auto tutors. In this paper, we propose an automatic inquiry-based instruction teaching strategy, i.e., inquiry intervention using students’ affective states. The proposed model contains two modules: the first module consists of the proposed framework for predicting the unobtrusive multi-modal students’ affective states (teacher-centric attentive and in-attentive states) using the facial expressions, hand gestures and body postures. The second module consists of the proposed automated inquiry-based instruction teaching strategy to compare the learning outcomes with and without inquiry intervention using affective state transitions for both an individual and a group of students. The proposed system is tested on four different learning environments, namely: e-learning, flipped classroom, classroom and webinar environments. Unobtrusive recognition of students’ affective states is performed using deep learning architectures. After student-independent tenfold cross-validation, we obtained the students’ affective state classification accuracy of 77% and object localization accuracy of 81% using students’ faces, hand gestures and body postures. The overall experimental results demonstrate that there is a positive correlation with r= 0.74 between students’ affective states and their performance. Proposed inquiry intervention improved the students’ performance as there is a decrease of 65%, 43%, 43%, and 53% in overall in-attentive affective state instances using the inquiry interventions in e-learning, flipped classroom, classroom and webinar environments, respectively. © 2020, Springer Nature B.V.Item Surveillance video analysis for student action recognition and localization inside computer laboratories of a smart campus(Springer, 2021) Rashmi, M.; Ashwin, T.S.; Guddeti, G.R.M.In the era of smart campus, unobtrusive methods for students’ monitoring is a challenging task. The monitoring system must have the ability to recognize and detect the actions performed by the students. Recently many deep neural network based approaches have been proposed to automate Human Action Recognition (HAR) in different domains, but these are not explored in learning environments. HAR can be used in classrooms, laboratories, and libraries to make the teaching-learning process more effective. To make the learning process more effective in computer laboratories, in this study, we proposed a system for recognition and localization of student actions from still images extracted from (Closed Circuit Television) CCTV videos. The proposed method uses (You Only Look Once) YOLOv3, state-of-the-art real-time object detection technology, for localization, recognition of students’ actions. Further, the image template matching method is used to decrease the number of image frames and thus processing the video quickly. As actions performed by the humans are domain specific and since no standard dataset is available for students’ action recognition in smart computer laboratories, thus we created the STUDENT ACTION dataset using the image frames obtained from the CCTV cameras placed in the computer laboratory of a university campus. The proposed method recognizes various actions performed by students in different locations within an image frame. It shows excellent performance in identifying the actions with more samples compared to actions with fewer samples. © 2020, Springer Science+Business Media, LLC, part of Springer Nature.
