Faculty Publications

Permanent URI for this communityhttps://idr.nitk.ac.in/handle/123456789/18736

Publications by NITK Faculty

Browse

Search Results

Now showing 1 - 5 of 5
  • Item
    An E-Learning System with Multifacial Emotion Recognition Using Supervised Machine Learning
    (Institute of Electrical and Electronics Engineers Inc., 2016) Ashwin, T.S.; Jose, J.; Raghu, G.; Guddeti, G.R.
    E-Learning systems based on Affective computingare popularly used for emotional/behavioral analysis of the users. Emotions expressed by the user is depicted by detecting the facialexpression of the user and accordingly the teaching strategies willbe changed. The present eLearning systems mainly focus on thesingle user face detection. Hence, in this paper, we proposemultiuser face detection based eLearning system using supportvector machine based supervised machine learning technique. Experimental results demonstrate that the proposed systemprovides the accuracy of 89% to 100% w.r.t different datasets(LFW, FDDB, and YFD). Further, to improve the speed ofemotional feature processing, we used GPU along with the CPUand thereby achieve a speedup factor of 2. © 2015 IEEE.
  • Item
    Unobtrusive students' engagement analysis in computer science laboratory using deep learning techniques
    (Institute of Electrical and Electronics Engineers Inc., 2018) Ashwin, T.S.; Guddeti, R.M.
    Nowadays, analysing the students' engagement using non-verbal cues is very popular and effective. There are several web camera based applications for predicting the students' engagement in an e-learning environment. But there are very limited works on analyzing the students' engagement using the video surveillance cameras in a teaching laboratory. In this paper, we propose a Convolutional Neural Networks based methodology for analysing the students' engagement using video surveillance cameras in a teaching laboratory. The proposed system is tested on five different courses of computer science and information technology with 243 students of NITK Surathkal, Mangalore, India. The experimental results demonstrate that there is a positive correlation between the students' engagement and learning, thus the proposed system outperforms the existing systems. © 2018 IEEE.
  • Item
    Unobtrusive Behavioral Analysis of Students in Classroom Environment Using Non-Verbal Cues
    (Institute of Electrical and Electronics Engineers Inc., 2019) Ashwin, T.S.; Guddeti, G.R.
    Pervasive intelligent learning environments can be made more personalized by adapting the teaching strategies according to the students' emotional and behavioral engagements. The students' engagement analysis helps to foster those emotions and behavioral patterns that are beneficial to learning, thus improving the effectiveness of the teaching-learning process. Unobtrusive student engagement analysis is performed using the students' non-verbal cues such as facial expressions, hand gestures, and body postures. Though there exist several techniques for classifying the engagement of a single student present in a single image frame, there are limited works on the students' engagement analysis in a classroom environment. In this paper, we propose a convolutional neural network architecture for unobtrusive students' engagement analysis using non-verbal cues. The proposed architecture is trained and tested on faces, hand gestures and body postures in the wild of more than 350 students present in a classroom environment, with each test image containing multiple students in a single image frame. The data annotation is performed using the gold standard study, and the annotators reliably agree with Cohen's ? = 0.43. We obtained 71% accuracy for the students' engagement level classification. Further, a pre-test/post-test analysis was performed, and it was observed that there is a positive correlation between the students' engagement and their test performance. © 2013 IEEE.
  • Item
    Impact of inquiry interventions on students in e-learning and classroom environments using affective computing framework
    (Springer Science and Business Media B.V. editorial@springerplus.com, 2020) Ashwin, T.S.; Guddeti, R.M.R.
    Effective teaching strategies improve the students’ learning rate within academic learning time. Inquiry-based instruction is one of the effective teaching strategies used in the classrooms. But these teaching strategies are not adapted in other learning environments like intelligent tutoring systems, including auto tutors. In this paper, we propose an automatic inquiry-based instruction teaching strategy, i.e., inquiry intervention using students’ affective states. The proposed model contains two modules: the first module consists of the proposed framework for predicting the unobtrusive multi-modal students’ affective states (teacher-centric attentive and in-attentive states) using the facial expressions, hand gestures and body postures. The second module consists of the proposed automated inquiry-based instruction teaching strategy to compare the learning outcomes with and without inquiry intervention using affective state transitions for both an individual and a group of students. The proposed system is tested on four different learning environments, namely: e-learning, flipped classroom, classroom and webinar environments. Unobtrusive recognition of students’ affective states is performed using deep learning architectures. After student-independent tenfold cross-validation, we obtained the students’ affective state classification accuracy of 77% and object localization accuracy of 81% using students’ faces, hand gestures and body postures. The overall experimental results demonstrate that there is a positive correlation with r= 0.74 between students’ affective states and their performance. Proposed inquiry intervention improved the students’ performance as there is a decrease of 65%, 43%, 43%, and 53% in overall in-attentive affective state instances using the inquiry interventions in e-learning, flipped classroom, classroom and webinar environments, respectively. © 2020, Springer Nature B.V.
  • Item
    Affective Feedback Synthesis Towards Multimodal Text and Image Data
    (Association for Computing Machinery, 2023) Kumar, P.; Bhatt, G.; Ingle, O.; Goyal, D.; Raman, B.
    In this article, we have defined a novel task of affective feedback synthesis that generates feedback for input text and corresponding images in a way similar to humans responding to multimodal data. A feedback synthesis system has been proposed and trained using ground-truth human comments along with image-text input. We have also constructed a large-scale dataset consisting of images, text, Twitter user comments, and the number of likes for the comments by crawling news articles through Twitter feeds. The proposed system extracts textual features using a transformer-based textual encoder. The visual features have been extracted using a Faster region-based convolutional neural networks model. The textual and visual features have been concatenated to construct multimodal features that the decoder uses to synthesize the feedback. We have compared the results of the proposed system with baseline models using quantitative and qualitative measures. The synthesized feedbacks have been analyzed using automatic and human evaluation. They have been found to be semantically similar to the ground-truth comments and relevant to the given text-image input. © 2023 Copyright held by the owner/author(s). Publication rights licensed to ACM.