A Framework for Human Activity and Behavioural Pattern Recognition in Multimodal Sensor Smart Home Environment

dc.contributor.authorKolkar, Ranjit
dc.contributor.authorGeetha V.
dc.date.accessioned2026-01-23T15:09:54Z
dc.date.issued2024
dc.description.abstractHuman Activity Recognition (HAR) has become a subject of significant interest due to its potential applications in various fields, including healthcare, sports, and user profiling. There are four main types of sensor-based HAR: wearable, ambient, camera, and hybrid sensor-based recognition. Smartphones, with their built-in sensors, have emerged as valuable tools for HAR, and other sensors like Passive Infrared (PIR), load sensors, smart switches, and smartwatches are extensively used in HAR systems along with vision-based sensors. Despite advancements, accurately recognizing human activities remains challenging due to the complexity and diversity of sensors used and the intricate nature of human activities. Each sensor type has advantages and limitations, making selecting appropriate sensors a challenging task requiring a comprehensive understanding of their characteristics. While there are existing applications of HAR, there are still significant opportunities to address various challenges. This work addresses several challenges in improving recognition efficiency, integrating multimodal sensors, achieving synchronization between heterogeneous sensors, collecting long-hour data using these sensors, and developing a cost-effective framework for human activity recognition and behavioral patterns in the daily life of an elderly person. The thesis work addresses the challenges and develops a framework for HAR and behavioural pattern recognition using multimodal sensors in a smart home environment. First, we design and develop a deep learning-based solution to recognize the activities based on sensors present in the smartphone. Later, we create and curate a dataset for long-hour human activities in a multimodal sensor-equipped smart home environment and follow to design and develop a human behavioural pattern recognition system in a smart home environment. The first work focuses on comparing the performance of various deep learning models Convolutional Neural Networks (CNN), Long Short-Term Memory (LSTM), and Gated Recurrent Unit (GRU) for HAR using smartphone-based sensors. The study explored various datasets and recognition models, providing valuable insights into the overall HAR architecture. The primary objective of this research is to accurately recognize basic human activities such as walking, sitting, standing, going upstairs, going downstairs, and lying down. The models were trained and evaluated on well-known datasets like Wireless Sensor Data Mining (WISDM) and University of California, Irvine, Human Activity Recognition (UCI-HAR). Through rigorous experimentation, the performance of the models on these datasets was significantly improved using the GRU model, laying the foundation for the subsequent research objectives. Additionally, the thesis proposed a novel approach called Spider Monkey Optimization (SMO)-based deep neural network to enhance HAR’s accuracy and precision further. The proposed system was evaluated on various datasets involving similar activities, including UCI-HAR, WISDM, Royal Institute of Technology (KTH) action, and Physical Activity Monitoring using Accelerometers, Gyroscopes, and Magnetometers (PAMAP2). The optimization improved the performance and the reduced training time, making it practical for real-world applications. The second work in the thesis involves the collection of long-hour datasets using a multimodal approach. It has been observed from the literature and our previous work that understanding human behaviour patterns solely based on basic activities and smartphone sensors is challenging. Therefore, in this work, we combined smartphone sensor data with ambient sensors to better understand the user’s context. The context includes room occupancy detection using PIR sensors, water bottle level indication using load sensors, and monitoring the status of the TV, bathroom lights, and mirror bulb lights using smart switches. We derived a broader range of activities beyond the basic ones by combining and proposing a hybrid sensor-based data collection approach for two individuals over an extended period. The third work in the thesis also proposes a novel priority-based labelling technique for data segmentation to retain user context while labelling. This enhanced dataset enables us to gain valuable insights into human behaviour patterns in dayto- day life. Additionally, through a comprehensive analysis of user data, we can derive the user’s personality and provide feedback on their behaviour patterns to improve or analyze activities performed over time. The research identifies various applications, such as elderly monitoring systems, personality identification, and behaviour analysis, all aimed at improving health and well-being. KEYWORDS: HAR, SMO, Wearable sensors Smartphone sensors, Deep learning, Ambient sensors, Internet of Things (IoT), PIR, Elderly monitoring, User profiling, Behaviour patterns.
dc.identifier.urihttps://idr.nitk.ac.in/handle/123456789/18798
dc.language.isoen
dc.publisherNational Institute of Technology Karnataka, Surathkal
dc.subjectSMO
dc.subjectHAR
dc.subjectWearable sensors Smartphone sensors
dc.subjectDeep learning
dc.subjectAmbient sensors
dc.subjectInternet of Things (IoT)
dc.subjectPIR
dc.subjectElderly monitoring
dc.subjectUser profiling
dc.subjectBehaviour patterns.
dc.titleA Framework for Human Activity and Behavioural Pattern Recognition in Multimodal Sensor Smart Home Environment
dc.typeThesis

Files

Original bundle

Now showing 1 - 1 of 1
Thumbnail Image
Name:
187059-IT002-RanjitKolkar.pdf
Size:
4.11 MB
Format:
Adobe Portable Document Format

License bundle

Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
1.71 KB
Format:
Item-specific license agreed upon to submission
Description:

Collections