Please use this identifier to cite or link to this item:
https://idr.nitk.ac.in/jspui/handle/123456789/11983
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Anusha, R. | |
dc.contributor.author | Jaidhar, C.D. | |
dc.date.accessioned | 2020-03-31T08:36:03Z | - |
dc.date.available | 2020-03-31T08:36:03Z | - |
dc.date.issued | 2020 | |
dc.identifier.citation | Multimedia Tools and Applications, 2020, Vol., , pp.- | en_US |
dc.identifier.uri | http://idr.nitk.ac.in/jspui/handle/123456789/11983 | - |
dc.description.abstract | Gait recognition is an evolving technology in the biometric domain; it aims to recognize people through an analysis of their walking pattern. One of the significant challenges of the appearance-based gait recognition system is to augment its performance by using a distinctive low-dimensional feature vector. Therefore, this study proposes the low-dimensional features that are capable of effectively capturing the spatial, gradient, and texture information in this context. These features are obtained by the computation of histogram of oriented gradients, followed by sum variance Haralick texture descriptor from nine cells of gait gradient magnitude image. Further, the performance of the proposed method is validated on five widely used gait databases. They include CASIA A gait database, CASIA B gait database, OU-ISIR D gait database, CMU MoBo database, and KTH video database. The experimental results demonstrated that the proposed approach could choose significant discriminatory features for individual identification and consequently, outperform certain state-of-the-art methods in terms of recognition performance. 2020, Springer Science+Business Media, LLC, part of Springer Nature. | en_US |
dc.title | Human gait recognition based on histogram of oriented gradients and Haralick texture descriptor | en_US |
dc.type | Article | en_US |
Appears in Collections: | 1. Journal Articles |
Files in This Item:
There are no files associated with this item.
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.