Please use this identifier to cite or link to this item: https://idr.nitk.ac.in/jspui/handle/123456789/7031
Full metadata record
DC FieldValueLanguage
dc.contributor.authorRajan, J.
dc.contributor.authorDen, Dekker, A.J.
dc.contributor.authorJuntu, J.
dc.contributor.authorSijbers, J.
dc.date.accessioned2020-03-30T09:58:26Z-
dc.date.available2020-03-30T09:58:26Z-
dc.date.issued2013
dc.identifier.citationLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2013, Vol.8251 LNCS, , pp.451-458en_US
dc.identifier.urihttp://idr.nitk.ac.in/jspui/handle/123456789/7031-
dc.description.abstractDenoising of Magnetic Resonance images is important for proper visual analysis, accurate parameter estimation, and for further preprocessing of these images. Maximum Likelihood (ML) estimation methods were proved to be very effective in denoising Magnetic Resonance (MR) images. Among the ML based methods, the recently proposed Non Local Maximum Likelihood (NLML) approach gained much attention. In the NLML method, the samples for the ML estimation of the true underlying intensity are selected in a non local way based on the intensity similarity of the pixel neighborhoods. This similarity is generally measured using the Euclidean distance. A drawback of this approach is the usage of a fixed sample size for the ML estimation and, as a result, optimal results cannot be achieved because of over- or under-smoothing. In this work, we propose an NLML estimation method for denoising MR images in which the samples are selected in an adaptive way using the Kolmogorov-Smirnov (KS) similarity test. The method has been tested both on simulated and real data, showing its effectiveness. � Springer-Verlag 2013.en_US
dc.titleA new nonlocal maximum likelihood estimation method for denoising magnetic resonance imagesen_US
dc.typeBook chapteren_US
Appears in Collections:2. Conference Papers

Files in This Item:
There are no files associated with this item.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.