Automated Colorization of Grayscale Images Using Superpixels and K-Means Clustering

dc.contributor.authorKulkarni, B.C.
dc.contributor.authorTeja, B.
dc.contributor.authorHegde, A.R.
dc.contributor.authorBhat, P.
dc.contributor.authorPatil, N.
dc.date.accessioned2026-02-06T06:33:26Z
dc.date.issued2025
dc.description.abstractThe process of transforming grayscale photos into aesthetically pleasing color images is called colorization. Convincing the audience of the realism of the outcome is the primary objective of colorization. Natural scenery makes up the majority of the grayscale photographs that require colorization. A broad range of colorization techniques have been created over the past 20 years; these vary from algorithmically basic procedures that need time and energy due to inevitable human participation to more complex ways that are also more automated. The complex field of automatic conversion mixes deep learning, machine learning, and art. Most of the earlier works which use deep learning, use every pixel values to train the models which is computationally expensive. We present a methodology for colorizing grayscale images using convolutional neural network (CNN), our method uses a combination of superpixel segmentation and K-Means clustering to significantly reduce number of pixel values. The process begins with the conversion of grayscale images to superpixels, which are perceptually uniform regions that aid in efficient colorization. Subsequently, K-Means clustering is applied within each superpixel to identify dominant color clusters, followed by quantization of color information to simplify representation. The prepared input, comprising grayscale images and quantized color information, is then fed into a CNN for colorization, leveraging spatial coherence and semantic context to predict plausible colors for grayscale pixels. The proposed methodology is evaluated on a diverse set of grayscale images, demonstrating its effectiveness in producing vibrant and visually appealing colorized outputs. Through experiments and analysis, we showcase the potential applications and benefits of the proposed approach in historical photograph restoration, movie colorization, and other domains requiring accurate and efficient grayscale image colorization. We use SSIM and PSNR as our evaluation metrics. SSIM is calculated based on the similarity of the luminance and brightness values of the considered and obtained rgb images for the grayscale images, and PSNR is calculated using Mean Squared Error (MSE) of the peak signal values within images. Our methodology’s SSIM and PSNR for the considered flower class is 81.5 and 25.6, respectively. © The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2025.
dc.identifier.citationLecture Notes in Networks and Systems, 2025, Vol.1265 LNNS, , p. 545-555
dc.identifier.issn23673370
dc.identifier.urihttps://doi.org/10.1007/978-981-96-2299-3_37
dc.identifier.urihttps://idr.nitk.ac.in/handle/123456789/28647
dc.publisherSpringer Science and Business Media Deutschland GmbH
dc.subjectColorization
dc.subjectConvolutional neural networks (CNNs)
dc.subjectDeep learning
dc.subjectGrayscale images
dc.subjectImage enhancement
dc.subjectK-Means clustering
dc.subjectSuperpixel segmentation
dc.titleAutomated Colorization of Grayscale Images Using Superpixels and K-Means Clustering

Files