Detection and Classification of Ships using a self-attention Residual Network

No Thumbnail Available

Date

2022

Journal Title

Journal ISSN

Volume Title

Publisher

Institute of Electrical and Electronics Engineers Inc.

Abstract

The categorization and detection of ships is crucial in maritime applications such as marine surveillance, traffic monitoring etc., which are extremely crucial for ensuring national security. There has been extensive amount of research happening in this area since the last decade. The acceleration of computational power and increasing number of satellite images collected over the years paved the way for such an extensive research. However, high-resolution images of ships that capture the texture details for an effective categorization are still very limited. In the recent years, researchers collected images over various sources to create datasets which are made publicly available for use. In our work, we use the ShipRSImageNet dataset which was recently released. Many state-of-the-art object detection methods were employed on this dataset for different levels of ship detection. But due to the complexity of the dataset, the state-of-the-art models are not capable enough to perform a full categorization of the ships in these dataset. In this work, a residual attention based convolutional neural network (CNN) model is employed for feature extraction, which can be fed in to the state-of-the-art object detection models for the extraction of the features. Usually, other backbone networks which can be employed are complex and consists of multiple layers. Due to the need for a large number of training parameters, these models and approaches have heavy-weight designs. Therefore, they need more storage and costlier training processes. © 2022 IEEE.

Description

Keywords

Attention model, Deep learning, Satellite images, Ship detection and classification

Citation

2022 IEEE 6th Conference on Information and Communication Technology, CICT 2022, 2022, Vol., , p. -

Endorsement

Review

Supplemented By

Referenced By