MARU-Net: Multiscale Attention Gated Residual U-Net With Contrastive Loss for SAR-Optical Image Matching
Peer reviewed, Journal article
Published version
Date
2023Metadata
Show full item recordCollections
Original version
IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing. 2023, 16 4891-4899. 10.1109/JSTARS.2023.3277550Abstract
Accurate synthetic aperture radar-optical matching is essential for combining the complementary information from the two sensors. However, the main challenge is overcoming the different heterogeneous characteristics of the two imaging sensors. In this article, we propose an end-to-end machine learning pipeline inspired by recent advances in image segmentation. We develop a siamese multiscale attention-gated residual U-Net for feature extraction from satellite images. The siamese architecture shares weights and transforms the heterogeneous images into a homogeneous feature space. Fast Fourier transform is used to compute the cross-correlation between the feature maps and produce a similarity map. A contrastive loss is introduced to aid the training procedure of the model and maximize the discriminability of the model. The experimental results on a benchmark dataset show that the proposed method has superior matching accuracy and precision compared to other state-of-the-art methods.