DOI: 10.3390/rs17132241 ISSN: 2072-4292

SAR-DeCR: Latent Diffusion for SAR-Fused Thick Cloud Removal

Meilin Wang, Shihao Hu, Yexing Song, Yukai Shi

The current methods for removing thick clouds from remote-sensing images face significant limitations, including the integration of thick cloud images with synthetic aperture radar (SAR) ground information, the provision of meaningful guidance for SAR ground data, and the accurate reconstruction of textures in cloud-covered regions. To overcome these challenges, we introduce SAR-DeCR, a novel method for thick cloud removal in satellite remote-sensing images. SAR-DeCR utilizes a diffusion model combined with the transformer architecture to synthesize accurate texture details guided by SAR ground information. The method is structured into three distinct phases: coarse cloud removal (CCR), SAR-Fusion (SAR-F) and cloud-free diffusion (CF-D), aimed at enhancing the effectiveness of the thick cloud removal. In CCR, we significantly employ the transformer’s capability for long-range information interaction, which significantly strengthens the cloud removal process. In order to overcome the problem of missing ground information after cloud removal and ensure that the ground information produced is consistent with SAR data, we introduced SAR-F, a module designed to incorporate the rich ground information in synthetic aperture radar (SAR) into the output of CCR. Additionally, to achieve superior texture reconstruction, we introduce prior supervision based on the output of the coarse cloud removal, using a pre-trained visual-text diffusion model named cloud-free diffusion (CF-D). This diffusion model is encouraged to follow the visual prompts, thus producing a visually appealing, high-quality result. The effectiveness and superiority of SAR-DeCR are demonstrated through qualitative and quantitative experiments, comparing it with other state-of-the-art (SOTA) thick cloud removal methods on the large-scale SEN12MS-CR dataset.

More from our Archive