DAM-Net: Global Flood Detection from SAR Imagery Using Differential Attention Metric-Based Vision Transformers
The detection of flooded areas using high-resolution synthetic aperture radar (SAR) imagery is a critical task with applications in crisis and disaster management, as well as environmental resource planning. However, the complex nature of SAR images presents a challenge that often leads to an overestimation of the flood extent. To address this issue, we propose a novel differential attention metric-based network (DAM-Net) in this study. The DAM-Net comprises two key components: a weight-sharing Siamese backbone to obtain multi-scale change features of multi-temporal images and tokens containing high-level semantic information of water-body changes, and a temporal differential fusion (TDF) module that integrates semantic tokens and change features to generate flood maps with reduced speckle noise. Specifically, the backbone is split into multiple stages. In each stage, we design three modules, namely, temporal-wise feature extraction (TWFE), cross-temporal change attention (CTCA), and temporal-aware change enhancement (TACE), to effectively extract the change features. In TACE of the last stage, we introduce a class token to record high-level semantic information of water-body changes via the attention mechanism. Another challenge faced by data-driven deep learning algorithms is the limited availability of flood detection datasets. To overcome this, we have created the S1GFloods open-source dataset, a global-scale high-resolution Sentinel-1 SAR image pairs dataset covering 46 global flood events between 2015 and 2022. The experiments on the S1GFloods dataset using the proposed DAM-Net showed top results compared to state-of-the-art methods in terms of overall accuracy, F1-score, and IoU, which reached 97.8 respectively. Our dataset and code will be available online at https://github.com/Tamer-Saleh/S1GFlood-Detection.
READ FULL TEXT