Skip to main content

FRAMES logo
Resource Catalog

Document

Type: Journal Article
Author(s): Puzhao Zhang; Yifang Ban; Andrea Nascetti
Publication Date: 2021

Wildfires are increasing in intensity and frequency across the globe due to climate change and rising global temperature. Development of novel approach to Monitor wildfire progressions in near real-time is therefore of critical importance for emergency responses. The objective of this research is to investigate continuous learning with U-Net by exploiting both Sentinel-1 SAR and Sentinel-2 MSI time series for increasing the frequency and accuracy of wildfire progression mapping. In this study, optical-based burned areas prior to each SAR acquisition (when available) were accumulated into SAR-based pseudo progression masks to train a deep residual U-Net model. Unlike multi-temporal fusion of SAR and optical data, the temporal fusion of progression masks allows us to track as many wildfire progressions as possible. Specifically, two approaches were investigated to train the deep residual U-Net model for continuous learning: 1) Continuous joint training (CJT) with all historical data (including both SAR and optical data); 2) Learning without forgetting (LwF) based on newly incoming data alone (SAR or optical). For LwF, a mean squared loss was integrated to keep the capabilities learned before and prevent it from overfitting to newly incoming data only. By fusing optical-based burned areas, SAR-based progression pseudo masks improve significantly, which benefits both data sampling and model training, considering the challenges in SAR-based change extraction attributed to the variability in SAR backscatter of the surrounding environments. Pre-trained ResNet was frozen as the encoder of the U-Net model, and the decoder part was trained to further refine the derived burned area maps in a progression-wise manner. The experimental results demonstrated that LwF has the potential to match CJT in terms of the agreement between SAR-based results and optical-based ground truth, achieving a F1 score of 0.8423 on the Sydney Fire (2019-2020) and 0.7807 on the Chuckegg Creek Fire (2019). We also observed that the SAR cross-polarization ratio (VH/VV) shows good capability in suppressing multiplicative noise and detecting burned areas when VH and VV have diverse temporal behaviors.

Online Links
Citation: Zhang, Puzhao; Ban, Yifang; Nascetti, Andrea. 2021. Learning U-Net without forgetting for near real-time wildfire monitoring by the fusion of SAR and optical time series. Remote Sensing of Environment 261:112467.

Cataloging Information

Topics:
Regions:
Keywords:
  • Alberta
  • Arizona
  • Australia
  • burned area
  • Canada
  • change detection
  • Chuckegg Creek Fire
  • deep learning
  • fire progression
  • Mangum Fire
  • SAR - synthetic aperture radar
  • Sentinel-1
  • Sentinel-2
  • Sydney Fire
  • U-Net
  • wildfires
Record Last Modified:
Record Maintained By: FRAMES Staff (https://www.frames.gov/contact)
FRAMES Record Number: 63599