This digital tool is part of the catalog of tools of the Inter-American Development Bank. You can learn more about the IDB initiative at code.iadb.org
Detection of burned areas using deep learning from satellite images.
• Description • Notebooks • About Dymaxion Labs • Contributing • License •
The burned-area-detection project aims to identify and analyze the affected areas after a fire incident. It allows us to understand incident behavior to take action shortly.
The number of uncontrolled fires has increased significantly in the last few years. This kind of environmental catastrophe affects habitat and community on several levels. The impact on our environment can be evidenced in a short time by measuring the wellness and the evacuation process of the different communities living in affected areas. But we are also able to notice its effects in the long term due to the impact on nature and local economies. Some of the project's principal goals are measuring these affected areas.
This project uses Sentinel-2 public satellite images. Sentinel-2 has high cadence at no cost, allowing the study of the affected area's evolution across time. These images can be download from Google Earth Engine. There are several reflectance bands available to use, besides a combination of them can be more sensitive to detect burn areas.
The Normalized Burn Ratio (NBR) is an index that highlights burnt areas in large fire zones. The formula combines the near-infrared (NIR) and shortwave infrared (SWIR) wavelengths.
Healthy vegetation shows a very high reflectance in the NIR, and low reflectance in the SWIR portion of the spectrum, (see figure below). The contrary happens for areas destroyed by fire; recently burnt areas show a low reflectance in the NIR and high reflectance in the SWIR. Therefore, the normalized difference between the NIR and the SWIR is a good discriminant for this kind of phenomenon.
The difference between the pre-fire and post-fire NBR obtained from the images is used to calculate the delta NBR. A higher value of dNBR indicates more severe damage, while areas with negative dNBR values may indicate regrowth following a fire.
Uses satproc and unetseg Python packages.
This repository contains a set of Jupyter Notebooks describing the steps for building a semantic segmentation model based on the U-Net architecture for detecting burned areas from fires from optical satellite imagery.
- Pre-process: Image and ground truth data preprocessing and dataset generation
- Training: Model training and evaluation
- Prediction: Prediction
- Post-process: Post-processing of prediction results
Dymaxion Labs leverages AI and Computer Vision to analyze petabytes of geospatial data to understand the physical world. These include optical, SAR and aerial imagery, climate data, and IoT sensors. With our grounded, data science based methodology, private companies and the public sector accelerate strategic data-driven decisions from their remote targets.
- María Roberta Devesa [email protected]
- Damián Silvani [email protected]
Bug reports and pull requests are welcome on GitHub at the issues page. This project is intended to be a safe, welcoming space for collaboration, and contributors are expected to adhere to the Contributor Covenant code of conduct.
This project is licensed under Apache 2.0. Refer to LICENSE.txt.