Skip to main content

Pulls data from Google Earth Engine, syncs it to Google Drive, and downloads files.

Project description

Todo List

  • Catch timeout when downloading data.

  • Tie to Jesse's google drive, might be weird because its a shared folder.

Project Summary

Earth-Engine-Wildfire-Data is a Python command-line utility and library for extracting and transforming wildfire-related geospatial data from Google Earth Engine. It supports:

  • Access to MODIS, VIIRS, GRIDMET, and other remote sensing datasets.

  • Filtering wildfire perimeters by date, size, and region.

  • Combining daily and final fire perimeters.

  • Generating YAML config files for use in simulation or prediction tools.

  • Command-line configurability with persistent YAML-based settings.

  • This tool is intended for researchers, data scientists, or modelers working with wildfire data pipelines, particularly those interested in integrating Earth Engine datasets into geospatial ML workflows.

Prerequisite

Requires at least python 3.10.

As of mid-2023, Google Earth Engine access must be linked to a Google Cloud Project, even for free/non-commercial usage. So sign up for a non-commercial earth engine account.

Google API Instructions

Make a service account and add these rolls:

  • Owner
  • Service Usage Admin
  • Service Usage Consumer
  • Storage Admin
  • Storage Object Creator

In main account add these rolls:

  • Owner
  • Service Usage Admin
  • Service Usage Consumer

We then created an oath account for google drive access. We need to create an OAuth account for Google Drive access. In the top right hamburger menu select:

  • APIs & Services/Credentials/+Create credentials/OAuth client ID
    • OAuth client ID
      • first configure OAuth screen. Select Desktop App and give it a name.
      • keep track of the Client ID and Client secret, we will need those later.
      • click download JSON from this screen, these are your credentials.

Now we need to enable the apis. In the top right hamburger menu select:

  • APIs & Services From this menu select Google Drive API and click Enable API. Do the same for Google Earth Engine API

Now we need to add ourselves as a test user in google cloud navigate to API's & Servies/OAut concent screen/Audience - Scroll down and under Test users click + Add users. Select your main account.

Install Instructions

For the stable build:

pip install ee-wildfire

For the experimental build:

git clone git@github.com:KylesCorner/Earth-Engine-Wildfire-Data.git
cd Earth-Engine-Wildfire-Data
pip install -e .

Configuration

There are two ways to configure this tool; you can use command line arguments to alter the internal YAML file, or you can input your own YAML. Here's a template:

year: '2020'
min_size: 1000000
geojson: /home/kyle/NRML/data/perims/
output: /home/kyle/NRML/data/tiff/
drive_dir: EarthEngine_WildfireSpreadTS_2020
credentials: /home/kyle/NRML/OAuth/credentials.json
project_id: project_id_here
download: false
export_data: false
show_config: true
force_new_geojson: false
sync_year: false

Command-Line Interface (CLI)

This tool can be run from the command line to generate fire configuration YAML files from GeoJSON data. Configuration can be passed directly via flags or through a YAML file using --config.

Argument Type Description
--config str Path to a YAML configuration file. Defaults to ./config_options.yml.
--year str The year of the fire events to process.
--min-size float Minimum fire size (in square meters) to include.
--output str Local directory to store generated TIFF files.
--drive-dir str Google Drive directory where TIFFs are uploaded or downloaded from.
--credentials str Path to the Google OAuth2 credentials JSON file. Required for GEE export.
--project-id str Google Cloud project ID associated with your Earth Engine access.
--geojson str Path to the input or output GeoJSON file containing fire perimeter data.
--download flag If set, the tool will download TIFF files from Google Drive.
--export-data flag If set, data will be exported to Google Drive using Earth Engine.
--show-config flag Print the currently loaded configuration and exit. Useful for debugging.
--force-new-geojson flag Force the script to generate a new GeoJSON file even if one exists.
--sync-year flag Have all config and output files sync to the year in the config.

Basic Usage

ee-wildfire --config ./config_options.yml --year 2020 --geojson data/perims/combined_fires_2020.geojson

Acknowledgements

This project builds on work from the WildfireSpreadTSCreateDataset. Credit to original authors for providing data, methods, and insights.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ee_wildfire-2025.0.3.tar.gz (19.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ee_wildfire-2025.0.3-py3-none-any.whl (19.6 kB view details)

Uploaded Python 3

File details

Details for the file ee_wildfire-2025.0.3.tar.gz.

File metadata

  • Download URL: ee_wildfire-2025.0.3.tar.gz
  • Upload date:
  • Size: 19.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.12

File hashes

Hashes for ee_wildfire-2025.0.3.tar.gz
Algorithm Hash digest
SHA256 17e029436436a837034d7c31e2c98412ca63ea81685a22412e01e8bafb500c84
MD5 100ea01e97b071ed3e71b2535693dc53
BLAKE2b-256 d4f318c0add4a21340041c1039b9ac2b8918914e39ceb701affa9e18ad1cd949

See more details on using hashes here.

File details

Details for the file ee_wildfire-2025.0.3-py3-none-any.whl.

File metadata

  • Download URL: ee_wildfire-2025.0.3-py3-none-any.whl
  • Upload date:
  • Size: 19.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.12

File hashes

Hashes for ee_wildfire-2025.0.3-py3-none-any.whl
Algorithm Hash digest
SHA256 2eff77500a2dfa61d0d3c0a39d7aa01e2f2567932ccba2c5276d97247fb7717c
MD5 543703bb4639c0d000cd51229849ab97
BLAKE2b-256 8112ce22c84e554b65cf232c16d323513de20aacccfa345adb86c2a4a757b4e6

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page