Skip to main content

Creating maps with machine learning models and earth observation data.

Project description

OpenMapFlow 🌍

CI Status Docker Status tb1 db1 tb2 db2 tb3 db3

Rapid map creation with machine learning and earth observation data.

Examples: Cropland, Buildings, Maize

3maps-gif

Tutorial cb

Colab notebook tutorial demonstrating data exploration, model training, and inference over small region. (video)

Prerequisites:

How it works

To create your own maps with OpenMapFlow, you need to

  1. Generate your own OpenMapFlow project, this will allow you to:
  2. Add your own labeled data
  3. Train a model using that labeled data, and
  4. Create a map using the trained model.

openmapflow-pipeline

Generating a project cb

Prerequisites:

Once all prerequisites are satisfied, inside your Github repository run:

pip install openmapflow
openmapflow generate

The command will prompt for project configuration such as project name and Google Cloud Project ID. Several prompts will have defaults shown in square brackets. These will be used if nothing is entered.

After all configuration is set, the following project structure will be generated:

<YOUR PROJECT NAME>
│   README.md
│   datasets.py             # Dataset definitions (how labels should be processed)
│   evaluate.py             # Template script for evaluating a model
│   openmapflow.yaml        # Project configuration file
│   train.py                # Template script for training a model
│   
└─── .dvc/                  # https://dvc.org/doc/user-guide/what-is-dvc
│       
└─── .github
│   │
│   └─── workflows          # Github actions
│       │   deploy.yaml     # Automated Google Cloud deployment of trained models
│       │   test.yaml       # Automated integration tests of labeled data
│       
└─── data
    │   raw_labels/                     # User added labels
    │   processed_labels/               # Labels standardized to common format
    │   features/                       # Labels combined with satellite data
    │   compressed_features.tar.gz      # Allows faster features downloads
    │   models/                         # Models trained using features
    |   raw_labels.dvc                  # Reference to a version of raw_labels/
    |   processed_labels.dvc            # Reference to a version of processed_labels/
    │   compressed_features.tar.gz.dvc  # Reference to a version of features/
    │   models.dvc                      # Reference to a version of models/
    

This project contains all the code necessary for: Adding data ➞ Training a model ➞ Creating a map.

Adding data cb

Prerequisites:

Move raw labels into project:

export RAW_LABEL_DIR=$(openmapflow datapath RAW_LABELS)
mkdir RAW_LABEL_DIR/<my dataset name>
cp -r <path to my raw data files> RAW_LABEL_DIR/<my dataset name>

Add reference to data using a LabeledDataset object in datasets.py, example:

datasets = [
    LabeledDataset(
        dataset="example_dataset",
        country="Togo",
        raw_labels=(
            RawLabels(
                filename="Togo_2019.csv",
                longitude_col="longitude",
                latitude_col="latitude",
                class_prob=lambda df: df["crop"],
                start_year=2019,
            ),
        ),
    ),
    ...
]

Run feature creation:

earthengine authenticate    # For getting new earth observation data
gcloud auth login           # For getting cached earth observation data

openmapflow create-features # Initiatiates or checks progress of features creation
openmapflow datasets        # Shows the status of datasets

dvc commit && dvc push      # Push new data to data version control

git add .
git commit -m'Created new features'
git push

Important: When new data is pushed to the repository a Github action will be run to verify data integrity. This action will pull data using dvc and thereby needs access to remote storage (your Google Drive). To allow the Github action to access the data add a new repository secret (instructions).

  • In step 5 of the instructions, name the secret: GDRIVE_CREDENTIALS_DATA
  • In step 6, enter the value in .dvc/tmp/gdrive-user-creditnals.json (in your repository)

After this the Github action should successfully run if the data is valid.

Training a model cb

Prerequisites:

# Pull in latest data
dvc pull    
tar -xzf $(openmapflow datapath COMPRESSED_FEATURES) -C data

# Set model name, train model, record test metrics
export MODEL_NAME=<YOUR MODEL NAME>              
python train.py --model_name $MODEL_NAME    
python evaluate.py --model_name $MODEL_NAME 

# Push new models to data version control
dvc commit 
dvc push  

# Make a Pull Request to the repository
git checkout -b"$MODEL_NAME"
git add .
git commit -m "$MODEL_NAME"
git push --set-upstream origin "$MODEL_NAME"

Important: When a new model is pushed to the repository a Github action will be run to deploy this model to Google Cloud. To allow the Github action to access Google Cloud add a new repository secret (instructions).

  • In step 5 of the instructions, name the secret: GCP_SA_KEY
  • In step 6, enter a Google Cloud Service Account key (how to create)

Now after merging the pull request, the model will be deployed to Google Cloud.

Creating a map cb

Prerequisites:

Only available through Colab. Cloud Architecture must be deployed using the deploy.yaml Github Action.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

openmapflow-0.0.2.tar.gz (55.5 kB view details)

Uploaded Source

Built Distribution

openmapflow-0.0.2-py3-none-any.whl (67.2 kB view details)

Uploaded Python 3

File details

Details for the file openmapflow-0.0.2.tar.gz.

File metadata

  • Download URL: openmapflow-0.0.2.tar.gz
  • Upload date:
  • Size: 55.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.9.13

File hashes

Hashes for openmapflow-0.0.2.tar.gz
Algorithm Hash digest
SHA256 51108f69ef843ee8d9413a10524615a5993d7e094ac8b008a0968b63c8df5f55
MD5 94e23c34bc79c02ac065e3dbed09913b
BLAKE2b-256 64c6505608a7a5299959f8edc76a877339349ae3c06c7f8a617898681b59bbda

See more details on using hashes here.

File details

Details for the file openmapflow-0.0.2-py3-none-any.whl.

File metadata

  • Download URL: openmapflow-0.0.2-py3-none-any.whl
  • Upload date:
  • Size: 67.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.9.13

File hashes

Hashes for openmapflow-0.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 cd3fb8042fb0496f650e853b4202d7b365486eb5b4388f2514d7040d0de0342f
MD5 4a47bbe0e033d14ce8262d0fef68ece4
BLAKE2b-256 b419ec3b13666fafffbfa5b59164ca2c4c605dc24225b912f728425f047c9496

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page