Skip to main content

A library for developing earth system foundation models

Project description

OlmoEarth Logo

GitHub License Model Checkpoints Paper PDF

The OlmoEarth models are a flexible, multi-modal, spatio-temporal family of foundation models for Earth Observations.

The OlmoEarth models exist as part of the OlmoEarth platform. The OlmoEarth Platform is an end-to-end solution for scalable planetary intelligence, providing everything needed to go from raw data through R&D, to fine-tuning and production deployment.

Installation

We recommend Python 3.12, and recommend using uv. To install dependencies with uv, run:

git clone git@github.com:allenai/olmoearth_pretrain.git
cd olmoearth_pretrain
uv sync --locked --all-groups --python 3.12
# only necessary for development
uv tool install pre-commit --with pre-commit-uv --force-reinstall

uv installs everything into a venv, so to keep using python commands you can activate uv's venv: source .venv/bin/activate. Otherwise, swap to uv run python.

OlmoEarth is built using OLMo-core. OLMo-core's published Docker images contain all core and optional dependencies.

Model Summary

Model Architecture Diagram

The OlmoEarth models are trained on three satellite modalities (Sentinel 2, Sentinel 1 and Landsat) and six derived maps (OpenStreetMap, WorldCover, USDA Cropland Data Layer, SRTM DEM, WRI Canopy Height Map, and WorldCereal).

Model Size Weights Encoder Params Decoder Params
Nano link 1.4M 800K
Tiny link 6.2M 1.9M
Base link 89M 30M
Large link 308M 53M

Using OlmoEarth

InferenceQuickstart shows how to initialize the OlmoEarth model and apply it on a satellite image.

We also have several more in-depth tutorials for computing OlmoEarth embeddings and fine-tuning OlmoEarth on downstream tasks:

Additionally, olmoearth_projects has several examples of active OlmoEarth deployments.

Data Summary

Our pretraining dataset contains 285,288 samples from around the world of 2.56km×2.56km regions, although many samples contain only a subset of the timesteps and modalities.

The distribution of the samples is available below:

Training sample distribution

The dataset can be downloaded here.

Detailed instructions on how to make your own pretraining dataset are available in the dataset README.

Training scripts

Detailed instructions on how to pretrain your own OlmoEarth model are available in Pretraining.md.

Evaluations

Detailed instructions on how to replicate our evaluations is available here:

License

This code is licensed under the OlmoEarth Artifact License.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

olmoearth_pretrain-0.0.1.tar.gz (292.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

olmoearth_pretrain-0.0.1-py3-none-any.whl (384.2 kB view details)

Uploaded Python 3

File details

Details for the file olmoearth_pretrain-0.0.1.tar.gz.

File metadata

  • Download URL: olmoearth_pretrain-0.0.1.tar.gz
  • Upload date:
  • Size: 292.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for olmoearth_pretrain-0.0.1.tar.gz
Algorithm Hash digest
SHA256 f780d568bc7daefb80141718654e0621f460e15da7c599abb62098b2edc33502
MD5 6279ce82aa9ab14b563eadf983fba390
BLAKE2b-256 0067bbbc3c10f8dbd0b35c4d84d9b501ce444ae988ff9d54b47677549a628c6e

See more details on using hashes here.

Provenance

The following attestation bundles were made for olmoearth_pretrain-0.0.1.tar.gz:

Publisher: publish.yml on allenai/olmoearth_pretrain

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file olmoearth_pretrain-0.0.1-py3-none-any.whl.

File metadata

File hashes

Hashes for olmoearth_pretrain-0.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 cd8604aca3c4dd32b3961878d82ee87666f904dc9cd113e8ed6474b4b1218b53
MD5 73a79ec3b02978c782b72b8200100f1d
BLAKE2b-256 d8424536ccd5d89d0f53dcb25a28ec0a23fcf5507124299ffe9f6571438e4286

See more details on using hashes here.

Provenance

The following attestation bundles were made for olmoearth_pretrain-0.0.1-py3-none-any.whl:

Publisher: publish.yml on allenai/olmoearth_pretrain

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page