Skip to main content

Minimal package for loading and initializing OlmoEarth models

Project description

OlmoEarth Pretrain Minimal

A minimal package for loading and initializing OlmoEarth v1 models. This package contains only the code necessary to load models from Hugging Face or initialize them with random weights, without training or evaluation dependencies.

Installation

Install uv if you haven't already:

curl -LsSf https://astral.sh/uv/install.sh | sh

To install dependencies:

git clone git@github.com:allenai/olmoearth_pretrain_minimal.git
cd olmoearth_pretrain_minimal
# Install with CPU-only PyTorch (works on all platforms including Linux)
uv sync --locked --python 3.12 --extra torch-cpu
# Or install with CUDA 12.8 PyTorch
uv sync --locked --python 3.12 --extra torch-cu128

uv installs everything into a venv, so to keep using python commands you can activate uv's venv: source .venv/bin/activate. Otherwise, swap to uv run python.

Note: You must specify either --extra torch-cpu or --extra torch-cu128 to install PyTorch. This allows you to explicitly choose the CPU or GPU version regardless of your platform, which is especially useful for CI environments that need CPU-only builds on Linux.

Model Summary

Model Architecture Diagram

The OlmoEarth models are trained on three satellite modalities (Sentinel 2, Sentinel 1 and Landsat) and six derived maps (OpenStreetMap, WorldCover, USDA Cropland Data Layer, SRTM DEM, WRI Canopy Height Map, and WorldCereal).

Model Size Weights Encoder Params Decoder Params
Nano link 1.4M 800K
Tiny link 6.2M 1.9M
Base link 89M 30M
Large link 308M 53M

Usage

Loading Models from Hugging Face

The recommended way to load models is using the model loader, which downloads the model configuration from Hugging Face:

from olmoearth_pretrain_minimal import ModelID, load_model_from_id

# Load a model from Hugging Face with pre-trained weights
# - ModelID.OLMOEARTH_V1_NANO - 1.4M encoder params, 800K decoder params
# - ModelID.OLMOEARTH_V1_TINY - 6.2M encoder params, 1.9M decoder params
# - ModelID.OLMOEARTH_V1_BASE - 89M encoder params, 30M decoder params
# - ModelID.OLMOEARTH_V1_LARGE - 308M encoder params, 53M decoder params
model = load_model_from_id(ModelID.OLMOEARTH_V1_BASE, load_weights=True)

# Load with randomly initialized weights 
model_with_weights = load_model_from_id(ModelID.OLMOEARTH_V1_NANO, load_weights=True)

Direct Model Initialization (Custom Configuration)

For custom configurations (e.g., custom modalities), you can directly instantiate the model class:

from olmoearth_pretrain_minimal import OlmoEarthPretrain_v1

# Initialize with custom modalities and settings
model = OlmoEarthPretrain_v1(
    model_size="nano",
    supported_modality_names=["sentinel2_l2a", "sentinel1", "landsat"],
    max_patch_size=8,
    max_sequence_length=12,
    drop_path=0.1,
)

Manual Weight Loading

If you have pre-trained weights in a separate file, you can load them manually:

from olmoearth_pretrain_minimal import ModelID, load_model_from_id
import torch

# Load model without weights
model = load_model_from_id(ModelID.OLMOEARTH_V1_NANO, load_weights=False)

# Load pre-trained weights from a separate file
weights = torch.load("path/to/weights.pth")
model.load_state_dict(weights)

Note

For the full package with training and evaluation capabilities, see the main olmoearth_pretrain package.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

olmoearth_pretrain_minimal-0.0.1.tar.gz (53.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

olmoearth_pretrain_minimal-0.0.1-py3-none-any.whl (59.0 kB view details)

Uploaded Python 3

File details

Details for the file olmoearth_pretrain_minimal-0.0.1.tar.gz.

File metadata

File hashes

Hashes for olmoearth_pretrain_minimal-0.0.1.tar.gz
Algorithm Hash digest
SHA256 cbb5a675e8c730f856abaaa8b324f6a9a7b1d554897a2fac733800d373c6f2ee
MD5 72140be11c55a902682ff60c418be4a6
BLAKE2b-256 3272a53c289670a3771ad93007cade8e5329e5206555991fb183b26089e62784

See more details on using hashes here.

Provenance

The following attestation bundles were made for olmoearth_pretrain_minimal-0.0.1.tar.gz:

Publisher: publish.yml on allenai/olmoearth_pretrain_minimal

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file olmoearth_pretrain_minimal-0.0.1-py3-none-any.whl.

File metadata

File hashes

Hashes for olmoearth_pretrain_minimal-0.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 91c1d7bb48324a2c1b587dfe4c334fb3c5582e463f2e347053bceddc743163f6
MD5 02d359e90891e7f53ff6f5c1ab1b470a
BLAKE2b-256 062da15a41cb76d4f281835fab5d2a8bfd78d49fd7353a2ba2368c040b820344

See more details on using hashes here.

Provenance

The following attestation bundles were made for olmoearth_pretrain_minimal-0.0.1-py3-none-any.whl:

Publisher: publish.yml on allenai/olmoearth_pretrain_minimal

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page