Pretrained remote sensing models.
Project description
Pretrained remote sensing models for the rest of us.
[Read The Docs] - [Quick Start] - [Website]
What is Moonshine?
Moonshine is a Python package that makes it easier to train models on remote sensing data like satellite imagery. Using Moonshine's pretrained models, you can reduce the amount of labeled data required and reduce the training compute needed.
For more info and examples, read the docs.
Why use Moonshine?
-
Pretrained on multispectral data: Many existing packages are pretrained with ImageNet or similar RGB images. Using Moonshine you can unlock the full power of satellites that many contain many channels of multispectral data.
-
Pretrained on remote sensing data: Pretraining in the domain of your data is important, and most off the shelf pretrained models are fit to natural images such as ImageNet.
-
Focus on usability: While there are some academic remote sensing pretrained models available, they often are difficult to use and lack support. Moonshine is designed to be easy to use and will offer community support via Github and Slack.
Installation
PyPI version:
pip install moonshine
Latest version from source:
pip install git+https://github.com/moonshinelabs-ai/moonshine
Quick Start
The Moonshine Python package offers a light wrapper around our pretrained PyTorch models. You can load the pretrained weights into your own model architecture and fine tune with your own data:
import torch.nn as nn
from moonshine.models.unet import UNet
class SegmentationModel(nn.Module):
def __init__(self):
super().__init__()
# Create a blank model based on the available architectures.
self.backbone = UNet(name="unet50_fmow_rgb")
# If we are using pretrained weights, load them here. In
# general, using the decoder weights isn't preferred unless
# your downstream task is also a reconstruction task. We suggest
# trying only the encoder first.
self.backbone.load_weights(
encoder_weights="unet50_fmow_rgb", decoder_weights=None
)
# Run a per-pixel classifier on top of the output vectors.
self.classifier = nn.Conv2d(32, 2, (1, 1))
def forward(self, x):
x = self.backbone(x)
return self.classifier(x)
You can also configure data pre-processing to make sure your data is formatted the same way as the model pretraining was done.
from moonshine.preprocessing import get_preprocessing_fn
preprocess_fn = get_preprocessing_fn(model="unet", dataset="fmow_rgb")
Citing
@misc{Harada:2023,
Author = {Nate Harada},
Title = {Moonshine},
Year = {2023},
Publisher = {GitHub},
Journal = {GitHub repository},
Howpublished = {\url{https://github.com/moonshinelabs-ai/moonshine}}
}
License
This project is under MIT License.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file moonshine-0.1.7.tar.gz
.
File metadata
- Download URL: moonshine-0.1.7.tar.gz
- Upload date:
- Size: 4.8 MB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.11.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | c2db4c9a03866a0876ab07cc7bd96f5e41a1dc01f7421aa45e30414c8d4cd387 |
|
MD5 | 7de5df88cacedee83ed3e46fd0003922 |
|
BLAKE2b-256 | 1ddc0a39e3e604eb3805047180e6c98341e8633892af43ca40cdb5b546bc919a |
File details
Details for the file moonshine-0.1.7-py3-none-any.whl
.
File metadata
- Download URL: moonshine-0.1.7-py3-none-any.whl
- Upload date:
- Size: 10.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.11.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 6890e2c0a3be332cb56a7ad15c5a3829c68fbd9713e89657fb254f91788348fb |
|
MD5 | 9c6761568d7b301e302ec2fd051b1587 |
|
BLAKE2b-256 | 0d2df0c0c76f16e75bb1defc25c349b15eea6ecdc4bba725bdd012450d3ad78e |