Tungstenkit is an open-source tool for building and using versatile and standardized ML model containers, Tungsten models.
Project description
Tungstenkit
Tungstenkit is an open-source tool for building and using versatile and standardized ML model containers, Tungsten models. The key features of Tungsten models are:
- Easy: Require only a few lines of Python code.
- Versatile: Support multiple usages:
- RESTful API server
- GUI application
- Serverless function
- CLI application (coming soon)
- Python function (coming soon)
- Abstracted: User-defined JSON input/output.
- Standardized: Support advanced workflows.
- Scalable: Support adaptive batching and clustering (coming soon).
Learn More
Take the tour
Build a Tungsten model
Building a Tungsten model is easy. All you have to do is write a simple tungsten_model.py
like below:
from typing import List
import torch
from tungstenkit import io, model
class Input(io.BaseIO):
prompt: str
class Output(io.BaseIO):
image: io.Image
@model.config(
gpu=True,
python_packages=["torch", "torchvision"],
batch_size=4,
description="Text to image"
)
class Model(model.TungstenModel[Input, Output]):
def setup(self):
weights = torch.load("./weights.pth")
self.model = load_torch_model(weights)
def predict(self, inputs: List[Input]) -> List[Output]:
input_tensor = preprocess(inputs)
output_tensor = self.model(input_tensor)
outputs = postprocess(output_tensor)
return outputs
Now, you can start a build process with the following command:
$ tungsten build
✅ Successfully built tungsten model: 'text-to-image:latest'
Run it as a RESTful API server
You can start a prediction with a REST API call.
Start a server:
$ docker run -p 3000:3000 --gpus all text-to-image:latest
INFO: Setting up the model
INFO: Getting inputs from the input queue
INFO: Starting the prediction service
INFO: Started server process [1]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: Uvicorn running on http://0.0.0.0:3000 (Press CTRL+C to quit)
Send a prediction request with a JSON payload:
$ curl -X 'POST' 'http://localhost:3000/predict' \
-H 'accept: application/json' \
-H 'Content-Type: application/json' \
-d '[{"prompt": "a professional photograph of an astronaut riding a horse"}]'
{
"status": "success",
"outputs": [{"image": "data:image/png;base64,..."}],
"error_message": null
}
Run it as a GUI application
If you need a more user-friendly way to make predictions, start a GUI app with the following command:
$ tungsten demo text-to-image:latest -p 8080
INFO: Uvicorn running on http://localhost:8080 (Press CTRL+C to quit)
Run it as a serverless function
We support remote, serverless executions via a Tungsten server.
Push a model:
$ tungsten push exampleuser/exampleproject -n text-to-image:latest
✅ Successfully pushed to 'https://server.tungsten-ai.com'
Now, you can start a remote prediction in the Tungsten server:
Prerequisites
- Python 3.7+
- Docker
- (Optional) nvidia-docker for running GPU models locally. It enables using GPUs in Docker containers. However, you can build and push GPU models without it.
Installation
pip install tungstenkit
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file tungstenkit-0.0.1a4.tar.gz
.
File metadata
- Download URL: tungstenkit-0.0.1a4.tar.gz
- Upload date:
- Size: 974.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.4.2 CPython/3.10.6 Linux/5.19.0-38-generic
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 5c75e9b9de91302484acecc1df2a4e15dd80a383c8140bf13ca9d8841859d38f |
|
MD5 | 194e8a167c212881f049ca6b464c85d0 |
|
BLAKE2b-256 | 0c32c5e49f93f0282aab56397e67cf252d75ac18d7ef76e6e2cd341939454955 |
File details
Details for the file tungstenkit-0.0.1a4-py3-none-any.whl
.
File metadata
- Download URL: tungstenkit-0.0.1a4-py3-none-any.whl
- Upload date:
- Size: 1.0 MB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.4.2 CPython/3.10.6 Linux/5.19.0-38-generic
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 402f726ad4aa05a793c662c4bbbdf743a3d0b88ab0a5dc0a0b9c0a56d487509c |
|
MD5 | 52c6fbe542d47091c8dc3053fce442a9 |
|
BLAKE2b-256 | 2c4a26f80970a23ed698bf914137a06787ea0e1c0e8c89df72bd1b718ff8aad3 |