Skip to main content

A Python client for Orign

Project description

orign-py

A Python client for Orign

Installation

pip install orign

Install the Orign CLI

curl -fsSL -H "Cache-Control: no-cache" https://storage.googleapis.com/orign/releases/install.sh | bash

Login to Orign

$ orign login

Usage

Get a list of available models

$ orign get models

Chat

Define which model we would like to use

from orign import ChatModel

model = ChatModel(model="allenai/Molmo-7B-D-0924", provider="vllm")

Open a socket connection to the model

model.connect()

Chat with the model

model.chat(msg="What's in this image?", image="https://tinyurl.com/2fz6ms35")

Stream tokens from the model

for response in model.chat(msg="What is the capital of France?", stream_tokens=True):
    print(response)

Send a thread of messages to the model

model.chat(prompt=[
    {"role": "user", "content": "What is the capital of France?"},
    {"role": "assistant", "content": "Paris"},
    {"role": "user", "content": "When was it built?"}
])

Send a batch of threads to the model

model.chat(batch=[
    [{"role": "user", "content": "What is the capital of France?"}, {"role": "assistant", "content": "Paris"}, {"role": "user", "content": "When was it built?"}],
    [{"role": "user", "content": "What is the capital of Spain?"}, {"role": "assistant", "content": "Madrid"}, {"role": "user", "content": "When was it built?"}]
]):

Use the async API

from orign import AsyncChatModel

model = AsyncChatModel(model="allenai/Molmo-7B-D-0924", provider="vllm")
await model.connect()

async for response in model.chat(
    msg="What is the capital of france?", stream_tokens=True
):
    print(response)

Embeddings

Define which model we would like to use

from orign import EmbeddingModel

model = EmbeddingModel(provider="sentence-tf", model="clip-ViT-B-32")

Embed a text

model.embed(text="What is the capital of France?")

Embed an image

model.embed(image="https://example.com/image.jpg")

Embed text and image

model.embed(text="What is the capital of France?", image="https://example.com/image.jpg")

Use the async API

from orign import AsyncEmbeddingModel

model = AsyncEmbeddingModel(provider="sentence-tf", model="clip-ViT-B-32")
await model.connect()

await model.embed(text="What is the capital of France?")

OCR

Define which model we would like to use

from orign import OCRModel

model = OCRModel(provider="easyocr")

Detect text in an image

model.detect(image="https://example.com/image.jpg")

Use the async API

from orign import AsyncOCRModel

model = AsyncOCRModel(provider="doctr")
await model.connect()

await model.detect(image="https://example.com/image.jpg")

Examples

See the examples directory for more usage examples.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

orign-0.1.8.tar.gz (7.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

orign-0.1.8-py3-none-any.whl (13.8 kB view details)

Uploaded Python 3

File details

Details for the file orign-0.1.8.tar.gz.

File metadata

  • Download URL: orign-0.1.8.tar.gz
  • Upload date:
  • Size: 7.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.3 CPython/3.12.7 Darwin/23.4.0

File hashes

Hashes for orign-0.1.8.tar.gz
Algorithm Hash digest
SHA256 2b5d7acc2d7830a14a99326f77f654a18995fa572d0982478d579cba314aee0c
MD5 b686ae38f889a82a60e4e61bb71f6fc9
BLAKE2b-256 e7a8cc6d0eff8ef64fad8c00a9a9ef1b191343732422232895439b76423e9436

See more details on using hashes here.

File details

Details for the file orign-0.1.8-py3-none-any.whl.

File metadata

  • Download URL: orign-0.1.8-py3-none-any.whl
  • Upload date:
  • Size: 13.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.3 CPython/3.12.7 Darwin/23.4.0

File hashes

Hashes for orign-0.1.8-py3-none-any.whl
Algorithm Hash digest
SHA256 0634157276e5bfa141d0db66c3b308e08c6dd7e24d1ab0b8b69d62c626744e6b
MD5 61f4580a751f121877127246da36ac3a
BLAKE2b-256 64d917cdac48f93e237f9ab0e162095ae046bfc169fb308e4746cbd56411bc6d

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page