Skip to main content

The Oshepherd guiding the Ollama(s) inference orchestration.

Project description

oshepherd

The Oshepherd guiding the Ollama(s) inference orchestration.

oshepherd logo

A centralized FastAPI service, using Celery and Redis to orchestrate multiple Ollama servers as workers.

Install

pip install oshepherd

Usage

  1. Setup Redis:

    Celery uses Redis as message broker and backend. You'll need a Redis instance, which you can provision for free in redislabs.com.

  2. Setup FastAPI Server:

    # define configuration env file
    # use credentials for redis as broker and backend
    cp .api.env.template .api.env
    
    # start api
    oshepherd start-api --env-file .api.env
    
  3. Setup Celery/Ollama Worker(s):

    # install ollama https://ollama.com/download
    # optionally pull the model
    ollama pull mistral
    
    # define configuration env file
    # use credentials for redis as broker and backend
    cp .worker.env.template .worker.env
    
    # start worker
    oshepherd start-worker --env-file .worker.env
    
  4. Now you're ready to execute Ollama completions remotely. You can point your Ollama client to your oshepherd api server by setting the host, and it will return your requested completions from any of the workers:

    import ollama
    
    client = ollama.Client(host="http://127.0.0.1:5001")
    ollama_response = client.generate({"model": "mistral", "prompt": "Why is the sky blue?"})
    
    import { Ollama } from "ollama/browser";
    
    const ollama = new Ollama({ host: "http://127.0.0.1:5001" });
    const ollamaResponse = await ollama.generate({
        model: "mistral",
        prompt: "Why is the sky blue?",
    });
    
    • Raw http request:
    curl -X POST -H "Content-Type: application/json" -L http://127.0.0.1:5001/api/generate/ -d '{
        "model": "mistral",
        "prompt":"Why is the sky blue?"
    }'
    

Disclaimers 🚨

This package is in alpha, its architecture and api might change in the near future. Currently this is getting tested in a controlled environment by real users, but haven't been audited, nor tested thorugly. Use it at your own risk.

As this is an alpha version, support and responses might be limited. We'll do our best to address questions and issues as quickly as possible.

API server parity

  • Generate a completion: POST /api/generate
  • Generate a chat completion: POST /api/chat
  • Generate Embeddings: POST /api/embeddings
  • List Local Models: GET /api/tags
  • Version: GET /api/version
  • Show Model Information: POST /api/show (pending)
  • List Running Models: GET /api/ps (pending)

Oshepherd API server has been designed to maintain compatibility with the endpoints defined by Ollama, ensuring that any official client (i.e.: ollama-python, ollama-js) can use this server as host and receive expected responses. For more details on the full API specifications, refer to the official Ollama API documentation.

Contribution guidelines

We welcome contributions! If you find a bug or have suggestions for improvements, please open an issue or submit a pull request pointing to development branch. Before creating a new issue/pull request, take a moment to search through the existing issues/pull requests to avoid duplicates.

Conda Support

To run and build locally you can use conda:

conda create -n oshepherd python=3.8
conda activate oshepherd
pip install -r requirements.txt

# install oshepherd
pip install -e .
Tests

Follow usage instructions to start api server and celery worker using a local ollama, and then run the tests:

pytest -s tests/

Author

This is a project developed and maintained by mnemonica.ai.

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

oshepherd-0.0.13.tar.gz (14.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

oshepherd-0.0.13-py3-none-any.whl (19.1 kB view details)

Uploaded Python 3

File details

Details for the file oshepherd-0.0.13.tar.gz.

File metadata

  • Download URL: oshepherd-0.0.13.tar.gz
  • Upload date:
  • Size: 14.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.8.20

File hashes

Hashes for oshepherd-0.0.13.tar.gz
Algorithm Hash digest
SHA256 5d0d969966d229cb564870c59c092893ebdcf03ee8e65f3b5b90bbb951a33677
MD5 4cfdeea0d337b6e2323a3b1ba50563b3
BLAKE2b-256 4a541577ee27d2fd4e5f182529b63dd9ca5669f8a59545bceba5e542e7f24960

See more details on using hashes here.

File details

Details for the file oshepherd-0.0.13-py3-none-any.whl.

File metadata

  • Download URL: oshepherd-0.0.13-py3-none-any.whl
  • Upload date:
  • Size: 19.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.8.20

File hashes

Hashes for oshepherd-0.0.13-py3-none-any.whl
Algorithm Hash digest
SHA256 c491f36ddf91a864c926dc491af75aa8c12175a8a9ac9041809d0296aa8e8c28
MD5 2b3a77ae5ef84860a217cae928029c0d
BLAKE2b-256 2265928d0d5b71efe91a818590dffad7e8cdbaac8973a7ef32786ae34e9a53eb

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page