Skip to main content

Start an API server as OpenAI's

Project description

FastOAI (OpenAI-like API Server)

Motivation

This project is a simple API server that can be used to serve language models. It is designed to be simple to use and easy to deploy. It is built using FastAPI, Pydantic and openai-python.

Quick Start

The following example shows a case to overriding the original streaming response, we append [ollama] to every chunk content.

from fastapi.responses import StreamingResponse
from fastoai import FastOAI
from fastoai.requests import CompletionCreateParams
from fastoai.routers.ollama import router
from openai import AsyncOpenAI

app = FastOAI()


@app.post_chat_completions
async def create_chat_completions(params: CompletionCreateParams):
    client = AsyncOpenAI(api_key="ollama", base_url="http://localhost:11434/v1")
    response = await client.chat.completions.create(**params.model_dump())
    if params.stream:

        async def _stream():
            async for chunk in response:
                for choice in chunk.choices:
                    choice.delta.content += " [ollama]"
                yield f"data: {chunk.model_dump_json()}\n\n"
            else:
                yield "data: [DONE]\n\n"

        return StreamingResponse(_stream())
    return response

# Order matters, you should firstly define your own entrypoint functions, then
# include the existing router in order to override the original process with
# yours
app.include_router(router)

And that's it! You can now run the server using uvicorn:

uvicorn examples.ollama:app --reload

Architecture

graph TD
    client["`*openai client* (Python/Node.js/RESTful)`"]-->fastoai[FastOAI server]
    fastoai-->ollama[Ollama API]
    fastoai-->openai[OpenAI API]
    fastoai-->others[Any other API]

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

fastoai-1.0.1.tar.gz (3.4 kB view details)

Uploaded Source

Built Distribution

fastoai-1.0.1-py3-none-any.whl (5.3 kB view details)

Uploaded Python 3

File details

Details for the file fastoai-1.0.1.tar.gz.

File metadata

  • Download URL: fastoai-1.0.1.tar.gz
  • Upload date:
  • Size: 3.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.12.0

File hashes

Hashes for fastoai-1.0.1.tar.gz
Algorithm Hash digest
SHA256 a676ff5e4403e46575056d6af97ff13915ed6560caff1b73edf71554ee7d4fc5
MD5 0efebcd8912c10717705f8b39230dd12
BLAKE2b-256 d6f4c57251c135c5a11a5b7f7546d04ea08dc76c84d030f30760f860be6ea9df

See more details on using hashes here.

File details

Details for the file fastoai-1.0.1-py3-none-any.whl.

File metadata

  • Download URL: fastoai-1.0.1-py3-none-any.whl
  • Upload date:
  • Size: 5.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.12.0

File hashes

Hashes for fastoai-1.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 4677803b8959fd6f9c06a5f682cb799514f1f0361b08eb6e7fb4def0bd8512bf
MD5 1e815c92ce1dc562af7c0bda123ede2b
BLAKE2b-256 5c93a7adcf27e889c9138c720ff9e483125d957592bd421312a6cec46c58c127

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page