Skip to main content

Start an API server as OpenAI's

Project description

FastOAI (OpenAI-like API Server)

Motivation

This project is a simple API server that can be used to serve language models. It is designed to be simple to use and easy to deploy. It is built using FastAPI, Pydantic and openai-python.

Quick Start

The following example shows a case to overriding the original streaming response, we append [ollama] to every chunk content.

from fastapi.responses import StreamingResponse
from fastoai import FastOAI
from fastoai.requests import CompletionCreateParams
from fastoai.routers import router
from openai import AsyncOpenAI

app = FastOAI()


@app.post("/chat/completions")
async def create_chat_completions(params: CompletionCreateParams):
    client = AsyncOpenAI(api_key="ollama", base_url="http://localhost:11434/v1")
    response = await client.chat.completions.create(**params.model_dump())
    if params.stream:

        async def _stream():
            async for chunk in response:
                for choice in chunk.choices:
                    choice.delta.content += " [ollama]"
                yield f"data: {chunk.model_dump_json()}\n\n"
            else:
                yield "data: [DONE]\n\n"

        return StreamingResponse(_stream())
    return response

# Order matters, you should firstly define your own entrypoint functions, then
# include the existing router in order to override the original process with
# yours
app.include_router(router)

And that's it! You can now run the server using uvicorn:

uvicorn examples.main:app --reload

Architecture

graph TD
    client["`*openai client* (Python/Node.js/RESTful)`"]-->fastoai[FastOAI server]
    fastoai-->ChatCompletion[OpenAI Chat Completion API]
    fastoai-->openai[OpenAI API]
    fastoai-->others[Any other API]

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

fastoai-1.0.3.tar.gz (181.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

fastoai-1.0.3-py3-none-any.whl (34.4 kB view details)

Uploaded Python 3

File details

Details for the file fastoai-1.0.3.tar.gz.

File metadata

  • Download URL: fastoai-1.0.3.tar.gz
  • Upload date:
  • Size: 181.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.5.9

File hashes

Hashes for fastoai-1.0.3.tar.gz
Algorithm Hash digest
SHA256 967bc4596ab574aa976958c42a19e056477ae813e7809dbd9a18c753f46b6d0d
MD5 95823f7d64828addb0818796615a0e0b
BLAKE2b-256 cccb9bdebc38d5eae837252485e4fec387c6ae3c27feaad5132767c538febd0d

See more details on using hashes here.

File details

Details for the file fastoai-1.0.3-py3-none-any.whl.

File metadata

  • Download URL: fastoai-1.0.3-py3-none-any.whl
  • Upload date:
  • Size: 34.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.5.9

File hashes

Hashes for fastoai-1.0.3-py3-none-any.whl
Algorithm Hash digest
SHA256 9e2c2996716c480dd183952bef0a69211d24484c3e60f2ab715362cb952ac29f
MD5 39f82e5d100a77f636633015747b12ae
BLAKE2b-256 0f2fce295c520930b293266748b4e0175c7a2e42e0a6783094c938a7ba172c85

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page