Skip to main content

A REST API for running Large Language Models

Project description

GULL-API

Test Docker Publish PyPI Publish

GULL-API is a web application backend that can be used to run Large Language Models (LLMs). The interface between the front-end and the back-end is a JSON REST API.

Features

  • Exposes a /api route that returns a JSON file describing the parameters of the LLM.
  • Provides a /llm route that accepts POST requests with JSON payloads to run the LLM with the specified parameters.

Installation

Using Docker

  1. Build the Docker image:

    docker build -t gull-api .
    
  2. Run the Docker container:

    docker run -p 8000:8000 gull-api
    

The API will be available at http://localhost:8000.

Docker Test Mode

To build and run the Docker container in test mode, use the following commands:

docker build -t gull-api .
docker run -v $(pwd)/data:/app/data -v $(pwd)/example_cli.json:/app/cli.json -p 8000:8000 gull-api

In test mode, an included script echo_args.sh is used instead of a real LLM. This script echoes the arguments it receives, which can be helpful for local testing.

Local Installation

  1. Clone the repository:

    git clone https://github.com/yourusername/gull-api.git
    
  2. Change to the project directory:

    cd gull-api
    
  3. Install the dependencies:

    pip install poetry
    poetry install
    
  4. Configure Environment Variables (Optional):

    GULL-API can be configured using environment variables. To do this, create a file named .env in the root of the project directory, and set the environment variables there. For example:

    DB_URI=sqlite:///mydatabase.db
    CLI_JSON_PATH=/path/to/cli.json
    

    GULL-API uses the python-dotenv package to load these environment variables when the application starts.

  5. Run the application:

    uvicorn gull_api.main:app --host 0.0.0.0 --port 8000
    

The API will be available at http://localhost:8000.

Usage

/api Route

Send a GET request to the /api route to retrieve a JSON file describing the parameters of the LLM:

GET http://localhost:8000/api

/llm Route

Send a POST request to the /llm route with a JSON payload containing the LLM parameters:

POST http://localhost:8000/llm
Content-Type: application/json

{
  "Prompt": "Once upon a time",
  "Top P": 0.5
}

Example Requests

curl -X POST "http://localhost:8000/llm" -H  "accept: application/json" -H  "Content-Type: application/json" -d "{\"Instruct mode\":false, \"Maximum length\":256, \"Prompt\":\"Hello, world\", \"Stop sequences\":\"Goodbye, world\", \"Temperature\":0.7, \"Top P\":0.95}"
curl -X GET "http://localhost:8000/api" -H "accept: application/json" | python -mjson.tool

Example CLI JSON

An example CLI JSON file is provided in the repository as example_cli.json. This file provides an example of the expected structure for defining the command-line arguments for the LLM.

License

See LICENSE

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

gull_api-0.0.15.tar.gz (6.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

gull_api-0.0.15-py3-none-any.whl (8.3 kB view details)

Uploaded Python 3

File details

Details for the file gull_api-0.0.15.tar.gz.

File metadata

  • Download URL: gull_api-0.0.15.tar.gz
  • Upload date:
  • Size: 6.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/4.0.1 CPython/3.11.4

File hashes

Hashes for gull_api-0.0.15.tar.gz
Algorithm Hash digest
SHA256 885a7e2a6e94843c1160edf1cd11097f61bec8039c25c56887ea20a0757d7d42
MD5 4b9192074940b5f9a17abd0e40997c1c
BLAKE2b-256 ce802460f474eb0f1573a592ad3bb2e74db592dad97c0b0ecd16710084301b5c

See more details on using hashes here.

File details

Details for the file gull_api-0.0.15-py3-none-any.whl.

File metadata

  • Download URL: gull_api-0.0.15-py3-none-any.whl
  • Upload date:
  • Size: 8.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/4.0.1 CPython/3.11.4

File hashes

Hashes for gull_api-0.0.15-py3-none-any.whl
Algorithm Hash digest
SHA256 cf1676278f2eee525736e40bf4a85aae2fc0b5509f2e80bf75c91600508db560
MD5 71b9f91095f1039b7ca2b948f56cb2fc
BLAKE2b-256 b4a13bbe2325265f94de23d07e2ffcd1614ec14b04edec87ad561d5ca90a431e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page