Skip to main content

OpenAI-compatible proxy gateway for Kiro IDE API (AWS CodeWhisperer)

Project description

๐Ÿš€ Kiro OpenAI Gateway

OpenAI-compatible proxy gateway for Kiro IDE API (AWS CodeWhisperer)

License: AGPL v3 Python 3.10+ FastAPI

Use Claude models through any tools that support the OpenAI API

Features โ€ข Quick Start โ€ข Configuration โ€ข API Reference โ€ข License


โœจ Features

Feature Description
๐Ÿ”Œ OpenAI-compatible API Works with any OpenAI client out of the box
๐Ÿ’ฌ Full message history Passes complete conversation context
๐Ÿ› ๏ธ Tool Calling Supports function calling in OpenAI format
๐Ÿ“ก Streaming Full SSE streaming support
๐Ÿ”„ Retry Logic Automatic retries on errors (403, 429, 5xx)
๐Ÿ“‹ Extended model list Including versioned models
๐Ÿ” Smart token management Automatic refresh before expiration
๐Ÿงฉ Modular architecture Easy to extend with new providers

๐Ÿš€ Quick Start

Prerequisites

  • Python 3.10+
  • Kiro IDE with logged in account

Installation

# Clone the repository
git clone https://github.com/Jwadow/kiro-openai-gateway.git
cd kiro-openai-gateway

# Install dependencies
pip install -r requirements.txt

# Configure (see Configuration section)
cp .env.example .env
# Edit .env with your credentials

# Start the server
python main.py

The server will be available at http://localhost:8000


โš™๏ธ Configuration

Option 1: JSON Credentials File

Specify the path to the credentials file:

KIRO_CREDS_FILE="~/.aws/sso/cache/kiro-auth-token.json"

# Password to protect YOUR proxy server (make up any secure string)
# You'll use this as api_key when connecting to your gateway
PROXY_API_KEY="my-super-secret-password-123"
๐Ÿ“„ JSON file format
{
  "accessToken": "eyJ...",
  "refreshToken": "eyJ...",
  "expiresAt": "2025-01-12T23:00:00.000Z",
  "profileArn": "arn:aws:codewhisperer:us-east-1:...",
  "region": "us-east-1"
}

Option 2: Environment Variables (.env file)

Create a .env file in the project root:

# Required
REFRESH_TOKEN="your_kiro_refresh_token"

# Password to protect YOUR proxy server (make up any secure string)
PROXY_API_KEY="my-super-secret-password-123"

# Optional
PROFILE_ARN="arn:aws:codewhisperer:us-east-1:..."
KIRO_REGION="us-east-1"

Getting the Refresh Token

The refresh token can be obtained by intercepting Kiro IDE traffic. Look for requests to:

  • prod.us-east-1.auth.desktop.kiro.dev/refreshToken

๐Ÿ“ก API Reference

Endpoints

Endpoint Method Description
/ GET Health check
/health GET Detailed health check
/v1/models GET List available models
/v1/chat/completions POST Chat completions

Available Models

Model Description
claude-opus-4-5 Top-tier model
claude-opus-4-5-20251101 Top-tier model (versioned)
claude-sonnet-4-5 Enhanced model
claude-sonnet-4-5-20250929 Enhanced model (versioned)
claude-sonnet-4 Balanced model
claude-sonnet-4-20250514 Balanced model (versioned)
claude-haiku-4-5 Fast model
claude-3-7-sonnet-20250219 Legacy model

๐Ÿ’ก Usage Examples

๐Ÿ”น Simple cURL Request
curl http://localhost:8000/v1/chat/completions \
  -H "Authorization: Bearer my-super-secret-password-123" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "claude-sonnet-4-5",
    "messages": [{"role": "user", "content": "Hello!"}],
    "stream": true
  }'

Note: Replace my-super-secret-password-123 with the PROXY_API_KEY you set in your .env file.

๐Ÿ”น Streaming Request
curl http://localhost:8000/v1/chat/completions \
  -H "Authorization: Bearer my-super-secret-password-123" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "claude-sonnet-4-5",
    "messages": [
      {"role": "system", "content": "You are a helpful assistant."},
      {"role": "user", "content": "What is 2+2?"}
    ],
    "stream": true
  }'
๐Ÿ”น With Tool Calling
curl http://localhost:8000/v1/chat/completions \
  -H "Authorization: Bearer my-super-secret-password-123" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "claude-sonnet-4-5",
    "messages": [{"role": "user", "content": "What is the weather in London?"}],
    "tools": [{
      "type": "function",
      "function": {
        "name": "get_weather",
        "description": "Get weather for a location",
        "parameters": {
          "type": "object",
          "properties": {
            "location": {"type": "string", "description": "City name"}
          },
          "required": ["location"]
        }
      }
    }]
  }'
๐Ÿ Python OpenAI SDK
from openai import OpenAI

client = OpenAI(
    base_url="http://localhost:8000/v1",
    api_key="my-super-secret-password-123"  # Your PROXY_API_KEY from .env
)

response = client.chat.completions.create(
    model="claude-sonnet-4-5",
    messages=[
        {"role": "system", "content": "You are a helpful assistant."},
        {"role": "user", "content": "Hello!"}
    ],
    stream=True
)

for chunk in response:
    if chunk.choices[0].delta.content:
        print(chunk.choices[0].delta.content, end="")
๐Ÿฆœ LangChain
from langchain_openai import ChatOpenAI

llm = ChatOpenAI(
    base_url="http://localhost:8000/v1",
    api_key="my-super-secret-password-123",  # Your PROXY_API_KEY from .env
    model="claude-sonnet-4-5"
)

response = llm.invoke("Hello, how are you?")
print(response.content)

๐Ÿ“ Project Structure

kiro-openai-gateway/
โ”œโ”€โ”€ main.py                    # Entry point, FastAPI app creation
โ”œโ”€โ”€ requirements.txt           # Python dependencies
โ”œโ”€โ”€ .env.example               # Environment configuration example
โ”‚
โ”œโ”€โ”€ kiro_gateway/              # Main package
โ”‚   โ”œโ”€โ”€ __init__.py            # Package exports
โ”‚   โ”œโ”€โ”€ config.py              # Configuration and constants
โ”‚   โ”œโ”€โ”€ models.py              # Pydantic models for OpenAI API
โ”‚   โ”œโ”€โ”€ auth.py                # KiroAuthManager - token management
โ”‚   โ”œโ”€โ”€ cache.py               # ModelInfoCache - model caching
โ”‚   โ”œโ”€โ”€ utils.py               # Helper utilities
โ”‚   โ”œโ”€โ”€ converters.py          # OpenAI <-> Kiro conversion
โ”‚   โ”œโ”€โ”€ parsers.py             # AWS SSE stream parsers
โ”‚   โ”œโ”€โ”€ streaming.py           # Response streaming logic
โ”‚   โ”œโ”€โ”€ http_client.py         # HTTP client with retry logic
โ”‚   โ”œโ”€โ”€ debug_logger.py        # Debug logging (optional)
โ”‚   โ””โ”€โ”€ routes.py              # FastAPI routes
โ”‚
โ”œโ”€โ”€ tests/                     # Tests
โ”‚   โ”œโ”€โ”€ unit/                  # Unit tests
โ”‚   โ””โ”€โ”€ integration/           # Integration tests
โ”‚
โ””โ”€โ”€ debug_logs/                # Debug logs (generated when enabled)

๐Ÿ”ง Debugging

Debug logging is disabled by default. To enable, add to your .env:

# Debug logging mode:
# - off: disabled (default)
# - errors: save logs only for failed requests (4xx, 5xx) - recommended for troubleshooting
# - all: save logs for every request (overwrites on each request)
DEBUG_MODE=errors

Debug Modes

Mode Description Use Case
off Disabled (default) Production
errors Save logs only for failed requests (4xx, 5xx) Recommended for troubleshooting
all Save logs for every request Development/debugging

Debug Files

When enabled, requests are logged to the debug_logs/ folder:

File Description
request_body.json Incoming request from client (OpenAI format)
kiro_request_body.json Request sent to Kiro API
response_stream_raw.txt Raw stream from Kiro
response_stream_modified.txt Transformed stream (OpenAI format)
app_logs.txt Application logs for the request
error_info.json Error details (only on errors)

๐Ÿงช Testing

# Run all tests
pytest

# Run unit tests only
pytest tests/unit/

# Run with coverage
pytest --cov=kiro_gateway

๐Ÿ”Œ Extending with New Providers

The modular architecture makes it easy to add support for other providers:

  1. Create a new module kiro_gateway/providers/new_provider.py
  2. Implement the required classes:
    • NewProviderAuthManager โ€” token management
    • NewProviderConverter โ€” format conversion
    • NewProviderParser โ€” response parsing
  3. Add routes to routes.py or create a separate router

๐Ÿ“œ License

This project is licensed under the GNU Affero General Public License v3.0 (AGPL-3.0).

This means:

  • โœ… You can use, modify, and distribute this software
  • โœ… You can use it for commercial purposes
  • โš ๏ธ You must disclose source code when you distribute the software
  • โš ๏ธ Network use is distribution โ€” if you run a modified version on a server and let others interact with it, you must make the source code available to them
  • โš ๏ธ Modifications must be released under the same license

See the LICENSE file for the full license text.

Why AGPL-3.0?

AGPL-3.0 ensures that improvements to this software benefit the entire community. If you modify this gateway and deploy it as a service, you must share your improvements with your users.

Contributor License Agreement (CLA)

By submitting a contribution to this project, you agree to the terms of our Contributor License Agreement (CLA). This ensures that:

  • You have the right to submit the contribution
  • You grant the maintainer rights to use and relicense your contribution
  • The project remains legally protected

๐Ÿ‘ค Author

Jwadow โ€” @Jwadow


โš ๏ธ Disclaimer

This project is not affiliated with, endorsed by, or sponsored by Amazon Web Services (AWS), Anthropic, or Kiro IDE. Use at your own risk and in compliance with the terms of service of the underlying APIs.


Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

kiro_openai_gateway-1.0.9.tar.gz (61.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

kiro_openai_gateway-1.0.9-py3-none-any.whl (74.3 kB view details)

Uploaded Python 3

File details

Details for the file kiro_openai_gateway-1.0.9.tar.gz.

File metadata

  • Download URL: kiro_openai_gateway-1.0.9.tar.gz
  • Upload date:
  • Size: 61.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.2 CPython/3.12.9 Darwin/24.5.0

File hashes

Hashes for kiro_openai_gateway-1.0.9.tar.gz
Algorithm Hash digest
SHA256 a1d7386cf28dbb4f005204f1c3765fcd028407d20bf9de9932444b48b7dfe18f
MD5 be0bd7749b52df10eace643e07ee5bbd
BLAKE2b-256 9f4f65d6c6687cb73e9ae734dbf2e2477aed73e939cdafd004eb6fc46c65227b

See more details on using hashes here.

File details

Details for the file kiro_openai_gateway-1.0.9-py3-none-any.whl.

File metadata

File hashes

Hashes for kiro_openai_gateway-1.0.9-py3-none-any.whl
Algorithm Hash digest
SHA256 6918ab3f94fd359d2d45969e12226eadbf4f22e679e53352543667de00c4702e
MD5 bd94710cd9c9707b49b931c137eb6d2c
BLAKE2b-256 0685d70461804240b27e5ad4460c02f330e2a13dba9dde31153a6ac075f73b69

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page