OpenAI-compatible proxy gateway for Kiro IDE API (AWS CodeWhisperer)
Project description
๐ Kiro OpenAI Gateway
OpenAI-compatible proxy gateway for Kiro IDE API (AWS CodeWhisperer)
Use Claude models through any tools that support the OpenAI API
Features โข Quick Start โข Configuration โข API Reference โข License
โจ Features
| Feature | Description |
|---|---|
| ๐ OpenAI-compatible API | Works with any OpenAI client out of the box |
| ๐ฌ Full message history | Passes complete conversation context |
| ๐ ๏ธ Tool Calling | Supports function calling in OpenAI format |
| ๐ก Streaming | Full SSE streaming support |
| ๐ Retry Logic | Automatic retries on errors (403, 429, 5xx) |
| ๐ Extended model list | Including versioned models |
| ๐ Smart token management | Automatic refresh before expiration |
| ๐งฉ Modular architecture | Easy to extend with new providers |
๐ Quick Start
Prerequisites
- Python 3.10+
- Kiro IDE with logged in account
Installation
# Clone the repository
git clone https://github.com/Jwadow/kiro-openai-gateway.git
cd kiro-openai-gateway
# Install dependencies
pip install -r requirements.txt
# Configure (see Configuration section)
cp .env.example .env
# Edit .env with your credentials
# Start the server
python main.py
The server will be available at http://localhost:8000
โ๏ธ Configuration
Option 1: JSON Credentials File
Specify the path to the credentials file:
KIRO_CREDS_FILE="~/.aws/sso/cache/kiro-auth-token.json"
# Password to protect YOUR proxy server (make up any secure string)
# You'll use this as api_key when connecting to your gateway
PROXY_API_KEY="my-super-secret-password-123"
๐ JSON file format
{
"accessToken": "eyJ...",
"refreshToken": "eyJ...",
"expiresAt": "2025-01-12T23:00:00.000Z",
"profileArn": "arn:aws:codewhisperer:us-east-1:...",
"region": "us-east-1"
}
Option 2: Environment Variables (.env file)
Create a .env file in the project root:
# Required
REFRESH_TOKEN="your_kiro_refresh_token"
# Password to protect YOUR proxy server (make up any secure string)
PROXY_API_KEY="my-super-secret-password-123"
# Optional
PROFILE_ARN="arn:aws:codewhisperer:us-east-1:..."
KIRO_REGION="us-east-1"
Getting the Refresh Token
The refresh token can be obtained by intercepting Kiro IDE traffic. Look for requests to:
prod.us-east-1.auth.desktop.kiro.dev/refreshToken
๐ก API Reference
Endpoints
| Endpoint | Method | Description |
|---|---|---|
/ |
GET | Health check |
/health |
GET | Detailed health check |
/v1/models |
GET | List available models |
/v1/chat/completions |
POST | Chat completions |
Available Models
| Model | Description |
|---|---|
claude-opus-4-5 |
Top-tier model |
claude-opus-4-5-20251101 |
Top-tier model (versioned) |
claude-sonnet-4-5 |
Enhanced model |
claude-sonnet-4-5-20250929 |
Enhanced model (versioned) |
claude-sonnet-4 |
Balanced model |
claude-sonnet-4-20250514 |
Balanced model (versioned) |
claude-haiku-4-5 |
Fast model |
claude-3-7-sonnet-20250219 |
Legacy model |
๐ก Usage Examples
๐น Simple cURL Request
curl http://localhost:8000/v1/chat/completions \
-H "Authorization: Bearer my-super-secret-password-123" \
-H "Content-Type: application/json" \
-d '{
"model": "claude-sonnet-4-5",
"messages": [{"role": "user", "content": "Hello!"}],
"stream": true
}'
Note: Replace
my-super-secret-password-123with thePROXY_API_KEYyou set in your.envfile.
๐น Streaming Request
curl http://localhost:8000/v1/chat/completions \
-H "Authorization: Bearer my-super-secret-password-123" \
-H "Content-Type: application/json" \
-d '{
"model": "claude-sonnet-4-5",
"messages": [
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "What is 2+2?"}
],
"stream": true
}'
๐น With Tool Calling
curl http://localhost:8000/v1/chat/completions \
-H "Authorization: Bearer my-super-secret-password-123" \
-H "Content-Type: application/json" \
-d '{
"model": "claude-sonnet-4-5",
"messages": [{"role": "user", "content": "What is the weather in London?"}],
"tools": [{
"type": "function",
"function": {
"name": "get_weather",
"description": "Get weather for a location",
"parameters": {
"type": "object",
"properties": {
"location": {"type": "string", "description": "City name"}
},
"required": ["location"]
}
}
}]
}'
๐ Python OpenAI SDK
from openai import OpenAI
client = OpenAI(
base_url="http://localhost:8000/v1",
api_key="my-super-secret-password-123" # Your PROXY_API_KEY from .env
)
response = client.chat.completions.create(
model="claude-sonnet-4-5",
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Hello!"}
],
stream=True
)
for chunk in response:
if chunk.choices[0].delta.content:
print(chunk.choices[0].delta.content, end="")
๐ฆ LangChain
from langchain_openai import ChatOpenAI
llm = ChatOpenAI(
base_url="http://localhost:8000/v1",
api_key="my-super-secret-password-123", # Your PROXY_API_KEY from .env
model="claude-sonnet-4-5"
)
response = llm.invoke("Hello, how are you?")
print(response.content)
๐ Project Structure
kiro-openai-gateway/
โโโ main.py # Entry point, FastAPI app creation
โโโ requirements.txt # Python dependencies
โโโ .env.example # Environment configuration example
โ
โโโ kiro_gateway/ # Main package
โ โโโ __init__.py # Package exports
โ โโโ config.py # Configuration and constants
โ โโโ models.py # Pydantic models for OpenAI API
โ โโโ auth.py # KiroAuthManager - token management
โ โโโ cache.py # ModelInfoCache - model caching
โ โโโ utils.py # Helper utilities
โ โโโ converters.py # OpenAI <-> Kiro conversion
โ โโโ parsers.py # AWS SSE stream parsers
โ โโโ streaming.py # Response streaming logic
โ โโโ http_client.py # HTTP client with retry logic
โ โโโ debug_logger.py # Debug logging (optional)
โ โโโ routes.py # FastAPI routes
โ
โโโ tests/ # Tests
โ โโโ unit/ # Unit tests
โ โโโ integration/ # Integration tests
โ
โโโ debug_logs/ # Debug logs (generated when enabled)
๐ง Debugging
Debug logging is disabled by default. To enable, add to your .env:
# Debug logging mode:
# - off: disabled (default)
# - errors: save logs only for failed requests (4xx, 5xx) - recommended for troubleshooting
# - all: save logs for every request (overwrites on each request)
DEBUG_MODE=errors
Debug Modes
| Mode | Description | Use Case |
|---|---|---|
off |
Disabled (default) | Production |
errors |
Save logs only for failed requests (4xx, 5xx) | Recommended for troubleshooting |
all |
Save logs for every request | Development/debugging |
Debug Files
When enabled, requests are logged to the debug_logs/ folder:
| File | Description |
|---|---|
request_body.json |
Incoming request from client (OpenAI format) |
kiro_request_body.json |
Request sent to Kiro API |
response_stream_raw.txt |
Raw stream from Kiro |
response_stream_modified.txt |
Transformed stream (OpenAI format) |
app_logs.txt |
Application logs for the request |
error_info.json |
Error details (only on errors) |
๐งช Testing
# Run all tests
pytest
# Run unit tests only
pytest tests/unit/
# Run with coverage
pytest --cov=kiro_gateway
๐ Extending with New Providers
The modular architecture makes it easy to add support for other providers:
- Create a new module
kiro_gateway/providers/new_provider.py - Implement the required classes:
NewProviderAuthManagerโ token managementNewProviderConverterโ format conversionNewProviderParserโ response parsing
- Add routes to
routes.pyor create a separate router
๐ License
This project is licensed under the GNU Affero General Public License v3.0 (AGPL-3.0).
This means:
- โ You can use, modify, and distribute this software
- โ You can use it for commercial purposes
- โ ๏ธ You must disclose source code when you distribute the software
- โ ๏ธ Network use is distribution โ if you run a modified version on a server and let others interact with it, you must make the source code available to them
- โ ๏ธ Modifications must be released under the same license
See the LICENSE file for the full license text.
Why AGPL-3.0?
AGPL-3.0 ensures that improvements to this software benefit the entire community. If you modify this gateway and deploy it as a service, you must share your improvements with your users.
Contributor License Agreement (CLA)
By submitting a contribution to this project, you agree to the terms of our Contributor License Agreement (CLA). This ensures that:
- You have the right to submit the contribution
- You grant the maintainer rights to use and relicense your contribution
- The project remains legally protected
๐ค Author
Jwadow โ @Jwadow
โ ๏ธ Disclaimer
This project is not affiliated with, endorsed by, or sponsored by Amazon Web Services (AWS), Anthropic, or Kiro IDE. Use at your own risk and in compliance with the terms of service of the underlying APIs.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file kiro_openai_gateway-1.0.8.tar.gz.
File metadata
- Download URL: kiro_openai_gateway-1.0.8.tar.gz
- Upload date:
- Size: 58.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/2.1.2 CPython/3.12.9 Darwin/24.5.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
42a5c46fd1ef4a42cce33ca9999955ea2788ba06db447dd62655037e9c3be044
|
|
| MD5 |
8fb93a91ffa04eb49ed43ea43164b129
|
|
| BLAKE2b-256 |
c0a6457c4db936154a36f283e39d53b0a7f078df54103e94b6b13faa04cf9231
|
File details
Details for the file kiro_openai_gateway-1.0.8-py3-none-any.whl.
File metadata
- Download URL: kiro_openai_gateway-1.0.8-py3-none-any.whl
- Upload date:
- Size: 70.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/2.1.2 CPython/3.12.9 Darwin/24.5.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
45cb295a24079b7beac4680e4560027e9722822baf5a5191a2092269133b2ece
|
|
| MD5 |
6fedac5cb628a605b7c92350bd80d9bb
|
|
| BLAKE2b-256 |
2c970d9ea02de7fbc91a1694c91cd7903ee4391e1a8f93a09ba242df60484bf9
|