A Python SDK for the CheckThat AI platform's unified LLM API with fact-checking and claim normalization capabilities.
Project description
CheckThat AI Python SDK
A Python SDK for the CheckThat AI platform's unified LLM API with built-in fact-checking and claim normalization capabilities.
Features
- 🔄 Unified LLM Access: Access 11+ models from OpenAI, Anthropic, Google Gemini, xAI, and Together AI through a single API
- 🔍 Claim Normalization: Standardize and structure claims for analysis
- ✅ Fact-Checking: Built-in claim verification and evidence sourcing
- 🔌 OpenAI Compatible: Drop-in replacement for OpenAI Python SDK
- ⚡ Async Support: Full async/await support for high-performance applications
- 🛡️ Type Safety: Complete type hints for better development experience
Installation
pip install checkthat-ai
Quick Start
Basic Usage
import os
from checkthat_ai import CheckThatAI
# Initialize the client
api_key = os.environ.get("OPENAI_API_KEY") # or your provider's API key
client = CheckThatAI(api_key=api_key)
# Use exactly like OpenAI's client
response = client.chat.completions.create(
model="gpt-4o",
messages=[
{"role": "user", "content": "Fact-check this claim: The Earth is flat"}
]
)
print(response.choices[0].message.content)
Async Usage
import asyncio
from checkthat_ai import AsyncCheckThatAI
async def main():
client = AsyncCheckThatAI(api_key="your-api-key")
response = await client.chat.completions.create(
model="gpt-4o",
messages=[
{"role": "user", "content": "What is the capital of France?"}
]
)
print(response.choices[0].message.content)
asyncio.run(main())
Streaming Responses
response = client.chat.completions.create(
model="gpt-4o",
messages=[{"role": "user", "content": "Tell me a story"}],
stream=True
)
for chunk in response:
if chunk.choices[0].delta.content:
print(chunk.choices[0].delta.content, end="", flush=True)
Supported Models
The SDK provides access to models from multiple providers:
- OpenAI: GPT-5, GPT-5 nano, o3, o4-mini
- Anthropic: Claude Sonnet 4, Sonnet Opus 4.1
- Google: Gemini 2.5 Pro, Gemini 2.5 Flash
- xAI: Grok 4, Grok 3, Grok 3 Mini
- Together AI: Llama 3.3 70B, Deepseek R1 Distill Llama 70B
API Reference
CheckThatAI Client
client = CheckThatAI(
api_key="your-api-key", # Required: Your API key
base_url="https://api.checkthat-ai.com/v1", # Optional: Custom base URL
timeout=30.0, # Optional: Request timeout
max_retries=3, # Optional: Max retry attempts
)
Chat Completions
Compatible with OpenAI's chat completions API:
response = client.chat.completions.create(
model="gpt-4o",
messages=[...],
temperature=0.7,
max_tokens=1000,
stream=False,
# ... other OpenAI parameters
)
Model Information
# List available models
models = client.models.list()
for model in models:
print(f"Model: {model.id}")
sample response
{
"models_list": [
{
"provider": "OpenAI",
"available_models": [
{
"name": "GPT-4o",
"model_id": "gpt-4o-2024-11-20"
},
{
"name": "GPT-4.1",
"model_id": "gpt-4.1-2025-04-14"
},
{
"name": "o4-mini",
"model_id": "o4-mini-2025-04-16"
}
]
},
]
}
Authentication
Set your API key as an environment variable:
export OPENAI_API_KEY="your-openai-key"
export ANTHROPIC_API_KEY="your-anthropic-key"
export GEMINI_API_KEY="your-gemini-key"
export XAI_API_KEY="your-xai-key"
export TOGETHER_API_KEY="your-together-key"
and pass your API Key to the client:
client = CheckThatAI(api_key="your-api-key")
Error Handling
The SDK uses the same exception types as the OpenAI SDK:
from openai import OpenAIError, RateLimitError, APITimeoutError
try:
response = client.chat.completions.create(
model="gpt-4o",
messages=[{"role": "user", "content": "Hello!"}]
)
except RateLimitError:
print("Rate limit exceeded")
except APITimeoutError:
print("Request timed out")
except OpenAIError as e:
print(f"API error: {e}")
Contributing
We welcome contributions! Please see our Contributing Guide for details.
License
This project is licensed under the MIT License - see the LICENSE file for details.
Support
- 📧 Email: kadapalanikhil@gmail.com
- 🐛 Issues: GitHub Issues
- 🌐 Website: checkthat-ai.com
Changelog
See CHANGELOG.md for a history of changes.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file checkthat_ai-0.1.0.tar.gz.
File metadata
- Download URL: checkthat_ai-0.1.0.tar.gz
- Upload date:
- Size: 10.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c922c937a7f5b45034e3c0f4acca177164e5e97f60f1b3c5c25e4e1e60d12dc4
|
|
| MD5 |
51805330f8d49a40b86c268556b17d55
|
|
| BLAKE2b-256 |
9bca9e3d1b586db165bb31f83b443911457fa3076de0bd6538d747d7accc85d3
|
Provenance
The following attestation bundles were made for checkthat_ai-0.1.0.tar.gz:
Publisher:
python-publish.yml on Nikhil-Kadapala/checkthat-ai
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
checkthat_ai-0.1.0.tar.gz -
Subject digest:
c922c937a7f5b45034e3c0f4acca177164e5e97f60f1b3c5c25e4e1e60d12dc4 - Sigstore transparency entry: 499041847
- Sigstore integration time:
-
Permalink:
Nikhil-Kadapala/checkthat-ai@3be21066a9b9b0a647ab35d5d1fb3531e172f804 -
Branch / Tag:
refs/tags/v0.1.0 - Owner: https://github.com/Nikhil-Kadapala
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
python-publish.yml@3be21066a9b9b0a647ab35d5d1fb3531e172f804 -
Trigger Event:
release
-
Statement type:
File details
Details for the file checkthat_ai-0.1.0-py3-none-any.whl.
File metadata
- Download URL: checkthat_ai-0.1.0-py3-none-any.whl
- Upload date:
- Size: 10.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
5be5a9d8ddbe7bace96d6334d0e5f23a2225e087b98a2e37a7a4f852a0437aef
|
|
| MD5 |
c31f2e918f5cf6b057493b34ba9b3a00
|
|
| BLAKE2b-256 |
4cb3a91d528a0e36be12ddeb9c25aee015d2f29af3f7e4f0b973d8703430b688
|
Provenance
The following attestation bundles were made for checkthat_ai-0.1.0-py3-none-any.whl:
Publisher:
python-publish.yml on Nikhil-Kadapala/checkthat-ai
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
checkthat_ai-0.1.0-py3-none-any.whl -
Subject digest:
5be5a9d8ddbe7bace96d6334d0e5f23a2225e087b98a2e37a7a4f852a0437aef - Sigstore transparency entry: 499041852
- Sigstore integration time:
-
Permalink:
Nikhil-Kadapala/checkthat-ai@3be21066a9b9b0a647ab35d5d1fb3531e172f804 -
Branch / Tag:
refs/tags/v0.1.0 - Owner: https://github.com/Nikhil-Kadapala
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
python-publish.yml@3be21066a9b9b0a647ab35d5d1fb3531e172f804 -
Trigger Event:
release
-
Statement type: