Skip to main content

Fast, minimalist, multi-model terminal-based SDK for building, testing, and interacting with LLMs via cloud APIs.

Project description

FastCCG (Fast Conversational & Completion Gateway)

Python License PyPI GitHub Stars GitHub Issues Documentation

FastCCG is a simple, powerful, and developer-friendly Python library for interacting with Large Language Models (LLMs). It provides a clean, unified API to work with models from leading providers like OpenAI, Google, Anthropic, and Mistral, making it easy to build, test, and deploy AI-powered applications.

🚀 Key Features

  • 🔄 Unified API: Switch between different LLM providers with minimal code changes
  • ⚡ Async Support: Built-in asynchronous operations for high-performance applications
  • 🌊 Streaming: Real-time response streaming for interactive experiences
  • 💾 Session Management: Save and restore conversation history
  • 🖥️ CLI Interface: Powerful command-line tools for quick testing and interaction
  • 🔧 Easy Configuration: Chainable methods for clean, readable code
  • 🛡️ Error Handling: Robust error handling with custom exceptions

🏗️ Supported Providers

Provider Models Status
OpenAI GPT-4o, GPT-3.5 Turbo ✅ Fully Supported
Google Gemini 1.5 Pro, Gemini 1.5 Flash ✅ Fully Supported
Mistral Mistral Tiny, Small, Medium ✅ Fully Supported
Anthropic Claude 3 Sonnet ✅ Fully Supported

📦 Installation

pip install fastccg

⚡ Quick Start

import fastccg
from fastccg.models.gpt import gpt_4o

# Add your API key
api_key = fastccg.add_openai_key("sk-...")

# Initialize the model
model = fastccg.init_model(gpt_4o, api_key=api_key)

# Ask a question
response = model.ask("What is the best thing about Large Language Models?")
print(response.content)

🖥️ CLI Usage

FastCCG comes with a powerful CLI for quick interactions:

# List available models
fastccg models

# Ask a single question
fastccg ask "What is the capital of France?" --model gpt-4o

# Start an interactive chat session
fastccg chat --model gpt-4o

🔄 Advanced Features

Asynchronous Operations

import asyncio

async def main():
    # Run multiple prompts concurrently
    task1 = model.ask_async("What is the speed of light?")
    task2 = model.ask_async("What is the capital of Australia?")
    
    responses = await asyncio.gather(task1, task2)
    for response in responses:
        print(response.content)

asyncio.run(main())

Streaming Responses

async def stream_example():
    async for chunk in model.ask_stream("Tell me a story"):
        print(chunk.content, end="", flush=True)

asyncio.run(stream_example())

Session Management

# Save conversation
model.save("my_session.json")

# Load conversation later
loaded_model = fastccg.load_model("my_session.json", api_key=api_key)

📚 Documentation

Comprehensive documentation is available in the docs/ directory:

🤝 Contributing

We welcome contributions! Please see our Contributing Guidelines for details.

📄 License

This project is licensed under the MIT License - see the LICENSE file for details.

🌟 Why FastCCG?

  • Developer Experience: Clean, intuitive API that just works
  • Performance: Built with async-first architecture for scalable applications
  • Flexibility: Easy to switch between providers and models
  • Reliability: Comprehensive error handling and testing
  • Community: Open source with active development and support

📖 Read the Full Documentation | 🚀 Get Started Now | 💬 Join the Discussion

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

fastccg-0.2.0.tar.gz (21.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

fastccg-0.2.0-py3-none-any.whl (26.6 kB view details)

Uploaded Python 3

File details

Details for the file fastccg-0.2.0.tar.gz.

File metadata

  • Download URL: fastccg-0.2.0.tar.gz
  • Upload date:
  • Size: 21.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.5

File hashes

Hashes for fastccg-0.2.0.tar.gz
Algorithm Hash digest
SHA256 89761e6d6e99bd85a80d348d5c976502402b49345e62ba7e82c0be2cc653a98b
MD5 7bd1319fa16f945cbc2be7511d7567cc
BLAKE2b-256 9e1720b7a6aaaf7dead19c75c51cbf142632efcf9219fb7b47c5e32614a95e0e

See more details on using hashes here.

File details

Details for the file fastccg-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: fastccg-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 26.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.5

File hashes

Hashes for fastccg-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 96c427ec8e6d2a391812a2aa7de64dd45199ecff07ebc66445486835be3ca932
MD5 3ae77b0c93f92170298a0071621ad55d
BLAKE2b-256 1da6f3b9dfb57d02fc9928ce32c958a4d7c4f43592038c2e76772434b3ca90ce

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page