Skip to main content

A custom wrapper for the Langchain Python package with modified response handling

Project description

custom_openai

custom_openai is a drop-in replacement for the official OpenAI Python SDK, enabling you to easily customize, standardize, and enrich the API responses for your own post-processing, logging, RAG pipelines, or monitoring needs. It supports both synchronous and asynchronous usage, with robust support for streaming and complete compatibility with the OpenAI API.


🚀 Features

  • Plug-and-play: Fully compatible with the OpenAI client interface. Just swap your import and go.
  • Custom response fields: Every .create() call (sync/async/streaming) includes a .flashquery attribute for easy extraction of processed or enriched response data.
  • Streaming support: Automatically injects your custom data into the final item in async generator streams.
  • No monkey-patching: Cleanly extends OpenAI’s classes without risky global side effects.

📦 Installation

pip install flashquery

🔥 Quickstart

Synchronous Example

from flashquery.client import CustomLangchainClient

client = CustomLangchainClient(
    provider="openai",
    model="gpt-4o-mini",
    temperature=0,
    api_key="sk-..."
)

response = client.generate([{"role": "user", "content": "Say hello?"}])

print(response.flashquery)  # Your custom field!

Streaming Example

import asyncio
from flashquery.client import CustomLangchainClient

async def main():
    client = CustomLangchainClient(
        provider="openai",
        model="gpt-4o-mini",
        temperature=0,
        api_key="sk-..."
    )
    response = client.astream([{"role": "user", "content": "Oi, tudo bem?"}])

    async for chunk in response:
        print("Chunk:", chunk.flashquery)

asyncio.run(main())

⚙️ How it Works

  • CustomOpenAIClient and CustomAsyncOpenAIClient inherit from the official OpenAI clients, overriding .create() methods of responses and chat.completions.
  • After each response is created, a new .flashquery attribute is attached.
  • In streaming mode, .flashquery is set on the last chunk yielded from the generator.

🤝 Contributing

Pull requests are welcome! For major changes, please open an issue first to discuss what you would like to change.


📄 License

MIT License


Need help? Open an issue or contribute on GitHub!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

flashquery-0.0.2.tar.gz (4.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

flashquery-0.0.2-py3-none-any.whl (4.5 kB view details)

Uploaded Python 3

File details

Details for the file flashquery-0.0.2.tar.gz.

File metadata

  • Download URL: flashquery-0.0.2.tar.gz
  • Upload date:
  • Size: 4.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.5

File hashes

Hashes for flashquery-0.0.2.tar.gz
Algorithm Hash digest
SHA256 399d73aef0d04c3e81af5c7e3eef90daf537b1a5f23ac001ff29220a3f0ce6b3
MD5 66c474869e4eafdeef152e17f3f944f5
BLAKE2b-256 b41f30c4110c427143d588f1313581dce98f89e48cbe56ef77b85a67f23539c4

See more details on using hashes here.

File details

Details for the file flashquery-0.0.2-py3-none-any.whl.

File metadata

  • Download URL: flashquery-0.0.2-py3-none-any.whl
  • Upload date:
  • Size: 4.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.5

File hashes

Hashes for flashquery-0.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 93bd2b5403dee58b57192a8389f6366391a881112450784860cffb191372b180
MD5 72b9a483bd6a8208ee970d2c98272b2c
BLAKE2b-256 7ee28b4369679a2cac7d29e17c08244b4fd625214edcd28481cce65add6f4371

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page