Skip to main content

A custom wrapper for the Langchain Python package with modified response handling

Project description

custom_openai

custom_openai is a drop-in replacement for the official OpenAI Python SDK, enabling you to easily customize, standardize, and enrich the API responses for your own post-processing, logging, RAG pipelines, or monitoring needs. It supports both synchronous and asynchronous usage, with robust support for streaming and complete compatibility with the OpenAI API.


🚀 Features

  • Plug-and-play: Fully compatible with the OpenAI client interface. Just swap your import and go.
  • Custom response fields: Every .create() call (sync/async/streaming) includes a .flashquery attribute for easy extraction of processed or enriched response data.
  • Streaming support: Automatically injects your custom data into the final item in async generator streams.
  • No monkey-patching: Cleanly extends OpenAI’s classes without risky global side effects.

📦 Installation

pip install flashquery

🔥 Quickstart

Synchronous Example

from flashquery.client import CustomLangchainClient

client = CustomLangchainClient(
    provider="openai",
    model="gpt-4o-mini",
    temperature=0,
    api_key="sk-..."
)

response = client.generate([{"role": "user", "content": "Say hello?"}])

print(response.flashquery)  # Your custom field!

Streaming Example

import asyncio
from flashquery.client import CustomLangchainClient

async def main():
    client = CustomLangchainClient(
        provider="openai",
        model="gpt-4o-mini",
        temperature=0,
        api_key="sk-..."
    )
    response = client.astream([{"role": "user", "content": "Oi, tudo bem?"}])

    async for chunk in response:
        print("Chunk:", chunk.flashquery)

asyncio.run(main())

⚙️ How it Works

  • CustomOpenAIClient and CustomAsyncOpenAIClient inherit from the official OpenAI clients, overriding .create() methods of responses and chat.completions.
  • After each response is created, a new .flashquery attribute is attached.
  • In streaming mode, .flashquery is set on the last chunk yielded from the generator.

🤝 Contributing

Pull requests are welcome! For major changes, please open an issue first to discuss what you would like to change.


📄 License

MIT License


Need help? Open an issue or contribute on GitHub!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

flashquery-0.0.3.tar.gz (4.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

flashquery-0.0.3-py3-none-any.whl (4.5 kB view details)

Uploaded Python 3

File details

Details for the file flashquery-0.0.3.tar.gz.

File metadata

  • Download URL: flashquery-0.0.3.tar.gz
  • Upload date:
  • Size: 4.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.5

File hashes

Hashes for flashquery-0.0.3.tar.gz
Algorithm Hash digest
SHA256 36cdf61b1bae7997b5aaa009d5b81e0f78f525ead5eda6338a715b6205b8ecc9
MD5 86413e09c016abfd349617c6bc127621
BLAKE2b-256 02d136126192464a80d8584910c2bf46039c163d00b3f4d509e42b0805b4cb40

See more details on using hashes here.

File details

Details for the file flashquery-0.0.3-py3-none-any.whl.

File metadata

  • Download URL: flashquery-0.0.3-py3-none-any.whl
  • Upload date:
  • Size: 4.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.5

File hashes

Hashes for flashquery-0.0.3-py3-none-any.whl
Algorithm Hash digest
SHA256 01fdb29a4ce90486d93984e1371f91e1ca9df23dc6768495487e81afc0feaeb6
MD5 97aae81dcecf0a6de3609fcbc276ab3d
BLAKE2b-256 da05659d5c82aa249aec04e4a03b8a38d67e450db6e8f5d146a6091185228606

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page