Skip to main content

GPTQuery is a modular Python framework for building, orchestrating, and deploying AI-powered tools across domains.

Project description

GPTQuery – Modular Tool Framework

GPTQuery is a modular Python framework for building, orchestrating, and deploying AI-powered tools across domains. It provides a clear, scalable architecture for integrating multiple AI providers (OpenAI, Perplexity, Claude, etc.) while maintaining robust error handling, throttling, and dynamic prompting.

The goal of this project is for researchers (or anyone) to create tools derived from custom prompts that can be easily integrated into reserach pipelines.

🚀 Features

  • Multi-Provider Support: Unified interface for OpenAI, Perplexity, Claude, and other AI APIs.
  • Modular Tool Design: Organize AI functionality into independent, reusable tool families.
  • Task Step Separation: Each tool can have multiple steps or functional modules.
  • Smart Defaults: Automatically selects optimal models or parameters per provider.
  • Safe Data Operations: Uses @requires_columns for safe DataFrame operations.
  • Advanced Throttling: Token bucket and adaptive rate-limiters for API management.
  • Robust Error Handling: Preserves partial results and gracefully handles failures.
  • Cost Optimization: Integrates token management to reduce API costs.

🏗️ Architecture Overview

GPTQuery is organized into three main layers:

  1. Core Infrastructure (core/)

    • Clients, logging, and utilities.
  2. Processing Utilities (processing/)

    • Helpers for throttling, token management, and miscellaneous runtime functions.
  3. Task-Specific Tools (tools/)

    • Each tool lives in its own namespace (e.g., tool_name).
    • Tools are subdivided into submodules/steps:
      • task.py → user-facing functions (run_*)
      • prompt.py → AI prompt definitions
      • log.py → logging utilities
    • The tool’s __init__.py exposes the main public API.

Example: A generic AI tool (tool_example)

tool_example/
├── step_one/
│ ├── task.py # run_step_one ├── prompt.py
│ └── log.py
├── step_two/
│ ├── task.py # run_step_two ├── prompt.py
│ └── log.py
└── init.py # exposes run_step_one, run_step_two

🛠️ Installation

Option 1: Install from PyPI (recommended) Install the latest stable release directly from PyPI using pip:

pip install gptquerytools

Option 2: Clone the repository (for development and custom tools) If you want to contribute or build custom tools based on GPTQuery, clone the repo and install dependencies:

# 1. Clone the repo:
git clone https://github.com/mauriciomm7/gptquery.git
cd gptquery
# 2. Install dependencies:
pip install -r requirements.txt

🎓 Citation

If you use this framework in academic research, please cite:

Mandujano Manríquez, M. (2025). GPTQuery: Modular framework for building and orchestrating AI-powered research tools. GitHub: https://github.com/mauriciomm7/gptquery

@misc{mandujano2025gptquery,
  author       = {Mauricio Mandujano Manríquez},
  title        = {GPTQuery: Modular framework for building and orchestrating AI-powered research tools},
  year         = {2025},
  howpublished = {\url{https://github.com/mauriciomm7/gptquery}},
  note         = {GitHub repository}
}

🤝 Contrubuting

To integrate your tool into GPTQuery, please:

  • Place your tool inside the tools/ directory following the existing modular structure.

  • Organize your tool with submodules for each step or feature, including:

    • task.py for core functionality,
    • prompt.py for AI prompts,
    • log.py for logging.
  • Expose your tool’s API in its __init__.py.

  • Write tests and update documentation as needed.

  • Submit a pull request with clear explanations of your additions.

Thank you for contributing to GPTQuery!

🙏 Acknowledgments

  • CI/CD automation using GitHub Actions — Automating build, test, and deployment workflows
  • tokencost — Token cost estimation for language models
  • tiktoken — Tokenization library used for accurate token counts
  • IPython — Interactive computing environment and enhanced Python shell

📄 License

This project is licensed under the MIT License.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

gptquerytools-0.0.1.tar.gz (52.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

gptquerytools-0.0.1-py3-none-any.whl (74.1 kB view details)

Uploaded Python 3

File details

Details for the file gptquerytools-0.0.1.tar.gz.

File metadata

  • Download URL: gptquerytools-0.0.1.tar.gz
  • Upload date:
  • Size: 52.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.4

File hashes

Hashes for gptquerytools-0.0.1.tar.gz
Algorithm Hash digest
SHA256 6ce4a1b6ded7cea8fcf51e2cba8c083ed63e0f0ed54f49b6b849542fbc2b0f8e
MD5 70f2d46cb1ad14f056d036e8f9d7ca2e
BLAKE2b-256 477ebba7de12a01257cfda2d756423f3923385bca008bb46dadcb3f95211401a

See more details on using hashes here.

File details

Details for the file gptquerytools-0.0.1-py3-none-any.whl.

File metadata

  • Download URL: gptquerytools-0.0.1-py3-none-any.whl
  • Upload date:
  • Size: 74.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.4

File hashes

Hashes for gptquerytools-0.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 0b79cd2d37862987b06b3c2c3e01e6e598bc8c5a16f1ceb74f95e1be0b87eb0e
MD5 2b7d3453c2952af83ecf53a09e366fd7
BLAKE2b-256 a36d6f7175911d1e95cc450d500f30131af2990e74d5f77689d7d9276b9b8c78

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page