Skip to main content

A Python package to get data from different llm's

Project description

🔷 Kynex

Kynex is a modular, pluggable Python framework that simplifies integrating multiple LLM providers such as Google Gemini, Groq, and Ollama through a unified, flexible interface.

Whether you're building AI workflows, chatbots, or prompt-based tools, Kynex allows seamless integration with different LLMs — all through a single API.


🚀 Features

  • Multi-LLM Support: Easily switch between Gemini, Groq, and Ollama with a single interface.
  • Dynamic Inputs: Accept LLM type, model name, API keys, and host dynamically at runtime.
  • LangChain Prompt Templates: Built-in LangChain PromptTemplate for clean prompt formatting.
  • Pluggable Architecture: Easily extend to new LLMs in the future.
  • Ready for Local & Remote Deployments: Supports local Ollama and remote LLM services.

📦 Installation

pip install kynex

Example Usage:

Create it:

Create a Python file and add the following:

 from kynex.LLMTools import LLMConnector

Google Gemini Example:


if __name__ == "__main__":
    # Simulated input from frontend
    request = {
        "prompt": "what is fast api",
        "model_name": "your_model",   #gemini-1.5-flash
        "api_key": "your_api_key",
        "llm_type": "LLMConnector.LLM_GEMINI"
    }

    response = LLMConnector.get_llm_response(
        prompt=request["prompt"],
        model_name=request["model_name"],
        api_key=request["api_key"],
        llm_type=request.get("llm_type")  # Can be None — will default to gemini
    )

    print("\n🔹 Response:\n")
    print(response)



Groq Example:

from kynex.LLMTools import LLMConnector


 if __name__ == "__main__":
     request = {
         "prompt": "your_prompt",
         "model_name": "your_model",  # ✅ Groq model
         "api_key": "your_api_key",
         "llm_type": "groq"
     }

     response =LLMConnector.get_llm_response(
         prompt=request["prompt"],
         model_name=request["model_name"],
         api_key=request["api_key"],
         llm_type=request.get("llm_type")
     )

     print("\n🔹 Groq LLaMA-4 Response:\n")
     print(response)

Ollama Example (Local or Remote):

from kynex.LLMTools import LLMConnector

response = LLMConnector.get_llm_response(
    prompt="your_prompt",
    model_name="your_model",  #EX:llama3
    llm_type=LLMConnector.LLM_OLLAMA,
    host="your_host"  # or remote URL if exposed via proxy
)

print(response)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

kynex-0.2.5.tar.gz (6.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

kynex-0.2.5-py3-none-any.whl (7.8 kB view details)

Uploaded Python 3

File details

Details for the file kynex-0.2.5.tar.gz.

File metadata

  • Download URL: kynex-0.2.5.tar.gz
  • Upload date:
  • Size: 6.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.11

File hashes

Hashes for kynex-0.2.5.tar.gz
Algorithm Hash digest
SHA256 e7eb81cc757d426ed00864a080128c27c1cec0bd2990934f32b56e49edf2fbdc
MD5 88e8d142372b427428165b537aab32b6
BLAKE2b-256 e8f17b31e4fee4cfea2c1c61533a47c8f1274efa6107f37a18efb7a8cde01052

See more details on using hashes here.

File details

Details for the file kynex-0.2.5-py3-none-any.whl.

File metadata

  • Download URL: kynex-0.2.5-py3-none-any.whl
  • Upload date:
  • Size: 7.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.11

File hashes

Hashes for kynex-0.2.5-py3-none-any.whl
Algorithm Hash digest
SHA256 6b29055cd32a28426ab2f22071c754a6f28d595a5730dda8f86f6479ce942bce
MD5 9dce5df2364ade3de2951e848a39685e
BLAKE2b-256 50a5a0b8bebc7ff162770b023300bcfbb83cc5ee1dcd8553d98d6a2929961c5c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page