Skip to main content

A Python package to get data from different llm's

Project description

🔷 Kynex

Kynex is a modular, pluggable Python framework that simplifies integrating multiple LLM providers such as Google Gemini, Groq, and Ollama through a unified, flexible interface.

Whether you're building AI workflows, chatbots, or prompt-based tools, Kynex allows seamless integration with different LLMs — all through a single API.


🚀 Features

  • Multi-LLM Support: Easily switch between Gemini, Groq, and Ollama with a single interface.
  • Dynamic Inputs: Accept LLM type, model name, API keys, and host dynamically at runtime.
  • LangChain Prompt Templates: Built-in LangChain PromptTemplate for clean prompt formatting.
  • Pluggable Architecture: Easily extend to new LLMs in the future.
  • Ready for Local & Remote Deployments: Supports local Ollama and remote LLM services.

📦 Installation

pip install kynex

Example Usage:

Create it:

Create a Python file and add the following:

 from kynex.LLMTools import LLMConnector

Google Gemini Example:


if __name__ == "__main__":
    # Simulated input from frontend
    request = {
        "prompt": "what is fast api",
        "model_name": "gemini-1.5-flash",
        "api_key": "your_api_key",
        "llm_type": "LLMConnector.LLM_GEMINI"
    }

    response = LLMConnector.get_llm_response(
        prompt=request["prompt"],
        model_name=request["model_name"],
        api_key=request["api_key"],
        llm_type=request.get("llm_type")  # Can be None — will default to gemini
    )

    print("\n🔹 Response:\n")
    print(response)



Groq Example:

from kynex.LLMTools import LLMConnector


 if __name__ == "__main__":
     request = {
         "prompt": "Give 3 project use cases for a full-stack web developer.",
         "model_name": "meta-llama/llama-4-scout-17b-16e-instruct",  # ✅ Groq model
         "api_key": "your_api_key",
         "llm_type": "groq"
     }

     response =LLMConnector.get_llm_response(
         prompt=request["prompt"],
         model_name=request["model_name"],
         api_key=request["api_key"],
         llm_type=request.get("llm_type")
     )

     print("\n🔹 Groq LLaMA-4 Response:\n")
     print(response)

Ollama Example (Local or Remote):

from kynex.LLMTools import LLMConnector

response = LLMConnector.get_llm_response(
    prompt="your_prompt",
    model_name="llama3",
    llm_type=LLMConnector.LLM_OLLAMA,
    host="your_host"  # or remote URL if exposed via proxy
)

print(response)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

kynex-0.2.0.tar.gz (6.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

kynex-0.2.0-py3-none-any.whl (7.5 kB view details)

Uploaded Python 3

File details

Details for the file kynex-0.2.0.tar.gz.

File metadata

  • Download URL: kynex-0.2.0.tar.gz
  • Upload date:
  • Size: 6.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.11

File hashes

Hashes for kynex-0.2.0.tar.gz
Algorithm Hash digest
SHA256 403af8c9624acf406cbfe10953ba1940d25dfadd202f0099e8ffb69256067916
MD5 84d09ec0251f78ae4a64437b9f935ff2
BLAKE2b-256 e55226ef7dfbea6b3215190206c62c55f6fbd621841130b36360f752ec5a1e5e

See more details on using hashes here.

File details

Details for the file kynex-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: kynex-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 7.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.11

File hashes

Hashes for kynex-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 5f213297db4ba9ce2de544ecd5efd64601aeea201e8434cf2bd4a909713e5888
MD5 8339da1528d727141196b776d6a4542d
BLAKE2b-256 25186454c9a4ab8a732433a5bde31e510c95acffcd9816c2320510193a9504df

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page