Skip to main content

Intelligent Research and Experimentation AI for LLM experimentation production.

Project description

Intura-AI: Intelligent Research and Experimentation AI

PyPI version LangChain Compatible

intura-ai is a Python package designed to streamline LLM experimentation and production. It provides tools for logging LLM usage and managing experiment predictions, with seamless LangChain compatibility.

Dashboard: dashboard.intura.co

Features

  • Experiment Prediction:
    • ChatModelExperiment: Facilitates the selection and execution of LangChain models based on experiment configurations.
  • LangChain Compatibility:
    • Designed to integrate smoothly with LangChain workflows.

Installation

pip install intura-ai

Usage

Initialization

Before using intura-ai, you need to initialize the client with your API key.

import os
from intura_ai.client import intura_initialization

INTURA_API_KEY = "..."
intura_initialization(INTURA_API_KEY)

Experiment Prediction

Use ChatModelExperiment to fetch and execute pre-configured LangChain models.

from intura_ai.experiments import ChatModelExperiment

EXPERIMENT_ID = "..."
client = ChatModelExperiment(EXPERIMENT_ID)

choiced_model, model_config, chat_prompts = client.build(
    features={
        "user_id": "Rama12345", 
        "membership": "FREE", 
        "employment_type": "FULL_TIME",
        "feature_x": "your custom features"
    }
)
chat_prompts.append(('human', 'give me today quote for programmer'))

print(client.choiced_model) # Your choiced model for instance: claude-3-5-sonnet-20240620

# Set api_key as environment 
import os

os.environ["GOOGLE_API_KEY"] = "xxx"
os.environ["ANTHROPIC_API_KEY"] = "xxx"
os.environ["DEEPSEEK_API_KEY"] = "xxx"
os.environ["OPENAI_API_KEY"] = "xxx"

model = choiced_model(**model_config)

# Or set api_key as params

model = choiced_model(**model_config, api_key="<YOUR_API_KEY>")

# Inference

model.invoke(chat_prompts)

Contributing

Contributions are welcome! Please feel free to submit pull requests or open issues for bug reports or feature requests.

License

This project is licensed under the MIT License.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

intura_ai-0.0.3.4.tar.gz (10.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

intura_ai-0.0.3.4-py3-none-any.whl (12.6 kB view details)

Uploaded Python 3

File details

Details for the file intura_ai-0.0.3.4.tar.gz.

File metadata

  • Download URL: intura_ai-0.0.3.4.tar.gz
  • Upload date:
  • Size: 10.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.16

File hashes

Hashes for intura_ai-0.0.3.4.tar.gz
Algorithm Hash digest
SHA256 7e5d57537c8b8971d3ef1723134dd50f8ef2be5f7dc3f3ae91d1e920f5baed1f
MD5 08677de07cb0c65c6087b8d1a64021d3
BLAKE2b-256 9ab83f68bb5cbf5e68ca0046e468add43f127436bf1635472af8f350f2bb48d7

See more details on using hashes here.

File details

Details for the file intura_ai-0.0.3.4-py3-none-any.whl.

File metadata

  • Download URL: intura_ai-0.0.3.4-py3-none-any.whl
  • Upload date:
  • Size: 12.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.16

File hashes

Hashes for intura_ai-0.0.3.4-py3-none-any.whl
Algorithm Hash digest
SHA256 e4081247b032cb87b96583630b8d8f3127d83e335ee5f9c6aca4da65ff1028e4
MD5 b13d783f1447d01e103311f327d07b40
BLAKE2b-256 bd54723292add57e399271af515f73e0ec53d3c63608a9ad942f1d0825b11a3d

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page