Intelligent Research and Experimentation AI for LLM experimentation production.
Project description
Intura-AI: Intelligent Research and Experimentation AI
intura-ai is a Python package designed to streamline LLM experimentation and production. It provides tools for logging LLM usage and managing experiment predictions, with seamless LangChain compatibility.
Dashboard: dashboard.intura.co
Features
- Experiment Prediction:
ChatModelExperiment: Facilitates the selection and execution of LangChain models based on experiment configurations.
- LangChain Compatibility:
- Designed to integrate smoothly with LangChain workflows.
Installation
pip install intura-ai
Usage
Initialization
Before using intura-ai, you need to initialize the client with your API key.
import os
from intura_ai.client import intura_initialization
INTURA_API_KEY = "..."
intura_initialization(INTURA_API_KEY)
Experiment Prediction
Use ChatModelExperiment to fetch and execute pre-configured LangChain models.
from intura_ai.experiments import ChatModelExperiment
EXPERIMENT_ID = "..."
client = ChatModelExperiment(EXPERIMENT_ID)
choiced_model, model_config, chat_prompts = client.build(
features={
"user_id": "Rama12345",
"membership": "FREE",
"employment_type": "FULL_TIME",
"feature_x": "your custom features"
}
)
chat_prompts.append(('human', 'give me today quote for programmer'))
print(client.choiced_model) # Your choiced model for instance: claude-3-5-sonnet-20240620
# Set api_key as environment
import os
os.environ["GOOGLE_API_KEY"] = "xxx"
os.environ["ANTHROPIC_API_KEY"] = "xxx"
os.environ["DEEPSEEK_API_KEY"] = "xxx"
os.environ["OPENAI_API_KEY"] = "xxx"
model = choiced_model(**model_config)
# Or set api_key as params
model = choiced_model(**model_config, api_key="<YOUR_API_KEY>")
# Inference
model.invoke(chat_prompts)
Contributing
Contributions are welcome! Please feel free to submit pull requests or open issues for bug reports or feature requests.
License
This project is licensed under the MIT License.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file intura_ai-0.0.3.4.tar.gz.
File metadata
- Download URL: intura_ai-0.0.3.4.tar.gz
- Upload date:
- Size: 10.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.10.16
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
7e5d57537c8b8971d3ef1723134dd50f8ef2be5f7dc3f3ae91d1e920f5baed1f
|
|
| MD5 |
08677de07cb0c65c6087b8d1a64021d3
|
|
| BLAKE2b-256 |
9ab83f68bb5cbf5e68ca0046e468add43f127436bf1635472af8f350f2bb48d7
|
File details
Details for the file intura_ai-0.0.3.4-py3-none-any.whl.
File metadata
- Download URL: intura_ai-0.0.3.4-py3-none-any.whl
- Upload date:
- Size: 12.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.10.16
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e4081247b032cb87b96583630b8d8f3127d83e335ee5f9c6aca4da65ff1028e4
|
|
| MD5 |
b13d783f1447d01e103311f327d07b40
|
|
| BLAKE2b-256 |
bd54723292add57e399271af515f73e0ec53d3c63608a9ad942f1d0825b11a3d
|