Simple client to interact with regolo.ai
Project description
Regolo.ai Python Client
A simple Python client for interacting for Regolo.ai's LLM-based API.
Installation
Ensure you have the regolo
module installed. If not, install it using:
pip install regolo
Basic Usage
1. Import the regolo module
import regolo
2. Set Up Default API Key and Model
To avoid manually passing the API key and model in every request, you can set them globally:
regolo.default_key = "<EXAMPLE_KEY>"
regolo.default_model = "meta-llama/Llama-3.3-70B-Instruct"
This ensures that all RegoloClient
instances and static functions will
use the specified API key and model.
Still, you can create run methods by passing model and key directly.
3. Perform a basic request
Completion:
print(regolo.static_completions(prompt="Tell me something about Rome."))
Chat_completion
print(regolo.static_chat_completions(messages=[{"role": "user", "content": "Tell me something about rome"}]))
Other usages
Handling streams
With full output:
import regolo
regolo.default_key = "<EXAMPLE_KEY>"
regolo.default_model = "meta-llama/Llama-3.3-70B-Instruct"
# Completions
client = regolo.RegoloClient()
response = client.completions("Tell me about Rome in a concise manner", full_output=True, stream=True)
while True:
try:
print(next(response))
except StopIteration:
break
# Chat completions
client = regolo.RegoloClient()
response = client.run_chat(user_prompt="Tell me about Rome in a concise manner", full_output=True, stream=True)
while True:
try:
print(next(response))
except StopIteration:
break
Without full output:
import regolo
regolo.default_key = "<EXAMPLE_KEY>"
regolo.default_model = "meta-llama/Llama-3.3-70B-Instruct"
# Completions
client = regolo.RegoloClient()
response = client.completions("Tell me about Rome in a concise manner", full_output=True, stream=True)
while True:
try:
print(next(response), end='', flush=True)
except StopIteration:
break
# Chat completions
client = regolo.RegoloClient()
response = client.run_chat(user_prompt="Tell me about Rome in a concise manner", full_output=True, stream=True)
while True:
try:
res = next(response)
if res[0]:
print(res[0] + ":")
print(res[1], end="", flush=True)
except StopIteration:
break
Handling chat through add_prompt_to_chat()
import regolo
client = regolo.RegoloClient()
# Make a request
client.add_prompt_to_chat(role="user", prompt="Tell me about rome!")
print(client.run_chat())
# Continue the conversation
client.add_prompt_to_chat(role="user", prompt="Tell me something more about it!")
print(client.run_chat())
# You can print the whole conversation if needed
print(print(client.instance.get_conversation()))
It is to consider that using the user_prompt parameter in run_chat() is equivalent to adding a prompt with role=user through add_prompt_to_chat().
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file regolo-1.0.0.tar.gz
.
File metadata
- Download URL: regolo-1.0.0.tar.gz
- Upload date:
- Size: 10.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.12.8
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 |
dff8567475781f72caf71a423312806aceca73aedd8855e0493d89795829af19
|
|
MD5 |
5d4eedcc5143bad771de95142d2291ce
|
|
BLAKE2b-256 |
722a573a48b6d266f4cf72ddbbb08be1f987535223add998aaa71a025a7044ca
|
File details
Details for the file regolo-1.0.0-py3-none-any.whl
.
File metadata
- Download URL: regolo-1.0.0-py3-none-any.whl
- Upload date:
- Size: 10.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.12.8
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 |
3da64e2829675db61260e66ea48db25e913f97fb902abda1639ab5b00033f50f
|
|
MD5 |
795c9675fb838b590dadc1698516c00f
|
|
BLAKE2b-256 |
5e12db419c7898126205df1e7ad0637f78762bec393aee4060a6072e89e53cbd
|