An unofficial async/sync client library for Straico API
Project description
Async Client Libary for the Straico API
A client side implementation of Straico API.
Installation
# install from PyPI
pip install aio-straico
Usage
Please see the official Straico API documentation https://documenter.getpostman.com/view/5900072/2s9YyzddrR
Basic Prompt Completion
from aio_straico import straico_client
from aio_straico.utils import cheapest_model
def main():
with straico_client(API_KEY="ko-11111111111111111111111111") as client:
user_info = client.user()
print(user_info)
"""
{'coins': 100000.00,
'first_name': 'User',
'last_name': 'Name',
'plan': 'License Tier 1'}
"""
models = client.models()
cheapest_chat_model = cheapest_model(models)
print(cheapest_chat_model)
"""
{'name': 'Google: Gemma 2 27B',
'model': 'google/gemma-2-27b-it',
'word_limit': 3072,
'pricing': {'coins': 0.4,
'words': 100}}
"""
reply = client.prompt_completion(cheapest_chat_model, "Hello there")
print(reply["completion"]["choices"][0]["message"]["content"])
"""
General Kenobi! 👋
What can I do for you today? 😊
"""
if __name__=="__main__":
main()
Async Basic Prompt Completion
from aio_straico import aio_straico_client
from aio_straico.utils import cheapest_model
async def main():
async with aio_straico_client(API_KEY="ko-11111111111111111111111111") as client:
user_info = await client.user()
print(user_info)
"""
{'coins': 100000.00,
'first_name': 'User',
'last_name': 'Name',
'plan': 'License Tier 1'}
"""
models = await client.models()
cheapest_chat_model = cheapest_model(models)
print(cheapest_chat_model)
"""
{'name': 'Google: Gemma 2 27B',
'model': 'google/gemma-2-27b-it',
'word_limit': 3072,
'pricing': {'coins': 0.4,
'words': 100}}
"""
reply = await client.prompt_completion(cheapest_chat_model, "Hello there")
print(reply["completion"]["choices"][0]["message"]["content"])
"""
General Kenobi! 👋
What can I do for you today? 😊
"""
asyncio.run(main())
when API_KEY
is not set in aio_straico_client, it will use the value from environment variable STRAICO_API_KEY
.
If no environment variable is found the program will raise an error.
You can also set the model name manually
reply = await client.prompt_completion("openai/gpt-4o-mini", "Hello there")
print(reply["completion"]["choices"][0]["message"]["content"])
"""
General Kenobi! 👋
What can I do for you today? 😊
"""
Example Async Code
While the code below is async code, it can also be executed in a non-async mode by removing "await" and using the code with straico_client
as shown in the "Basic Prompt Completion" section.
Add file attachment and Transcript
mp3_files = [*Path("test_data/audio/").glob("*.mp3")]
response = await client.prompt_completion(
"openai/gpt-4o-mini",
"summarize the main points",
files=mp3_files,
display_transcripts=True,
)
print("## Summary")
print(
response["completions"]["openai/gpt-4o-mini"]["completion"]["choices"][0][
"message"
]["content"]
)
print("## Transcript")
for transcript in response["transcripts"]:
print("Name:", transcript["name"])
print("Transcript:", transcript["text"])
print()
"""
## Summary
The . . .
## Transcript
Name: . . .
Transcript: . . .
"""
Add Youtube URL and Transcript
youtube_url = "https://www.youtube.com/watch?v=zWPe_CUR4yU"
response = await client.prompt_completion(
"openai/gpt-4o-mini",
"summarize the main points",
youtube_urls=youtube_url,
display_transcripts=True,
)
print("## Summary")
print(
response["completions"]["openai/gpt-4o-mini"]["completion"]["choices"][0][
"message"
]["content"]
)
print("## Transcript")
for transcript in response["transcripts"]:
print("Name:", transcript["name"])
print("Transcript:", youtube_trasncript_to_plain_text(transcript["text"]))
print()
"""
## Summary
The . . .
## Transcript
Name: . . .
Transcript: . . .
"""
Image Generation
Generate images and download zip file to local directory
model ="openai/dall-e-3"
directory = Path(".")
zip_file_path = await client.image_generation_as_zipfile(
model=model,
description="A cute cat",
size=ImageSize.square,
variations=4,
destination_zip_path=directory,
)
Generate images and download image files to local directory
model ="openai/dall-e-3"
directory = Path(".")
image_paths = await client.image_generation_as_images(
model=model,
description="A cute cat",
size=ImageSize.landscape,
variations=4,
destination_zip_path=directory,
)
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file aio_straico-0.0.10.tar.gz
.
File metadata
- Download URL: aio_straico-0.0.10.tar.gz
- Upload date:
- Size: 10.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.9.20
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 8b5203cfd80ca4abb9a9a7c4b087634044a292444ee8c390ceaae890a04a8927 |
|
MD5 | c23dc139f15f1bd5eb6ef76bd759ee2a |
|
BLAKE2b-256 | dabe21836c9167086c180022f31d15c83185709b63d8c0cb489edc2153b5d0f1 |
File details
Details for the file aio_straico-0.0.10-py3-none-any.whl
.
File metadata
- Download URL: aio_straico-0.0.10-py3-none-any.whl
- Upload date:
- Size: 12.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.9.20
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 04526985b1a5ab3e301f17b793de1742f437c8e9d6053f364dcbb4d56822a033 |
|
MD5 | 04c22f39c271f429c7ee78fa05b1d9f3 |
|
BLAKE2b-256 | 942b671d7ce6f2978165996d57337e6851484b76bc47873d20bc4bebe780d7e7 |