CLI tool for AWS Bedrock model invocation
Project description
loam-ai
A command-line interface (CLI) tool for interacting with AWS Bedrock foundation models and inference profiles. LoamAI provides a streamlined way to invoke AI models, generate embeddings, and manage conversations through AWS Bedrock.
Features
- List available foundation models and inference profiles
- Generate text and image embeddings
- Stream model responses for real-time output
- Support for conversation-based model interactions
- Manage multiple AWS profiles and regions
- Rich terminal output formatting
Installation
Requires Python 3.10 or higher.
pip install loam-ai
Configuration
AWS SSO Login
If you're using AWS SSO, first configure your AWS profile and log in:
export AWS_PROFILE=your-sso-profile
aws sso login
After successful login, you can run loam-ai commands using your SSO credentials.
Usage
List Available Models
loam list-models
Filter by provider or output type:
loam list-models --provider anthropic
loam list-models --output TEXT
Check Session Information
loam session
Generate Embeddings
Generate text embeddings:
loam generate-embeddings \
--model-id amazon.titan-embed-text-v2:0 \
--input-file texts.txt
Generate image embeddings:
loam generate-embeddings \
--model-id amazon.titan-embed-image-v1 \
--image image.jpg \
--texts "Image description"
Invoke Models
Simple text generation:
loam invoke \
-m amazon.nova-lite-v1:0 \
-p "What are the benefits of renewable energy?"
Conversation Mode
Use conversation mode for chat-based interactions:
loam converse \
--model-id "anthropic.claude-3-sonnet-20240229-v1:0" \
--messages-file conversation.json
Example messages file format:
[
{
"role": "user",
"content": [{ "text": "What is the capital of France?" }]
}
]
List Inference Profiles
loam list-inference-profiles
Command Options
Global Options
--profile: AWS profile name--region: AWS region--debug: Enable debug output
Model Invocation Options
--temperature: Control response randomness (0-1)--max-tokens: Maximum response length--top-p: Control response diversity (0-1)
Error Handling
LoamAI provides clear error messages with rich terminal formatting. Common issues include:
- Invalid AWS credentials
- Unsupported model configurations
- Rate limiting
- Input validation errors
Python Usage
LoamAI can also be used as a Python library. First make sure you have run pip install loam-ai in your Python environment. Here's an example setting up a client and invoking a model with streaming output:
from loam_ai.client import BedrockClient
client = BedrockClient(profile_name="default", region_name="us-east-1")
model_id = "amazon.nova-lite-v1:0"
response = client.invoke_model(
model_id=model_id,
prompt="Write a poem about the sea",
temperature=0.7,
max_tokens=200,
stream=True
)
for chunk in response:
print(chunk, end="")
# Beneath the boundless azure sky,
# The sea unfolds her silent plea,
# A vast expanse where dreams can fly,
# And secrets whispered tenderly.
# Her waves, like ancient, rhythmic prose,
# Converse in tones both soft and strong,
# A timeless dance, a mystic flow,
# Where endless stories are prolong.
# The sea, a canvas of the blue,
# With brushstrokes of the silver moon,
# A tapestry of hues anew,
# As daylight yields to night's cocoon.
Development
Built with:
License
MIT License - See LICENSE file for details.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file loam_ai-0.1.1.tar.gz.
File metadata
- Download URL: loam_ai-0.1.1.tar.gz
- Upload date:
- Size: 31.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.0.1 CPython/3.10.16
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
978b995369f09ee1acc9b471937c06f132a7853244144202d675b80d974ee270
|
|
| MD5 |
9ab5571ff651982a2a01e05ead7d2bdb
|
|
| BLAKE2b-256 |
dbe186f133b5dc6963a5ca3eb1045beb1c0ad0bd470935d7a486c3f6ad90aecf
|
File details
Details for the file loam_ai-0.1.1-py3-none-any.whl.
File metadata
- Download URL: loam_ai-0.1.1-py3-none-any.whl
- Upload date:
- Size: 12.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.0.1 CPython/3.10.16
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
a514b2750ed22328e82e893db2de0f925ffe5fbc3e467eab0baa2af585241020
|
|
| MD5 |
fd0dde48384a3913c1a410387f6dce80
|
|
| BLAKE2b-256 |
5d15a9388fc7eb6f84a4875e0b1f44705ff3f6c1e4dd203446608d8fda20ef5f
|