Skip to main content

CLI tool for AWS Bedrock model invocation

Project description

loam-ai

A command-line interface (CLI) tool for interacting with AWS Bedrock foundation models and inference profiles. LoamAI provides a streamlined way to invoke AI models, generate embeddings, and manage conversations through AWS Bedrock.

Features

  • List available foundation models and inference profiles
  • Generate text and image embeddings
  • Stream model responses for real-time output
  • Support for conversation-based model interactions
  • Manage multiple AWS profiles and regions
  • Rich terminal output formatting

Installation

Requires Python 3.10 or higher.

pip install loam-ai

Configuration

AWS SSO Login

If you're using AWS SSO, first configure your AWS profile and log in:

export AWS_PROFILE=your-sso-profile
aws sso login

After successful login, you can run loam-ai commands using your SSO credentials.

Usage

List Available Models

loam list-models

Filter by provider or output type:

loam list-models --provider anthropic
loam list-models --output TEXT

Check Session Information

loam session

Generate Embeddings

Generate text embeddings:

loam generate-embeddings \
    --model-id amazon.titan-embed-text-v2:0 \
    --input-file texts.txt

Generate image embeddings:

loam generate-embeddings \
    --model-id amazon.titan-embed-image-v1 \
    --image image.jpg \
    --texts "Image description"

Invoke Models

Simple text generation:

loam invoke \
    -m amazon.nova-lite-v1:0 \
    -p "What are the benefits of renewable energy?"

Conversation Mode

Use conversation mode for chat-based interactions:

loam converse \
    --model-id "anthropic.claude-3-sonnet-20240229-v1:0" \
    --messages-file conversation.json

Example messages file format:

[
  {
    "role": "user",
    "content": [{ "text": "What is the capital of France?" }]
  }
]

List Inference Profiles

loam list-inference-profiles

Command Options

Global Options

  • --profile: AWS profile name
  • --region: AWS region
  • --debug: Enable debug output

Model Invocation Options

  • --temperature: Control response randomness (0-1)
  • --max-tokens: Maximum response length
  • --top-p: Control response diversity (0-1)

Error Handling

LoamAI provides clear error messages with rich terminal formatting. Common issues include:

  • Invalid AWS credentials
  • Unsupported model configurations
  • Rate limiting
  • Input validation errors

Python Usage

LoamAI can also be used as a Python library. First make sure you have run pip install loam-ai in your Python environment. Here's an example setting up a client and invoking a model with streaming output:

from loam_ai.client import BedrockClient

client = BedrockClient(profile_name="default", region_name="us-east-1")
model_id = "amazon.nova-lite-v1:0"

response = client.invoke_model(
    model_id=model_id,
    prompt="Write a poem about the sea",
    temperature=0.7,
    max_tokens=200,
    stream=True
)
for chunk in response:
    print(chunk, end="")

# Beneath the boundless azure sky,
# The sea unfolds her silent plea,
# A vast expanse where dreams can fly,
# And secrets whispered tenderly.

# Her waves, like ancient, rhythmic prose,
# Converse in tones both soft and strong,
# A timeless dance, a mystic flow,
# Where endless stories are prolong.

# The sea, a canvas of the blue,
# With brushstrokes of the silver moon,
# A tapestry of hues anew,
# As daylight yields to night's cocoon.

Development

Built with:

  • Click for CLI interface
  • Rich for terminal formatting
  • boto3 for AWS SDK

License

MIT License - See LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

loam_ai-0.1.1.tar.gz (31.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

loam_ai-0.1.1-py3-none-any.whl (12.8 kB view details)

Uploaded Python 3

File details

Details for the file loam_ai-0.1.1.tar.gz.

File metadata

  • Download URL: loam_ai-0.1.1.tar.gz
  • Upload date:
  • Size: 31.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.0.1 CPython/3.10.16

File hashes

Hashes for loam_ai-0.1.1.tar.gz
Algorithm Hash digest
SHA256 978b995369f09ee1acc9b471937c06f132a7853244144202d675b80d974ee270
MD5 9ab5571ff651982a2a01e05ead7d2bdb
BLAKE2b-256 dbe186f133b5dc6963a5ca3eb1045beb1c0ad0bd470935d7a486c3f6ad90aecf

See more details on using hashes here.

File details

Details for the file loam_ai-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: loam_ai-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 12.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.0.1 CPython/3.10.16

File hashes

Hashes for loam_ai-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 a514b2750ed22328e82e893db2de0f925ffe5fbc3e467eab0baa2af585241020
MD5 fd0dde48384a3913c1a410387f6dce80
BLAKE2b-256 5d15a9388fc7eb6f84a4875e0b1f44705ff3f6c1e4dd203446608d8fda20ef5f

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page