Skip to main content

A framework for stable AI agents

Project description

StableAgents AI

A framework for building the Linux kernel of AI agents - providing the core infrastructure and system-level capabilities that enable reliable, secure, and efficient AI agent operations.

Installation

pip install stableagents-ai

Or with Poetry:

poetry add stableagents-ai

For local LLM support:

pip install stableagents-ai[local]
# or with Poetry
poetry add stableagents-ai -E local

Quick Start

# Using the Python API
from stableagents import StableAgents

agent = StableAgents()
agent.set_api_key('openai', 'your-api-key')
agent.set_active_ai_provider('openai')

response = agent.generate_text("Tell me about AI agents")
print(response)

# Using with a local model
agent = StableAgents()
agent.set_local_model()  # Uses default model location
# or specify a model path
# agent.set_local_model("/path/to/your/model.gguf")
response = agent.generate_text("Tell me about AI agents")
print(response)

Command Line Interface

StableAgents comes with a simple CLI that you can run from anywhere:

# Run the CLI directly with any of these commands
stableagents
stableagents-ai
run-stableagents

# Run with a specific model and API key
stableagents --model openai --key your-api-key

# Run with a local model
stableagents --local

# Run with a local model from a specific path
stableagents --local --model-path /path/to/your/model.gguf

Once in the CLI, you can:

  • Chat with the AI directly by typing any text
  • Use commands like memory, control, and provider
  • Type help to see all available commands

Features

  • Multiple AI provider support (OpenAI, Anthropic, etc.)
  • Local model support for offline usage (via llama-cpp-python)
  • Memory management
  • Computer control capabilities
  • Simple but powerful CLI
  • Logging system

Local Models

StableAgents supports running LLM inference locally using llama-cpp-python. To use local models:

  1. Install the local dependencies: pip install stableagents-ai[local]
  2. Download a compatible GGUF model file (e.g., from TheBloke on Hugging Face)
  3. Place the model file in ~/.stableagents/models/default/ or specify the path when loading
# Load a model from a specific path
agent.set_local_model("/path/to/your/model.gguf")

# Or run the CLI with local mode
# stableagents --local

# Or specify a custom model path in the CLI
# stableagents --local --model-path /path/to/your/model.gguf

The framework will automatically search for a .gguf file in the default directory if no path is specified.

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

stableagents_ai-0.1.4.tar.gz (23.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

stableagents_ai-0.1.4-py3-none-any.whl (27.6 kB view details)

Uploaded Python 3

File details

Details for the file stableagents_ai-0.1.4.tar.gz.

File metadata

  • Download URL: stableagents_ai-0.1.4.tar.gz
  • Upload date:
  • Size: 23.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.3 CPython/3.11.11 Darwin/24.1.0

File hashes

Hashes for stableagents_ai-0.1.4.tar.gz
Algorithm Hash digest
SHA256 1004d79933cc32647830ac34d3ff27a585ead0a454baff75073e141558973e4e
MD5 d886143c5cc2e8c2cb340cf710b37a4b
BLAKE2b-256 cd020add591e02ba05689fdb1772c7502a9dd76119b7f4ca240560ef2f7a9c9f

See more details on using hashes here.

File details

Details for the file stableagents_ai-0.1.4-py3-none-any.whl.

File metadata

  • Download URL: stableagents_ai-0.1.4-py3-none-any.whl
  • Upload date:
  • Size: 27.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.3 CPython/3.11.11 Darwin/24.1.0

File hashes

Hashes for stableagents_ai-0.1.4-py3-none-any.whl
Algorithm Hash digest
SHA256 c42371aef066a18d6acb81925f33cc298e597d30ecfe7370e7fb2cead6dc53fd
MD5 94aca1a3fe84eba01bdf494ef9b4c178
BLAKE2b-256 6397ca390f79a0cc2b2d2eb0373498cc1b2d8dcc532debd37536ca02f1bc0b00

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page