Skip to main content

Backend API service for the PyLama ecosystem

Project description

APILama

PyLama Ecosystem Navigation

Project Description Links
APILama API service for code generation GitHub · Docs
GetLLM LLM model management and code generation GitHub · PyPI · Docs
DevLama Python code generation with Ollama GitHub · Docs
LogLama Centralized logging and environment management GitHub · PyPI · Docs
BEXY Sandbox for executing generated code GitHub · Docs
JSLama JavaScript code generation GitHub · NPM · Docs
JSBox JavaScript sandbox for executing code GitHub · NPM · Docs
SheLLama Shell command generation GitHub · PyPI · Docs
WebLama Web application generation GitHub · Docs

Author

Tom Sapletta — DevOps Engineer & Systems Architect

  • 💻 15+ years in DevOps, Software Development, and Systems Architecture
  • 🏢 Founder & CEO at Telemonit (Portigen - edge computing power solutions)
  • 🌍 Based in Germany | Open to remote collaboration
  • 📚 Passionate about edge computing, hypermodularization, and automated SDLC

GitHub LinkedIn ORCID Portfolio

Support This Project

If you find this project useful, please consider supporting it:


APILama is the API gateway for the PyLama ecosystem. It serves as the central communication layer between the frontend WebLama interface and the various backend services (PyLama, BEXY, PyLLM, SheLLama). APILama integrates with LogLama as the primary service for centralized logging, environment management, and service orchestration.

Features

  • Unified API Gateway: Single entry point for all PyLama ecosystem services
  • Service Routing: Intelligent routing of requests to appropriate backend services (BEXY, PyLLM, SheLLama, PyLama)
  • RESTful API Design: Clean and consistent API endpoints for all operations
  • Cross-Origin Resource Sharing (CORS): Support for cross-origin requests from the WebLama frontend
  • Authentication and Authorization: Secure access to API endpoints
  • Response Standardization: Consistent response format across all services
  • Error Handling: Comprehensive error handling and reporting
  • LogLama Integration: Integrates with LogLama for centralized logging, environment management, and service orchestration
  • Structured Logging: All API requests and responses are logged with component context for better debugging and monitoring

Installation

# Create a virtual environment
python -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate

# Install the package in development mode
pip install -e .  # This is important! Always install in development mode before starting

IMPORTANT: Always run pip install -e . before starting the project to ensure all dependencies are properly installed and the package is available in development mode.

Usage

Running the API Gateway

# Start the API server with default settings
python -m apilama.app --port 8080 --host 127.0.0.1

# Or with environment variables
export PORT=8080
export HOST=127.0.0.1
python -m apilama.app

Using the Makefile

# Start the API server with default settings
make run

# Start with custom port
make run PORT=8090

Environment Variables

APILama uses the following environment variables for configuration:

  • PORT: The port to run the server on (default: 8080)
  • HOST: The host to bind to (default: 127.0.0.1)
  • DEBUG: Enable debug mode (default: False)
  • BEXY_API_URL: URL of the BEXY API (default: http://localhost:8000)
  • GETLLM_API_URL: URL of the PyLLM API (default: http://localhost:8001)
  • SHELLAMA_API_URL: URL of the SheLLama API (default: http://localhost:8002)
  • DEVLAMA_API_URL: URL of the PyLama API (default: http://localhost:8003)

You can set these variables in a .env file or pass them directly when starting the server.

API Documentation

Health Check

GET /api/health

Returns the health status of the APILama service.

SheLLama Endpoints

File Operations

GET /api/shellama/files?directory=/path/to/dir   # List files in a directory
GET /api/shellama/file?filename=/path/to/file     # Get file content
POST /api/shellama/file                           # Create/update a file
DELETE /api/shellama/file?filename=/path/to/file  # Delete a file

Directory Operations

GET /api/shellama/directory?directory=/path/to/dir  # Get directory information
POST /api/shellama/directory                        # Create a directory
DELETE /api/shellama/directory?directory=/path      # Delete a directory

Shell Operations

POST /api/shellama/shell  # Execute a shell command

BEXY Endpoints

GET /api/bexy/health     # Check BEXY health
POST /api/bexy/execute   # Execute Python code

PyLLM Endpoints

GET /api/getllm/health     # Check PyLLM health
POST /api/getllm/generate  # Generate code or text

PyLama Endpoints

GET /api/devlama/health    # Check PyLama health
GET /api/devlama/models    # List available models
POST /api/devlama/execute  # Execute a model

Development

# Install development dependencies
pip install -e ".[dev]"

# Run tests
python -m pytest

# Format code
black apilama tests

# Lint code
flake8 apilama tests

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

apilama-0.1.10.tar.gz (18.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

apilama-0.1.10-py3-none-any.whl (23.0 kB view details)

Uploaded Python 3

File details

Details for the file apilama-0.1.10.tar.gz.

File metadata

  • Download URL: apilama-0.1.10.tar.gz
  • Upload date:
  • Size: 18.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.3

File hashes

Hashes for apilama-0.1.10.tar.gz
Algorithm Hash digest
SHA256 b399350a3c3ffce9f79eefcd5de8b3d9b89e2ec7c3aecaf5753021d8bf78129a
MD5 004de063b91b251d2d84457aa527357e
BLAKE2b-256 a37fb4aa1336eb22ac1e407790dc045f0d47add78d0a22eef04c5b08f2b99ecb

See more details on using hashes here.

File details

Details for the file apilama-0.1.10-py3-none-any.whl.

File metadata

  • Download URL: apilama-0.1.10-py3-none-any.whl
  • Upload date:
  • Size: 23.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.3

File hashes

Hashes for apilama-0.1.10-py3-none-any.whl
Algorithm Hash digest
SHA256 276d250c8d3af5b96f883ce4a8a814ca14be4c0432c6abd3be791f474387418c
MD5 dc659ecbf1ffc97850481b5126c85402
BLAKE2b-256 92ccf48c1fdc153d3a18244553cef0d647a573fd27aef8ab4239fd9758ffa6b9

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page