Skip to main content

AutoMagik Spark - Automagion Engine with LangFlow integration

Project description

AutoMagik Spark Logo

AutoMagik Spark

**Because magic shouldn't be complicated. **

AutoMagik Spark is an automagion engine that seamlessly integrates with multiple LangFlow instances. Deploy AI-driven flows, schedule one-time or recurring tasks, and monitor everything with minimal fuss—no coding required.

🔗 Ecosystem

🚀 Installation

AutoMagik Spark provides two setup options:

Prerequisites

  • Linux-based system (Ubuntu/Debian recommended)
  • Docker and Docker Compose (automatically installed on Ubuntu/Debian if not present)

Local Production Setup

For a production-ready local environment:

./scripts/setup_local.sh

Development Setup

For development with PostgreSQL and Redis Docker containers:

./scripts/setup_dev.sh

What Happens During Setup

Both setup scripts will:

  • Create necessary environment files
  • Install Docker if needed (on Ubuntu/Debian)
  • Set up all required services
  • Install the CLI tool (optional)
  • Guide you through the entire process

After Installation

You'll have access to:

  • AutoMagik Spark API: Running at http://localhost:8883
  • PostgreSQL Database: Available at localhost:15432
  • Worker Service: Running and ready to process tasks
  • CLI Tool: Installed (if chosen during setup)

Verifying Your Installation

The setup automatically verifies all services, but you can also check manually:

# Access API documentation
open http://localhost:8883/api/v1/docs  # Interactive Swagger UI
open http://localhost:8883/api/v1/redoc # ReDoc documentation

# List flows (requires CLI installation)
source .venv/bin/activate
automagik-spark flow list

🧩 System Components

  • API Server: Handles all HTTP requests and core logic
  • Worker: Processes tasks and schedules
  • Database: PostgreSQL with all required tables automatically created
  • LangFlow (optional): Visual flow editor for creating AI workflows
  • CLI Tool (optional): Command-line interface for managing flows and tasks

🏗️ System Architecture

flowchart LR
    subgraph Services
      DB[PostgreSQL]
      LF1[LangFlow Instance 1]
      LF2[LangFlow Instance 2]
    end
    subgraph AutoMagik Spark
      CLI[CLI]
      API[API Server]
      CW[Celery Worker]
      W[Worker]
    end
    API -- uses --> DB
    API -- triggers --> CW
    W -- processes --> API
    API -- integrates with --> LF1
    API -- integrates with --> LF2
    CLI -- controls --> API
    API -- has UI --> UI[Automagik UI]

Core Components Explained

  • API: Core service handling requests and business logic
  • Worker: Processes tasks and schedules
  • CLI: Command-line tool for managing flows and tasks
  • PostgreSQL: Stores flows, tasks, schedules, and other data
  • LangFlow: Optional service for creating and editing flows

📚 API Documentation

For complete API documentation, visit:

🛠️ Next Steps

  1. If you installed LangFlow, visit http://localhost:17860 to create your first flow
  2. Use the API at http://localhost:8883/api/v1/docs to manage your flows and tasks
  3. Try out the CLI commands with automagik-spark --help
  4. Monitor task execution through logs and API endpoints

🗺️ Roadmap

AutoMagik Spark's future development focuses on:

  • TBA

AutoMagik Spark: Bringing AI Automation to Life

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

automagik_spark-0.3.0.tar.gz (83.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

automagik_spark-0.3.0-py3-none-any.whl (90.3 kB view details)

Uploaded Python 3

File details

Details for the file automagik_spark-0.3.0.tar.gz.

File metadata

  • Download URL: automagik_spark-0.3.0.tar.gz
  • Upload date:
  • Size: 83.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.7.13

File hashes

Hashes for automagik_spark-0.3.0.tar.gz
Algorithm Hash digest
SHA256 0a73d2c981c366f66ac44b70d70e6299e9e440be2eb938e457983e1f458affa0
MD5 7b67a09639c302eab5f3abfd6bd0d684
BLAKE2b-256 50d3c5c688d6eec61067ddf0410760252f29fd9c556b072447a18cdee91ebbef

See more details on using hashes here.

File details

Details for the file automagik_spark-0.3.0-py3-none-any.whl.

File metadata

File hashes

Hashes for automagik_spark-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 3cbc67fce8783516d33fb800933eadf3557a171cbe3cf8ded2deb3e3ee0cb775
MD5 c8df2a7d3c2d4f486f05594b59387118
BLAKE2b-256 b2551aa022bfede0125d4622c3391c22ffdfc63b7a8e27097e0868c0af2ee920

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page