Skip to main content

Multi-Level Solution Architecture Generator powered by Ollama Mistral:7b

Project description

🌀 Inceptor

AI-Powered Multi-Level Solution Architecture Generator

Note: This project has been refactored for better maintainability and organization. The core functionality remains the same, but the code is now more modular and easier to extend.

PyPI Version Python Version License Documentation GitHub Workflow Status Code Style: Black Imports: isort Type Checker: mypy Linter: flake8

Inceptor is a powerful AI-powered tool that helps you design, generate, and implement complex software architectures using natural language. Built with Ollama's Mistral:7b model, it creates multi-level architecture designs that evolve from high-level concepts to detailed implementation plans.

✨ Key Features

  • 🤖 AI-Powered: Leverages Ollama's Mistral:7b for intelligent architecture generation
  • 🏗️ Multi-Level Design: Creates 5 distinct architecture levels (LIMBO → DREAM → REALITY → DEEPER → DEEPEST)
  • 🔍 Context-Aware: Understands requirements from natural language descriptions
  • 💻 Interactive CLI: Command-line interface with autocomplete and suggestions
  • 📊 Structured Output: Exports to Markdown, JSON, YAML, and more
  • 🚀 Zero-Setup: Works out of the box with local Ollama installation
  • 🔌 Extensible: Plugin system for custom generators and templates

🚀 Quick Start

Prerequisites

  • Python 3.8 or higher
  • Ollama with Mistral:7b model
  • 4GB RAM (minimum)

Installation

# Install from PyPI
pip install inceptor

# Or install from source
git clone https://github.com/wronai/inceptor.git
cd inceptor
make install  # Installs in development mode with all dependencies

# Start Ollama server (if not already running)
ollama serve

Basic Usage

# Generate architecture from a description
inceptor "I need a REST API for a todo app with user authentication"

# Start interactive shell
inceptor shell

Using the Python API

from inceptor import DreamArchitect, Solution, ArchitectureLevel

# Create an architect instance
architect = DreamArchitect()

# Generate a solution
problem = """
I need a task management system for a small development team.
The team consists of 5 people and uses Python, FastAPI, and PostgreSQL.
The system should have a web interface and REST API.
"""

# Generate solution with 3 levels of detail
solution = architect.inception(problem, max_levels=3)

# Access solution components
print(f"Problem: {solution.problem}")
print(f"Components: {len(solution.architecture.get('limbo', {}).get('components', []))}")
print(f"Tasks: {len(solution.tasks)}")

# Save to JSON
import json
from dataclasses import asdict, is_dataclass

def convert_dataclass(obj):
    if is_dataclass(obj):
        return {k: convert_dataclass(v) for k, v in asdict(obj).items()}
    elif isinstance(obj, (list, tuple)):
        return [convert_dataclass(x) for x in obj]
    elif isinstance(obj, dict):
        return {k: convert_dataclass(v) for k, v in obj.items()}
    elif hasattr(obj, 'name'):  # For Enums
        return obj.name
    return obj

with open("solution.json", "w") as f:
    json.dump(convert_dataclass(solution), f, indent=2, ensure_ascii=False)

🏗️ Project Structure

After refactoring, the project has a cleaner, more modular structure:

src/inceptor/
├── __init__.py           # Package exports and version
├── inceptor.py           # Compatibility layer
└── core/                 # Core functionality
    ├── __init__.py       # Core package exports
    ├── enums.py          # ArchitectureLevel enum
    ├── models.py         # Solution and Task dataclasses
    ├── context_extractor.py # Context extraction utilities
    ├── ollama_client.py  # Ollama API client
    ├── prompt_templates.py # Prompt templates for each level
    ├── dream_architect.py # Main architecture generation logic
    └── utils.py          # Utility functions

🏗️ Multi-Level Architecture

Inceptor structures architectures across 5 levels of detail:

Level Name Description Output
1 LIMBO Problem analysis & decomposition High-level components
2 DREAM Component design & interactions API contracts, Data flows
3 REALITY Implementation details Code structure, Tech stack
4 DEEPER Integration & deployment CI/CD, Infrastructure
5 DEEPEST Optimization & scaling Performance, Monitoring

🛠️ Development

Setup

  1. Clone the repository:

    git clone https://github.com/wronai/inceptor.git
    cd inceptor
    
  2. Set up the development environment:

    # Install Python dependencies
    make install
    
    # Install pre-commit hooks
    pre-commit install
    
    # Start Ollama server (in a separate terminal)
    ollama serve
    

Common Tasks

# Install development dependencies
make install

# Run tests
make test

# Run tests with coverage
make test-cov

# Check code style
make lint

# Format code
make format

# Build documentation
make docs

# Run documentation server (http://localhost:8001)
make serve-docs

# Build package
make build

# Clean up
make clean

# Run a local example
python -m src.inceptor.inceptor

📚 Documentation

For full documentation, please visit https://wronai.github.io/inceptor/

🤝 Contributing

Contributions are welcome! Please read our Contributing Guide to get started.

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add some amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

📄 License

This project is licensed under the MIT License - see the LICENSE file for details.

🙏 Acknowledgments

  • Ollama for the powerful AI models
  • Mistral AI for the 7B model
  • The open-source community for invaluable tools and libraries
# 1. Zainstaluj MkDocs
pip install mkdocs-material mkdocstrings[python] mkdocs-awesome-pages-plugin

# 2. Stwórz strukturę docs/
mkdir -p docs/{guide,architecture,api,development,examples,about,assets/{css,js,images}}

# 3. Uruchom development server
mkdocs serve

# 4. Build i deploy
mkdocs build
mkdocs gh-deploy  # GitHub Pages

📚 Struktura dokumentacji:

  • Home: Installation, Quick Start, Features
  • User Guide: Getting Started, CLI Reference, Examples
  • Architecture: Multi-Level Design, Prompts, Ollama Integration
  • API Reference: Auto-generated z kodu
  • Development: Contributing, Testing, Release Process
  • Examples: Real-world use cases, troubleshooting

🎨 Customizacja:

  • Theme: Material Design z custom colors
  • Logo: Inception-inspired rotating animation
  • Terminal: Code examples z animacją
  • Social: GitHub, PyPI, Docker links

🔧 Plugin features:

  • Search: Zaawansowane z język separatorami
  • Git dates: Automatic creation/modification dates
  • Minify: Optimized HTML/CSS/JS
  • Privacy: GDPR-compliant
  • Tags: Content categorization

Teraz wystarczy dodać treść do folderów w docs/ i masz profesjonalną dokumentację gotową na deployment! 🎯

Przykładowa komenda uruchomienia:

mkdocs serve  # http://localhost:8000

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

inceptor-0.1.5.tar.gz (25.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

inceptor-0.1.5-py3-none-any.whl (25.9 kB view details)

Uploaded Python 3

File details

Details for the file inceptor-0.1.5.tar.gz.

File metadata

  • Download URL: inceptor-0.1.5.tar.gz
  • Upload date:
  • Size: 25.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.3 CPython/3.11.12 Linux/6.14.9-300.fc42.x86_64

File hashes

Hashes for inceptor-0.1.5.tar.gz
Algorithm Hash digest
SHA256 2c83409f9a065702d0550776839a4346257b7d6bd6502344187742735047b0ae
MD5 3724cea0dfd9097f467cb4a46b439ef1
BLAKE2b-256 f16a71445dd6affe2370af5940bdeee9bb42c3414be5117fcac237e6b84fcbd5

See more details on using hashes here.

File details

Details for the file inceptor-0.1.5-py3-none-any.whl.

File metadata

  • Download URL: inceptor-0.1.5-py3-none-any.whl
  • Upload date:
  • Size: 25.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.3 CPython/3.11.12 Linux/6.14.9-300.fc42.x86_64

File hashes

Hashes for inceptor-0.1.5-py3-none-any.whl
Algorithm Hash digest
SHA256 b4495c117891b5725efb4be8b39449d855402bf776514f541e727aff1688a7da
MD5 1116e711d36c3ae2dd5e084d134ac291
BLAKE2b-256 fd2f22af7f10d2e620deead2722f9df1e004be0b0b5bac306bbe9c494e8f6665

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page