Multi-Level Solution Architecture Generator powered by Ollama Mistral:7b
Project description
🌀 Inceptor
AI-Powered Multi-Level Solution Architecture Generator
Note: This project has been refactored for better maintainability and organization. The core functionality remains the same, but the code is now more modular and easier to extend.
Inceptor is a powerful AI-powered tool that helps you design, generate, and implement complex software architectures using natural language. Built with Ollama's Mistral:7b model, it creates multi-level architecture designs that evolve from high-level concepts to detailed implementation plans.
✨ Key Features
- 🤖 AI-Powered: Leverages Ollama's Mistral:7b for intelligent architecture generation
- 🏗️ Multi-Level Design: Creates 5 distinct architecture levels (LIMBO → DREAM → REALITY → DEEPER → DEEPEST)
- 🔍 Context-Aware: Understands requirements from natural language descriptions
- 💻 Interactive CLI: Command-line interface with autocomplete and suggestions
- 📊 Structured Output: Exports to Markdown, JSON, YAML, and more
- 🚀 Zero-Setup: Works out of the box with local Ollama installation
- 🔌 Extensible: Plugin system for custom generators and templates
🚀 Quick Start
Prerequisites
- Python 3.8 or higher
- Ollama with Mistral:7b model
- 4GB RAM (minimum)
Installation
# Install from PyPI
pip install inceptor
# Or install from source
git clone https://github.com/wronai/inceptor.git
cd inceptor
make install # Installs in development mode with all dependencies
# Start Ollama server (if not already running)
ollama serve
Basic Usage
# Generate architecture from a description
inceptor "I need a REST API for a todo app with user authentication"
# Start interactive shell
inceptor shell
Using the Python API
from inceptor import DreamArchitect, Solution, ArchitectureLevel
# Create an architect instance
architect = DreamArchitect()
# Generate a solution
problem = """
I need a task management system for a small development team.
The team consists of 5 people and uses Python, FastAPI, and PostgreSQL.
The system should have a web interface and REST API.
"""
# Generate solution with 3 levels of detail
solution = architect.inception(problem, max_levels=3)
# Access solution components
print(f"Problem: {solution.problem}")
print(f"Components: {len(solution.architecture.get('limbo', {}).get('components', []))}")
print(f"Tasks: {len(solution.tasks)}")
# Save to JSON
import json
from dataclasses import asdict, is_dataclass
def convert_dataclass(obj):
if is_dataclass(obj):
return {k: convert_dataclass(v) for k, v in asdict(obj).items()}
elif isinstance(obj, (list, tuple)):
return [convert_dataclass(x) for x in obj]
elif isinstance(obj, dict):
return {k: convert_dataclass(v) for k, v in obj.items()}
elif hasattr(obj, 'name'): # For Enums
return obj.name
return obj
with open("solution.json", "w") as f:
json.dump(convert_dataclass(solution), f, indent=2, ensure_ascii=False)
🏗️ Project Structure
After refactoring, the project has a cleaner, more modular structure:
src/inceptor/
├── __init__.py # Package exports and version
├── inceptor.py # Compatibility layer
└── core/ # Core functionality
├── __init__.py # Core package exports
├── enums.py # ArchitectureLevel enum
├── models.py # Solution and Task dataclasses
├── context_extractor.py # Context extraction utilities
├── ollama_client.py # Ollama API client
├── prompt_templates.py # Prompt templates for each level
├── dream_architect.py # Main architecture generation logic
└── utils.py # Utility functions
🏗️ Multi-Level Architecture
Inceptor structures architectures across 5 levels of detail:
| Level | Name | Description | Output |
|---|---|---|---|
| 1 | LIMBO | Problem analysis & decomposition | High-level components |
| 2 | DREAM | Component design & interactions | API contracts, Data flows |
| 3 | REALITY | Implementation details | Code structure, Tech stack |
| 4 | DEEPER | Integration & deployment | CI/CD, Infrastructure |
| 5 | DEEPEST | Optimization & scaling | Performance, Monitoring |
🛠️ Development
Setup
-
Clone the repository:
git clone https://github.com/wronai/inceptor.git cd inceptor
-
Set up the development environment:
# Install Python dependencies make install # Install pre-commit hooks pre-commit install # Start Ollama server (in a separate terminal) ollama serve
Common Tasks
# Install development dependencies
make install
# Run tests
make test
# Run tests with coverage
make test-cov
# Check code style
make lint
# Format code
make format
# Build documentation
make docs
# Run documentation server (http://localhost:8001)
make serve-docs
# Build package
make build
# Clean up
make clean
# Run a local example
python -m src.inceptor.inceptor
📚 Documentation
For full documentation, please visit https://wronai.github.io/inceptor/
🤝 Contributing
Contributions are welcome! Please read our Contributing Guide to get started.
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add some amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
📄 License
This project is licensed under the MIT License - see the LICENSE file for details.
🙏 Acknowledgments
- Ollama for the powerful AI models
- Mistral AI for the 7B model
- The open-source community for invaluable tools and libraries
# 1. Zainstaluj MkDocs
pip install mkdocs-material mkdocstrings[python] mkdocs-awesome-pages-plugin
# 2. Stwórz strukturę docs/
mkdir -p docs/{guide,architecture,api,development,examples,about,assets/{css,js,images}}
# 3. Uruchom development server
mkdocs serve
# 4. Build i deploy
mkdocs build
mkdocs gh-deploy # GitHub Pages
📚 Struktura dokumentacji:
- Home: Installation, Quick Start, Features
- User Guide: Getting Started, CLI Reference, Examples
- Architecture: Multi-Level Design, Prompts, Ollama Integration
- API Reference: Auto-generated z kodu
- Development: Contributing, Testing, Release Process
- Examples: Real-world use cases, troubleshooting
🎨 Customizacja:
- Theme: Material Design z custom colors
- Logo: Inception-inspired rotating animation
- Terminal: Code examples z animacją
- Social: GitHub, PyPI, Docker links
🔧 Plugin features:
- Search: Zaawansowane z język separatorami
- Git dates: Automatic creation/modification dates
- Minify: Optimized HTML/CSS/JS
- Privacy: GDPR-compliant
- Tags: Content categorization
Teraz wystarczy dodać treść do folderów w docs/ i masz profesjonalną dokumentację gotową na deployment! 🎯
Przykładowa komenda uruchomienia:
mkdocs serve # http://localhost:8000
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file inceptor-0.1.3.tar.gz.
File metadata
- Download URL: inceptor-0.1.3.tar.gz
- Upload date:
- Size: 24.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/2.1.3 CPython/3.11.12 Linux/6.14.9-300.fc42.x86_64
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
5f16bb485fae4dc5b3a6e808a569fa47b403163bda11e6cf1483989e97984e00
|
|
| MD5 |
c78276921bfebe98b4e6922451fbb742
|
|
| BLAKE2b-256 |
b8c4867ccd358cd23ce64c0d383ddec3e5c662f8d7081160aaca543e178647d3
|
File details
Details for the file inceptor-0.1.3-py3-none-any.whl.
File metadata
- Download URL: inceptor-0.1.3-py3-none-any.whl
- Upload date:
- Size: 25.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/2.1.3 CPython/3.11.12 Linux/6.14.9-300.fc42.x86_64
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
fa735cc4b4f52910c0c72fe778f034c8366a60f2f369b265931d84da18257062
|
|
| MD5 |
faa15511ccd80a7cc823ee6c1cd14c5d
|
|
| BLAKE2b-256 |
9206dd0fe1cc37763bd3dbeb283ce6e061eaa9c58906fa201c4b54bfc110715a
|