A tool to generate infrastructure files (Dockerfile, etc.) for codebases using local LLMs.
Project description
local-AI-infra-generation
local-AI-infra-generation leverages local Large Language Models (LLMs) to analyze code repositories and automatically generate infrastructure files such as Dockerfiles and docker-compose.yml. All processing is performed locally, ensuring privacy and control over your codebase.
Features
- Codebase Embedding: Index and embed your codebase for semantic search and retrieval.
- Natural Language Q&A: Ask questions about your codebase and receive context-aware answers.
- Automated Infrastructure Generation: Generate Dockerfiles and docker-compose.yml files tailored to your project.
- Multi-language Support: Works with Python, JavaScript, TypeScript, and Go projects.
Getting Started
1. Prerequisites
-
Python 3.11+
Ensure you have Python 3.11 or higher installed.
Check with:python --version -
Ollama
Download and install Ollama for local LLM inference. -
C/C++ Build Tools
Required for building tree-sitter-languages. -
Git (optional, for cloning repositories)
2. Installation
a. Clone the Repository
git clone https://github.com/yourusername/local-AI-infra-generation.git
cd local-AI-infra-generation
b. Create a Virtual Environment
python -m venv .venv
source .venv/bin/activate # On Windows: .venv\Scripts\activate
c. Install Dependencies
This project uses uv for fast dependency management, but you can use pip as well.
With uv:
uv pip install -r requirements.txt
uv pip install -e .
Or with pip:
pip install -r requirements.txt
pip install -e .
Alternatively, install via pyproject.toml:
pip install .
Usage
1. Start Ollama
Make sure the Ollama server is running:
ollama serve
2. Run the CLI
uv run python -m src.main --help
Common Commands
-
Embed a Project:
uv run python -m src.main embed /path/to/your/project uv run python -m src.main embed ../PlotTwister
-
Ask a Question:
uv run python -m src.main ask "How does authentication work?" --project your_project_name uv run python -m src.main ask "what does the main.py do" --project "PlotTwister"
-
List Embedded Projects:
uv run python -m src.main list
-
Generate Dockerfile:
uv run python -m src.main generate-docker --project your_project_name
generate full infra for a multi-service repo
uv run python -m src.main generate-infra /path/to/repo --output infra
generate just the Dockerfile for a single service folder
uv run python -m src.main generate-docker /path/to/repo/service
- Generate docker-compose.yml:
uv run python -m src.main generate-compose --project your_project_name
Configuration
Edit src/config.yaml to customize:
- Model names and versions
- Supported languages and file extensions
- ChromaDB storage directory
- Ollama server URL
Dependencies
Key packages (see pyproject.toml for full list):
Development
- All source code is in
src/. - Prompt templates are in
prompt/. - Data and ChromaDB indexes are stored in
data/.
Running Tests
TODO: Add unit tests and instructions for running them.
check db from chromadb:
uv run python -m src.chroma_manager --db_dir data/chroma_index --list
uv run python -m src.chroma_manager --db_dir data/chroma_index --preview PlotTwister
Troubleshooting
-
Ollama not found:
Ensure Ollama is installed and available in your PATH. -
tree-sitter language .so files missing:
If you encounter errors about missing.sofiles, ensure tree-sitter-languages is installed and built correctly. -
Model download issues:
The first run will download required models. Ensure you have a stable internet connection.
TODO
- Add comprehensive unit and integration tests.
- Improve error handling and user feedback.
- Add support for more programming languages (e.g., Java, Rust).
- Enhance prompt templates for better infrastructure generation.
- Add web or GUI interface.
- Document API for programmatic usage.
- Support for private model registries and custom LLMs.
- Optimize embedding and retrieval for large codebases.
- Add CI/CD pipeline for automated testing and deployment.
- Generate terraforms artifacts.
License
This project is licensed under the Apache 2.0 License.
Acknowledgements
Contact
For questions or contributions, please open an issue or pull request on GitHub.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file local_ai_infra_generation-1.0.2.tar.gz.
File metadata
- Download URL: local_ai_infra_generation-1.0.2.tar.gz
- Upload date:
- Size: 22.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.11.13
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c9ee13747dd684606060e2c94baf9a1ab011c36613ae228c81955abd802c014c
|
|
| MD5 |
0cb0db7c57034b1ab534b37054e839b4
|
|
| BLAKE2b-256 |
45eb5e75649cf220e3edd67fe0629eb9ffd55e0750001eb7d4519541dcf0c881
|
File details
Details for the file local_ai_infra_generation-1.0.2-py3-none-any.whl.
File metadata
- Download URL: local_ai_infra_generation-1.0.2-py3-none-any.whl
- Upload date:
- Size: 23.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.11.13
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
9eb1455c7b24822423a31aecbbb0c2d86145b0bad2e3bcd2d5f3e7afdcd67708
|
|
| MD5 |
c329d8f9432ef6bd9166c0b78029067a
|
|
| BLAKE2b-256 |
27c42ab0a2abc8c3a14a1206d1edd0cfad7007efcc5b0bb48de7e8047ae423da
|